Connect with us

Cars

Ford aims at self-driving ride shares by 2021

Published

on

Ford has recent;y announced its intent to have a high-volume, fully autonomous SAE level 4-capable vehicles in commercial operation in 2021 in a ride-hailing or ride-sharing service.

To get there, the company is investing in or collaborating with four startups to enhance its autonomous vehicle development, doubling its Silicon Valley team and more than doubling its Palo Alto campus.

“The next decade will be defined by automation of the automobile, and we see autonomous vehicles as having as significant an impact on society as Ford’s moving assembly line did 100 years ago,” said Mark Fields, Ford president and CEO. “We’re dedicated to putting on the road an autonomous vehicle that can improve safety and solve social and environmental challenges for millions of people – not just those who can afford luxury vehicles.”

Autonomous vehicles in 2021 are part of Ford Smart Mobility, the company’s plan to be a leader in autonomous vehicles, as well as in connectivity, mobility, the customer experience, and data and analytics.

Driving autonomous vehicle leadership

Building on more than a decade of autonomous vehicle research and development, Ford’s first fully autonomous vehicle will be a Society of Automotive Engineers-rated level 4-capable vehicle without a steering wheel or gas and brake pedals. It is being specifically designed for commercial mobility services, such as ride sharing and ride hailing, and will be available in high volumes.

“Ford has been developing and testing autonomous vehicles for more than 10 years,” said Raj Nair, Ford executive vice president, Global Product Development, and chief technical officer. “We have a strategic advantage because of our ability to combine the software and sensing technology with the sophisticated engineering necessary to manufacture high-quality vehicles. That is what it takes to make autonomous vehicles a reality for millions of people around the world.”

This year, Ford will triple its autonomous vehicle test fleet to be the largest test fleet of any automaker – bringing the number to about 30 self-driving Fusion Hybrid sedans on the roads in California, Arizona and Michigan, with plans to triple it again next year.

Ford was the first automaker to begin testing its vehicles at Mcity, University of Michigan’s simulated urban environment, the first automaker to publicly demonstrate autonomous vehicle operation in the snow and the first automaker to test its autonomous research vehicles at night, in complete darkness, as part of LiDAR sensor development.

To deliver an autonomous vehicle in 2021, Ford is announcing four key investments and collaborations that are expanding its strong research in advanced algorithms, 3D mapping, LiDAR, and radar and camera sensors:

  • Velodyne: Ford has invested in Velodyne, the Silicon Valley-based leader in light detection and ranging (LiDAR) sensors. The aim is to quickly mass-produce a more affordable automotive LiDAR sensor. Ford has a longstanding relationship with Velodyne, and was among the first to use LiDAR for both high-resolution mapping and autonomous driving beginning more than 10 years ago
  • SAIPS: Ford has acquired the Israel-based computer vision and machine learning company to further strengthen its expertise in artificial intelligence and enhance computer vision. SAIPS has developed algorithmic solutions in image and video processing, deep learning, signal processing and classification. This expertise will help Ford autonomous vehicles learn and adapt to the surroundings of their environment
  • Civil Maps: Ford has invested in Berkeley, California-based Civil Maps to further develop high-resolution 3D mapping capabilities. Civil Maps has pioneered an innovative 3D mapping technique that is scalable and more efficient than existing processes. This provides Ford another way to develop high-resolution 3D maps of autonomous vehicle environments
  • Nirenberg Neuroscience LLC: Ford has an exclusive licensing agreement with Nirenberg Neuroscience, a machine vision company founded by neuroscientist Dr. Sheila Nirenberg, who cracked the neural code the eye uses to transmit visual information to the brain. This has led to a powerful machine vision platform for performing navigation, object recognition, facial recognition and other functions, with many potential applications. For example, it is already being applied by Dr. Nirenberg to develop a device for restoring sight to patients with degenerative diseases of the retina. Ford’s partnership with Nirenberg Neuroscience will help bring humanlike intelligence to the machine learning modules of its autonomous vehicle virtual driver system

Silicon Valley expansion

Ford also is expanding its Silicon Valley operations, creating a dedicated campus in Palo Alto.

Adding two new buildings and 150,000 square feet of work and lab space adjacent to the current Research and Innovation Center, the expanded campus grows the company’s local footprint and supports plans to double the size of the Palo Alto team by the end of 2017.

“Our presence in Silicon Valley has been integral to accelerating our learning and deliverables driving Ford Smart Mobility,” said Ken Washington, Ford vice president, Research and Advanced Engineering. “Our goal was to become a member of the community. Today, we are actively working with more than 40 startups, and have developed a strong collaboration with many incubators, allowing us to accelerate development of technologies and services.”

Since the new Ford Research and Innovation Center Palo Alto opened in January 2015, the facility has rapidly grown to be one of the largest automotive manufacturer research centers in the region. Today, it is home to more than 130 researchers, engineers and scientists, who are increasing Ford’s collaboration with the Silicon Valley ecosystem.

Research and Innovation Center Palo Alto’s multi-disciplinary research and innovation is the newest of nearly a dozen of Ford’s global research, innovation, IT and engineering centers. The expanded Palo Alto campus opens in mid-2017.

Cars

Meet Aston Martin F1’s incredible moving data centre

The Aston Martin Red Bull Racing team faces a great deal more IT challenges than your average enterprise as not many IT teams have to rebuild their data center 21 times each year and get it running it up in a matter of hours. Not many data centers are packed up and transported around the world by air and sea along with 45 tonnes of equipment. Not many IT technicians also have to perform a dual role as pit stop mechanic.

Published

on

The trackside garage at an F1 race is a tight working environment and a team of only two IT technicians face pressure from both the factory and trackside staff to get the trackside IT up and running very fast. Yet, despite all these pressures, Aston Martin Red Bull Racing do not have a cloud-led strategy. Instead they have chosen to keep all IT in house.

The reason for this is performance. F1 is arguably the ultimate performance sport. A walk round the team’s factory in Milton Keynes, England, makes it abundantly clear that the whole organization is hell bent on maximizing performance. 700 staff at the factory are all essentially dedicated to the creation of just two cars. The level of detail that is demanded in reaching peak performance is truly mind blowing. For example, one machine with a robotic arm that checks the dimensions of the components built at the factory is able to measure accuracy to a scale 10 times thinner than a human hair.

This quest for maximum performance, however, is hampered at every turn by the stringent rules from the F1 governing body – the FIA. Teams face restrictions on testing and technology usage in order to prevent the sport becoming an arms race. So, for example, pre-season track testing is limited to only 8 days. Furthermore, wind tunnel testing is only allowed with 60% scale models and wind tunnel-usage is balanced with the use of Computational Fluid Dynamics (CFD) software, essentially a virtual wind tunnel. Teams that overuse one, lose time with the other.

In order to maximize performance within uniquely difficult logistical and regulatory conditions, the Aston Martin Red Bull Racing team has had to deploy a very powerful and agile IT estate.

According to Neil Bailey, Head of IT Infrastructure, Enterprise Architecture and Innovation, their legacy trackside infrastructure was “creaking”. Before choosing hyperconverged infrastructure, their “traditional IT had reached its limits”, says Bailey. “When things reach their limits they break, just like a car,” adds Bailey.

The team’s biggest emphasis for switching to HPE’s hyperconverged infrastructure, SimpliVity, was performance. Now, with “the extra performance of SimpliVity, it means it doesn’t get to its limits,” says Bailey. HPE SimpliVity has helped reduce space, has optimized processing power and brought more agility.

One of the first and most important use cases they switched to hyperconverged infrastructure was post-processing trackside data. During a race weekend each car is typically fitted with over 100 sensors providing key data on things like tyre temperature and downforce multiple times per second. Processing this data and acting on the insights is key to driving performance improvements. With their legacy infrastructure, Bailey says they were “losing valuable track time during free practice waiting for data processing to take place.” Since switching to HPE SimpliVity, data processing has dropped from being more than 15 minutes to less than 5 minutes. Overall, the team has seen a 79% performance boost compared to the legacy architecture. This has allowed for real time race strategy analysis and has improved race strategy decision making.

Data insights helps the team stay one step ahead, as race strategy decisions are data driven. For example, real time tyre temperature data helps the team judge tyre wear and make pit stop decisions. Real time access to tyre data helped the team to victory at the 2018 Chinese Grand Prix as the Aston Martin Red Bull cars pitted ahead of the rest of the field and Daniel Ricciardo swept to a memorable victory.

Hyperconverged infrastructure is also well suited to the “hostile” trackside environment, according to Bailey. With hyperconverged infrastructure, only two racks are needed at each race of which SimpliVity only takes up about 20% of the space, thus freeing up key space in very restricted trackside garages. Furthermore, with the team limited to 60 staff at each race, only two of Bailey’s team can travel. The reduction in equipment and closer integration of HPE SimpliVity means engineers can get the trackside data center up and running quickly and allow trackside staff to start work as soon as they arrive.

Since seeing the notable performance gains from using hyperconverged infrastructure for trackside data processing, the team has also transitioned some of the factory’s IT estate over to HPE SimpliVity. This includes: Aerodynamic metrics, ERP system, SQL server, exchange server and the team’s software house, the Team Foundation Server.

As well as seeing huge performance benefits, HPE SimpliVity has significantly impacted the work patterns of Bailey’s team of just ten. According to Bailey, the biggest operational win from hyperconverged infrastructure is “freeing up engineers’ time from focusing on ‘business as usual’ to innovation.” Traditional IT took up too much of the engineers’ time monitoring systems and just keeping things running. Now with HPE SimpliVity, Bailey’s team can “give the business more and quicker” and “be more creative with how they use technology.”

Hyperconverged infrastructure has given Aston Martin Red Bull Racing the speed, scalability and agility they require without any need to turn to the cloud. It allows them to deliver more and more resources to trackside staff in an increasingly responsive manner. However, even with all these performance gains, Aston Martin Red Bull Racing has been able to reduce IT costs. So, the users are happy, the finance director is happy and the IT team are happy because their jobs are easier. Hyperconvergence is clearly the right choice for the unique challenges of Formula 1 racing.

Continue Reading

Cars

Body-tracking tech moves to assembly line

Technology typically used by the world’s top sport stars to raise their game, or ensure their signature skills are accurately replicated in leading video games, is now being used on an auto assembly line.

Published

on

Employees at Ford’s Valencia Engine Assembly Plant, in Spain, are using a special suit equipped with advanced body tracking technology. The pilot system, created by Ford and the Instituto Biomecánica de Valencia, has involved 70 employees in 21 work areas. 

Player motion technology usually records how athletes sprint or turn, enabling sport coaches or game developers to unlock the potential of sport stars in the real world or on screen. Ford is using it to design less physically stressful workstations for enhanced manufacturing quality.

“It’s been proven on the sports field that with motion tracking technology, tiny adjustments to the way you move can have a huge benefit,” said Javier Gisbert, production area manager, Ford Valencia Engine Assembly Plant. “For our employees, changes made to work areas using similar technology can ultimately ensure that, even on a long day, they are able to work comfortably.”

Engineers took inspiration from a suit they saw at a trade fair that demonstrated how robots could replicate human movement and then applied it to their workplace, where production of the  new Ford Transit Connect and 2.0-litre EcoBoost Duratec engines began this month.

The skin-tight suit consists of 15 tiny movement tracking light sensors connected to a wireless detection unit. The system tracks how the person moves at work, highlighting head, neck, shoulder and limb movements. Movement is recorded by four specialised motion-tracking cameras – similar to those usually paired with computer game consoles – placed near the worker and captured as a 3D skeletal character animation of the user.

Specially trained ergonomists then use the data to help employees align their posture correctly. Measurements captured by the system, such as an employee’s height or arm length, are used to design workstations, so they better fit employees. 

Continue Reading

Trending

Copyright © 2018 World Wide Worx