As the autonomous car gets closer to becoming a reality, MIKE WHITFIELD, MD at Nissan South Africa, believes that they will also be smart enough to make life saving decisions without human intervention.
Autonomous driving technology is developing at a rapid pace. Business Insider publication’s research platform has forecast that there will be around 10 million cars with various self-driving features on the road in the UK by 2020. But the closer we get to our ultimate goal of completely driverless cars, the more critical it becomes for manufacturers to ensure it’s safe for us to place these vehicles on the road.
It’s no secret that autonomous driving technology has the ability to change lives and to save them. Not only is this technology expected to reduce serious traffic incidents – the Society of Motor Manufacturers and Traders (SMMT) predicted that in the UK accidents would reduce by 25 000 a year by 2030 – but it will also make automotive transportation available to people who were previously unable to drive.
But as advances in autonomous driving technology continue, so important questions around the complexity of having these vehicles on the road continue to arise. For example, how can drivers learn to trust autonomous vehicles? How will vehicles communicate with drivers and alert them to the presence of other vehicles on the road? And, what actions will vehicles take after identifying objects, signs and other road infrastructure such as painted lanes?
Can driverless cars handle unpredictable situations?
One of the biggest questions around the safety of this technology is what would happen in an unpredictable situation? Would the system make the right decision and navigate the vehicle through the scenario safely?
At the moment the autonomous driving technology used on roads is not fully autonomous. Nissan’s ProPILOT, still requires a driver to be present and ready to take over the control of the vehicle at any moment.
The technology, which launched and went on sale in Japan last year, enables cars to drive autonomously is a single lane, including in heavy stop-and-go-traffic. It’s the first time that a combination of steering, acceleration and braking has been operated in fully automatic mode, easing the workload of the driver in heavy traffic.
However, ultimate control and responsibility remains with the driver.
In fact, should the driver remove their hands from the steering wheel, a warning light will come on and an alarm will sound. The system will literally deactivate until the driver places their hands back on the wheel.
The day is fast approaching, though, when completely driverless cars will become a reality.
When that day comes, the question of who takes control in an emergency situation will need to be answered.
Particularly a situation in which the technology would be required to make an ethical decision. For example, the decision to swerve and avoid hitting a pedestrian might endanger the passengers within the vehicle. How does the technology discern the right course of action in this instance?
Not surprisingly, the inability of autonomous vehicles to ‘handle’ these unpredictable situations is one of the major stumbling blocks to a future of fully autonomous driving.
The good news, however, is that SAM has the ability to solve this problem. Nissan’s Seamless Autonomous Mobility system (SAM) can navigate unforeseen situations such as accidents, road construction and other obstacles. Ultimately, SAM will help us realise a future in which autonomous cars can operate safely and smoothly.
How does SAM work?
Basically, SAM is smart enough to know when not to navigate a potentially dangerous situation by itself.
Let’s say while driving you encounter an accident scene at which police are using hand signals to direct traffic, possibly against the normal rules of the road. In this scenario SAM will bring your vehicle to a safe stop and request help from the command centre.
This request is passed on to a mobility manager – an actual person who is using vehicle images and sensor data (streamed via the wireless network) to assess the situation, decide on the correct action, and create a safe path around the obstruction.
The mobility manager paints a virtual lane for the vehicle to drive itself through. Then once it clears the accident scene, the vehicle again resumes full autonomy.
The great thing about SAM is that it’s able to learn from experience – and as autonomous technology improves, vehicles will require less assistance from the mobility managers.
This technology will literally speed up the introduction of autonomous vehicles to our roads by decades.
Meet Aston Martin F1’s incredible moving data centre
The Aston Martin Red Bull Racing team faces a great deal more IT challenges than your average enterprise as not many IT teams have to rebuild their data center 21 times each year and get it running it up in a matter of hours. Not many data centers are packed up and transported around the world by air and sea along with 45 tonnes of equipment. Not many IT technicians also have to perform a dual role as pit stop mechanic.
The trackside garage at an F1 race is a tight working environment and a team of only two IT technicians face pressure from both the factory and trackside staff to get the trackside IT up and running very fast. Yet, despite all these pressures, Aston Martin Red Bull Racing do not have a cloud-led strategy. Instead they have chosen to keep all IT in house.
The reason for this is performance. F1 is arguably the ultimate performance sport. A walk round the team’s factory in Milton Keynes, England, makes it abundantly clear that the whole organization is hell bent on maximizing performance. 700 staff at the factory are all essentially dedicated to the creation of just two cars. The level of detail that is demanded in reaching peak performance is truly mind blowing. For example, one machine with a robotic arm that checks the dimensions of the components built at the factory is able to measure accuracy to a scale 10 times thinner than a human hair.
This quest for maximum performance, however, is hampered at every turn by the stringent rules from the F1 governing body – the FIA. Teams face restrictions on testing and technology usage in order to prevent the sport becoming an arms race. So, for example, pre-season track testing is limited to only 8 days. Furthermore, wind tunnel testing is only allowed with 60% scale models and wind tunnel-usage is balanced with the use of Computational Fluid Dynamics (CFD) software, essentially a virtual wind tunnel. Teams that overuse one, lose time with the other.
In order to maximize performance within uniquely difficult logistical and regulatory conditions, the Aston Martin Red Bull Racing team has had to deploy a very powerful and agile IT estate.
According to Neil Bailey, Head of IT Infrastructure, Enterprise Architecture and Innovation, their legacy trackside infrastructure was “creaking”. Before choosing hyperconverged infrastructure, their “traditional IT had reached its limits”, says Bailey. “When things reach their limits they break, just like a car,” adds Bailey.
The team’s biggest emphasis for switching to HPE’s hyperconverged infrastructure, SimpliVity, was performance. Now, with “the extra performance of SimpliVity, it means it doesn’t get to its limits,” says Bailey. HPE SimpliVity has helped reduce space, has optimized processing power and brought more agility.
One of the first and most important use cases they switched to hyperconverged infrastructure was post-processing trackside data. During a race weekend each car is typically fitted with over 100 sensors providing key data on things like tyre temperature and downforce multiple times per second. Processing this data and acting on the insights is key to driving performance improvements. With their legacy infrastructure, Bailey says they were “losing valuable track time during free practice waiting for data processing to take place.” Since switching to HPE SimpliVity, data processing has dropped from being more than 15 minutes to less than 5 minutes. Overall, the team has seen a 79% performance boost compared to the legacy architecture. This has allowed for real time race strategy analysis and has improved race strategy decision making.
Data insights helps the team stay one step ahead, as race strategy decisions are data driven. For example, real time tyre temperature data helps the team judge tyre wear and make pit stop decisions. Real time access to tyre data helped the team to victory at the 2018 Chinese Grand Prix as the Aston Martin Red Bull cars pitted ahead of the rest of the field and Daniel Ricciardo swept to a memorable victory.
Hyperconverged infrastructure is also well suited to the “hostile” trackside environment, according to Bailey. With hyperconverged infrastructure, only two racks are needed at each race of which SimpliVity only takes up about 20% of the space, thus freeing up key space in very restricted trackside garages. Furthermore, with the team limited to 60 staff at each race, only two of Bailey’s team can travel. The reduction in equipment and closer integration of HPE SimpliVity means engineers can get the trackside data center up and running quickly and allow trackside staff to start work as soon as they arrive.
Since seeing the notable performance gains from using hyperconverged infrastructure for trackside data processing, the team has also transitioned some of the factory’s IT estate over to HPE SimpliVity. This includes: Aerodynamic metrics, ERP system, SQL server, exchange server and the team’s software house, the Team Foundation Server.
As well as seeing huge performance benefits, HPE SimpliVity has significantly impacted the work patterns of Bailey’s team of just ten. According to Bailey, the biggest operational win from hyperconverged infrastructure is “freeing up engineers’ time from focusing on ‘business as usual’ to innovation.” Traditional IT took up too much of the engineers’ time monitoring systems and just keeping things running. Now with HPE SimpliVity, Bailey’s team can “give the business more and quicker” and “be more creative with how they use technology.”
Hyperconverged infrastructure has given Aston Martin Red Bull Racing the speed, scalability and agility they require without any need to turn to the cloud. It allows them to deliver more and more resources to trackside staff in an increasingly responsive manner. However, even with all these performance gains, Aston Martin Red Bull Racing has been able to reduce IT costs. So, the users are happy, the finance director is happy and the IT team are happy because their jobs are easier. Hyperconvergence is clearly the right choice for the unique challenges of Formula 1 racing.
Body-tracking tech moves to assembly line
Technology typically used by the world’s top sport stars to raise their game, or ensure their signature skills are accurately replicated in leading video games, is now being used on an auto assembly line.
Employees at Ford’s Valencia Engine Assembly Plant, in Spain, are using a special suit equipped with advanced body tracking technology. The pilot system, created by Ford and the Instituto Biomecánica de Valencia, has involved 70 employees in 21 work areas.
Player motion technology usually records how athletes sprint or turn, enabling sport coaches or game developers to unlock the potential of sport stars in the real world or on screen. Ford is using it to design less physically stressful workstations for enhanced manufacturing quality.
“It’s been proven on the sports field that with motion tracking technology, tiny adjustments to the way you move can have a huge benefit,” said Javier Gisbert, production area manager, Ford Valencia Engine Assembly Plant. “For our employees, changes made to work areas using similar technology can ultimately ensure that, even on a long day, they are able to work comfortably.”
Engineers took inspiration from a suit they saw at a trade fair that demonstrated how robots could replicate human movement and then applied it to their workplace, where production of the new Ford Transit Connect and 2.0-litre EcoBoost Duratec engines began this month.
The skin-tight suit consists of 15 tiny movement tracking light sensors connected to a wireless detection unit. The system tracks how the person moves at work, highlighting head, neck, shoulder and limb movements. Movement is recorded by four specialised motion-tracking cameras – similar to those usually paired with computer game consoles – placed near the worker and captured as a 3D skeletal character animation of the user.
Specially trained ergonomists then use the data to help employees align their posture correctly. Measurements captured by the system, such as an employee’s height or arm length, are used to design workstations, so they better fit employees.