As the autonomous car gets closer to becoming a reality, MIKE WHITFIELD, MD at Nissan South Africa, believes that they will also be smart enough to make life saving decisions without human intervention.
Autonomous driving technology is developing at a rapid pace. Business Insider publication’s research platform has forecast that there will be around 10 million cars with various self-driving features on the road in the UK by 2020. But the closer we get to our ultimate goal of completely driverless cars, the more critical it becomes for manufacturers to ensure it’s safe for us to place these vehicles on the road.
It’s no secret that autonomous driving technology has the ability to change lives and to save them. Not only is this technology expected to reduce serious traffic incidents – the Society of Motor Manufacturers and Traders (SMMT) predicted that in the UK accidents would reduce by 25 000 a year by 2030 – but it will also make automotive transportation available to people who were previously unable to drive.
But as advances in autonomous driving technology continue, so important questions around the complexity of having these vehicles on the road continue to arise. For example, how can drivers learn to trust autonomous vehicles? How will vehicles communicate with drivers and alert them to the presence of other vehicles on the road? And, what actions will vehicles take after identifying objects, signs and other road infrastructure such as painted lanes?
Can driverless cars handle unpredictable situations?
One of the biggest questions around the safety of this technology is what would happen in an unpredictable situation? Would the system make the right decision and navigate the vehicle through the scenario safely?
At the moment the autonomous driving technology used on roads is not fully autonomous. Nissan’s ProPILOT, still requires a driver to be present and ready to take over the control of the vehicle at any moment.
The technology, which launched and went on sale in Japan last year, enables cars to drive autonomously is a single lane, including in heavy stop-and-go-traffic. It’s the first time that a combination of steering, acceleration and braking has been operated in fully automatic mode, easing the workload of the driver in heavy traffic.
However, ultimate control and responsibility remains with the driver.
In fact, should the driver remove their hands from the steering wheel, a warning light will come on and an alarm will sound. The system will literally deactivate until the driver places their hands back on the wheel.
The day is fast approaching, though, when completely driverless cars will become a reality.
When that day comes, the question of who takes control in an emergency situation will need to be answered.
Particularly a situation in which the technology would be required to make an ethical decision. For example, the decision to swerve and avoid hitting a pedestrian might endanger the passengers within the vehicle. How does the technology discern the right course of action in this instance?
Not surprisingly, the inability of autonomous vehicles to ‘handle’ these unpredictable situations is one of the major stumbling blocks to a future of fully autonomous driving.
The good news, however, is that SAM has the ability to solve this problem. Nissan’s Seamless Autonomous Mobility system (SAM) can navigate unforeseen situations such as accidents, road construction and other obstacles. Ultimately, SAM will help us realise a future in which autonomous cars can operate safely and smoothly.
How does SAM work?
Basically, SAM is smart enough to know when not to navigate a potentially dangerous situation by itself.
Let’s say while driving you encounter an accident scene at which police are using hand signals to direct traffic, possibly against the normal rules of the road. In this scenario SAM will bring your vehicle to a safe stop and request help from the command centre.
This request is passed on to a mobility manager – an actual person who is using vehicle images and sensor data (streamed via the wireless network) to assess the situation, decide on the correct action, and create a safe path around the obstruction.
The mobility manager paints a virtual lane for the vehicle to drive itself through. Then once it clears the accident scene, the vehicle again resumes full autonomy.
The great thing about SAM is that it’s able to learn from experience – and as autonomous technology improves, vehicles will require less assistance from the mobility managers.
This technology will literally speed up the introduction of autonomous vehicles to our roads by decades.
Project Bloodhound saved
The British project to break the world landspeed record at a site in the Northern Cape has been saved by a new backer, after it went into bankruptcy proceedings in October.
Two weeks ago, and two months after entering voluntary administration, the Bloodhound Programme Limited announced it was shutting down. This week it announced that its assets, including the Bloodhound Supersonic Car (SSC), had been acquired by an enthusiastic – and wealthy – supporter.
“We are absolutely delighted that on Monday 17th December, the business and assets were bought, allowing the Project to continue,” the team said in a statement.
“The acquisition was made by Yorkshire-based entrepreneur Ian Warhurst. Ian is a mechanical engineer by training, with a strong background in managing a highly successful business in the automotive engineering sector, so he will bring a lot of expertise to the Project.”
Warhurst and his family, says the team, have been enthusiastic Bloodhound supporters for many years, and this inspired his new involvement with the Project.
“I am delighted to have been able to safeguard the business and assets preventing the project breakup,” he said. “I know how important it is to inspire young people about science, technology, engineering and maths, and I want to ensure Bloodhound can continue doing that into the future.
“It’s clear how much this unique British project means to people and I have been overwhelmed by the messages of thanks I have received in the last few days.”
The record attempt was due to be made late next year at Hakskeen Pan in the Kalahari Desert, where retired pilot Andy Green planned to beat the 1228km/h land-speed record he set in the United States in 1997. The target is for Bloodhound to become the first car to reach 1000mph (1610km/h). A track 19km long and 500 metres wide has been prepared, with members of the local community hired to clear 16 000 tons of rock and stone to smooth the surface.
The team said in its announcement this week: “Although it has been a frustrating few months for Bloodhound, we are thrilled that Ian has saved Bloodhound SSC from closure for the country and the many supporters around the world who have been inspired by the Project. We now have a lot of planning to do for 2019 and beyond.”
Motor Racing meets Machine Learning
The futuristic car technology of tomorrow is being built today in both racing cars and
toys, writes ARTHUR GOLDSTUCK
The car of tomorrow, most of us imagine, is being built by the great automobile manufacturers of the world. More and more, however, we are seeing information technology companies joining the race to power the autonomous vehicle future.
Last year, chip-maker Intel paid $15.3-billion to acquire Israeli company Mobileye, a leader in computer vision for autonomous driving technology. Google’s autonomous taxi division, Waymo, has been valued at $45-billion.
Now there’s a new name to add to the roster of technology giants driving the future.
Amazon Web Services, the world’s biggest cloud computing service and a subsidiary of Amazon.com, last month unveiled a scale model autonomous racing car for developers to build new artificial intelligence applications. Almost in the same breath, at its annual re:Invent conference in Las Vegas, it showcased the work being done with machine learning in Formula 1 racing.
AWS DeepRacer is a 1/18th scale fully autonomous race car, designed to incorporate the features and behaviour of a full-sized vehicle. It boasts all-wheel drive, monster truck tires, an HD video camera, and on-board computing power. In short, everything a kid would want of a self-driving toy car.
But then, it also adds everything a developer would need to make the car autonomous in ways that, for now, can only be imagined. It uses a new form of machine learning (ML), the technology that allows computer systems to improve their functions progressively as they receive feedback from their activities. ML is at the heart of artificial intelligence (AI), and will be core to autonomous, self-driving vehicles.
AWS has taken ML a step further, with an approach called reinforcement learning. This allows for quicker development of ML models and applications, and DeepRacer is designed to allow developers to experiment with and hone their skill in this area. It is built on top of another AWS platform, called Amazon SageMaker, which enables developers and data scientists to build, train, and deploy machine learning quickly and easily.
Along with DeepRacer, AWS also announced the DeepRacer League, the world’s first global autonomous racing league, open to anyone who orders the scale model from AWS.
As if to prove that DeepRacer is not just a quirky entry into the world of motor racing, AWS also showcased the work it is doing with the Formula One Group. Ross Brawn, Formula 1’s managing director of Motor Sports, joined AWS CEO Andy Jassy during the keynote address at the re:Invent conference, to demonstrate how motor racing meets machine learning.
“More than a million data points a second are transmitted between car and team during a Formula 1 race,” he said. “From this data, we can make predictions about what we expect to happen in a wheel-to-wheel situation, overtaking advantage, and pit stop advantage. ML can help us apply a proper analysis of a situation, and also bring it to fans.
“Formula 1 is a complete team contest. If you look at a video of tyre-changing in a pit stop – it takes 1.6 seconds to change four wheels and tyres – blink and you will miss it. Imagine the training that goes into it? It’s also a contest of innovative minds.”
Formula 1 racing has more than 500 million global fans and generated $1.8 billion in revenue in 2017. As a result, there are massive demands on performance, analysis and information.
During a race, up to 120 sensors on each car generate up to 3GB of data and 1 500 data points – every second. It is impossible to analyse this data on the fly without an ML platform like Amazon SageMaker. It has a further advantage: the data scientists are able to incorporate 65 years of historical race data to compare performance, make predictions, and provide insights into the teams’ and drivers’ split-second decisions and strategies.
This means Formula 1 can pinpoint how a driver is performing and whether or not drivers have pushed themselves over the limit.
“By leveraging Amazon SageMaker and AWS’s machine-learning services, we are able to deliver these powerful insights and predictions to fans in real time,” said Pete Samara, director of innovation and digital technology at Formula 1.