Connect with us

Cars

Heads-up display for any car

Published

on

Navdy, the world’s first augmented driving device, is now available at iStore in South Africa.

Navdy uses augmented reality (AR) technology to project information, sourced from the smartphone and the car itself, directly in the driver’s line of sight, for a new driving experience.

Navdy’s breakthrough user interface projects a transparent image on the road ahead. It incorporates hand gestures to accept calls with the wave of a hand. A specially engineered dial, which attaches to the steering wheel, allows access to features that let the driver control the phone hands-free. Maps, calls, messages, notifications, music, and car information are projected directly in front of the driver.

“The way we interact with our cars hasn’t changed much in decades, and phones were never designed to be used while driving,” says Navdy CEO, Doug Simpson. “Navdy creates and defines an entirely new augmented driving category, enabling drivers to interact with their phones and cars in a much more immersive, natural, and intuitive way. We’ve developed multiple technologies and an entirely new user interface specifically designed to enhance the driving experience. Navdy fundamentally changes how we interact with useful information while driving.”

Navdy-Street_Web_Large

Navdy incorporates popular head-up display (HUD) technology, usually offered as a pricey upgrade package in luxury cars, with a groundbreaking user interface, and specially developed software, delivering a far richer driving experience at an accessible price. Since Navdy is portable, consumers can enjoy the Navdy experience in whatever car they currently drive.

Navdy provided the following information. Bear in mind that claims like “never miss a tiurn” cannot be guaranteed:

Navdy’s features include:

•       Look Forward Display: rich, full color, fully transparent display with unmatched clarity in any light that projects information into the distance so the road stays in focus. The most advanced display on the road.

•       Intuitive Interface: Navdy Hand Gestures are the most natural way to accept a call or message with the simple wave of your hand. The Navdy Dial is the most intuitive way to scroll, zoom and navigate menus fluidly. The Dial also serves as a convenient way to access Siri and Google Now. Never again will you have to hunt through rows of buttons or look down to poke at a touch screen while driving.

Screen Shot 2017-06-20 at 10.49.25 PM

•       Never Miss a Turn: Navdy’s Projected Navigation system is powered by Google Maps — with maps and directions appearing right in front of you, you’ll never miss a turn again. Only Navdy offers full dynamic maps as a transparent image without obstructing your view of the road. It’s as easy as following the car in front of you and uniquely immersive. With its own high precision GPS chip and local storage of maps, drivers don’t have to worry about losing navigation if they are out of network coverage.

•       Stay Connected: Navdy lets you make and receive calls, listen to messages, control music, receive calendar reminders and stay connected to the apps on your phone. Navdy also connects to your car with Navdy Dash to show your speed, RPM and automatically recommend nearby gas stations when your fuel level is low.

•       Portable and Storable: works in any car with Navdy’s magnetic mounting system making it effortless to take it with you or store it in the glove box.

•       Easy Setup: easy for anyone to set up and get rolling without tools in around 15 minutes.

•       Thoughtful Design: crafted from premium materials that blend in seamlessly with your car, adding a touch of sophistication.

Cars

Autonomous goes off-road

Jaguar Land Rover is developing autonomous cars capable of all-terrain, off-road driving in any weather condition. 

Published

on

The CORTEX project will take self-driving cars off-road, ensuring they are fully capable in any weather condition: dirt, rain, ice, snow or fog. As part of the project, a “5D” technique combining acoustic, video, radar, light detection and distance sensing (LiDAR) data live in real-time is being engineered. Access to this combined data improves the awareness of the environment the car is in. Machine-learning enables the self-driving car to behave in an increasingly sophisticated way, allowing it to handle any weather condition on any terrain.

“It’s important that we develop our self-driving vehicles with the same capability and performance customers expect from all Jaguars and Land Rovers,” said Chris Holmes, Connected and Autonomous Vehicle Research Manager at Jaguar Land Rover.

“Self-driving is an inevitability for the automotive industry and ensuring that our autonomous offering is the most enjoyable, capable and safe is what drives us to explore the boundaries of innovation. CORTEX gives us the opportunity to work with some fantastic partners whose expertise will help us realise this vision in the near future.”

Jaguar Land Rover is developing fully- and semi-automated vehicle technologies, offering customers a choice of the level of automation, while maintaining an enjoyable and safe driving experience. This project forms part of the company’s vision to make the self-driving car viable in the widest range of real-life, on- and off-road driving environments and weather.

CORTEX will develop the technology through algorithm development, sensor optimisation and physical testing on off-road tracks in the UK. The University of Birmingham, with its world leading research in radar and sensing for autonomous platforms and Myrtle AI, machine learning experts, join the project. CORTEX was announced as part of Innovate UK’s third round of Connected and Autonomous Vehicle Funding in March 2018.

Continue Reading

Cars

Smart car window brings scenery to partially sighted

Published

on

From rolling hills to mountain ranges, views make any road trip memorable, but for blind passengers this is part of the experience that they miss. A prototype smart car window aims to change this – by enabling blind or partially-sighted people to visualise passing scenery through touch.

Feel The View takes pictures that are turned into high-contrast monochrome images. These are then reproduced on the glass using special LEDs.

By touching the image, different shades of grey vibrate with a range of 255 intensities, allowing passengers to touch the scene and rebuild in their mind the landscape in front of them. Feel The

View was conceived and developed by Ford of Italy and GTB Roma, in collaboration with Aedo – an Italian start-up that specialises in devices for the visually impaired.

“We seek to make people’s lives better and this was a fantastic opportunity to help blind passengers experience a great aspect of driving. The technology is advanced, but the concept is simple – and could turn mundane journeys into truly memorable ones,” said Marco Alù Saffi, Ford of Italy.

For now, the technology is an early prototype, with no current plans for production.

Continue Reading

Trending

Copyright © 2018 World Wide Worx