GadgetWheels
AutoSens 2025: The cockpit awakens
Your next car will know you better than your phone, suggests a report released at the recent AutoSens USA 2025 conference.
For all the talk of self-driving cars and electric vehicles, the real revolution is happening in the space we already occupy: the cabin. Not the sleek exterior, nor the drivetrain underneath, but the user experience cocooned within. The most cutting-edge technology in today’s vehicles isn’t what makes them move, but what makes them care.
This was the unspoken theme that ran through the halls and talks at AutoSens USA 2025, held in Detroit last month. While LiDAR wars and edge-processing debates played out in the main arena, the conversations that lingered longest were about what is happening inside the car. The in-cabin track, once a side session at AutoSens, has now emerged as a full-fledged force in its own right, accelerated by the AutoSens partnership with Sense Media’s InCabin series and a growing community of engineers, designers and neuroscientists determined to reshape the way humans experience mobility.
The InCabin Report 2025, unveiled at the event, captures the culmination of that shift. Over the past three years, in-cabin sensing has transformed from a technical curiosity into a compliance necessity and, increasingly, a source of strategic differentiation.
Europe’s General Safety Regulation (GSR) has already mandated driver monitoring systems (DMS) for new vehicle types. Euro NCAP will start rating in-cabin systems on a scale from 2026, beginning with a 60-point requirement in the “Safe Driving” category for 5-star certification, rising to 80 by 2028. It is no longer about tinkering with dashboards. It’s about making safety personal and experiential.
“Driver Monitoring was the starting point,” says Manfred Wilck of Continental. “Today, Cabin Monitoring is making use of those same sensors to enhance occupant safety, comfort, personalised experiences and access functionalities”. That layered approach has turned vehicle interiors into complex ecosystems of sensors, algorithms and human-machine interfaces. What was once about preventing distraction is now about anticipating discomfort, intervening in moments of fatigue, and even monetising the ride itself.
Yes, monetisation. That was one of the more eyebrow-raising trends at AutoSens. Caroline Chung, senior engineering manager at Magna, pointed to a new wave of revenue models where the car becomes both a personal assistant and a retail platform. “Your car might suggest a coffee stop and get a kickback if you redeem the coupon,” she said. “Or project ads onto its windows while parked.” The once-static cabin is being reimagined as a living retail channel, fed by real-time behavioural data and edge AI.
But not all innovation comes wrapped in lights and logos. One of the most consequential trends at AutoSens USA was the convergence of neuroscience and engineering. A company called CorrActions demonstrated how changes in steering micro-movements can reveal cognitive impairment well before a driver becomes visibly drowsy. “We’re using vehicle sensor data as a proxy for brainwave activity,” said Gal Geffen-Frenkel. “It’s not just about fatigue – it’s about how the brain responds under stress, distraction or even the early onset of illness”.
Another startup, Neumo, showcased headrest-integrated EEG modules capable of reading low-power neural signals non-invasively. Their CEO, Niall Berkery, described a future where real-time brain scanning helps optimise everything from music selection to automated handover in Level 3 autonomy. “It’s the missing link between the intent of the human and the reaction of the machine,” he said.
While this sounds futuristic, the present-day reality is that most consumers are still fumbling through touchscreen interfaces just to adjust climate settings. Rob Stead, managing director of Sense Media, voiced the growing backlash against screen-only controls in his section of the report.
“I think it’s bad,” he writes. “Buttons are better.” He’s not alone. Volkswagen’s decision to reintroduce physical controls in its new Golf models – reversing years of minimalist interface design – is a sign that industry might finally be listening to users.
Indeed, that was one of the most refreshing takeaways from AutoSens: the realisation that understanding drivers and passengers as human beings, not data points, is now central to the design process.
Melissa Agosta, a psychology and neuroscience graduate embedded in the Sense Media team, drew on multiple cognitive theories to explain why ADAS systems often misfire, not because they’re technically flawed, but because they ignore how our brains handle information. “Selective attention and cognitive overload aren’t buzzwords,” she said. “They’re the difference between trust and frustration”.
That human insight becomes even more essential as the cabin becomes a canvas for wellness. Mood lighting, zonal audio, thermal cocooning and biometric feedback are being fused into a new kind of therapy on wheels.
Hugo Piccin from Forvia described it as “a multifunctional, occupant-centric environment dedicated to health and comfort.” As mental workload metrics become standard telemetry, we may soon see cabins that adapt not just to our seating preferences but to our emotional state.
What began as a sideline to keep drowsy drivers awake has morphed into a sophisticated interplay of sensors, psychology, and software. The cabin is no longer a passive enclosure. It’s a co-pilot, a confidant, and a commercial platform. At AutoSens USA 2025, the message was clear: the future of cars will be shaped not by what moves them forward, but by what happens inside.
