Preparing for the Mobility Revolution

Driver & Passenger Understanding: The Next Frontier
July 2018

So far, few solutions have been created to assess real time traffic risks around vehicles, and even fewer to understand the behavior of humans inside them. Safety, comfort and onboard e-commerce solutions can be enhanced by leveraging sensors, AI, onboard compute platforms and connectivity. Improvement can be brought about by the analysis of driving habits, driving behavior vs. traffic risks and the generation of in-cabin intelligence. Whereas basic data is generated by native sensors and accessed via the OBD port, additional input can be extracted from extra cameras or biometric sensors. Let’s look at the different value propositions and some of the enabling solutions. 

 

 

Evaluate drivers by assessing their driving habits

This most basic level in this array of features identifies hard accelerations, braking and steering, or frequent lane changes. Depending on the solution, feedback is provided to drivers to help them drive more safely or improve fuel economy. Fleets are the prime targets for these solutions as they provide fleet managers with insight into the driving habits of their employees. Incentives can then be put in place with the objective to curtail accidents and reduce insurance cost. The latter benefit is also targeted by private owners who accept to share their driving profile with insurance providers. These in turn offer usage-based insurance premium (UBI), i.e., incentive to drive more safely.

 

Solutions are mostly based on connected dongles plugged into the OBD port. In this case, the underlying data is generated by the vehicle’s native sensors. Some applications may use sensors inside smartphones, e.g., accelerometer. Providers of such solutions include Zendrive, Octo (insurance focus), KeepTruckin (truck specific), Caruma or Voyomotive.

 

 

Analyzing traffic risks vs. driver behavior can curtail accidents

The next level in terms of value proposition builds on the previous one and aims at increasing real time safety. These solutions monitor the driving scene ahead of the vehicle, e.g., leading vehicle braking hard, lane departure, etc. The more sophisticated systems also monitor the driver and assess their ability to react by analyzing whether they are on the phone, texting, looking away from the road or focused on driving. A safety score can be derived from this analysis. These aftermarket systems will then warn drivers but stop short of acting as a native Emergency Braking System. In addition, events can be recorded to identify responsibilities in case of accidents, serve for driver training purpose or be used as raw material for the development of autonomous vehicles.

 

Solutions that simply observe traffic risk rely on a forward facing camera, which in some cases is the one fitted on the drivers’ smartphone. The full set of features, including driver behavior analysis, also require a rear facing camera. Some companies that provide such solutions are Nauto, CarVi, i4drive or SmartDrive (focus on commercial vehicles).

 

 

Highly automated vehicles require specific solutions for safety

In Level 2 (SAE), partial or “hands off” automation, the system steers, brakes and accelerates in certains conditions, but the drivers must remain focused on the road. In Level 3,  conditional or “eyes off” automation, the driver no longer needs to continuously watch the road but must be able to take over the driving function if requested. This calls for the need to monitor the driver and assess his/her state. Different solutions will monitor eye gaze, eyelid movement, whether hands are on the wheel, etc., and will infer a state of awareness, inattention or drowsiness. Currently, drivers may be requested to signal their awareness every so often — as it has been the case for train engineers for many years. For instance, Tesla warns drivers if their hands have been off the wheels for too long. Settings were strengthened last month after a deadly accident in March, which indicates this is a critical feature. Audi and Cadillac use cameras positioned behind the wheel respectively on the A8 (with Traffic Jam Pilot, available in Europe only for now) and the CT6 (with Super Cruise). On these two cars, the native cameras monitor the drivers’ head pose and eye gaze direction to assess whether they are looking at the road.

The solutions currently in production are provided by SeeingMachine (Cadillac) and Smart Eye (Audi). Other providers of vision-AI solutions include Eyeris (more below), or Innov +.

 

 

Biometrics measurement can help improve comfort

Today, comfort inside the cabin is optimized by personalizing settings for the seating/driving position, temperature and airflow, lighting and the infotainment system. What if your vehicle could make these adjustments based on your actual body temperature, heart rate, respiration rate, sweating, fatigue or brain load? By fusing various data sets, one can make a determination as to how best to make passengers feel most comfortable. Life Detection Technologies and BrainCo could provide such input to enable automatic cabin setting optimization. 

 

In-cabin intelligence for safety, comfort and e-commerce

This constitutes the last untapped space. Who is inside the vehicle? What are the drivers’ and passengers’ mood and behavior? What are they doing? These questions become more relevant as mobility evolves towards autonomy and sharing. When drivers are conditionally relieved from driving duty, monitoring eye movement may not be enough to assess their ability to take over. Body pose (head and upper body positions) gives valuable insight to this end and can even be fused with a traffic risk assessment to maximize safety as defined above. 

In parallel, shared mobility makes it necessary to monitor what is going on inside the cabin. The identification of passengers, their age group, gender, race, mood (sad, joyful, fearful…) and activity (on the phone, texting, reading, sleeping…) gives valuable information to mobility service providers. Moreover, ensuring all passengers are properly seated inside autonomous shuttles (or robo-taxis) or identifying objects left behind will also be critical. It is important to know passengers are not moving around the cabin as the vehicle moves. Lastly, Human Behavior Understanding (HBU) will enable targeted promotion.

 

At this point, Eyeris (of which I am a Board Advisor) is the only company capable to offer a comprehensive HBU solution to OEMs and mobility operators, with a full suite of interior vision analytics. This includes emotion recognition from occupants’ facial micro-expression, real-time 2D upper body tracking, action recognition and activity prediction. The solution is targeting automotive OEMs and Tier1 suppliers. It is based on a multi-modal AI engine that uses deep learning at the edge, probabilistic modeling techniques and efficient inference. It works with any standard 2D camera placed inside the cabin.

 

 

Conclusion

The understanding of both humans inside the cabin and traffic risk, made possible mostly by AI, will improve safety and comfort. Most of the data processing is done on board the vehicle, which will ensure privacy where needed. However, raw or processed data can be shared for monitoring and value-adding purposes. Whereas most of the solutions presented here are for the aftermarket, the underlying tech will progressively be designed into new vehicles to make our mobility experience safer and more pleasant.

Marc Amblard

Managing Director, Orsay Consulting

Feel free to like this article on LinkedIn. Thanks!

© 2020 by Orsay Consulting