Head-up displays (HUDs) were originally developed for aviation. Now they are seen as a great way to view navigation or safety alerts on the road. HUDs project images that appear either on the windscreen or in the field of view of the driver. And soon they may start augmenting reality.
Auto Futures’ U.S reporter, Lynn Walford, has been talking to a number of leading HUD experts who offered up future use-case scenarios.
Susan Drescher is the Advanced Product Manager and a HMI (human machine interface) expert at Continental. She says that the main reason for HUD is to improve off-road glance time. Continental is one of the major suppliers of HUDs to the automotive industry.
“When you don’t have to redirect your gaze off the windscreen it’s less likely to cause accidents,” says Drescher.
Currently, Continental HUDs are available as an option in Lincoln vehicles. The HUDs can show speeds, fuel, phone connections or navigation. The images are very bright and can even be seen with bright light or through polarized sunglass lenses. Drescher says research shows that drivers like navigation and seeing speed limits. Showing the limit and the speed being driven helps drivers stay under the speed limit. Most HUDs in current vehicles use mirrors to reflect images. There is a wedge in the glass to enable the image to be seen clearly.
The next phase of head-up displays will be when there are lines on the lane of the road to show lane departures or bracket to warn alert that cruise control is operating. Eventually HUD displays could show connected car data such as traffic or road hazards. Drescher predicts that, when cars become autonomous, HUDs could show that the vehicle is aware of the lines on the road and performing its functions, reassuring the driver that everything is working.
“When a driver knows that the system the car is taking care of business. They will be more likely to take their hands off the wheel for autonomous driving,” says Drescher.
Continental recently invested in DigiLens that offers compact holographic wide field-of-view waveguide technology for automotive HUDs.
“Research shows that volumetric displays with a true 3D environment are better for the brain,” says Juliana Clegg, CEO of Falcon-AR, which build augmented reality displays for automated vehicles. “When the brain has depth perception it can recognize images faster,” adds Clegg. She says that NASA originally developed HUD systems to keep pilots oriented when they were on the ground.
The team at Falcon-AR has been working on HUD displays for the last ten years. Falcon-AR offers a HUD system that creates 3D images and does not require any changes to glass in the windscreen. Falcon-AR technology is able to project an image where it placed accurately on the real image.
Clegg gives the following example. Say someone is walking across the street with a dog. The driver may not notice the person and the dog in the landscape. However, a symbol appears accurately over that image to warn the driver. The warning symbol is a culminated image with dimensions in three points in space (XYZ) that makes it appear like it’s hugging the dog walker.
The Swiss startup WayRay is expanding into the automotive sector with funding from car makers such as Porsche and Hyundai. It offers a wide-angle HUD.
“It’s an optical illusion on a thin film within the structure of the glass inside the windscreen,” says Vitaly Ponomarev, WayRay’s CEO. “There is a laser projector which gives an illusion of an image of any distance from zero to infinity but usually about 15 meters.”
WayRay developed an AR rendering engine that projects images at 60 frames per second. WayRay also offers a software development kit (True AR SDK) for developers to make apps.
“It’s super exciting to see images that keep with reality without vibration and you can see the object as part of reality such as in navigation,” says Ponomarev about test drives in WayRay HUD-vehicles. He says he sees major use of WayRay HUDs is for navigation but it can also be used to display the human user interface (HMI).
WayRay is experimenting with a new form of HUD in which the projector can be embedded into glass. It was shown off at CES 2019 in Las Vegas.
“With WayRay on the side windows, there could also be haptics and gesture control used for such things as points of interest, games, social media and dating apps,” envisions Ponomarev. “When you get to robotaxis, mobility services and mass transit there could be other things displayed such as advertising.”
Derek Vita, Senior Analyst for the In-Vehicle UX service at Strategy Analytics says: “At first, consumers like the look at either HMI displays or HUDs when they see them. However, after three to five bits of information are shown on the screen.”
“It becomes cluttered and the value is lessened, usability starts to become questionable,” Vita adds. It’s a warning to automakers that they may well have to be careful with how much information is appropriate for HUDs.