FIC introduced its Intelligent AR HUD for Commercial Vehicle system that successfully integrates full-color laser beam scanning.
Head’s up—driver displays are primed for some serious upgrades, thanks to advancements in augmented reality.
At least two new AR-based heads-up display systems were showcased at CES 2021, highlighting the technology’s potential—both now and for the driverless car future.
FIC (First International Computer) introduced its Intelligent AR HUD for Commercial Vehicle. The system, named an Honoree in the Vehicle Intelligence & Transportation category of the CES 2021 Innovation Awards, successfully integrates full-color laser beam scanning (FCLBS) technology for improved clarity.
“FCLBS technology is mainly designed to prevent direct sunlight from affecting the projected images, and through the automatic sensing system to detect environmental light sources and automatically adjust the projection brightness during driving,” says Alex Dee, vice president of the automotive electronic R&D department of FIC group. “It gives drivers the most comfortable, clearest and safest imaging effect in any weather,”
In recent years, he explains, augmented reality technology has evolved from 2D images to 3D projection integrated into the real-life environment.
“When AR technology is applied to daily driving, vehicle data will no longer be limited to just the digital instrument cluster or the advanced driver-assistance system (ADAS),” he says. “It will integrate with the traffic environment through 3D virtual projection and directly display information on the windshield in front of the driver’s eyes, making driving safer and more convenient.”
As the core technology, FCLBS can clearly project AR images 16-164 feet ahead on the road, even when driving in strong sunlight, heavy rain or fog, according to the company. In addition, eye-tracking technology helps provide vehicle information including speed, navigation, traffic warnings, forward collision detection, lane departure warning, pedestrian detection, traffic sign recognition and other important warning signals on the windshield to keep the driver’s vision away from gauge clusters.
Furthermore, when the messages are displayed through AR technology, the virtual images and real traffic environment are integrated in 3D in order to provide the driver with a clearer, safer and more convenient driving experience, according to the company.
Participating at CES for the first time, Raythink held online discussions with a number of manufacturers and technical experts to explore the technological innovations and applications for its own AR HUD offering.
The company expects its wide-angle spatial imaging AR HUD system to be a core application utilized in intelligent vehicle cockpits in the future, and explained how it integrates with intelligent driving devices.
The company’s patented OpticalCore technology provides a wide-angle field of vision (FOV), allowing it “to move away from a dependence on HUD displays as shown on the PVB wedge-shaped film interlayer of windshields, and present a clear AR visual effect without ghosting on any windshields,” according to the company, noting its AR Generator SDK, AR recognition algorithm and platform software system architecture can present a safe and comfortable AR driving experience on any HUD by integrating environmental recognition with multiple data sources and the independent AR rendering engine, in combination with AI-enabled visual-spatial coordinate correction.”
The product combines ADAS multi-sensor environment perception, map and navigation, and visual-assisted technology for precise lane positioning and to display a number of intelligent driving functions on the windshield including lane-level navigation, front collision warning (FCW), pedestrian collision warning (PCW), lane departure warning (LDW) and more.
With the accelerated development of intelligent network-connected vehicles and the deployment of mass-market L3 automatic driving technology by various automobile manufacturers, Raythink predicts the intelligent cockpit will be based on AR HUD as the main human/computer interface in the future.