Today we interview Valeo’s Alexander Muehlbauer and Rémi Mathieu about the Valeo innovation which has just been launched in Korea.

Valeo’s public post says their XR gaming demo was shown in Austin from 12 to 14 March, and is making its commercial debut in Korea on Renault Korea’s Filante, together with holoride. Valeo positions it as an experience built on existing vehicle sensors, vehicle motion, and software integration.
The concept does not depend on new hardware; this is a camp-on strategy: GNSS data for localization and mapping, vehicle data from the bus for dynamic state and perception, front/360 camera video streams for the gaming background, and then the infotainment domain sending video plus processed 3D data to the application. On the user side, the experience can run on passenger displays or in a bring-your-own-device configuration, with the smartphone acting as a controller over local Wi-Fi or bluetooth. In other words, “It is a real-time software layer connected to the perception-infotainment-display chain”.


Valeo places XR gaming within a broader SDV stack comprising services, middleware, computing resources, automated-driving sensors and interior experience devices. XR gaming appears alongside functions such as driver monitoring, data fusion, computer vision and OS/middleware. The game is not treated as an isolated rear-seat gimmick, but framed as one application made possible by a vehicle platform already organized around centralized computing, ADAS sensors, and perception algorithms. That may be the strongest message for the industry: ADAS value no longer stops at safety. It can also feed passenger experience and create new digital-service territory inside the cabin.
The demonstrated game, R:Racing, is equally revealing: we could have a virtual mode and an AR mode, identified as first-series implemented modes as of March 2026. The real road becomes the live backdrop of the game; in virtual mode, gameplay still remains synchronized with vehicle motion. Valeo explains that the system blends real-time vehicle data with live environmental perception to create a motion-synchronized experience, while holoride notes that synchronization can reduce signs of motion sickness, though not eliminate them entirely.

The demo images are also instructive from an HMI standpoint. They clearly show rear-seat screens or tablets mounted behind the front seats, with smartphones or game controllers used as input devices.: new end-consumer applications, tablet and vehicle-display usage, a kind of third-screen logic, and current driver restrictions until higher levels of autonomy are reached. In the near term, the natural playground is the passenger zone, possibly including stationary or charging scenarios for other applications. In the longer term, the challenge will be to broaden usage contexts without ever crossing the safety line on driver distraction.
From an industrial standpoint, the most serious question is therefore not whether the visuals feel closer to arcade gaming than to high-end simulation. That is beside the point. The real challenge is the technical chain: end-to-end latency, perception robustness, AR rendering quality under changing light and weather, compute availability, prioritization versus safety-critical functions, cybersecurity and UX consistency across embedded displays and personal devices. Building a game in a car is relatively easy. Building context-aware, motion-synchronized gaming on top of production grade perception assets without disturbing the safety architecture is a very different engineering proposition.
On Renault Korea’s Filante, launched first in Korea before other non-European markets. For cockpit players, the question now shifts from whether people can play in the car to what other high-value applications can be built tomorrow on the same sensor,-computing-software stack.
Won’t it be thrilling to be able to try it on the roads next to Créteil (France)!!!