Infrared and iris sensors help road safety

Feb 12 | 2021

The SEAT car company is pioneering a new technique to track a driver’s gaze through infrared cameras and algorithms. Knowing where users are looking helps to achieve a more intuitive and secure interaction with devices such as radios, sat navs and vehicle controls.

The glasses record patterns of behaviour which can then be analysedSEAT is using infrared light sensors, high resolution images and a sophisticated algorithm to find out exactly where people are looking. As we drive, the road must obviously be the main focus. That’s why it’s key to safety to be able to locate everything we’re looking for on the dashboard at a glance. “We must guarantee the minimum interaction time [so] this the information must be where users intuitively and naturally look for it,” said Rubén Martínez, Head of SEAT’s Smart Quality department.

What is it? 

Eye-Tracking is a technology that enables a computer to know where a person is looking. It does so through glasses with infrared sensors in the lenses and a camera in the centre of the frame. “The sensors detect the exact position of the iris at every moment, while everything the user sees is recorded,” explained Rubén. A complex 3D eye model algorithm interprets all this data and obtains the exact viewing point. 

The SEAT eye tracking glasses are fitted with infrared sensors in the lenses and a cameraWhat does it do? 

This technology makes it possible to obtain very precise studies on human interaction with all kinds of devices. “We can know where users expect to find information such as battery level or range of kilometres,” said Rubén. 

How is it used? 

The team is now working on a pilot test to introduce the Eye-Tracker glasses in the testing of new models. “We’ll ask [test drivers], for example, to turn up the temperature or change the radio station and we’ll analyse which part of the screen they’ve directed their gaze at first, how long it takes them to do so and how many times they look at the road while interacting with the device,” said Rubén. Before, these tests were done by asking people questions but this was imprecise, and the new system provides much more accurate data.

How does SEAT interpret the data? 

In the Smart Quality department facilities, using the complex algorithm, the behavioural patterns of each driver’s gaze are obtained through different indicators. One of them is the heat zone indicator, which shows the intensity of each focus of attention.  “The red spot, which indicates the greatest number of impacts, should always be on the road,” said Rubén. “It is the guarantee that users continue to pay attention to the road, even when interacting with the screen.” 

Another indicator is the order in which they look, a key to knowing where each driver expects to find a function. “We may think, for example, that the lower part of the screen is the most accessible, but with the Eye-Tracker glasses, we can discover that, for whatever reason, first they look at the upper part,” he explained. 

What future does it have? 

All these usability patterns will be key in developing the central consoles of tomorrow’s cars, determining the location, size and distribution of information that is most comfortable for users. “This technology will help us humanise the interfaces, improving the user experience. With it we’ll certainly go a step further in the quality of the infotainment console of the future,” Rubén concluded.

Photos: SEAT’s Eye-Tracking glasses are fitted with infrared sensors in the lenses and a camera. The glasses record the patterns of the driver’s behaviour which can then be analysed in detail.