Brian Blum
November 27, 2018, Updated November 28, 2018

Here’s a secret about self-driving cars: They won’t be entirely autonomous for quite some time. The near future belongs to semi-autonomous vehicles, which do most of the driving but pass over control to a human behind the wheel at certain points

But how will the car know if the driver is ready and alert for the transfer?

Israeli computer vision company Eyesight uses a single camera located near the infotainment system or above the instrument cluster, plus proprietary algorithms and artificial intelligence (AI), to judge if the driver is prepped to take the wheel.

Even in a standard vehicle equipped with Eyesight technology, if you’re not paying attention your steering wheel will vibrate or beep. It’s like Mobileye for distracted driving.

There’s a big opportunity here: In 2020, new Euro NCAP (New Car Assignment Program) rules come into effect, requiring any manufacturer wishing to receive a 5-star safety rating on its models to have a driver-monitoring system installed.

Eyesight just raised $15 million to drive the expansion of its vehicle AI computer-vision system. All told, the company has raised nearly $50 million, CEO Gideon Shmuel tells ISRAEL21c, including a $20 million investment in 2016 from Chinese investor Kuang-Chi.

China’s carmakers are prominent on Eyesight’s client prospect list. In June, Eyesight joined forces with Soling Industrial, a Shenzhen-based automotive integrator with sales in more than 60 countries and clients including Mazda, Toyota and Fiat.

Eyesight initially integrated its technology into consumer electronics. On the company’s website, you can see images of people gesturing with their hands rather than clutching a mouse or a keyboard to operate PCs and smart TVs. Eyesight’s facial analysis can also souse out details like age and gender to present relevant content (read: ads) to viewers.

In the consumer space, Eyesight has deals with NTT Docomo and Sony in Japan. Sony is launching a portable projector called the Xperia Touch with eyeSight gesture controls built in.

That’s what Eyesight was working on originally in cars too – the ability to control entertainment and telephone system touch-free.

“With your finger, you can do a small circle motion to the right or left to turn volume up and down,” Shmuel says. You could flash one finger to call home, two fingers to call your office or hold your palm flat to answer an incoming call.

But as the Euro NCAP rules loomed, “the market came to us,” Shmuel explains. “Sometimes, a market requires a long education for a new technology. Here, the market said, ‘Don’t educate us, we know what’s coming.’ We need driver monitoring systems to understand the state of the driver.”

That includes detecting the direction of a driver’s gaze, pupil dilation, eye openness and head position to gauge drowsiness vs. attentiveness.

Eyesight sensors track drivers’ eyelid and iris movement. Photo: courtesy

A Chinese bus company is interested in having Eyesight monitor whether its drivers are eating, smoking, talking on the phone or doing anything else that could put passengers in jeopardy.

Eyesight’s system can handle less critical details as well, such as automatically adjusting a driver’s seat and mirrors or selecting a driver’s favorite playlist. As car-sharing takes off, this may make all the difference in crafting a user-friendly experience.

By calculating a passenger’s size, Eyesight can also help a car’s computer understand how to more safely deploy an airbag.

Privacy is an important concern for drivers, and Eyesight’s got that covered, too. All the AI processing is done in the car itself, and images are not sent for comparison on the cloud. Once the images are processed for the task at hand, they are discarded.

Gideon Shmuel, CEO of Eyesight. Photo: courtesy

Compared with the data collected and used by, say, Facebook, Instagram or your iPhone, “We’re totally private,” Shmuel says.

Eyesight can do this in part because it requires very low processing power. This is known in the industry as “edge computing” on distributed nodes (“edge devices”) as opposed to a centralized cloud environment. And Eyesight’s cameras are simple 2D infrared sensors, not sophisticated 3D sensors like Apple has on board for facial recognition.

That same technology powers a separate Eyesight division for smart homes. Two applications in development are smart thermostats that sense how many people are in a room and adjust the temperature accordingly; and surveillance and security systems. If three 18-year-old males suddenly show up at Grandma’s house, that could trigger an alert, Shmuel says.

Eyesight has its headquarters and R&D center in Herzliya, with satellite offices in Detroit, Silicon Valley and Seattle. It has representatives in Europe and China as well. The company employs 55 people. And with Eyesight’s cameras around, we bet nobody is falling asleep on the job!

For more information, click here

More on Innovation

More on driver safety