fbpx

6 New Perception Systems for AI Self-Driving Cars

January 17. 2019. 7 mins read

We recently caught an indie film called At Eternity’s Gate about the artist and ear fetishist Vincent van Gogh. The painter suffered from mental illness, and the movie attempted to capture his brilliant but mad vision through various camera tricks that were intended to mimic van Gogh’s own impressionistic aesthetic. It made us a little sick at times. But it also reminded us that we each see the world in different ways, which makes the trick of translating the wonder of human vision to machines all the more amazing. Computer vision often relies on cameras or other types of sensors to perceive the world – the eyes – and then algorithms – the brain – help it make sense of the input. Google is pretty good about identifying pictures of cats, but if we want autonomous systems like AI self-driving cars to perceive real-life cats and make quick decisions about what to do, there’s still plenty of work to be done.

There are nearly 50 corporations, not to mention all of the startups out there, working on developing AI self-driving cars. But that’s only the start: A whole ecosystem of hardware and software has sprung up to support this emerging industry. There are companies developing high-definition mapping to help autonomous vehicles navigate; creating virtual worlds where they can learn to drive on billions of simulated miles; and building sensor systems such as LiDAR to perceive the environment around them. It’s this latter category around what we might call “perception systems” that we want to talk about today. This includes not just startups developing hardware like laser-based sensors and cameras, but machine learning systems that help AI self-driving cars take that input so that it can see more like a human.

Seeing LiDAR in a Different Light

One of the most prominent perception systems for AI self-driving cars is actually an old technology that dates back to the 1960s. LiDAR, which stands for Light Detection And Ranging, was developed shortly after the invention of lasers. One of its first uses was actually in meteorology, in order to answer the question: “Is it wet enough for you?” The basic concept is pretty simple: The instrument shoots rapid-fire light pulses at an object or surface, some up to a million or more per second. A sensor on the instrument measures the amount of time it takes for each pulse to bounce back, which is a measure of distance. More sophisticated system can use that data to create a 3D map of the surrounding environment in the process.

AI self-driving cars can use an array of sensors for their perception systems, including LiDAR, radar, and cameras.
AI self-driving cars can use an array of sensors for their perception systems, including LiDAR, radar, and cameras.

In 2016, we came across some of the first companies building LiDAR systems for AI self-driving cars, including a 35-year-old company called Velodyne. The next year we covered nine startups developing LiDAR systems for autonomous vehicles. Since then, even more have emerged, promising the latest iteration of the old technology.

One of these is a Silicon Valley startup called Aeva that was founded in 2016. It took in a $45 million Series A last October, including from a well-known VC firm called Lux Capital. Total funding is up to $48.5 million. Led by a pair of ex-Apple engineers, Aeva’s LiDAR system does one thing most others can’t do: It not only records the distance to the object, but how fast that object is moving, which is especially important if that object is, say, a child or incoming missile. The sensor package, as one of the co-founders described to VentureBeat, “combines the key advantages of lidar, radar, machine vision, and high-accuracy motion sensing.”

The Aeva LiDAR system.
The Aeva LiDAR system. Credit: Aeva

The system actually fires one steady beam of light rather than millions of light pulses, which helps it track the moving object. The sensor not only offers greater range and better resolution, but it’s less likely to experience interference from weather or other laser sensors, VentureBeat reported.

Click for company websiteSydney-based Baraja is also putting a new spin on LiDAR. The company raised $32 million this month to bring total funding to $33.6 million, with Sequoia China co-leading the latest round. Baraja calls its new technology Spectrum-Scan LiDAR, using “simple high school physics” that exploits the interaction between light and prisms. When light enters a prism, through the process of refraction, the beam is sent off in a new direction. By varying the color or wavelength of light, the instrument can control the direction the light is sent.

A look at what the Baraja LiDAR systems see.
A look at what the Baraja LiDAR systems see. Credit: Baraja

The instrument makes use of off-the-shelf components like optical-grade silica-glass found in smartphone cameras, which the company says will help with integrating and scaling the Spectrum-Scan LiDAR.

Update 03/24/2021: Baraja has raised $31 million in new funding to continue the deployment and development of its “unique and ingenious” imaging system. This brings the company’s total funding to $63.9 million to date.

LiDAR-based Perception System

One of the keys for widespread use of LiDAR is delivering a solid-state instrument without the very expensive moving parts of traditional electromechanical systems, which can cost upwards of $75,000. Unlike bucket seats or an eight-track player, LiDAR is not on the list of options you can decline if you want a true autonomous vehicle.

Click for company websiteBased near San Francisco, AEye is a six-year-old startup that has raised about $59 million, including a $40 million Series B back in November. Investors include Intel, Airbus, and VC firm Kleiner Perkins, all of which participated in a $16 million Series A in June 2017. The CEO is a former NASA engineer who spent the bulk of his career in the defense industry, designing surveillance, reconnaissance, and defense systems for fighter jets at Lockheed Martin.

The AEye iDAR platform combines LiDAR, cameras, and AI.
The AEye iDAR platform combines LiDAR, cameras, and AI. Credit: AEye

AEye, as the name implies, has developed a computer vision system for autonomous vehicles it calls iDAR (the “i” is for intelligent) that uses a solid-state LiDAR package combined with a high-resolution, low-light camera. Real-time integration of camera pixels and LiDAR voxels, what the company refers to as dynamic vixels, means the data are handled more quickly at the sensor level. That makes it easier and faster for the algorithms to evaluate a scene and make decisions quickly with less latency, bandwidth and computer power. AEye announced just this month that it is releasing its latest iDAR package, AE200, for level 3 autonomy where things start to get interesting.

Crossing the Road with Better Perception

Click for company websiteWe’re driving away from the hardware side of things with this next startup. Founded in 2015 just outside of Boston, Harvard spinout Perceptive Automata has taken in $20 million in disclosed funding over four rounds, including a $16 million Series A in October 2018 that included both Toyota and Hyundai. Perceptive Automata understands the biggest impediment to the widespread adoption of AI self-driving cars are humans. We’ll let Jim Adler at Toyota AI Ventures explain more:

[Perceptive Automata uses] behavioral science techniques to characterize the way human drivers understand the state-of-mind of other humans and then train deep learning models to acquire that human ability.

So, Perceptive Automata is developing the algorithms behind the sensors. In other words, the startup’s platform can intuit that the teenager at the crosswalk looking at her smartphone and taking a step forward probably isn’t paying attention, rightly predicting that it should stop rather than teach her an important lesson about awareness. Perceptive Automata is doing more than standard algorithm training, which usually involves recognizing objects and people. Instead, the startup helps train its AI by developing deeply contextual training data that simulate real-world idiots crossing the street.

Sensor Fusion for Perception Systems

Click for company websiteTel Aviv-based VayaVision also works on the software side of perception systems for AI self-driving cars. Founded in 2016, the Israeli company raised an $8 million seed round last October, with Mitsubishi among the early-stage investors, for its raw data fusion perception system. It sounds a bit like AEye, where the software package (currently called VayaDrive 2.0) fuses raw data from LiDAR, cameras and even radar to create an accurate 3D model around the autonomous vehicle, using computer vision and other computational tools.

How? VayaDrive takes the data from distance sensors like LiDAR and assigns the information to every pixel from high-resolution cameras. This allows autonomous vehicles to receive crucial information about the size and shape of objects, so they can be identified as vehicles, humans or incoming missiles.

Thermal Cameras for Perception Systems

Click for company websiteAnother Israeli company takes us back around to the hardware side. AdaSky, founded in 2015, took in $20 million last November from a South Korean outfit that specializes in automotive components. As you might expect from the heavily militarized startup nation, AdaSky’s high-resolution thermal cameras leverage military technology. Dubbed Viper, the platform collects far infrared signals through thermal energy radiated from objects and their body heat.  Computer vision algorithms go to work to process the signals to provide accurate object detection and scene analysis.

The AdaSky thermal camera.
The AdaSky thermal camera. Credit: AdaSky

Thermal cameras have advantages over other types of sensors, not only with higher resolution over LiDAR and radar, but better performance in poor weather conditions.

Conclusion

Making machines that can see the world like humans can is no mean feat, as we can see here. Solutions must not only be safe, but scalable and affordable. These new perception systems for AI self-driving cars promise to be all that and more. Some may run into a dead end, but others will surely be part of the future.

Share

Leave a Reply

Your email address will not be published.