Today at Tesla’s first Autonomy Day event, Elon Musk took questions from the press but didn’t have time for questions about Lidar. Historically, he’s been vocal about the technology, and this time he put it as clear as he could.
“LIDAR is a fool’s errand,” Elon Musk said. “Anyone relying on LIDAR is doomed. Doomed! [They are] expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices. Like, one appendix is bad, well now you have a whole bunch of them, it’s ridiculous, you’ll see.”
The topic was brought up by a question about if Tesla’s just-revealed self-driving hardware could handle input from LIDAR. Tesla’s vehicle’s currently uses several sources of data to acquire autonomous driving: radar, GPS, maps, ultrasonic sensors and more. But not LIDAR like some of Tesla’s chief competitors. Elon Musk previously explained that he views LIDAR as a crutch for self-driving vehicles. For Tesla, cameras are the keys to the future and its CEO sees a future when cameras will enable Tesla to see through the most adverse weather situations.
Andrej Karparthy, Senior Director of AI, took the stage and explained that the world is built for visual recognition. LIDAR systems, he said, have a hard time deciphering between a plastic bag and a rubber tire. Neural networks and visual recognition are necessary for Level 4 and Level 5 autonomy, he said.
Uber, Waymo, Cruise and several others use the technology in their self-driving technology stack. As proponents of the technology, they point to LIDAR’s ability to see through challenging weather and light conditions better than existing cameras. They’re expensive. And often hungry for power. That’s where Tesla’s solution around cameras comes in.
The company today detailed its current generation self-driving computer that works with all existing Tesla vehicles. Once the software is ready, it will enable all Teslas to drive autonomously with their existing sensor set — at least that’s what the company says — and that sensor set doesn’t include LIDAR. Instead, the sensors inside Tesla vehicles lean on a neural network that’s trained by data collected by all Tesla vehicles.
“Everyone’s training the network all the time,” Musk said. “Whether autopilot is on or off, the network is being trained. Every mile that’s driven for the car that’s hardware 2 or above is training the network.”
The resulting data is kind of scary, Musk mused later in the press conference. But presumably not as scary as relying on LIDAR.