In today’s automotive market, one can see multiple brands introducing new “self-driving cars” for commercial use, with companies like Tesla and Mercedes-Benz claiming to introduce fully autonomous models as early as 2020. However, while certain autonomous vehicle (AV) companies are hasty to brand their newest products as “driverless,” none of them have achieved true autonomy as defined by the Society of Automotive Engineers (SAE), the leading standards development organization for the industry. In a 2014 document named Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems (which has been revised three times since then), SAE defines different levels of autonomy from Level 0 to 5; Level 5 indicates full autonomy under all environmental conditions whereas Level 0 assumes human control in all aspects of driving (1). While they can automate functions such as steering, acceleration/deceleration, and lane changing, even the most advanced driver assistance systems today require constant attention of the driver — effectively making them fall into Level 2. 

The major problem companies face while developing autonomous vehicles is the question of dynamic vision, which contributes to how autonomous vehicles detect and respond to changes on the road (2). To solve this issue, companies rely on technologies that encompass a combination of sensors working in tandem. However, one of these sensors, namely LiDAR, has been the object of ongoing debate; while LiDAR technology is widely used by AV companies, a vocal minority in the industry think that the cost of LiDAR outweighs its benefits, including Tesla CEO Elon Musk. During Tesla’s inaugural Autonomy Day in April 2019, Musk made the remarks that LiDAR is “a fool’s errand” and that “anyone who relies on LiDAR is doomed” (3).

To understand the root of this argument, one must first investigate the purpose and working principles of the LiDAR — which stands for Light Detection and Ranging. Essentially, LiDAR is a remote sensing technology that sends pulses of laser light to surfaces and calculates the time it takes for the pulses to bounce back to the sensor. Using this information, LiDAR can then create detailed 3D maps of its surroundings (4). The proponents of LiDAR, such as Waymo (which is owned by Google’s parent company Alphabet) consider that the most significant advantage of LiDAR comes from its ability to both compensate for its own weaknesses and provide redundancy when used as a part of a detection sensor suite. For instance, according to CTO Dmitri Dolgov, Waymo’s idealized model of an autonomous vehicle utilizes a combination of LiDAR, radar, and camera technologies. In this combination, LiDAR is used as the main source to model the vehicle’s surroundings in 3D and track moving objects. Cameras are simultaneously used to distinguish objects (e.g. road signs, pedestrians) and recognize color (which LiDAR is unequipped to detect) using pattern recognition software. Finally, radars compensate for LiDAR’s inability to function in severe environments such as heavy rain, snow, or fog. Using the overlapping features of these sensors, Waymo also aims to ensure the sensors will be able to fulfill one another’s role in case one fails (5).

However, this model has its shortcomings — the most prominent being the cost of LiDAR. While leading LiDAR manufacturers don’t make their prices openly available for all models, Velodyne (a Silicon Valley company considered to be the biggest provider of LiDAR sensors to the mobility sector) is estimated to offer products within a price range of $5,000 to $75,000, creating concerns that the added cost of LiDAR may make the commercial availability of AVs impossible (6). Another concern stems from the mechanical structure of LiDAR; since most variants of the sensor relies on moving parts to obtain a peripheral vision, they face the danger of malfunctioning or breaking down on rough terrain (e.g. bumps, potholes) — ultimately increasing the long-term cost of maintaining a LiDAR-mounted AV. This is where Tesla’s own vision for autonomous vehicles comes in; the company hopes to eliminate the need for LiDAR by replicating the sensor’s depth perception and motion tracking abilities with a combination of eight cameras and a radar alone. Using data obtained by approximately half a million Tesla vehicles on road to train an advanced neural network, Tesla hopes to give its cars the ability to “reason” based on stereo vision alone — akin to what humans use when driving (7). Yet, critics cite past instances where Tesla’s camera technology displayed susceptibility to changes in external light and an inability to detect certain non-moving objects, questioning whether the camera-radar combination can effectively replace LiDAR (8,9).

In the light of cost-related problems, hardware issues, and companies coming up with more affordable solutions, the future of LiDAR as an essential component for AV production seems questionable. However, one thing is certain: whether detection technologies with LiDAR prevail over their alternatives or not, the market is benefiting from the competition. The pressure to be the first AV company to reach Level 5 autonomy is motivating carmakers to expedite their production and pursue new methods to lower their costs, with the unit cost of a radar falling to as low as $150 (10). Similarly, companies championing LiDAR technology are working on cheaper and more efficient models; for instance, after beginning to manufacture its own LiDAR sensors, Waymo successfully lowered the unit cost of a $75,000 model down to $7,500 (6), and Luminar recently announced its $500 LiDAR, signaling the end of the LiDAR era might be farther away than some expected (11).

 

 

References:

  1. SAE On-Road Automated Vehicle Standards Committee. "Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems." SAE Standard J 3016 (2014): 1-16.
  2. Maqueda, Ana I., et al. "Event-based vision meets deep learning on steering prediction for self-driving cars." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.
  3. McFarland, M. (2019, June 18). Most self-driving companies say this tech is crucial. Elon Musk disagrees. Retrieved August 2, 2019, from https://edition.cnn.com/2019/06/17/tech/lidar-self-driving-tesla/index.html
  4. Taranovich, S. (n.d.). Autonomous automotive sensors: How processor algorithms get their inputs. Retrieved August 2, 2019, from https://www.edn.com/design/analog/4442319/autonomous-automotive-sensors--how-processor-algorithms-get-their-inputs
  5. Brown, M. (n.d.). Waymo CTO Dmitri Dolgov on Dust Storms, Lidar, Tesla, and Expansion. Retrieved August 2, 2019, from https://www.inverse.com/article/56891-waymo-s-cto-details-the-iphone-like-car-that-powers-autonomous-driving
  6. Korosec, K. (2019, March 06). Waymo to start selling standalone LiDAR sensors. Retrieved August 02, 2019, from https://techcrunch.com/2019/03/06/waymo-to-start-selling-standalone-lidar-sensors/
  7. Knight, W. (2016, March 17). A Car Drives Itself with Just One Camera. Retrieved August 2, 2019, from https://www.technologyreview.com/s/539841/one-camera-is-all-this-self-driving-car-needs/
  8. Stewart, J. (2018, December 13). People Have Got to Stop Confusing Their Teslas for Self-Driving Cars. Retrieved August 2, 2019, from https://www.wired.com/story/tesla-autopilot-crash-dui/
  9. Stewart, J. (2018, December 21). Why Tesla's Autopilot Can't See a Stopped Firetruck. Retrieved August 2, 2019, from https://www.wired.com/story/tesla-autopilot-why-crash-radar/
  10. Cost of a Self-Driving Car's Components Automotive Electronics. (2018, November 15). Retrieved August 2, 2019, from https://www.automotivelectronics.com/cost-of-components-of-a-self-driving-car/
  11. $500 Lidar From Luminar Could Move Autonomous Driving Forward. (2019, July 12). Retrieved August 02, 2019, from https://cleantechnica.com/2019/07/12/500-lidar-from-luminar-could-move-autonomous-driving-forward/