For Self-Driving Cars, Lidar Amps Up at CES 2020
Jan. 09, 2020
LAS VEGAS – If self-driving cars are failing to live up to their initial hype, it isn’t for lack of investment in lidar, which is broadly agreed to be a key sensor technology needed for their eventual success. With Bosch’s recent announcement that it’s making lidar sensors for autonomous cars, it seems like the list of auto-related companies not making lidars is shorter than the list of ones that are.
CES has become a major showcase for lidar innovation, and CES 2020 is no exception. Covering lidar here can seem a bit like Groundhog Day . Each year the vendors grandly announce that they have amazing new products that will revolutionize the automotive sensor industry. In reality, many of the new ideas have taken years to become products and promised high-volume price points have proven elusive. As result, it is hard to get to the bottom of exactly which products are production-ready, and what car companies are actually paying for them, but we can give you a sense of some of the key lidar products planned to roll out this year.
A relative newcomer in lidar, having been founded in 2014, Robosense has risen to prominence fairly quickly in part because of strong backing from the Chinese auto industry. For CES it announced the smallest and smartest version of its MEMS-based solid-state lidar family, the RS-Lidar-M1 ($1,898). The base version (dubbed “Simple”) features 125 905nm beams covering a 120-degree Field of View. The company claims that it can achieve 10% on the NIST target score out as far as 150 meters. The new model is significantly smaller than the previous version, at just 4.3 x 1.9 x 4.7 inches, and can run at 15 Hertz.
While those specs are impressive, there are plenty of other lidar companies promising similar devices. What makes the M1 family more interesting is the planned “Smart” version. It adds onboard analytics, which many view as an important step in reducing the processing that needs to happen on the vehicle’s main CPU/GPU complex. The device’s embedded AI capability is designed to transform the raw lidar 3D point cloud into semantic-level data that can be directly used by the vehicles’ behavior systems. Robosense has chosen a relatively low-cost silicon solution using Xilinx FPGAs. It’ll be interesting to see how that compares with some of the higher-end device-based AI solutions, like Cepton’s that includes an Nvidia Jetson Tx board.
Robosense isn’t the only Chinese lidar company aiming to disrupt the market. Drone giant DJI has incubated a lidar startup, Livox, that is introducing a pair of new units at CES. What makes Livox’s approach novel is its non-repeating scanning pattern. Rather than the more traditional options of either a repeated rotation or repeated burst patter, the Livox units trace out new ground each time they traverse a region, creating a somewhat flower-like pattern as it goes. (Frankly, it reminds me of a Spirograph, but most patterns made with those repeated after a bit.)
One obvious advantage of this approach is that given enough time, the resolution far surpasses a traditional lidar. However, if a vehicle or its surroundings are moving, then that advantage is reduced. Also, given enough time, there aren’t any blind spots. Livox’s Horizon unit boasts a competitive range of 260 meters, covering a field of view of 81.7-degrees horizontally. That lets it cover four road lanes at 10 meters. Five of the units are enough for 360-degree usage. The company claims that one of the units is approximately equivalent to a 64-line scanning model. The higher-end model, the Tele-15, scans a full 360-degrees with a 15-degree vertical field of view and promises to work out to 500 meters. Livox claims lower part counts and therefore lower prices than competitors, but as usual for lidar announcements, there isn’t much provided in the way of facts to back that up.
At last count, at least 20 companies send us press releases claiming that they are the leader in lidar. But if any company can legitimately make that claim, it is Velodyne. As a pioneer and “first mover,” the company seized the lion’s share of the market for its first decade and popularized the “KFC bucket” $70K radars that sat on top of first- and second-generation autonomous test vehicles. Since then, the market has spawned dozens of competitors, but Velodyne certainly isn’t standing still. I got a tour of their new products at CES this week, and while most of them aren’t revolutionary (except perhaps for the tiny $100 Velobits), they continue to move the ball forward. The low price point of the Velobits puts it within striking distance of being an alternative to radar in the increasingly common AEB systems now found in millions of vehicles.
The most exciting thing about the Velodyne booth tour at CES is that it includes the single best use of VR at a tradeshow I’ve ever experienced. Once you put on the headset, you can pick up any of their products (using an Oculus with touch controllers) and activate them. They’ll show you what they’d record in a virtual city. Did I mention there is an entire virtual city complete with virtual birds? So you can evaluate how each sensor records its surroundings. The one feature I’d like to see them add is an accurate representation of the actual number of channels on each sensor, rather than just painting the field of view. But that issue is quickly addressed by going “inside.”
Once you move inside a lidar, you can take a virtual drive through the city and see what it saw (this part is recorded, so you can’t randomly drive around). There are various ways to help people try to experience point clouds, but this is one of the more effective.
The experience was great, but of course, the big question is whether Velodyne can continue to hold off the myriad challengers to its throne. I was fortunate enough to be able to spend some time with the company’s new CEO, Anand Gopalan, and get his perspective. Judging by the product announcements and hype, one of the biggest threats for Velodyne is the onslaught of vendors building various kinds of AI into their Lidar, especially pedestrian and vehicle detection.
Both Gopalan and executives I spoke with from several autonomous vehicle makers expressed some skepticism about whether that innovation will be ideal for L4/L5 vehicles, as it is important to have the data processed in a way that can provide a true fail-safe system. But Gopalan said that Velodyne is definitely looking at how much it can do in the lidar’s own hardware for other reasons. He explained that physics-based tasks like SLAM can be performed at a much lower cost in power if they are tightly integrated into the lidar instead of by a centralized computer in the car. He also said that such low-level systems could help be a potential fail-safe for autonomous vehicles, as a sort of “reptilian brain” if there are problems with the main control software.
While almost every Level 4 and above autonomous vehicle project includes one lidar, there are some exceptions. Most notably, Tesla, which claims its current array of radar and cameras is plenty to make your Tesla into a money machine as a self-driving taxi. Personally, I don’t think that will happen without additional hardware, but it might well not have to include lidar. For example, Ambarella has demonstrated impressive results with only an array of visible-light cameras, and FLIR does something very similar using thermal imaging alongside visible light. Intel’s Mobileye is running test vehicles with two parallel systems — one camera-based, and the other a combination of sensor technologies.
Lidar supplier Ouster also updated its product line, with new wide-field-of-view lidar (OS0, 32-128 channels, 95-degree FOV) and long-range traditional scanning lidar (OS2, 64-128 channels, 22.5-degree FOV). As a sign of progress in price reductions, the 32-channel versions of OS0 start at $16K, while you can get an impressive 128-channels for $24K — less than a third of what autonomous vehicle companies were paying for a unit with fewer channels (although in some cases a full 360-degree FOV) a few years ago. As mentioned earlier in the article, proven lidar vendor Cepton has added “AI at the Edge” to its products using an Nvidia GPU to run people- and vehicle-detection software. Intelligent lidar pioneer AEye is moving to the second generation of its product, which features a beam that can be aimed dynamically, with more important areas receiving more attention.
One thing that is clear is that not all of the dozens of lidar vendors will survive long-term. Many have already been acquired by larger automotive partners, some have folded, and a few are living on borrowed time. Venture capital also isn’t flowing as freely into this market as it once did. Partially because of the reality that autonomous vehicles aren’t imminent, and partially because for consumer adoption component prices will have to be low enough that only a few, high-volume, suppliers will be able to make any money. We’ll be reporting more on this after Electronic Imaging 2020 later this month, where I’m moderating a panel on sensor technologies for autonomous vehicles that will include lidar, radar, and camera experts comparing and contrasting their technologies.
British Vogue September issue: Edward Enninful showcases activists of 'hope'Inter vs Getafe FREE: Live stream, TV channel, kick-off time and team news for Europa League matchYES Or No? "Masturbation Is Better Than Just F*cking Anyone" - Fmr Beauty Queen, Sandra BenedeSHOCKER! What It Means If You Crave Painful And Rough, Hard S£xSHOCKER! This Woman Never Cleans Her Vagina — And Her Husband Loves It