in , ,

2019 CES Highlights: Innovations in Enviro-Sensing for Robocars

Blackmore sensors 2019 CES

Production is ramping up of test units still priced just sub-$20,000, but Blackmore is working with multiple Tier I suppliers to roll out a sub-$1,000 product by the 2023-2025 time frame. Those units will likely switch to a solid-state setup.

AEye

AEye thinks 1,550 nanometers is the way to go, too, but takes things a step farther by combining its dynamic iDAR solid-state lidar with a camera in one unit. Per AEye, traditional lidar is slowed down by its need to treat every pixel equally, creating a ton of environmental data for the computer to sift through, most of which gets thrown away when it’s determined to be irrelevant to vehicle travel.

AEye’s solution embeds micro-electro-mechanical systems into the solid state lidar which allow it to fire photons randomly rather than in a preset pattern, which is easier for the computer to process. It also allows the sensor to hone-in on areas of interest and direct more photons at them rather than taking another picture of the entire environment and throwing most of it away. AEye says this reduces power consumption and latency by reducing the workload of both the lidar and computer.

At the same time, AEye pairs the dynamic lidar with an HD camera for both resolution and color sensing. The data from both the camera and lidar are fused at the sensor rather than at the computer, saving processing power. AEye’s software then processes what it’s seeing and directs the vehicle accordingly. The company says its system is an open book; OEMs can upload any custom detection and tracking software to enhance the system.

AEye, which was born out of target acquisition and tracking technology developed for the military, claims iDAR is the fastest scanner on the market at 100 Hz and can see the farthest, up to a kilometer away. It does this, says AEye, while reducing power consumption five-to-ten times over competitors.

RoboSense

View 5 Photos

RoboSense’s RS-LiDAR-M1 Gaze system won a CES innovation award for its lidar/camera fusion solution that utilizes the 905 nanometer pulsed time-of-flight laser technology fused with a two-dimensional camera image and deep-learning algorithms that establish target points of interest and focus the laser’s “gaze” on these areas to obtain rich data about their position and velocity. The lidar unit is a solid-state of the MEMS type (utilizing micro electromechanical mirrors), and the company anticipates production in 2020 priced at around $200. It also expects the form factor to shrink about in half again from its already compact 2.8 x 5.5 x 6.5-inch size.

Cepton

View 5 Photos

Cepton is readying a lidar product for “millions” of units. It uses the more established 905 nanometer pulsed laser approach, but its unique twist on solid-state lidar systems is that its micro-mirrors are positioned via tiny electromagnets. The big news at CES was that the company has partnered with Japanese lighting manufacturer Koito, and plans to incorporate its compact lidar units into headlights and taillights that include cleaning functions.

Arbe Robotics

Arbe Robotics claims its 4D radar is “10 times more sensitive than any system in the market today.” As a result, Arbe Robotics says, employing its 4D radar could reduce the number of sensors required in an autonomous vehicle’s sensing suite, potentially reducing overall costs. The big breakthrough is Arbe’s high resolution at both a long distance—object detection is reliable at 300 meters (384 ft)—and a wide field of view (100 degrees wide and 30 degrees up). Resolution accuracy is said to be 1 degree of angular resolution in the horizontal direction, 2 degrees vertical. The system also works with SLAM—simultaneous localization and mapping algorithms to help establish position. A patent-pending innovation helps its receiver reject radar waves emitted by other devices. Production is expected to ramp up in 2020.

AdaSky

AdaSky’s Viper product introduces technology developed in the security and military sectors using a far-infrared (FIR) thermal camera that merely receives thermal data (no infra-red light is generated and reflected) from objects in the environment. This means it works better in inclement weather because objects in the fog or snow still radiate heat. Their system can provide detection to 120 meters (394 ft) in a 30.4-degree-wide beam, or to 200 meters (656 feet) within a 17-degree angle. Changing the optics can also provide high definition across an 87-degree beam, but only to “10s” of meters range. Image processing happens in the camera itself, with NVIDIA chips helping classify vehicles, cyclists, motorcycles, and pedestrians. On a clear Las Vegas morning, the image was nearly indistinguishable from that of an early black-and-white CMOS camera. Pricing is expected to be less than lidar, more than plain cameras. Mass production is expected by 2021, and the company is also marketing the technology for use in smart traffic light signaling.

TDK

View 5 Photos

TDK InvenSense bills its Coursa Drive as “the world’s first inertial-aided positioning solution for AV platform developers.” That means it provides dead-reckoning to maintain positional accuracy during brief outages of the global positioning satellite network due to tunnels/trees/etc. To do this it uses a three-axis accelerometer and three-axis gyroscopes, and the system calibrates itself continually using absolute position inputs from either high accuracy GPS receivers or from perception-based systems (camera, radar, lidar) and HD map-matching. Coursa Drive provides high-rate, 100 times/second positions info and and orientation, which is a faster refresh rate than typical GPS/perception systems deliver, so it stands to improve positional accuracy slightly even when there is a strong signal. Demonstration units will be available to developers in the first quarter of 2019.

Bosch

View 5 Photos

Bosch’s Sapcorda subsidiary proposes leveraging a network of terrestrial reference stations whose positions are precisely known. They provide correction factors that get transmitted to vehicles via the cloud or through satellite transmission in order to greatly increase the accuracy and precision of GPS satellite info—perhaps to a level sufficient for autonomous vehicle navigation.

SLD Laser

SLD Laser’s flashy CES demo is an aftermarket light bar built by Baja Designs that utilizes laser light rather than LEDs to increase the visible distance at night off-road, similar to high-beams from automakers like BMW. We’re more excited, though, about their research into laser-powered “LiFi.” LiFi is essentially a cross between WiFi and fiber optics, replacing WiFi’s radio signals with light beams pulsing so rapidly the human eye can’t detect it. A few companies are up and running with LiFi networks that replace your ceiling lights with LED LiFi broadcast/receive units and dongles for your devices, but SLD proposes replacing the LEDs with lasers, which have greater range and power. The hope is that someday, automotive-grade LiFi units could allow autonomous vehicles to communicate with one another. First, they’ll have to overcome obstacles like heavy weather and the fact that LiFi requires direct line-of-sight communication to function.


Read The Original Article Here

Leave a Reply

Buy Two or More HEATTECH Innerwear Items From Uniqlo, Get $5 Off Each

Buy Two or More HEATTECH Innerwear Items From Uniqlo, Get $5 Off Each

Nuance Emotion Detection multiscreen at 2019 CES

2019 CES Highlights: Artificial Intelligence Tech for Our Autonomous-Car Future