PLEASANTON, Calif.--(BUSINESS WIRE)--AEye Inc, a robotic perception pioneer, today introduced iDAR™, a new form of intelligent data collection that enables rapid, dynamic perception and path planning. iDAR (Intelligent Detection and Ranging) combines the world’s first agile MOEMS LiDAR, pre-fused with a low-light camera and embedded artificial intelligence - creating software-definable and extensible hardware that can dynamically adapt to real-time demands. iDAR will deliver higher accuracy, longer range, and more intelligent information to optimize path planning software, enabling radically improved autonomous vehicle safety and performance at a reduced cost.
AEye’s iDAR is designed to intelligently prioritize and interrogate co-located pixels (2D) and voxels (3D) within a frame, enabling the system to target and identify objects within a scene 10-20x more effectively than LiDAR-only products. Additionally, iDAR is capable of overlaying 2D images on 3D point clouds for the creation of True Color LiDAR. Its embedded AI capabilities enable iDAR to utilize thousands of existing and custom computer vision algorithms, which add intelligence that can be leveraged by path planning software. The introduction of iDAR follows AEye’s September demonstration of the first 360 degree, vehicle-mounted, solid-state LiDAR system with ranges up to 300 meters at high resolution.
“AEye’s unique architecture has allowed us to address many of the fundamental limitations of first generation spinning or raster scanning LiDAR technologies,” said Luis Dussan, AEye founder and CEO. “These first generation systems silo sensors and use rigid asymmetrical data collection that either oversample or undersample information. This dynamic exposes an inherent tradeoff between density and latency in legacy sensors, which restricts or eliminates the ability to do intelligent sensing. For example, while traditional 64 line systems can hit an object once per frame (every 100ms or so), we can, with intelligent sensing, selectively revisit any chosen object twice within 30 microseconds - an improvement of 3000X. This embedded intelligence optimizes data collection, so we can transfer less data while delivering better quality, more relevant content.”
AEye is also announcing the iDAR Development Partner Program for OEM customers, Tier 1 partners, and universities interested in integrating iDAR into their vehicles. The company will demo iDAR and announce its automotive product suite at CES 2018 in Las Vegas from January 9-12. To make an appointment for a live demonstration or learn more about participating in AEye’s iDAR Development Partner Program contact firstname.lastname@example.org or visit us at CES at Booth 2506.
A shortcoming of traditional LiDAR is that most systems oversample less important information like the sky, road and trees, or undersample critical information such as a fast-approaching vehicle. They then have to spend significant processing power and time extracting critical objects like pedestrians, cyclists, cars, and animals. AEye’s iDAR technology mimics how a human’s visual cortex focuses on and evaluates potential driving hazards: it uses a distributed architecture and at-the-edge processing to dynamically track targets and objects of interest, while always critically assessing general surroundings.
“Humans have an instinctive ability to respond to visual cues. By fusing intelligence within the data collection process, iDAR takes us a step closer to this instinctive response,” said AEye director of software, Jon Lareau. “AEye’s iDAR is also an open and extensible platform, allowing us to integrate best-of-breed sensors to improve performance, increase redundancy, and reduce cost. Most importantly, iDAR should help our customers streamline their development process and bring better autonomous vehicles to market, faster.”
AEye’s iDAR system uses proprietary low-cost, solid-state beam-steering 1550nm MOEMS-based LiDAR, computer vision, and embedded artificial intelligence to enable dynamic control of every co-located pixel and voxel in each frame within rapidly changing scenes. This enables path-planning software to address regions and objects of interest, or to apply differentiated focus on select objects or obstacles. By adding intelligence to the sensor layer, objects of interest can be identified and tracked with minimal computational latency. For example, the system can identify objects and then revisit them even within the same frame giving the perception and path planning layers the ability to make calculations such as multi-directional velocity and acceleration vectors simultaneously. This allows for faster and better predictability of behavior and intent.
True Color LiDAR
An additional limitation of traditional LiDAR is that it doesn’t provide sufficiently detailed information on color or text, so it can’t decipher color nuances in road striping, curbs, signage, and traffic lights. iDAR’s True color LiDAR instantaneously overlays 2D real-world color on 3D data, adding computer vision intelligence to 3D point clouds. Traditional LiDAR-based systems have to post-process, with inherent delays and computational drain due to registration and alignment challenges. True Color LiDAR’s unique fusion enables absolute color and distance segmentation, and co-location with no registration processing, so almost no computational penalty. This enables much greater accuracy and speed in interpreting signage, emergency warning lights, brake versus reverse lights, and other scenarios that have historically been tricky for legacy LiDAR-based systems to navigate. In addition, this approach dramatically expands the ability to do “enhanced training” for autonomous vehicles.
Software-Definable and Extensible Hardware
A key feature of AEye’s iDAR platform is its software-definable hardware. iDAR adds three feedback loops that do not exist today: one at the sensor layer, one at the perception layer, and another with path planning software. By enabling customizable data collection in real-time, the system is able to adapt to the environment and dynamically change performance based on the customer’s/host’s applications and needs. In addition, it can emulate legacy systems, define regions of interest, focus on threat detection, and/or be programmed for variable environments, such as highway or city driving. This configurability leads to optimized data collection, reduced bandwidth, improved vision perception and intelligence, and faster motion planning for autonomous vehicles.
By designing its system from the ground up to accommodate multivariable customization through its software API, AEye optimizes functional control and puts OEMs and Tier 1s in the driver’s seat. In addition, iDAR’s system architecture allows for remote updates of the firmware and software, which enables rapid prototyping without requiring hardware modifications.
iDAR is just the latest milestone AEye has achieved in advancing next generation vision systems. In September, the company announced completion of a live metropolitan demonstration showcasing vehicle-mounted solid state commercial LiDAR with 360° coverage. The demo broke new ground in enabling autonomous vehicles to collect real-time high density point clouds while simultaneously detecting objects up to 300 meters. Other recent company announcements include recent additions to executive team and investors.
AEye develops advanced vision hardware, software and algorithms that act as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Airbus Ventures and Intel Capital. For more information, please visit www.aeye.ai