What’s New: Mobileye, an Intel company, today showed its True Redundancy™ sensing system operating hands-free in Israel – a major milestone in preparation for the debut of its planned robotaxi services in Israel and Germany. A new, unedited video shows the vehicle operating in autonomous mode while mimicking the multi-stop behavior of a ride-hailing service with humanlike skill.
“Mobileye Drive™ with True Redundancy defies industry norms with separate sensing subsystems that act as backups to one another. The very normal way in which the vehicle navigates very complex scenarios proves the value in this approach.”
–Johann Jungwirth, vice president of mobility-as-a-service at Mobileye
What It Means: The video shows the Mobileye AV going through the motions of a robotaxi service, driving to multiple destinations and pausing where it might pick up and drop off passengers. In this fifth installment of the unedited drive series, the capabilities of True Redundancy, Mobileye’s alternative approach to autonomous vehicle (AV) sensor fusion, are on full display as the Mobileye AV robotaxi navigates the complex streets of Jerusalem at night. While previous unedited videos have shown the AV driving only with the camera subsystem, this new installment comes from the fully configured AV that Mobileye is planning to use in commercial robotaxi deployments.
How It Works: True Redundancy is Mobileye’s unique approach to environmental sensing whereby two independent subsystems – one camera-only and the other a lidar-radar combination – each serve as backups to each other instead of as complementary systems. The result is a sensing solution believed to deliver a higher mean time between failures. Prototype AVs now driving in Israel are Mobileye’s first to combine the two systems in a single vehicle, demonstrating how the robotaxi is expected to perform in real-world operations.
In the 40-minute unedited video, the Mobileye AV is seen completing complex, real-world driving maneuvers despite harsh nighttime roadway lighting and complicated road signs. The very humanlike driving behavior of the AV comes across as remarkably unremarkable in that it handles very challenging maneuvers smoothly.
Making it look easy, the AV negotiates with human drivers when executing a left turn at an unprotected junction (4:04; 06:48); safely and successfully navigates around jaywalking pedestrians (08:28; 10:42); seamlessly handles illegal maneuvers by other drivers (04:34); completes a 180-degree turn in an intersection with multiple traffic signals (25:18); navigates around vehicles blocking proper lane usage (25:39); rides through a roundabout with pedestrians (26:44); and completes other more regular driving maneuvers.
Why It Matters: The demonstration of True Redundancy on real roads helps to dispel past industry skepticism that doubted whether Mobileye’s cutting-edge approach to environmental sensing could work. More remarkable is the almost mundane quality to the video. The AV handles the drive more or less as a human would (and in some cases better), showing its near-readiness for planned robotaxi operations. Building on the already vast capabilities of Mobileye’s camera-first AV development fleet, the addition of radar-lidar to its sensor suite is the final piece to achieving what the company set out to do with its differentiated AV technology.
Operationalizing the True Redundancy system is a crucial milestone toward Mobileye’s planned robotaxi service scheduled for later this year in Germany and Israel. Mobileye has started the permit and regulatory approval process in both countries to enable the company to begin removing safety drivers on public roads.
More Context: Mobileye Drive – Mobileye’s self-driving system – combines Mobileye’s industry-leading technologies, including Road Experience Management™, the company’s proprietary approach to mapping that leverages crowdsourced data from mass-market advanced driver-assistance systems to build AV maps on short notice; the Responsibility-Sensitive Safety (RSS) driving policy that implements a mathematical model to enhance safety through improved adaption to unique driving environments; and True Redundancy, which combines two independent perception sub-systems powered by cameras and radar-lidar, with each alone capable of developing full models and ultimately supporting full end-to-end autonomous capabilities.
SOURCE: Intel