Home / Technology / Mobileye demos self-driving car that uses cameras to get around

Mobileye demos self-driving car that uses cameras to get around

Mobileye, Intel’s driverless vehicle R&D division, today published a 40-minute video of one of its cars navigating a 160-mile stretch of Jerusalem streets. The video features top-down footage captured by a drone, as well as an in-cabin cam recording, parallel to an overlay showing the perception system’s input and predictions. The perception system was introduced at the 2020 Consumer Electronics Show and features 12 cameras, but not radar, lidar, or other sensors. Eight of those cameras have long-range lenses, while four serve as “parking cameras” and all 12 feed into a compute system built atop dual 7-nanometer data-fusing, decision-making Mobileye EyeQ5 chips.

Running on the compute system is an algorithm tuned to identify wheels and infer vehicle locations and an algorithm that identifies open, closed, and partially open car doors. A third algorithm compares images from cameras to infer a distance for each pixel in an image and generate a three-dimensional point cloud, which Mobileye’s software uses to identify objects in the scene. A fourth algorithm identifies pixels corresponding to driveable roadway; detected objects pass through to a suite of four algorithms that attempt to place it in space.

The Mobileye car merges into traffic during the video, detecting vehicles traveling at speeds upwards of 56 miles per hour before veering to avoid a heavy construction zone in the right lane. The vehicle detects stationary cars, forklifts, and large trucks, yielding to a crossing pedestrian and one who hesitates before deciding not to cross the road.

Early on, Mobileye’s car changes lanes to avoid a row of stationary cars, and it’s shown slowing as it approaches a trailer protruding slightly into its lane. Later it nudges its way into a roundabout, switching lanes after spotting a less congested route. A safety driver behind the wheel takes over briefly, but only so that the drone’s battery can be replaced. Even when confronted with scenarios like a car parked in the middle of the road, Mobileye’s system attempts (and manages) to squeeze by, recognizing when the driver-side door is ajar.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

“The car needs to balance agility with safety and does so using the RSS framework. The streets of Jerusalem are notoriously challenging, as other road users tend to be very assertive, adding significant challenge [for] the decision-making module of the robotic driver,” Mobileye CEO Amnon Shashua said in a statement. “The problem we aim to solve is scale. The true promise of [autonomous vehicles] can only materialize at scale — first as a means for ride-sharing via robo-shuttles and later as passenger cars offered to consumers. The challenges to support AVs at scale center around cost, proliferation of HD-maps, and safety.”

Mobileye has previously demonstrated that its perception system can detect traffic lights and signs, enabling it to handle intersections fully autonomously. But it also relies on high-definition maps of transportation lines, light rail lines, and roads themselves captured by the company’s Road Experience Management (REM) technology.

At a high level, REM is an end-to-end mapping and localization engine comprising three layers:

  • “Harvesting” agents, the Mobileye-supplied advanced driver assistance systems (ADAS) embedded in vehicles from automakers who agree to share data with the company, including Volkswagen, BMW, and Nissan. (Mobileye powered ADAS in 300 car models across 27 OEM partners as of November 2019.) The systems collect and transmit information about driving path geometries and stationary landmarks around them, leveraging real-time geometrical and semantic analysis to compress map-relevant information to less than 10KB per 0.62 miles, on average.
  • Capsules called Road Segment Data (RSD) into which the map data is packed before it’s sent to the cloud for processing, where it’s aggregated and reconciled into a map called a “Roadbook.” Mobileye collects 3.7 million miles of sensor data from vehicles on roads every day and draws on publicly available geospatial corpora like OS MasterMap and Ordnance Survey. The company expects to have more than 1 million vehicles in its European fleet by the end of 2020 and 1 million U.S. vehicles in 2021.
  • Software running within cars — eventually including cars from Mobileye’s aforementioned data-sharing partners — that automatically localizes within the Roadbook via real-time detection of landmarks stored within it. Mobileye has nearly all of Europe mapped and anticipates it will fully map the U.S. sometime later this year.

Mobileye, which Intel paid $ 15.3 billion to acquire in March 2017, is building two independent self-driving systems. One, like the system demoed in the video, is based entirely on cameras, while the second incorporates radar, lidar sensors, modems, GPS, and other components. Both confer the full benefits of Mobileye’s Responsibility-Sensitive Safety (RSS) model, an open policy that imposes “common sense” constraints on the decisions driverless vehicles make, and Mobileye says the latter should be able to travel roughly 100 million hours without a crash.

Mobileye is aiming to deploy robo-taxi fleets in three major cities — Tel Aviv; Paris; and Daegu City, South Korea — by 2022, with the hardware cost per robo-taxi coming in at around $ 10,000 to $ 15,000 per vehicle. (By 2025, Mobileye is aiming to bring the cost of a self-driving system below $ 5,000.) In the interim, the plan is to deploy dozens of vehicles with unrestricted travel between destinations in Israel ahead of a rollout across the country, potentially alongside the launch of a China-based service in partnership with Beijing Public Transport Corporation and Beijing Beytai.

Beyond Mobileye, a number of companies are developing autonomous vehicle systems that lean heavily (or exclusively) on cameras for routing. There’s Wayve , a U.K.-based startup that trains self-driving models solely in simulation, and Comma.ai, which sells an aftermarket self-driving kit to retrofit existing cars. And then there’s Tesla, which recently released a preview of an active guidance system that navigates a car from a highway on-ramp to off-ramp, including interchanges and lane changes. Like Mobileye, Tesla leverages a fleet of hundreds of thousands of sensor-equipped cars to collect data for analysis, which it uses to train, develop, and refine algorithms in the cloud that are then sent via over-the-air updates to those vehicles.

Let’s block ads! (Why?)

VentureBeat

About

Check Also

The scale of ambition in gaming is getting bigger | Brian Ward fireside chat

The scale of ambition for Saudi Arabia when it comes to moving into the games …