Listen to this article

Deep map of vehicle navigation structure

Deepmap builds high-definition maps for autonomous vehicles. | Photo credit: Nvidia

NVIDIA adds high-resolution mapping features to its autonomous driving arsenal with this week’s announcement of its intention to acquire Deep map. Founded in 2016, DeepMap is a startup dedicated to building high-definition maps for autonomous vehicles for safe navigation around the world. Chipmaker Nvidia is deepening its technology assets through an acquisition. The transaction, the value of which is undisclosed, is expected to close in the third quarter of 2021.

Map accuracy from DeepMap

DeepMaps, based on Silicon Valley, was founded by James Wu and Mark Wheeler, veterans of Google, Apple and Baidu. The company is trying to resolve accuracy issues with maps for autonomous vehicles. It also works to provide solutions for highly accurate real-time positioning for stand-alone vehicles. Every autonomous vehicle manufacturer is struggling with these two problems.

Autonomous vehicles require centimeter accuracy. DeepMaps is working on mapping technology to support this requirement. They are also developing the infrastructures to be delivered maps-as-a-service, so that map data can be constantly updated and transmitted to vehicles. This requires the creation of a cloud-based distribution network to take data to hundreds of millions of endpoints (i.e., vehicles).

Nvidia’s primary product for autonomous driving is its Nvidia Drive platform. As a chip maker, Nvidia’s ultimate goal is to sell more silicon. However, their processing platforms are uniquely suited for tasks that support artificial intelligence workflows and machine learning models (ML). Independent driving observation tasks are some of the priority areas where AI / ML technology is currently used and where existing Nvidia hardware is used to support computational requirements.

High level view of the neighborhood

DeepMap map in San Jose, California that depicts highly detailed features of the road and surrounding block environment, including a reliable semantic data layer with key features such as navigable borders, lane boundaries, intersections, traffic signs and traffic signs, explicit and implicit revenue lines, and lane connections.

“The acquisition supports DeepMap’s unique vision, technology and people,” says Ali Kani, NVIDIA’s Director of Automotive. “DeepMap is expected to expand our mapping products, help us expand our mapping capabilities globally, and expand our complete self-driving.

“NVIDIA is an amazing, world-changing company that shares our vision for accelerating secure autonomy,” said James Wu, founder and CEO of DeepMap. “By joining forces with NVIDIA, technology can expand faster and benefit more people faster. We look forward to continuing our journey as part of the NVIDIA team. “

Autonomous car positioning is DeepMap’s core technology

Mapping and positioning go hand in hand for each independent vehicle from unmanned aerial drones to independent mobile robots to autonomous cars. DeepMap has invested heavily in improving the accuracy of AV positioning. More accurate maps are needed for more accurate localization. On the contrary, more accurate maps allow you to find something easier and more accurate. Therefore, both technology areas are part of DeepMap’s product range.

In the latest episode Robot Report Podcast, we experienced what can happen to an autonomous vehicle when it is confused about a change in the expected route. The construction zone, accidents, and traffic are just some of the possible things that can affect the world of AV. In a world where AV has a common map and can update the map in real time, vehicles can prepare for problems and avoid them. Work zones and road closures can be identified and then the world map can be modified for the benefit of all.

With the acquisition, DeepMap will gain Nvidia’s huge resources and channel strength. This is likely to speed up the production of DeepMap technology, but we have to wait and see.

LEAVE A REPLY

Please enter your comment!
Please enter your name here