“We envision a future society where autonomous vehicles whisk people safely and comfortably around beautiful cities,” said Jensen Huang, chief executive officer and founder of NVIDIA. “The development of a self-driving car is one of the greatest technical challenges that’s ever been tackled.
Nvidia is a company whose, according to Huang, the reason for existence is to find life after Moore’s law, a law that states the number of transistors in a dense integrated circuit doubles approximately every two years.
NVIDIA announced that it is collaborating with Toyota to power the brains of autonomous vehicles by delivering artificial intelligence hardware and software technologies. Toyota will use Nvidia’s DRIVE PX AI car computer platform to optimize the understanding of the volume of data generated by sensors and to handle the broad spectrum of autonomous driving situations.
Engineering teams from Nvidia and Toyota Research Development are already developing sophisticated software on NVIDIA’s high-performance artificial intelligence platform.
The GTC conference is a must for everyone in the artificial intelligence arena; ZF, Bosch, ten of the top auto companies were there, fifteen of the top tech companies, twenty-five virtual reality startups, and AI startups. GTC is where the future is invented; it is the intersection of Art, Science, and Engineering.
“Toyota has worked on autonomous driving technologies for over 20 years with the aim of reducing traffic fatalities to zero as an ultimate goal, achieving smoother traffic, and providing mobility for all,” said Ken Koibuchi, executive general manager at Toyota. “Through this collaboration, we intend to accelerate the development of autonomous driving systems that are even more safe and capable.”
Creating data processors small enough to fit in cars and work together seamlessly is the Holy Grail of autonomous vehicles. Nvidia is agnostic when it comes to working with groups that will combine sensors, cameras, and mapping data that make cars aware of their surroundings. Nvidia is incorporating Drive PX equipped with the next-generation Xavier processor into Toyota cars. Drive PX is a supercomputer small enough to fit into your hand and deliver 30 trillion deep learning operations per second.
Nvidia says the system can then use AI to understand the 360-degree environment surrounding the car, localize itself on an HD map and anticipate potential hazards while driving. Also, the system software receives updates over the air so that the car can become smarter and smarter over time
At the GTC17 NASA showed off its artificial intelligence functions to demonstrate the simulation of being in space. During the GPU Technology Conference keynote, Nvidia CEO Jen-Hsun Huang showed off a project known as Holodeck. If you’ve watched Star Trek, you will know what a virtual reality Holodeck simulation looks like from the television show.
Christian von Koenigsegg, the founder of the supercar manufacturer Koenigsegg Automotive AB, joined Huang on the keynote stage via NVIDIA’s Project Holodeck collaborative VR environment. Christian Koenigsegg and three other people entered the Holodeck and showed off the latest Koenigsegg, Regera, hypercar.
The audience watched, and clapped, as Huang and Koenigsegg went inside the all-carbon-fibre $1.9 million Koenigsegg Regera supercar. They watched engineers explore the car at scale and in full 3D visual fidelity. I have been to design studios around the world where variations of this concept were being used to consult on design changes in real-time.
Nvidia showed us a group of sensors in a computing system via a gaming video. The video has everything needed to keep the attention of a gamer, but beneath that are the sensors that tell the sharpshooter with the gun where the water is, he can see his reflection in the water. We talked about how close sensors were to animals; the ability to see in the dark like a raccoon, the whiskers of a cat that allows it to feel before anything touches its body.
At CES2017 Nvidia showed off their self-driving car, affectionately called BB8. In addition to the steering controls called PilotNet, BB8 uses LaneNet for detecting lane markings, DriveNet for detecting vehicles, pedestrians, and signs, and OpenRoadNet to detect the drivable area in front of the car. These are some of the capabilities brought together in the Drive PX technology.
Part geek, part Monster Jam, people were climbing all over the Peterbilt truck that was center stage in the Nvidia booth. Nvidia collaborated with PACCAR, which manufactures the Kenworth, Peterbilt, and DAF lines of trucks, to develop a proof-of-concept self-driving truck with SAE Level 4 capability built on NVIDIA DRIVE PX 2 technology, trained on deep neural networks.
It begs the question of whether Toyota will collaborate with Nvidia to bring out an autonomous hydrogen fleet of trucks when 5G LTE is available. Continental has said that autonomous vehicles will be ready after the 5G wireless technology is rolled out.
Nvidia has quietly amassed the respect of all the car companies. Udacity, Audi, Tesla, and Toyota are a few that were shown at GTC17. Originally demoed at CES 2017 driving itself around a changing course, the AUDI Q7 Concept integrates BB8 technology for end-to-end deep learning.
Nvidia described the AutonomouStuff car as essentially a DRIVE PX 2 on wheels. The car comes configured with our AI supercomputer, loaded with NVIDIA DriveWorks (perhaps a play on DreamWorks?), and comes pre-wired for sensors.
Expect to see more Autonomous stuff, more discussions on Nanodegree, a cloud-to-car platform for self-driving vehicles, and techniques for enhancing computationally intensive algorithms for object detection, map localization, and path planning.
We can’t keep up with the pace of change, let alone get ahead of it if all parts of the equation do not accelerate together. Accelerating all aspects of the architecture for autonomous driving is the ability to recognize and handle the nearly infinite number of scenarios encountered on the road.