I wrote a post for the NVIDIA Developer Blog on how we implemented real-time autonomous vehicle perception using the specialized hardware available on the NVIDIA DRIVE AGX Self Driving Computer. The AGX is powered by Xavier, NVIDIAs in-house ARM SoC which includes a Volta-based GPU, two programmable vision accelerators (used for tasks like edge detection, epipolar geometry) and two deep learning accelerators (fixed function neural network accelerator) in a unified memory architecture. We show how you can easily span all of these different types of simultaneously accelerators to make a real time perception pipeline.