11:45 | 12:25
Keywords defining the session:
- Deep Learning
- Deployment to embedded systems
Takeaway points of the session:
- A workflow for Deep Learning: from data preparation to deployment
- Deep Learning plays a major role in ADAS: image classification, object detection and localization, image and LiDAR segmentation
Self-driving cars, voice assistants, autonomous robots, smart devices… Autonomous systems are reaching to and changing every part of our lives, and Deep Learning is the technology behind that change. Advanced levels of perception, enabled by Deep Learning, are key to the success of automated driving, from advanced driver assistance systems (ADAS) to fully autonomous driving.
Designing and deploying deep learning applications to embedded CPU and GPU platforms (as it is the case with cars) is challenging because of resource constraints inherent in embedded devices. In this session, you will be exposed to some of the most relevant and stimulating real-world problems in ADAS, focusing on the role played by Deep Neural Networks (DNN): image classification, object detection and localization, semantic segmentation and deployment. You will get a walkthrough of a complete workflow including data preparation (properly ground-truth labeling of datasets and driving scenes) and visualization, creating or fine-tuning DNNs, training such networks leveraging NVIDIA GPUs to build automated driving capabilities and generating portable and optimized CUDA code that can be deployed on boards (NVIDIA Jetson TX2 and NVIDIA DRIVE PX) leveraging TensorRT for very fast inference.