Many companies involved in self-driving platforms development (for Automotive and Drones) are relaying one open source software and interfaces. Here a list of the 6 most used platforms based on their reliability and usability. The contents are in alphabetical order and not in order of relevance:
Here some Useful Resources to learn coding for Self-Driving vehicles.
Apollo provides an open, reliable and secure software platform for its partners to develop their own autonomous driving systems through on-vehicle and hardware platforms.
With Apollo, you get:
- A world leading HD map service
- The only open Autonomous Driving simulation engine
- End-to-End, a deep learning algorithm.
Apollo accelerates the development, testing, and deployment of Autonomous Vehicles. As participation grows, more accumulated data becomes available.
Compared to a closed ecosystem, Apollo can evolve faster, bring greater benefits to members, and continually grow.
Apollo Road Map
2018Production-levelClosed VenueAutonomous Driving
2019Geo-fenced City RoadsAutonomous Driving
2019Production-levelGeo-fenced City RoadsAutonomous Driving
2020Production-levelSimple City RoadsAutonomous Driving
2021Highway and City RoadsAutonomous Driving
Various sensors, such as LiDAR, cameras and radar collect environmental data surrounding the vehicle. Using sensor fusion technology perception algorithms can determine in real time the type, location, velocity and orientation of objects on the road.
This autonomous perception system is backed by both Baidu’s big data and deep learning technologies, as well as a vast collection of real world labeled driving data. The large-scale deep-learning platform and GPU clusters drastically
Simulation provides the ability to virtually drive millions of kilometers daily using an array of real world traffic and autonomous driving data. Through the simulation service, partners gain access to a large number of autonomous driving scenes to quickly test, validate, and optimize models with comprehensive coverage in a way that is safe and efficient.
HD Map and Localization
Baidu pioneered the extensive application of deep learning and artificial intelligence technology to map creation and is one of the few Chinese firms capable of producing HD mapping data on a large scale.
The localization system is a comprehensive positioning solution with centimeter level accuracy based on GPS, IMU, HD map, and a variety of sensor inputs.
Developers can minimize costs and adjust precision using varied usage scenarios.
End to End
End-to-End autonomous solutions are attractive because of the low cost and low engineering complexity. By using real road data, the horizontal and latitude driving models are based entirely on deep learning. This allows for the quick and efficient application onto autonomous test vehicles. Currently, horizontal, and latitude model source code with 10,000 km of data is available on Apollo.
pollo vehicles are equipped with a planning system consisting of prediction, behavior, and motion logic. The planning system adapts to real time traffic conditions, resulting in precise trajectories that are both safe and comfortable. Currently, the planning system operates on a fixed route in both night/day conditions.
The Apollo intelligent vehicle control and canbus-proxy modules are precise, broadly applicable and adaptive to different environments. The modules handle different road conditions, speeds, vehicle types and canbus protocols. Apollo provides waypoint following capability with a control accuracy of ~10 cm.
Open Data Platform
By opening the autonomous driving source code, capabilities, and data, Apollo forms a comprehensive “vehicle and cloud” open ecosystem. Apollo offers developers and partners lacking data and computing power an array of fast and flexible services. Through this, Apollo is building an open autonomous driving ecosystem that empowers each participant and broadens the widespread adoption of autonomous driving.
Apollo provides partners with a complete hardware reference design, including vehicle selection, key hardware components, peripherals, and a multifaceted hardware installation guide. This guide details the hardware installation process and offers a starting point for integration with software and outlines vehicle road testing.
The MAP Engine manages and protects the HD-Map data, as well as provides a unified data query interface. The MAP Engine includes core capabilities such as:
- Element retrieval
- Spatial retrieval
- Format adaptation
- Cache management
The MAP Engine provides a modular, hierarchical, cross platform, and a highly customizable programming interface.
DuerOS for Apollo
Based on DuerOS, the world’s first human-vehicle dedicated AI system, it provides solutions including human-vehicle dialogue, facial log-in, fatigue driving monitoring, AR navigation, intelligent security, vehicle to home (V2H) and personalized services & contents.
Apollo offers the 4S solution – Scan, Shield, See, and Save. This covers the full life-cycle of a vehicle’s cyber security needs. For Shield, Apollo’s security products are currently deployed in mass production vehicles, including IDPS, Car FireWall, Secured OTA Kits, in order to protect user privacy and vehicle information from network security breaches.
- On line training resources
- Compatible hardware
- Open Vehicle Certificate Platform
Autoware is ROS-based open-source software, enabling self-driving mobility to be deployed in open city areas. It provides, but not limited to, the following modules.
Localization is achieved by 3D maps and SLAM algorithms in combination with GNSS and IMUsensors. Detection uses cameras and LiDARs with sensor fusion algorithms and deep neural networks. Prediction and Planning are based on probabilistic robotics and rule-based systems, partly using deep neural networks as well. The output of Autoware to the vehicle is a twist of velocity and angular velocity (also curvature). This is a part of Control, though the major part of Control is supposed to reside in the by-wire controller of the vehicle, where PID and MPCalgorithms are often adopted.
All in all, Autoware provides a complete software stack for self-driving technology.
- download last release
- Docker environment to install Autoware
EB robinos is a comprehensive, hardware agnostic software solution for highly automated driving systems. Control and manage the increasing complexity of HAD systems and bring them to market quickly.
Highly automated driving (HAD) means many pieces of software and hardware need to work together seamlessly, efficiently, and safely. HAD requires creating, coordinating and combining sensors, actuators and components—with all the different parties involved (carmakers, suppliers etc.). The complexity of HAD systems is a huge challenge to the industry.
EB robinos is a software framework for automated driving that helps for overcome this challenge by controlling and managing the complexity professionally. EB robinos is an application-layer functional software architecture with open interfaces and software modules. Its holistic approach provides control and management of complex HAD systems to accelerate, enhance and optimize automated driving system development.
The functional architecture and the open interfaces enable cooperation among different parties and pave the way for future mobility. EB robinos is available for development, embedded prototyping, and for series use. With EB robinos, carmakers and Tier 1 suppliers can shift their focus from architecture effort to differentiation: bringing cars to market quickly and delighting consumers with distinctive, highly automated driving features that provide competitive advantages and strong market positions.
EB robinos is inspired by the open robinos specification which describes its architecture and interfaces. The specification is freely available. EB encourages industry companies and professionals to join this open effort to build the future of mobility.
EB robinos details
EB robinos, the comprehensive, hardware agnostic software framework for highly automated driving consists of:
- A functional software architecture with open interfaces
- EB robinos for development: EB robinos for EB Assist ADTF (free trial version available)
- EB robinos for embedded prototyping and volume production
- EB robinos software components
- Open robinos specification: Freely available platform specification that defines modules, interfaces, and control mechanisms (free download)
- Is a software framework for automated driving
- Deﬁnes an application-layer architecture for ADAS up to SAE level 5 of automated driving
- Is hardware agnostic
- Runs on Embedded Linux systems or integrates into AUTOSAR ECUs
- Provides Linux software modules for state-of-the-art HAD development and use
- Can be integrated into central ADAS ECU as well as into distributed systems of several ADAS ECU
- Incorporates existing or new, customer or third party subsystems
- Is available for development and prototyping in your environment or within EB Assist ADTF
- Is an integrated approach – available from development and embedded prototyping through series production
references and more details on https://www.elektrobit.com/products/eb-robinos/
Drivers benefit from cars that know what is on the road ahead. EB robinos Predictor provides highly accurate and up-to-date information about the road ahead for predictive driver assistance functions. Through electronic horizon-based driver assistance features, carmakers strengthen their position delivering safety and comfort while securing their way to automated driving.
- Map-based ADAS functions (commercial maps or community maps)
- Supports latest ADASISv formats
- ASIL B for the reconstructor
- ADASISv3 provider based on NDS 2.5 including lane building block 3.2)
- ADASISv2 provider based on EB internal map format
- Digital map information and GNSS data, independent of navigation system activity
- Complete range of electronic horizon development tools and target software modules
- High level of accuracy due to use of multiple sources
- Sensor information for bootstrapping maps based on machine learning technologies
EB robinos Predictor Eval Kit
EB robinos Predictor Eval Kit is a robust out-of-the-box ADASIS Provider for your R&D activities. By supporting latest map material (for ADASIS v2 and v3), using a GNSS receiver as positioning source and recording/replaying test-drives, the kit contains everything that is necessary to run EB robinos Predictor instantly within your development environment. The provided web interface (WebApp) lets you start using EB robinos Predictor Eval Kit without complex installation or configuration.
EB robinos Predictor Eval Kit is a Raspberry Pi device running the EB robinos Provider (ADASIS). It can be used to evaluate the capacity and performance of EB’s electronic horizon products. This Raspberry Pi platform is equipped with all that is necessary for a demonstration ECU. Major advantages of this kit are:
- Space-saving designs
- Easy integration and longevity due to high robustness (hardware & software)
- Saving time as the kit is ready for test drives
- Independent of ADASIS version (CAN-Bus and Ethernet support)
- Expandable: Add-on boards, e.g. inertial sensor boards, EB HD Positioning (Dead Reckoning)
references and download on https://www.elektrobit.com/products/eb-robinos/predictor/
Advances in automotive technology will continue to drive more computing power, more sensors, more functionality, more driver’s assistance and more autonomy into cars. NVIDIA® DriveWorks is a Software Development Kit (SDK) that contains reference applications, tools and library modules. It also includes a run-time pipeline framework that goes from detection to localization to planning to visualization. It is designed to be educational to use and open so you can enhance it with your own code.
NVIDIA DriveWorks is available as part of the NVIDIA DRIVE Software provided to select Automakers, Tier 1 Suppliers, and Research Institutions working on developing systems that enable cars to drive themselves.
Developers working on autonomous driving will find the DriveWorks SDK helpful in accelerating their development on NVIDIA DRIVE PX platforms.
DriveWorks SDK Overview
- Detection libraries include sensor processing, sensor correlation, sensor fusion, segmentation and deep learning detection & classification
- Localization libraries include map localization, HD-Map interfacing, and egomotion (structure from motion and odometry)
- Planning libraries include vehicle control, scene understanding and path planning problem solvers
- Visualization libraries include streaming to a cluster display, ADAS rendering, and debug rendering
- Tools that include sensor calibration, rig calibration, camera intrinsics & extrinsics and lidar extrinsics
- NVIDIA DriveWorks is architected to be modular, educational, optimized and open.
references and download on https://developer.nvidia.com/driveworks
OPENPILOT from comma.ai
How does openpilot work?
openpilot is an open source driving agent. It is capable of controlling the gas, brake, and steering on certain cars, reaching up to 6 minutes with no user action required (besides paying attention!). Let’s talk about how it works.
Putting it all together
manager.py is responsible for starting and stopping the constellation of processes when appropriate. It has two states, car stopped and car started, and runs different processes depending on what state it’s in. See service_list.yaml for a list.
controlsd is the main process that talks to the car. It is started when the car starts. The controlsd_thread function would probably be a good place to start reading the code, it’s the main 100hz control loop.
- Download Openpilot https://github.com/commaai/openpilot