rFpro launches the world first platform that enables the test and training of autonomous vehicles in simulation
For the first time, autonomous vehicles can be accurately tested and trained without using real roads
World first platform enables the test and training of autonomous vehicles in simulation
For the first time, autonomous vehicles can be accurately tested and trained without using real roads. The platform significantly reduces the cost and time involved in developing these vehicles and provides a completely safe testing environment. Two major vehicle manufacturers have already adopted the technology and are already carrying out more than 2 million miles of testing per month.
UK-based driving simulation company, rFpro, has launched the world’s first commercially available platform to train and develop autonomous vehicles in simulation. Using a digital environment to accurately represent the real world, the technology enables vehicle manufacturers to test their systems in every scenario imaginable.
“Autonomous vehicles are the future, the market is expected to be worth up to $10Tn, but debate is rising about whether these vehicles should be allowed on our roads, if not, how do we develop them?” said Chris Hoyle, rFpro technical director. “Our platform enables vehicle manufacturers to thoroughly test their technology and be absolutely confident in their systems before validation on real roads. The vehicle hardware, such as the cameras and sensors, are already approaching the level required to achieve a fully autonomous world but it is the ‘brain’, the vehicles ability to make appropriate decisions, that needs to be further developed.”
The key to rFpro’s platform is the level of accuracy achieved replicating the real world in simulation. This enables the various sensors used for autonomous vehicles to react naturally and therefore test results are completely representative. The company has been producing a library of real roads created through highly-precise scanning technology, which forms the basis of the simulation. As it is a digital platform, users have control of all the variables, such as traffic, pedestrians, weather and location, enabling them to test every eventuality.
“By using multiple computers 24/7, manufacturers can undertake millions of miles of testing every month using our platform,” said Hoyle. “Humans can also be introduced into the simulation, controlling surrounding cars or pedestrians, so we can assess an autonomous vehicle’s decision making and also the interaction between the vehicle and the driver, but most importantly it is carried out in a safe environment.”
The technology has been developed over the last three years and has already been adopted by two major vehicle manufactures and three autonomous car developers. It is also being used by a driver-less motorsport series.
“Autonomous vehicles will revolutionise road safety, much more than ABS (Anti-lock Braking System), AEB (Automatic Emergency Braking) or stability systems have done before it,” said Hoyle. “It has the real potential to create a largely accident-free road network. Allowing autonomous vehicles on to the roads is an essential part of the validation process but our platform enables all of the testing to be carried out in a completely safe environment. Further to this, it significantly reduces the cost and time required to develop these complex systems, bringing the vehicles to market sooner.”
Train, test and validate Deep Learning Autonomous Driving – rFpro
Creating digital road models of public test routes from kinetic LiDAR, followed by a live demonstration of the main features in rFpro for the testing, training and testing of Deep Learning Autonomous Driving:
- i) real road models, not synthetic data – avoid patterns inherent in synthetic data that damage DNN training
- ii) correlation with HD Maps – map the road signs detected by your perception algorithms to HD Map ground truth
- iii) detail road surface for radar sensor models – localisation from road signature correlation
- iv) Add traffic – Swarm and Programmed traffic
- v) Add pedestrians – Swarm and programmed pedestrians, cyclists, animals etc
- vi) Run real-time and add human test drivers – Mix stochastic, unpredictable, error-prone human behaviour live.
- vii) Massively parallel, multiple ego cars, multiple sensor feeds, multiple human drivers.
rFpro Segmentation for DLAD
(Deep Learning Autonomous Driving)
rFpro offers flexible classification and Segmentation for scenery, scene objects, roads and lane markings. These features are important for training and testing your segmentation and detection algorithms.
rFpro’s live scene-graph allows instant changes to scenes to test and train with e.g. different arrangements of roadside objects, vehicle classes, vehicle colours, manoeuvres, weather conditions, time of day.
Testing Deep Learning Autonomous Driving (DLAD) in rFpro at Paris
This video shows some of the ways in which rFpro is being used to test DLAD (Deep Learning Autonomous Driving) models. In some clips you can see the view from camera sensors feeding DLAD systems. You can also see the view from human-controlled cars being driven in simulation sharing the virtual world with vehicle models under the control of autonomous systems. We have also included some ‘corner cases’ where you can see scenes lit by the sun when it is low in the sky, obscuring the white lines and creating challenging conditions. In this case the Radar and LiDAR feeds become more important.
The model used is of the centre of Paris. Increasingly OEMs and T1s are requesting more complex urban environments with mixed traffic and pedestrians. rFpro is being used to Train, Test and Validate DLAD systems.
rFpro is also being used to allow human drivers to test vehicles with ADAS and Autonomous systems in simulation, to allow humans to drive autonomous vehicles, as a “passenger” and to allow human drivers to share a virtual world with autonomous vehicles in order to subjectively evaluate their behaviour and to drive in a way designed to provoke a response, for example veering into their lane or pulling out in front of them at an intersection.
rFpro JLR TestRoute WithTraffic
rFpro Digital Road Model of JLR’s public road test route. This model has been built to a level of accuracy that is suitable not just for vehicle dynamics testing, but also for ADAS and Autonomous testing, with very good correlation between the real world and the Digital Model.
Dassault Systèmes using rFpro to support their customers’ ADAS development
rFpro provides driving simulation software, and 3D content, for Deep Learning Autonomous Driving, ADAS and Vehicle Dynamics testing and validation. rFpro is being used to train, test and validate Deep Learning systems for ADAS and Autonomous applications by OEMs and Tier-1s. When developing systems based on machine learning from sensor feeds, such as camera, LiDAR and radar feeds, the quality of the 3D environment model is very important. The more accurate the virtual world is the greater the correlation will be when progressing to real-world testing. rFpro’s HiDef models are built around a graphics engine that includes a physically modeled atmosphere, weather and lighting, as well as physically modeled materials for every object in the scene. 100s of kilometers of public road models are available off-the-shelf, from rFpro, spanning North America, Asia and Europe, including multi-lane highways, urban, rural, mountain routes, all copied faithfully from the real world. rFpro scales from a desktop workstation to a massively parallel real-time test environment connecting to customers’ autonomous driver models and human test drivers.