Search

Using LG's LGSVL Simulator


Baidu Apollo AV control software running within LGSVL Simulator

There are several open source AV (Automated / autonomous Vehicle) simulators available including CARLA, AirSim, Deepdrive and MADRaS. After surveying the available packages, we decided to use the LGSVL Simulator, developed by LG Electronics America R&D Center. It includes an integration with two open-source AV control software platforms (i.e. the software that runs within the automated vehicle): Apollo and Autoware.


The initial setup process of using the Simulator is straightforward but when using it with Apollo, it is a bit more involved. It requires installing NVIDIA GPU drivers, CUDA libraries, Docker containers, GitHub repositories, and building some software executables. Even though this process is well documented, it proved to be a time consuming task as there were a number of subtleties to be aware of. We set up this system not only on desktop PCs, but also on cloud-based virtual machines.


We mainly used the LGSVL Simulator with their modified (i.e. forked) version of the Apollo AV control software. This means that the automated vehicle, i.e. the ego vehicle, is fully controlled by Apollo. The LGSVL Simulator models the ego vehicle, its sensors and actuators, and its road environment. Within the environment, there is the road network, other vehicles (referred to as NPCs or non-player characters), and other road users such as pedestrians.


Modifications to the simulator source code


As the LVSVL Simulator is open source, we were able to make some changes to the source code of the software in order to tweak its behaviour to better meet our requirements.


The NPC vehicles are able to drive in two different modes:

  • Following a pre-programmed route and ignoring what other vehicles are doing, any traffic signs, traffic lights, etc.

  • Driving around while reacting and interacting with other vehicles. This interaction basically consists of it slowing down if there is an obstacle in front of it, and complying with traffic lights.

In the latter mode, when the NPC vehicle reaches a road junction and there is a choice of where to go next, its decision is made randomly. As we wanted more control over the NPC vehicles’ movements, we modified the simulator source code.


Almost all of the junctions in the maps provided are controlled by traffic lights. We wanted to carry out tests on junctions with no traffic lights, but with road signs where the ego vehicle software had to make more decisions on what it should do. We therefore made some modifications to the simulator software to change the main junction we were using from a three-way stop to a T-junction with a minor road joining a major road.


Our data-logging requirements were more demanding than were available from the simulator by default. After receiving detailed advice from LG engineers, we modified the simulator software so that we could record the data we needed.


Setting up and running the driving scenarios


Apollo running within LGSVL showing planned route of ego vehicle

The scenarios were set up programmatically with no manual interventions, using the simulator’s supplied API. This involved:

  • Specifying initial positions, routes and behaviours for all the vehicles, including up to several dozen NPC vehicles

  • Setting up Apollo with the desired destination for the ego vehicle. This is normally done manually using Apollo’s Dreamview web browser interface, but we received help and software from LG to do this programmatically

  • Starting the test, and ending it early if required (e.g. if a collision occurred)


Based on the logged data from each run, we calculated a number of Key Performance Indicators (KPIs) covering:

  • Safety, e.g. collisions, near misses

  • Driving comfort (ego vehicle and NPC vehicles)

  • Traffic flow disruptions

These KPIs are used by our directed search algorithms to generate subsequent tests, in order to find the problematic ones. It effectively finds near-misses and tweaks the conditions in order to turn them into collisions.


For our safety-related performance criteria, one of the calculations we performed was TTC (Time-To-Collision). It involves at each instant, extrapolating the positions of all of the vehicles into the future to see if they are due to collide, and calculating the time of that collision.


Time-to-collision prediction

Conclusions


We were able to run thousands of tests in the LGSVL Simulator with Apollo as the system-under-test (SUT). As each driving scenario was reasonably complex, involving many vehicles and presenting the SUT with a myriad of choices, the resulting system behaviour was not fully deterministic. In other words, if we ran the exact same test multiple times, we sometimes got different results. This is sometimes to be expected from a complex system.


The LGSVL Simulator worked well, is under active development and has detailed and up-to-date documentation. They have excellent support on their forum, with our questions normally receiving a comprehensive and informed response within 24 hours.


Our tests with the LGSVL Simulator provided us with in-depth experience not only of using the package, but also of making software modifications to the simulator. As we were using it with Apollo, we also gained valuable insights into using this relatively-mature automated driving platform software, and evaluating its self-driving capabilities under a variety of scenarios.


We also gained insights into the use of KPIs to evaluate an AV system’s safety and (occupant) comfort, and assess its effects on traffic flow.

27 views

GET IN TOUCH

©2020 CavPoint.

CavPoint Limited is a company based in

Cambridge, UK and is registered in

England and Wales with company number 11819595.