carla/Docs/core_sensors.md

8.8 KiB

4th. Sensors and data

The last step in this introduction to CARLA are sensors. They allow to retrieve data from the surroundings and so, are crucial to use CARLA as a learning environment for driving agents.
This page summarizes everything necessary to start handling sensors including some basic information about the different types available and a step-by-step of their life cycle. The specifics for every sensor can be found in their reference


##Sensors step-by-step

The class carla.Sensor defines a special type of actor able to measure and stream data.

  • What is this data? It varies a lot depending on the type of sensor, but the data is always defined as an inherited class of the general carla.SensorData.
  • When do they retrieve the data? Either on every simulation step or when a certain event is registered. Depends on the type of sensor.
  • How do they retrieve the data? Every sensor has a listen() method that receives and manages the data.

Despite their differences, the way the user manages every sensor is quite similar.

Setting

As with every other actor, the first step is to find the proper blueprint in the library and set specific attributes to get the desired results. This is essential when handling sensors, as their capabilities depend on the way these are set. Their attributes are detailed in the sensors' reference.

The following example sets ready a dashboard HD camera that will later be attached to a vehicle.

# Find the blueprint of the sensor.
blueprint = world.get_blueprint_library().find('sensor.camera.rgb')
# Modify the attributes of the blueprint to set image resolution and field of view.
blueprint.set_attribute('image_size_x', '1920')
blueprint.set_attribute('image_size_y', '1080')
blueprint.set_attribute('fov', '110')
# Set the time in seconds between sensor captures
blueprint.set_attribute('sensor_tick', '1.0')

Spawning

Sensors are also spawned like any other actor, only this time the two optional parameters, attachment_to and attachment_type are crucial. They should be attached to another actor, usually a vehicle, to follow it around and gather the information regarding its surroundings.
There are two types of attachment:

  • Rigid: the sensor's location will be updated strictly regarding its parent. Cameras may show "little hops" as the moves are not eased.
  • SpringArm: movement will be eased with little accelerations and decelerations.
transform = carla.Transform(carla.Location(x=0.8, z=1.7))
sensor = world.spawn_actor(blueprint, transform, attach_to=my_vehicle)

!!! Important When spawning an actor with attachment, remember that its location should be relative to its parent, not global.

Listening

Every sensor has a listen() method that is called every time the sensor retrieves data.
This method has one argument: callback, which is a lambda expression of a function, defining what should the sensor do when data is retrieved.
The lambda function must have at least one argument, which will be the retrieved data:

# do_something() will be called each time a new image is generated by the camera.
sensor.listen(lambda data: do_something(data))

...

# This supposed collision sensor would print everytime a collision is detected. 
def callback(event):
    for actor_id in event:
        vehicle = world_ref().get_actor(actor_id)
        print('Vehicle too close: %s' % vehicle.type_id)

sensor02.listen(callback)

!!! note is_listening is a sensor attribute that enables/disables data listening at will.
Similarly, sensor_tick is a blueprint attribute that allows to set the simulation time between data received so that this is not retrieved every step.

Most sensor data objects have a function for saving the measurements to disk so it can be later used in other environments.
Sensor data differs a lot between sensor types, but it is always tagged with:

Sensor data attribute Type Description
frame int Frame number when the measurement took place.
timestamp double Timestamp of the measurement in simulation seconds since the beginning of the episode.
transform carla.Transform World reference of the sensor at the time of the measurement.

##Types of sensors

Cameras

These sensors take a shot of the world from their point of view and then use the helper class to alter this image and provide different types of information.
Retrieve data: every simulation step.

Sensor Output Overview
Depth carla.Image Combines the photo with the distance of the elements on scene to provide with a gray-scale depth map.
RGB carla.Image Provides clear vision of the surroundings. Looks like a normal photo of the scene.
Semantic segmentation carla.Image Uses the tags of the different actors in the photo to group the elements by color.

Detectors

Sensors that retrieve data when a parent object they are attached to registers a specific event in the simulation.
Retrieve data: when triggered.

Sensor Output Overview
Collision carla.CollisionEvent Retrieves collisions between its parent and other actors.
Lane invasion carla.LaneInvasionEvent Registers when its parent crosses a lane marking.
Obstacle carla.ObstacleDetectionEvent Detects possible obstacles ahead of its parent.

Other

This group gathers sensors with different functionalities: navigation, measure physical properties of an object and provide 2D and 3D models of the scene.
Retrieve data: every simulation step.

Sensor Output Overview
GNSS carla.GNSSMeasurement Retrieves the geolocation location of the sensor.
IMU carla.IMUMeasurement Comprises an accelerometer, a gyroscope and a compass.
Lidar raycast carla.LidarMeasurement A rotating lidar retrieving a cloud of points to generate a 3D model the surroundings.
Radar carla.RadarMeasurement 2D point map that models elements in sight and their movement regarding the sensor.

That is a wrap on sensors and how do these retrieve simulation data and thus, the introduction to CARLA is finished. However there is yet a lot to learn. Here are some different paths opened at the moment:

  • Gain some practise: if diving alone in CARLA is still frightening, it may be a good idea to try some of the code recipes provided in this documentation and combine them with the example scripts or some ideas of your own.
  • Continue learning: there are other more advanced features in CARLA such as rendering options, traffic manager, the recorder, and some more. Now that some fundaments on CARLA have been provided, it is a good moment to learn about these.
  • Experiment freely: but don't forget to take a look at the References section of this documentation. They contain detailed information on the classes in the Python API, sensors and their outputs, and much more.
  • Give your two cents: share your thoughts. Any doubts, suggestions and ideas about CARLA are welcome in the forum.