carla/Docs/core_sensors.md

8.3 KiB

4th. Sensors and data

The last step in this introduction to CARLA should be learning about sensors, which allow to retrieve data from the surroundings and so, are crucial to use CARLA as a learning environment for driving agents.
The first part of this page summarizes everything necessary to start handling sensors including some basic information about the different types available and a step-by-step of their life cycle. Use this to experiment and then come back when necessary to read more about each specific type of sensor.


##Sensors step-by-step

The class carla.Sensor defines a special type of actor able to measure and stream data.

  • What is this data? It varies a lot depending on the type of sensor, but the data is always defined as an inherited class of the general carla.SensorData.
  • When do they retrieve the data? Either on every simulation step or when a certain event is registered. Depends on the type of sensor.
  • How do they retrieve the data? Every sensor has a listen() method that receives and manages the data.

Although they are very different to each other, the way the user has to manage them is quite similar.

Setting

As with every other actor, the first step is to find the proper blueprint in the library and set specific attributes to get the desired results. This is essential when handling sensors, as their capabilities depend on the way this attributes are set. Their attributes are listed in the blueprint library but a detailed explanation on each type of sensor is provided later on this page.

The following example sets ready a dashboard HD camera that will later be attached to a vehicle.

# Find the blueprint of the sensor.
blueprint = world.get_blueprint_library().find('sensor.camera.rgb')
# Modify the attributes of the blueprint to set image resolution and field of view.
blueprint.set_attribute('image_size_x', '1920')
blueprint.set_attribute('image_size_y', '1080')
blueprint.set_attribute('fov', '110')
# Set the time in seconds between sensor captures
blueprint.set_attribute('sensor_tick', '1.0')

Spawning

Sensors are also spawned like any other actor, only this time the two optional parameters, attachment_to and attachment_type are crucial. They should be attached to another actor, usually a vehicle, to follow it around and gather the information regarding its surroundings.
There are two types of attachment:

  • Rigid: the sensor's location will update strictly regarding its parent. Cameras may show "little hops" as the moves are not eased.
  • SpringArm: movement will be smoothed with little accelerations and decelerations.
transform = carla.Transform(carla.Location(x=0.8, z=1.7))
sensor = world.spawn_actor(blueprint, transform, attach_to=my_vehicle)

!!! Important When spawning an actor with attachment, remember that its location should be relative to its parent, not global.

Listening

Every sensor has a listen() that is called every time the sensor retrieves data.
This method has one argument: callback, which is a lambda expression of a function, defining what should the sensor do when data is retrieved.
Similarly, the lambda function must have at least one argument, which will be the retrieved data:

# do_something() will be called each time a new image is generated by the camera.
sensor.listen(lambda data: do_something(data))

...

# This supposed collision sensor would print everytime a collision is detected. 
def callback(event):
    for actor_id in event:
        vehicle = world_ref().get_actor(actor_id)
        print('Vehicle too close: %s' % vehicle.type_id)

sensor02.listen(callback)

!!! note The is_listening attribute of a sensor allows to enable/disable data listening at will.

Most sensor data objects have a function for saving the measurements to disk so it can be later used in other environments.
Sensor data differs a lot between sensor types, but it is always tagged with:

Sensor data attribute Type Description
frame int Frame number when the measurement took place.
timestamp double Timestamp of the measurement in simulation seconds since the beginning of the episode.
transform carla.Transform World reference of the sensor at the time of the measurement.

##Types of sensors

Cameras

These sensors take a shot of the world from their point of view and then use the helper class to alter this image and provide different types of information.
Retrieve data: every simulation step.

Sensor Output Overview
Depth carla.Image Combines the photo with the distance of the elements on scene to provide with a gray-scale depth map.
RGB carla.Image Provides clear vision of the surroundings. Looks like a normal photo of the scene.
Semantic segmentation carla.Image Uses the tags of the different actors in the photo to group the elements by color.

Detectors

Sensors that retrieve data when a parent object they are attached to registers a specific event in the simulation.
Retrieve data: when triggered.

Sensor Output Overview
Collision carla.CollisionEvent Retrieves collisions between its parent and other actors.
Lane invasion carla.LaneInvasionEvent Registers when its parent crosses a lane marking.
Obstacle carla.ObstacleDetectionEvent Detects possible obstacles ahead of its parent.

Other

This group gathers sensors with different functionalities: navigation, measure physical properties of an object and provide 2D and 3D models of the scene.
Retrieve data: every simulation step.

Sensor Output Overview
GNSS carla.GNSSMeasurement Retrieves the geolocation location of the sensor.
IMU carla.IMUMeasurement Comprises an accelerometer, a gyroscope and a compass.
Lidar raycast carla.LidarMeasurement A rotating lidar retrieving a cloud of points to generate a 3D model the surroundings.
Radar carla.RadarMeasurement 2D point map that models elements in sight and their movement regarding the sensor.

That is a wrap on sensors and how do these retrieve simulation data.
There is yet a lot to learn about CARLA, but this has been the last of the first steps. Now it is time to really discover the possibilities of CARLA. However, here is a brief guidance for some of the different paths that are opened right now:

  • For those who want to gain some practise:

Python Cookbook

  • For those who want to continue learning:

Advanced step

  • For those who want to experiment freely:

References

  • For those who have something to say:

Forum