carla/Docs/core_sensors.md

247 lines
8.5 KiB
Markdown
Raw Normal View History

# 4th. Sensors and data
2020-02-19 00:58:02 +08:00
Sensors are actors that retrieve data from their surroundings. They are crucial to create learning environment for driving agents.
2020-02-19 00:58:02 +08:00
This page summarizes everything necessary to start handling sensors. It introduces the types available and a step-by-step guide of their life cycle. The specifics for every sensor can be found in the [sensors reference](ref_sensors.md).
* [__Sensors step-by-step__](#sensors-step-by-step)
2020-02-19 00:58:02 +08:00
* Setting
* Spawning
* Listening
* Data
* [__Types of sensors__](#types-of-sensors)
2020-02-19 00:58:02 +08:00
* Cameras
* Detectors
* Other
2020-03-02 21:35:50 +08:00
---
## Sensors step-by-step
2020-02-19 00:58:02 +08:00
The class [carla.Sensor](python_api.md#carla.Sensor) defines a special type of actor able to measure and stream data.
* __What is this data?__ It varies a lot depending on the type of sensor. All the types of data are inherited from the general [carla.SensorData](python_api.md#carla.SensorData).
2020-02-19 00:58:02 +08:00
* __When do they retrieve the data?__ Either on every simulation step or when a certain event is registered. Depends on the type of sensor.
* __How do they retrieve the data?__ Every sensor has a `listen()` method to receive and manage the data.
Despite their differences, all the sensors are used in a similar way.
2020-02-19 00:58:02 +08:00
### Setting
2020-02-19 00:58:02 +08:00
As with every other actor, find the blueprint and set specific attributes. This is essential when handling sensors. Their attributes will determine the results obtained. These are detailed in the [sensors reference](ref_sensors.md).
2020-02-19 00:58:02 +08:00
The following example sets a dashboard HD camera.
2020-02-19 00:58:02 +08:00
```py
# Find the blueprint of the sensor.
blueprint = world.get_blueprint_library().find('sensor.camera.rgb')
# Modify the attributes of the blueprint to set image resolution and field of view.
blueprint.set_attribute('image_size_x', '1920')
blueprint.set_attribute('image_size_y', '1080')
blueprint.set_attribute('fov', '110')
# Set the time in seconds between sensor captures
blueprint.set_attribute('sensor_tick', '1.0')
```
### Spawning
2020-02-19 00:58:02 +08:00
`attachment_to` and `attachment_type`, are crucial. Sensors should be attached to a parent actor, usually a vehicle, to follow it around and gather the information. The attachment type will determine how its position is updated regarding said vehicle.
2020-02-19 00:58:02 +08:00
* __Rigid attachment.__ Movement is strict regarding its parent location. Cameras may show "little hops" as the position updated is not eased.
* __SpringArm attachment.__ Movement is eased with little accelerations and decelerations.
2020-02-19 00:58:02 +08:00
```py
transform = carla.Transform(carla.Location(x=0.8, z=1.7))
sensor = world.spawn_actor(blueprint, transform, attach_to=my_vehicle)
```
!!! Important
When spawning with attachment, location must be relative to the parent actor.
### Listening
2020-02-19 00:58:02 +08:00
Every sensor has a [`listen()`](python_api.md#carla.Sensor.listen) method. This is called every time the sensor retrieves data.
2020-02-19 00:58:02 +08:00
The argument `callback` is a [lambda function](https://www.w3schools.com/python/python_lambda.asp). It describes what should the sensor do when data is retrieved. This must have the data retrieved as an argument.
2020-02-19 00:58:02 +08:00
```py
# do_something() will be called each time a new image is generated by the camera.
sensor.listen(lambda data: do_something(data))
...
2020-02-27 21:46:18 +08:00
# This collision sensor would print everytime a collision is detected.
2020-02-19 00:58:02 +08:00
def callback(event):
for actor_id in event:
vehicle = world_ref().get_actor(actor_id)
print('Vehicle too close: %s' % vehicle.type_id)
sensor02.listen(callback)
```
### Data
Most sensor data objects have a function to save the information to disk. This will allow it to be used in other environments.
Sensor data differs a lot between sensor types. Take a look at the [sensors reference](ref_sensors.md) to get a detailed explanation. However, all of them are always tagged with some basic information.
<table class ="defTable">
<thead>
<th>Sensor data attribute</th>
<th>Type</th>
<th>Description</th>
</thead>
<tbody>
<td><code>frame</code> </td>
<td>int</td>
<td>Frame number when the measurement took place.</td>
<tr>
<td><code>timestamp</code> </td>
<td>double</td>
<td>Timestamp of the measurement in simulation seconds since the beginning of the episode.</td>
<tr>
<td><code>transform</code> </td>
<td><a href="../python_api#carlatransform">carla.Transform</a></td>
<td>World reference of the sensor at the time of the measurement.</td>
</tbody>
</table>
<br>
2020-03-02 21:35:50 +08:00
!!! Important
`is_listening` is a __sensor attribute__ that enables/disables data listening at will.
`sensor_tick` is a __blueprint attribute__ that sets the simulation time between data received.
2020-03-02 21:35:50 +08:00
---
## Types of sensors
2020-02-19 00:58:02 +08:00
### Cameras
Take a shot of the world from their point of view. The helper class [carla.ColorConverter](python_api.md#carla.ColorConverter) will modify said image to represent different information.
* __Retrieve data__ every simulation step.
<table class ="defTable">
<thead>
<th>Sensor</th>
<th>Output</th>
<th>Overview</th>
</thead>
<tbody>
<td>Depth</td>
<td><a href="../python_api#carlaimage">carla.Image</a></td>
<td>Renders the depth of the elements in the field of view in a gray-scale map.</td>
<tr>
<td>RGB</td>
<td><a href="../python_api#carlaimage">carla.Image</a></td>
<td>Provides clear vision of the surroundings. Looks like a normal photo of the scene.</td>
<tr>
<td>Semantic segmentation</td>
<td><a href="../python_api#carlaimage">carla.Image</a></td>
<td>Renders elements in the field of view with a specific color according to their tags.</td>
</tbody>
</table>
<br>
2020-03-02 21:35:50 +08:00
### Detectors
Retrieve data when the object they are attached to registers a specific event.
* __Retrieve data__ when triggered.
<table class ="defTable">
<thead>
<th>Sensor</th>
<th>Output</th>
<th>Overview</th>
</thead>
<tbody>
<td>Collision</td>
<td><a href="../python_api#carlacollisionevent">carla.CollisionEvent</a></td>
<td>Retrieves collisions between its parent and other actors.</td>
<tr>
<td>Lane invasion</td>
<td><a href="../python_api#carlalaneinvasionevent">carla.LaneInvasionEvent</a></td>
<td>Registers when its parent crosses a lane marking.</td>
<tr>
<td>Obstacle</td>
<td><a href="../python_api#carlaobstacledetectionevent">carla.ObstacleDetectionEvent</a></td>
<td>Detects possible obstacles ahead of its parent.</td>
</tbody>
</table>
<br>
2020-03-02 21:35:50 +08:00
### Other
Different functionalities such as navigation, measurement of physical properties and 2D/3D point maps of the scene.
* __Retrieve data__ every simulation step.
<table class ="defTable">
<thead>
<th>Sensor</th>
<th>Output</th>
<th>Overview</th>
</thead>
<tbody>
<td>GNSS</td>
<td><a href="../python_api#carlagnssmeasurement">carla.GNSSMeasurement</a></td>
<td>Retrieves the geolocation of the sensor.</td>
<tr>
<td>IMU</td>
<td><a href="../python_api#carlaimumeasurement">carla.IMUMeasurement</a></td>
<td>Comprises an accelerometer, a gyroscope, and a compass.</td>
<tr>
<td>LIDAR raycast</td>
<td><a href="../python_api#carlalidarmeasurement">carla.LidarMeasurement</a></td>
<td>A rotating LIDAR. Generates a 3D point cloud modelling the surroundings.</td>
<tr>
<td>Radar</td>
<td><a href="../python_api#carlaradarmeasurement">carla.RadarMeasurement</a></td>
<td>2D point map modelling elements in sight and their movement regarding the sensor. </td>
</tbody>
</table>
<br>
2020-03-02 21:35:50 +08:00
---
That is a wrap on sensors and how do these retrieve simulation data.
Thus concludes the introduction to CARLA. However there is yet a lot to learn.
2020-02-19 00:58:02 +08:00
* __Gain some practise.__ It may be a good idea to try some of the code recipes provided in this documentation. Combine them with the example scripts, test new ideas.
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
2020-03-29 18:51:16 +08:00
<a href="../ref_code_recipes" target="_blank" class="btn btn-neutral" title="Code recipes">
2020-02-20 00:31:26 +08:00
Code recipes</a>
2020-02-19 00:58:02 +08:00
</p>
</div>
* __Continue learning.__ There are some advanced features in CARLA: rendering options, traffic manager, the recorder, and some more. This is a great moment to learn on them.
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
2020-03-29 18:51:16 +08:00
<a href="../adv_synchrony_timestep" target="_blank" class="btn btn-neutral" title="Synchrony and time-step">
2020-02-21 19:17:45 +08:00
Synchrony and time-step</a>
2020-02-19 00:58:02 +08:00
</p>
</div>
* __Experiment freely.__ Take a look at the __References__ section of this documentation. It contains detailed information on the classes in the Python API, sensors, and much more.
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
2020-03-29 18:51:16 +08:00
<a href="../python_api" target="_blank" class="btn btn-neutral" title="Python API reference">
2020-02-19 00:58:02 +08:00
Python API reference</a>
</p>
</div>
* __Give your two cents.__ Any doubts, suggestions and ideas are welcome in the forum.
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
<a href="https://forum.carla.org/" target="_blank" class="btn btn-neutral" title="Go to the CARLA forum">
CARLA forum</a>
</p>
</div>