251 lines
8.9 KiB
Markdown
251 lines
8.9 KiB
Markdown
# 4th. Sensors and data
|
|
|
|
Sensors are actors that retrieve data from their surroundings. They are crucial to create learning environment for driving agents.
|
|
|
|
This page summarizes everything necessary to start handling sensors. It introduces the types available and a step-by-step guide of their life cycle. The specifics for every sensor can be found in the [sensors reference](ref_sensors.md).
|
|
|
|
* [__Sensors step-by-step__](#sensors-step-by-step)
|
|
* [Setting](#setting)
|
|
* [Spawning](#spawning)
|
|
* [Listening](#listening)
|
|
* [Data](#data)
|
|
* [__Types of sensors__](#types-of-sensors)
|
|
* [Cameras](#cameras)
|
|
* [Detectors](#detectors)
|
|
* [Other](#other)
|
|
|
|
---
|
|
## Sensors step-by-step
|
|
|
|
The class [carla.Sensor](python_api.md#carla.Sensor) defines a special type of actor able to measure and stream data.
|
|
|
|
* __What is this data?__ It varies a lot depending on the type of sensor. All the types of data are inherited from the general [carla.SensorData](python_api.md#carla.SensorData).
|
|
* __When do they retrieve the data?__ Either on every simulation step or when a certain event is registered. Depends on the type of sensor.
|
|
* __How do they retrieve the data?__ Every sensor has a `listen()` method to receive and manage the data.
|
|
|
|
Despite their differences, all the sensors are used in a similar way.
|
|
|
|
### Setting
|
|
|
|
As with every other actor, find the blueprint and set specific attributes. This is essential when handling sensors. Their attributes will determine the results obtained. These are detailed in the [sensors reference](ref_sensors.md).
|
|
|
|
The following example sets a dashboard HD camera.
|
|
|
|
```py
|
|
# Find the blueprint of the sensor.
|
|
blueprint = world.get_blueprint_library().find('sensor.camera.rgb')
|
|
# Modify the attributes of the blueprint to set image resolution and field of view.
|
|
blueprint.set_attribute('image_size_x', '1920')
|
|
blueprint.set_attribute('image_size_y', '1080')
|
|
blueprint.set_attribute('fov', '110')
|
|
# Set the time in seconds between sensor captures
|
|
blueprint.set_attribute('sensor_tick', '1.0')
|
|
```
|
|
|
|
### Spawning
|
|
|
|
`attachment_to` and `attachment_type`, are crucial. Sensors should be attached to a parent actor, usually a vehicle, to follow it around and gather the information. The attachment type will determine how its position is updated regarding said vehicle.
|
|
|
|
* __Rigid attachment.__ Movement is strict regarding its parent location. Cameras may show "little hops" as the position updated is not eased.
|
|
* __SpringArm attachment.__ Movement is eased with little accelerations and decelerations.
|
|
|
|
```py
|
|
transform = carla.Transform(carla.Location(x=0.8, z=1.7))
|
|
sensor = world.spawn_actor(blueprint, transform, attach_to=my_vehicle)
|
|
```
|
|
!!! Important
|
|
When spawning with attachment, location must be relative to the parent actor.
|
|
|
|
### Listening
|
|
|
|
Every sensor has a [`listen()`](python_api.md#carla.Sensor.listen) method. This is called every time the sensor retrieves data.
|
|
|
|
The argument `callback` is a [lambda function](https://www.w3schools.com/python/python_lambda.asp). It describes what should the sensor do when data is retrieved. This must have the data retrieved as an argument.
|
|
|
|
```py
|
|
# do_something() will be called each time a new image is generated by the camera.
|
|
sensor.listen(lambda data: do_something(data))
|
|
|
|
...
|
|
|
|
# This collision sensor would print everytime a collision is detected.
|
|
def callback(event):
|
|
for actor_id in event:
|
|
vehicle = world_ref().get_actor(actor_id)
|
|
print('Vehicle too close: %s' % vehicle.type_id)
|
|
|
|
sensor02.listen(callback)
|
|
```
|
|
|
|
### Data
|
|
|
|
Most sensor data objects have a function to save the information to disk. This will allow it to be used in other environments.
|
|
|
|
Sensor data differs a lot between sensor types. Take a look at the [sensors reference](ref_sensors.md) to get a detailed explanation. However, all of them are always tagged with some basic information.
|
|
|
|
<table class ="defTable">
|
|
<thead>
|
|
<th>Sensor data attribute</th>
|
|
<th>Type</th>
|
|
<th>Description</th>
|
|
</thead>
|
|
<tbody>
|
|
<td><code>frame</code> </td>
|
|
<td>int</td>
|
|
<td>Frame number when the measurement took place.</td>
|
|
<tr>
|
|
<td><code>timestamp</code> </td>
|
|
<td>double</td>
|
|
<td>Timestamp of the measurement in simulation seconds since the beginning of the episode.</td>
|
|
<tr>
|
|
<td><code>transform</code> </td>
|
|
<td><a href="../python_api#carlatransform">carla.Transform</a></td>
|
|
<td>World reference of the sensor at the time of the measurement.</td>
|
|
</tbody>
|
|
</table>
|
|
<br>
|
|
|
|
!!! Important
|
|
`is_listening` is a __sensor attribute__ that enables/disables data listening at will.
|
|
`sensor_tick` is a __blueprint attribute__ that sets the simulation time between data received.
|
|
|
|
---
|
|
## Types of sensors
|
|
|
|
### Cameras
|
|
|
|
Take a shot of the world from their point of view. The helper class [carla.ColorConverter](python_api.md#carla.ColorConverter) will modify said image to represent different information.
|
|
|
|
* __Retrieve data__ every simulation step.
|
|
|
|
<table class ="defTable">
|
|
<thead>
|
|
<th>Sensor</th>
|
|
<th>Output</th>
|
|
<th>Overview</th>
|
|
</thead>
|
|
<tbody>
|
|
<td>Depth</td>
|
|
<td><a href="../python_api#carlaimage">carla.Image</a></td>
|
|
<td>Renders the depth of the elements in the field of view in a gray-scale map.</td>
|
|
<tr>
|
|
<td>RGB</td>
|
|
<td><a href="../python_api#carlaimage">carla.Image</a></td>
|
|
<td>Provides clear vision of the surroundings. Looks like a normal photo of the scene.</td>
|
|
<tr>
|
|
<td>Semantic segmentation</td>
|
|
<td><a href="../python_api#carlaimage">carla.Image</a></td>
|
|
<td>Renders elements in the field of view with a specific color according to their tags.</td>
|
|
</tbody>
|
|
</table>
|
|
<br>
|
|
|
|
### Detectors
|
|
|
|
Retrieve data when the object they are attached to registers a specific event.
|
|
|
|
* __Retrieve data__ when triggered.
|
|
|
|
<table class ="defTable">
|
|
<thead>
|
|
<th>Sensor</th>
|
|
<th>Output</th>
|
|
<th>Overview</th>
|
|
</thead>
|
|
<tbody>
|
|
<td>Collision</td>
|
|
<td><a href="../python_api#carlacollisionevent">carla.CollisionEvent</a></td>
|
|
<td>Retrieves collisions between its parent and other actors.</td>
|
|
<tr>
|
|
<td>Lane invasion</td>
|
|
<td><a href="../python_api#carlalaneinvasionevent">carla.LaneInvasionEvent</a></td>
|
|
<td>Registers when its parent crosses a lane marking.</td>
|
|
<tr>
|
|
<td>Obstacle</td>
|
|
<td><a href="../python_api#carlaobstacledetectionevent">carla.ObstacleDetectionEvent</a></td>
|
|
<td>Detects possible obstacles ahead of its parent.</td>
|
|
</tbody>
|
|
</table>
|
|
<br>
|
|
|
|
### Other
|
|
|
|
Different functionalities such as navigation, measurement of physical properties and 2D/3D point maps of the scene.
|
|
|
|
* __Retrieve data__ every simulation step.
|
|
|
|
<table class ="defTable">
|
|
<thead>
|
|
<th>Sensor</th>
|
|
<th>Output</th>
|
|
<th>Overview</th>
|
|
</thead>
|
|
<tbody>
|
|
<td>GNSS</td>
|
|
<td><a href="../python_api#carlagnssmeasurement">carla.GNSSMeasurement</a></td>
|
|
<td>Retrieves the geolocation of the sensor.</td>
|
|
<tr>
|
|
<td>IMU</td>
|
|
<td><a href="../python_api#carlaimumeasurement">carla.IMUMeasurement</a></td>
|
|
<td>Comprises an accelerometer, a gyroscope, and a compass.</td>
|
|
<tr>
|
|
<td>LIDAR raycast</td>
|
|
<td><a href="../python_api#carlalidarmeasurement">carla.LidarMeasurement</a></td>
|
|
<td>A rotating LIDAR. Generates a 3D point cloud modelling the surroundings.</td>
|
|
<tr>
|
|
<td>Radar</td>
|
|
<td><a href="../python_api#carlaradarmeasurement">carla.RadarMeasurement</a></td>
|
|
<td>2D point map modelling elements in sight and their movement regarding the sensor. </td>
|
|
<tr>
|
|
<td>RSS</td>
|
|
<td><a href="../python_api#carlarssresponse">carla.RssResponse</a></td>
|
|
<td>Modifies the controller applied to a vehicle according to safety checks. This sensor works in a different manner than the rest, and there is specific <a href="../adv_rss">RSS documentation</a> for it. </td>
|
|
</tbody>
|
|
</table>
|
|
<br>
|
|
|
|
---
|
|
That is a wrap on sensors and how do these retrieve simulation data.
|
|
|
|
Thus concludes the introduction to CARLA. However there is yet a lot to learn.
|
|
|
|
* __Gain some practise.__ It may be a good idea to try some of the code recipes provided in this documentation. Combine them with the example scripts, test new ideas.
|
|
|
|
<div class="build-buttons">
|
|
<p>
|
|
<a href="../ref_code_recipes" target="_blank" class="btn btn-neutral" title="Code recipes">
|
|
Code recipes</a>
|
|
</p>
|
|
</div>
|
|
|
|
* __Continue learning.__ There are some advanced features in CARLA: rendering options, traffic manager, the recorder, and some more. This is a great moment to learn on them.
|
|
|
|
<div class="build-buttons">
|
|
<p>
|
|
<a href="../adv_synchrony_timestep" target="_blank" class="btn btn-neutral" title="Synchrony and time-step">
|
|
Synchrony and time-step</a>
|
|
</p>
|
|
</div>
|
|
|
|
* __Experiment freely.__ Take a look at the __References__ section of this documentation. It contains detailed information on the classes in the Python API, sensors, and much more.
|
|
|
|
<div class="build-buttons">
|
|
<p>
|
|
<a href="../python_api" target="_blank" class="btn btn-neutral" title="Python API reference">
|
|
Python API reference</a>
|
|
</p>
|
|
</div>
|
|
|
|
|
|
* __Give your two cents.__ Any doubts, suggestions and ideas are welcome in the forum.
|
|
|
|
<div class="build-buttons">
|
|
<p>
|
|
<a href="https://forum.carla.org/" target="_blank" class="btn btn-neutral" title="Go to the CARLA forum">
|
|
CARLA forum</a>
|
|
</p>
|
|
</div>
|
|
|
|
|
|
|