carla/Docs/core_sensors.md

200 lines
16 KiB
Markdown
Raw Permalink Normal View History

Docs preview (#4983) * add UE4 warning * fixed UE4 build warning * removed file * created inst seg tutorial * added instance segmentation image * added to index * fix * added menu entries * texture streaming tutorial * reorganised instance segmentation tutorial * texture tutorial revision * typos * typos * added PIL link * added tooltip * remove tooltip image * replace tooltip image * removed tooltip image * added tooltip image * replaced image * texture streaming tutorial update * started pedestrian bones tutorial * added image * updates * updates * added download link for skeleton.txt * fixed link * fixed link * completed pedestrian tutorial * small change * small fix * TM tutorial * added new gif * typo * finished TM tutorial * small edit * small edit * typo * pygame tutorial * added gif to pygame tutorial * corrected gif location * replaced manual_control.gif * typos * fixed pygame tutorial * import numpy * pedestrian tutorial corrections * corrected pedestrian bones tutorial * added actors and blueprints * indexed getting started tutorial * fixed links * index.md refactor * mkdocs.yml nav refactor * mkdocs.yml syntax error * main docs refactor * new documentation structure * content authoring tutorials * content authoring tutorial * content authoring * latest changes * vehicle authoring tutorial * finished vehicle content tutorial * finished vehicles tutorial * adjusted outline * finalise authoring tutorials * rearrange index.md * extended index.md * change mkdocs format * update jinja version * bounding box tutorial * fix stray files * remove changes in build.sh * proof read * guillermo's pr edits * bbox tutorial changes * guillermo's pr edits 1 * added modeling guidelines and blender add on * added COCO export format * added bounding boxes to tutorials * merged bounding box tutorial Co-authored-by: germanros1987 <38517452+germanros1987@users.noreply.github.com>
2022-04-14 18:38:49 +08:00
# Sensors and data
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
Sensors are actors that retrieve data from their surroundings. They are crucial to create learning environment for driving agents.
2020-02-19 00:58:02 +08:00
This page summarizes everything necessary to start handling sensors. It introduces the types available and a step-by-step guide of their life cycle. The specifics for every sensor can be found in the [sensors reference](ref_sensors.md).
2020-07-31 05:02:05 +08:00
* [__Sensors step-by-step__](#sensors-step-by-step)
* [Setting](#setting)
* [Spawning](#spawning)
* [Listening](#listening)
* [Data](#data)
* [__Types of sensors__](#types-of-sensors)
* [Cameras](#cameras)
* [Detectors](#detectors)
Docs preview (#4983) * add UE4 warning * fixed UE4 build warning * removed file * created inst seg tutorial * added instance segmentation image * added to index * fix * added menu entries * texture streaming tutorial * reorganised instance segmentation tutorial * texture tutorial revision * typos * typos * added PIL link * added tooltip * remove tooltip image * replace tooltip image * removed tooltip image * added tooltip image * replaced image * texture streaming tutorial update * started pedestrian bones tutorial * added image * updates * updates * added download link for skeleton.txt * fixed link * fixed link * completed pedestrian tutorial * small change * small fix * TM tutorial * added new gif * typo * finished TM tutorial * small edit * small edit * typo * pygame tutorial * added gif to pygame tutorial * corrected gif location * replaced manual_control.gif * typos * fixed pygame tutorial * import numpy * pedestrian tutorial corrections * corrected pedestrian bones tutorial * added actors and blueprints * indexed getting started tutorial * fixed links * index.md refactor * mkdocs.yml nav refactor * mkdocs.yml syntax error * main docs refactor * new documentation structure * content authoring tutorials * content authoring tutorial * content authoring * latest changes * vehicle authoring tutorial * finished vehicle content tutorial * finished vehicles tutorial * adjusted outline * finalise authoring tutorials * rearrange index.md * extended index.md * change mkdocs format * update jinja version * bounding box tutorial * fix stray files * remove changes in build.sh * proof read * guillermo's pr edits * bbox tutorial changes * guillermo's pr edits 1 * added modeling guidelines and blender add on * added COCO export format * added bounding boxes to tutorials * merged bounding box tutorial Co-authored-by: germanros1987 <38517452+germanros1987@users.noreply.github.com>
2022-04-14 18:38:49 +08:00
* [Other](#other)
* [__Sensors reference__](ref_sensors.md)
2020-02-19 00:58:02 +08:00
2020-03-02 21:35:50 +08:00
---
2020-07-31 05:02:05 +08:00
## Sensors step-by-step
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
The class [carla.Sensor](python_api.md#carla.Sensor) defines a special type of actor able to measure and stream data.
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
* __What is this data?__ It varies a lot depending on the type of sensor. All the types of data are inherited from the general [carla.SensorData](python_api.md#carla.SensorData).
* __When do they retrieve the data?__ Either on every simulation step or when a certain event is registered. Depends on the type of sensor.
* __How do they retrieve the data?__ Every sensor has a `listen()` method to receive and manage the data.
2020-07-31 05:02:05 +08:00
Despite their differences, all the sensors are used in a similar way.
2020-02-19 00:58:02 +08:00
### Setting
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
As with every other actor, find the blueprint and set specific attributes. This is essential when handling sensors. Their attributes will determine the results obtained. These are detailed in the [sensors reference](ref_sensors.md).
2020-02-19 00:58:02 +08:00
The following example sets a dashboard HD camera.
2020-02-19 00:58:02 +08:00
```py
# Find the blueprint of the sensor.
blueprint = world.get_blueprint_library().find('sensor.camera.rgb')
# Modify the attributes of the blueprint to set image resolution and field of view.
blueprint.set_attribute('image_size_x', '1920')
blueprint.set_attribute('image_size_y', '1080')
blueprint.set_attribute('fov', '110')
# Set the time in seconds between sensor captures
blueprint.set_attribute('sensor_tick', '1.0')
2020-07-31 05:02:05 +08:00
```
2020-02-19 00:58:02 +08:00
### Spawning
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
`attachment_to` and `attachment_type`, are crucial. Sensors should be attached to a parent actor, usually a vehicle, to follow it around and gather the information. The attachment type will determine how its position is updated regarding said vehicle.
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
* __Rigid attachment.__ Movement is strict regarding its parent location. This is the proper attachment to retrieve data from the simulation.
* __SpringArm attachment.__ Movement is eased with little accelerations and decelerations. This attachment is only recommended to record videos from the simulation. The movement is smooth and "hops" are avoided when updating the cameras' positions.
* __SpringArmGhost attachment.__ Like the previous one but without doing the collision test, so the camera or sensor could cross walls or other geometries.
2020-02-19 00:58:02 +08:00
```py
transform = carla.Transform(carla.Location(x=0.8, z=1.7))
sensor = world.spawn_actor(blueprint, transform, attach_to=my_vehicle)
```
!!! Important
2020-07-31 05:02:05 +08:00
When spawning with attachment, location must be relative to the parent actor.
### Listening
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
Every sensor has a [`listen()`](python_api.md#carla.Sensor.listen) method. This is called every time the sensor retrieves data.
2020-02-19 00:58:02 +08:00
2020-07-31 05:02:05 +08:00
The argument `callback` is a [lambda function](https://www.w3schools.com/python/python_lambda.asp). It describes what should the sensor do when data is retrieved. This must have the data retrieved as an argument.
2020-02-19 00:58:02 +08:00
```py
# do_something() will be called each time a new image is generated by the camera.
sensor.listen(lambda data: do_something(data))
...
2020-07-31 05:02:05 +08:00
# This collision sensor would print everytime a collision is detected.
2020-02-19 00:58:02 +08:00
def callback(event):
for actor_id in event:
vehicle = world_ref().get_actor(actor_id)
print('Vehicle too close: %s' % vehicle.type_id)
sensor02.listen(callback)
```
### Data
2020-07-31 05:02:05 +08:00
Most sensor data objects have a function to save the information to disk. This will allow it to be used in other environments.
2020-07-31 05:02:05 +08:00
Sensor data differs a lot between sensor types. Take a look at the [sensors reference](ref_sensors.md) to get a detailed explanation. However, all of them are always tagged with some basic information.
| Sensor data attribute | Type | Description |
| -------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- |
| `frame` | int | Frame number when the measurement took place. |
| `timestamp` | double | Timestamp of the measurement in simulation seconds since the beginning of the episode. |
| `transform` | [carla.Transform](<../python_api#carlatransform>) | World reference of the sensor at the time of the measurement. |
<br>
2020-03-02 21:35:50 +08:00
!!! Important
`is_listening()` is a __sensor method__ to check whether the sensor has a callback registered by `listen`.
`stop()` is a __sensor method__ to stop the sensor from listening.
2020-07-31 05:02:05 +08:00
`sensor_tick` is a __blueprint attribute__ that sets the simulation time between data received.
2020-03-02 21:35:50 +08:00
---
## Types of sensors
2020-07-31 05:02:05 +08:00
### Cameras
Take a shot of the world from their point of view. For cameras that return [carla.Image](<../python_api#carlaimage>), you can use the helper class [carla.ColorConverter](python_api.md#carla.ColorConverter) to modify the image to represent different information.
2020-07-31 05:02:05 +08:00
* __Retrieve data__ every simulation step.
|Sensor |Output | Overview |
| ----------------- | ---------- | ------------------ |
Docs preview (#4983) * add UE4 warning * fixed UE4 build warning * removed file * created inst seg tutorial * added instance segmentation image * added to index * fix * added menu entries * texture streaming tutorial * reorganised instance segmentation tutorial * texture tutorial revision * typos * typos * added PIL link * added tooltip * remove tooltip image * replace tooltip image * removed tooltip image * added tooltip image * replaced image * texture streaming tutorial update * started pedestrian bones tutorial * added image * updates * updates * added download link for skeleton.txt * fixed link * fixed link * completed pedestrian tutorial * small change * small fix * TM tutorial * added new gif * typo * finished TM tutorial * small edit * small edit * typo * pygame tutorial * added gif to pygame tutorial * corrected gif location * replaced manual_control.gif * typos * fixed pygame tutorial * import numpy * pedestrian tutorial corrections * corrected pedestrian bones tutorial * added actors and blueprints * indexed getting started tutorial * fixed links * index.md refactor * mkdocs.yml nav refactor * mkdocs.yml syntax error * main docs refactor * new documentation structure * content authoring tutorials * content authoring tutorial * content authoring * latest changes * vehicle authoring tutorial * finished vehicle content tutorial * finished vehicles tutorial * adjusted outline * finalise authoring tutorials * rearrange index.md * extended index.md * change mkdocs format * update jinja version * bounding box tutorial * fix stray files * remove changes in build.sh * proof read * guillermo's pr edits * bbox tutorial changes * guillermo's pr edits 1 * added modeling guidelines and blender add on * added COCO export format * added bounding boxes to tutorials * merged bounding box tutorial Co-authored-by: germanros1987 <38517452+germanros1987@users.noreply.github.com>
2022-04-14 18:38:49 +08:00
| [Depth](ref_sensors.md#depth-camera) | [carla.Image](<../python_api#carlaimage>) |Renders the depth of the elements in the field of view in a gray-scale map. |
| [RGB](ref_sensors.md#rgb-camera) | [carla.Image](<../python_api#carlaimage>) | Provides clear vision of the surroundings. Looks like a normal photo of the scene. |
| [Optical Flow](ref_sensors.md#optical-flow-camera) | [carla.Image](<../python_api#carlaimage>) | Renders the motion of every pixel from the camera. |
| [Semantic segmentation](ref_sensors.md#semantic-segmentation-camera) | [carla.Image](<../python_api#carlaimage>) | Renders elements in the field of view with a specific color according to their tags. |
| [Instance segmentation](ref_sensors.md#instance-segmentation-camera) | [carla.Image](<../python_api#carlaimage>) | Renders elements in the field of view with a specific color according to their tags and a unique object ID. |
| [DVS](ref_sensors.md#dvs-camera) | [carla.DVSEventArray](<../python_api#carladvseventarray>) | Measures changes of brightness intensity asynchronously as an event stream. |
<br>
2020-03-02 21:35:50 +08:00
---
### Detectors
2020-07-31 05:02:05 +08:00
Retrieve data when the object they are attached to registers a specific event.
2020-07-31 05:02:05 +08:00
* __Retrieve data__ when triggered.
| Sensor | Output | Overview |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
Docs preview (#4983) * add UE4 warning * fixed UE4 build warning * removed file * created inst seg tutorial * added instance segmentation image * added to index * fix * added menu entries * texture streaming tutorial * reorganised instance segmentation tutorial * texture tutorial revision * typos * typos * added PIL link * added tooltip * remove tooltip image * replace tooltip image * removed tooltip image * added tooltip image * replaced image * texture streaming tutorial update * started pedestrian bones tutorial * added image * updates * updates * added download link for skeleton.txt * fixed link * fixed link * completed pedestrian tutorial * small change * small fix * TM tutorial * added new gif * typo * finished TM tutorial * small edit * small edit * typo * pygame tutorial * added gif to pygame tutorial * corrected gif location * replaced manual_control.gif * typos * fixed pygame tutorial * import numpy * pedestrian tutorial corrections * corrected pedestrian bones tutorial * added actors and blueprints * indexed getting started tutorial * fixed links * index.md refactor * mkdocs.yml nav refactor * mkdocs.yml syntax error * main docs refactor * new documentation structure * content authoring tutorials * content authoring tutorial * content authoring * latest changes * vehicle authoring tutorial * finished vehicle content tutorial * finished vehicles tutorial * adjusted outline * finalise authoring tutorials * rearrange index.md * extended index.md * change mkdocs format * update jinja version * bounding box tutorial * fix stray files * remove changes in build.sh * proof read * guillermo's pr edits * bbox tutorial changes * guillermo's pr edits 1 * added modeling guidelines and blender add on * added COCO export format * added bounding boxes to tutorials * merged bounding box tutorial Co-authored-by: germanros1987 <38517452+germanros1987@users.noreply.github.com>
2022-04-14 18:38:49 +08:00
| [Collision](ref_sensors.md#collision-detector) | [carla.CollisionEvent](<../python_api#carlacollisionevent>) | Retrieves collisions between its parent and other actors. |
| [Lane invasion](ref_sensors.md#lane-invasion-detector) | [carla.LaneInvasionEvent](<../python_api#carlalaneinvasionevent>) | Registers when its parent crosses a lane marking. |
| [Obstacle](ref_sensors.md#obstacle-detector) | [carla.ObstacleDetectionEvent](<../python_api#carlaobstacledetectionevent>) | Detects possible obstacles ahead of its parent. |
<br>
2020-03-02 21:35:50 +08:00
### Other
2020-07-31 05:02:05 +08:00
Different functionalities such as navigation, measurement of physical properties and 2D/3D point maps of the scene.
2020-07-31 05:02:05 +08:00
* __Retrieve data__ every simulation step.
| Sensor | Output | Overview |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
Docs preview (#4983) * add UE4 warning * fixed UE4 build warning * removed file * created inst seg tutorial * added instance segmentation image * added to index * fix * added menu entries * texture streaming tutorial * reorganised instance segmentation tutorial * texture tutorial revision * typos * typos * added PIL link * added tooltip * remove tooltip image * replace tooltip image * removed tooltip image * added tooltip image * replaced image * texture streaming tutorial update * started pedestrian bones tutorial * added image * updates * updates * added download link for skeleton.txt * fixed link * fixed link * completed pedestrian tutorial * small change * small fix * TM tutorial * added new gif * typo * finished TM tutorial * small edit * small edit * typo * pygame tutorial * added gif to pygame tutorial * corrected gif location * replaced manual_control.gif * typos * fixed pygame tutorial * import numpy * pedestrian tutorial corrections * corrected pedestrian bones tutorial * added actors and blueprints * indexed getting started tutorial * fixed links * index.md refactor * mkdocs.yml nav refactor * mkdocs.yml syntax error * main docs refactor * new documentation structure * content authoring tutorials * content authoring tutorial * content authoring * latest changes * vehicle authoring tutorial * finished vehicle content tutorial * finished vehicles tutorial * adjusted outline * finalise authoring tutorials * rearrange index.md * extended index.md * change mkdocs format * update jinja version * bounding box tutorial * fix stray files * remove changes in build.sh * proof read * guillermo's pr edits * bbox tutorial changes * guillermo's pr edits 1 * added modeling guidelines and blender add on * added COCO export format * added bounding boxes to tutorials * merged bounding box tutorial Co-authored-by: germanros1987 <38517452+germanros1987@users.noreply.github.com>
2022-04-14 18:38:49 +08:00
| [GNSS](ref_sensors.md#gnss-sensor) | [carla.GNSSMeasurement](<../python_api#carlagnssmeasurement>) | Retrieves the geolocation of the sensor. |
| [IMU](ref_sensors.md#imu-sensor) | [carla.IMUMeasurement](<../python_api#carlaimumeasurement>) | Comprises an accelerometer, a gyroscope, and a compass. |
| [LIDAR](ref_sensors.md#lidar-sensor) | [carla.LidarMeasurement](<../python_api#carlalidarmeasurement>) | A rotating LIDAR. Generates a 4D point cloud with coordinates and intensity per point to model the surroundings. |
| [Radar](ref_sensors.md#radar-sensor) | [carla.RadarMeasurement](<../python_api#carlaradarmeasurement>) | 2D point map modelling elements in sight and their movement regarding the sensor. |
| [RSS](ref_sensors.md#rss-sensor) | [carla.RssResponse](<../python_api#carlarssresponse>) | Modifies the controller applied to a vehicle according to safety checks. This sensor works in a different manner than the rest, and there is specific [RSS documentation](<../adv_rss>) for it. |
| [Semantic LIDAR](ref_sensors.md#semantic-lidar-sensor) | [carla.SemanticLidarMeasurement](<../python_api#carlasemanticlidarmeasurement>) | A rotating LIDAR. Generates a 3D point cloud with extra information regarding instance and semantic segmentation. |
<br>
2020-03-02 21:35:50 +08:00
2020-03-02 21:35:50 +08:00
---
2020-07-31 05:02:05 +08:00
That is a wrap on sensors and how do these retrieve simulation data.
Thus concludes the introduction to CARLA. However there is yet a lot to learn.
2020-02-19 00:58:02 +08:00
* __Continue learning.__ There are some advanced features in CARLA: rendering options, traffic
manager, the recorder, and some more. This is a great moment to learn more about them.
2020-07-31 05:02:05 +08:00
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
2020-03-29 18:51:16 +08:00
<a href="../adv_synchrony_timestep" target="_blank" class="btn btn-neutral" title="Synchrony and time-step">
2020-02-21 19:17:45 +08:00
Synchrony and time-step</a>
2020-02-19 00:58:02 +08:00
</p>
</div>
* __Experiment freely.__ Take a look at the __References__ section of this documentation. It
contains detailed information on the classes in the Python API, sensors, code snippets and much
more.
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
2020-03-29 18:51:16 +08:00
<a href="../python_api" target="_blank" class="btn btn-neutral" title="Python API reference">
2020-02-19 00:58:02 +08:00
Python API reference</a>
</p>
</div>
* __Give your two cents.__ Any doubts, suggestions and ideas are welcome in the forum.
2020-02-19 00:58:02 +08:00
<div class="build-buttons">
<p>
<a href="https://github.com/carla-simulator/carla/discussions/" target="_blank" class="btn btn-neutral" title="Go to the CARLA forum">
2020-02-19 00:58:02 +08:00
CARLA forum</a>
</p>
</div>