Merge branch 'og-develop' into pypi-release

This commit is contained in:
Cem Gökmen 2024-10-01 01:05:51 -07:00 committed by GitHub
commit 714487dacd
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
12 changed files with 129 additions and 2 deletions

BIN
docs/assets/robots/A1.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

BIN
docs/assets/robots/R1.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 53 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 700 KiB

After

Width:  |  Height:  |  Size: 1.8 MiB

View File

@ -1,3 +1,7 @@
---
icon: material/email
---
# **Contact**
If you have any questions, comments, or concerns, please feel free to reach out to us by joining our Discord server:

View File

@ -1,3 +1,7 @@
---
icon: material/arm-flex
---
# **Contribution Guidelines**
We sincerely welcome contributions of any form to OmniGibson, as our aim is to make it a more robust and useful resource for the community. We have always held the belief that a collective effort from the community is essential to tailor BEHAVIOR/OmniGibson to meet diverse needs and unlock its full potential.

View File

@ -1,3 +1,7 @@
---
icon: material/file-question
---
# **Frequently Asked Questions**
## **What is the relationship between BEHAVIOR-1K and OmniGibson?**

View File

@ -1,3 +1,7 @@
---
icon: octicons/question-16
---
# **Known Issues & Troubleshooting**
## 🤔 **Known Issues**

View File

@ -80,7 +80,7 @@ Controllers and sensors can be accessed directly via the `controllers` and `sens
## Types
**`OmniGibson`** currently supports 9 robots, consisting of 4 mobile robots, 2 manipulation robots, 2 mobile manipulation robots, and 1 anthropomorphic "robot" (a bimanual agent proxy used for VR teleoperation). Below, we provide a brief overview of each model:
**`OmniGibson`** currently supports 12 robots, consisting of 4 mobile robots, 3 manipulation robots, 4 mobile manipulation robots, and 1 anthropomorphic "robot" (a bimanual agent proxy used for VR teleoperation). Below, we provide a brief overview of each model:
### Mobile Robots
These are navigation-only robots (an instance of [`LocomotionRobot`](../reference/robots/locomotion_robot.md)) that solely consist of a base that can move.
@ -170,6 +170,19 @@ These are manipulation-only robots (an instance of [`ManipulationRobot`](../refe
<img src="../assets/robots/VX300S.png" alt="rgb">
</td>
</tr>
<tr>
<td valign="top" width="60%">
[**`A1`**](../reference/robots/A1.md)<br><br>
The 6-DOF A1 model equipped with a Inspire-Robots Dexterous Hand.<br><br>
<ul>
<li>_Controllers_: Arm, Gripper</li>
<li>_Sensors_: Wrist Camera</li>
</ul>
</td>
<td>
<img src="../assets/robots/A1.png" alt="rgb">
</td>
</tr>
</table>
@ -203,6 +216,32 @@ These are robots that can both navigate and manipulate (and inherit from both [`
<img src="../assets/robots/Tiago.png" alt="rgb">
</td>
</tr>
<tr>
<td valign="top" width="60%">
[**`Stretch`**](../reference/robots/stretch.md)<br><br>
The <a href="https://hello-robot.com/stretch-3-product">Stretch</a> model from Hello Robot, composed of a two-wheeled base, 2-DOF head, 5-DOF arm, and 1-DOF gripper.<br><br>
<ul>
<li>_Controllers_: Base, Head, Arm, Gripper</li>
<li>_Sensors_: Head Camera</li>
</ul>
</td>
<td>
<img src="../assets/robots/Stretch.png" alt="rgb">
</td>
</tr>
<tr>
<td valign="top" width="60%">
[**`R1`**](../reference/robots/R1.md)<br><br>
The bimanual R1 model, composed of a holonomic base (which we model as a 3-DOF (x,y,rz) set of joints), 4-DOF torso, x2 6-DOF arm, and x2 2-DOF parallel jaw grippers.<br><br>
<ul>
<li>_Controllers_: Base, Left Arm, Right Arm, Left Gripper, Right Gripper</li>
<li>_Sensors_: Head Camera</li>
</ul>
</td>
<td>
<img src="../assets/robots/R1.png" alt="rgb">
</td>
</tr>
</table>
### Additional Robots

View File

@ -34,3 +34,75 @@ action = teleop_sys.get_action(teleop_sys.get_obs())
```
to get the action based on the user teleoperation input, and pass the action to the `env.step` function.
## Data Collection and Playback
OmniGibson provides tools for collecting demonstration data and playing it back for further analysis, training, or evaluation. This is implemented via two environment wrapper classes: `DataCollectionWrapper` and `DataPlaybackWrapper`.
### DataCollectionWrapper
The `DataCollectionWrapper` is used to collect data during environment interactions. It wraps around an existing OmniGibson environment and records relevant information at each step.
Key features:
- Records actions, states, rewards, and termination conditions
- Optimizes the simulator for data collection
- Tracks object and system transitions within the environment
Example usage:
```python
import omnigibson as og
from omnigibson.envs import DataCollectionWrapper
# Create your OmniGibson environment
env = og.Environment(configs=your_config)
# Wrap it with DataCollectionWrapper
wrapped_env = DataCollectionWrapper(
env=env,
output_path="path/to/save/data.hdf5",
only_successes=False, # Set to True to only save successful episodes
)
# Use the wrapped environment as you would normally
obs, info = wrapped_env.reset()
for _ in range(num_steps):
action = your_policy(obs)
obs, reward, terminated, truncated, info = wrapped_env.step(action)
# Save the collected data
wrapped_env.save_data()
```
### DataPlaybackWrapper
The `DataPlaybackWrapper` is used to replay collected data and optionally record additional observations. This is particularly useful for gathering visual data or other sensor information that wasn't collected during the initial demonstration.
Key features:
- Replays episodes from collected data
- Can record additional observation modalities during playback
- Supports custom robot sensor configurations and external sensors
Example usage:
```python
from omnigibson.envs import DataPlaybackWrapper
# Create a playback environment
playback_env = DataPlaybackWrapper.create_from_hdf5(
input_path="path/to/collected/data.hdf5",
output_path="path/to/save/playback/data.hdf5",
robot_obs_modalities=["proprio", "rgb", "depth_linear"],
robot_sensor_config=your_robot_sensor_config,
external_sensors_config=your_external_sensors_config,
n_render_iterations=5,
only_successes=False,
)
# Playback the entire dataset and record observations
playback_env.playback_dataset(record=True)
# Save the recorded playback data
playback_env.save_data()
```

View File

@ -131,9 +131,9 @@ nav:
- FAQ: miscellaneous/faq.md
- Known Issues & Troubleshooting: miscellaneous/known_issues.md
- Contributing: miscellaneous/contributing.md
- Changelog: https://github.com/StanfordVL/OmniGibson/releases
- Contact Us: miscellaneous/contact.md
- API Reference: reference/*
- Changelog: https://github.com/StanfordVL/OmniGibson/releases
extra:
analytics: