Go to file
Fei Xia fcdd08c8f1
Merge pull request #32 from LinxiFan/jimfix
Remove special characters from YAML
2020-04-12 21:38:20 -07:00
docs update assets_utils for decompressing dataset, add one more trouble shooting tip 2020-04-10 20:51:20 -07:00
examples Remove special characters from YAML 2020-04-12 20:56:40 -07:00
gibson2 update assets_utils for decompressing dataset, add one more trouble shooting tip 2020-04-10 20:51:20 -07:00
misc gcp guide 2018-10-13 17:54:00 -07:00
test add documentation for tests, fix broken tests, remove duplicated scene objects in interactive_dataset 2020-04-06 15:38:25 -07:00
.dockerignore update download instructions 2018-05-13 19:59:57 -07:00
.gitignore update installation, Cpp openGL implementation 2020-03-30 15:05:53 -07:00
.gitmodules add glfw 2020-02-12 17:06:02 -08:00
.style.yapf yapf style 2019-05-21 14:30:07 -07:00
.yapfignore yapf style 2019-05-21 14:30:07 -07:00
LICENSE Initial commit 2017-12-18 12:12:36 -08:00
MANIFEST.in update installation, Cpp openGL implementation 2020-03-30 15:05:53 -07:00
README.md small tweaks, bump up version to 0.0.4 2020-04-07 13:10:56 -07:00
clean.sh finish example and documentation until Scene module, beef up Scene to include interactive objects 2020-04-04 18:33:56 -07:00
setup.py relax python dependency, remove torch dependency from locomotor_env if rgb_filled is not in outputs 2020-04-09 11:53:19 -07:00

README.md

iGibson: the Interactive Gibson Environment

Large Scale Interactive Simulation Environments for Robot Learning

iGibson, the Interactive Gibson Environment, is a simulation environment providing fast visual rendering and physics simulation (based on Bullet). It is packed with a dataset with hundreds of large 3D environments reconstructed from real homes and offices, and interactive objects that can be pushed and actuated. iGibson allows researchers to train and evaluate robotic agents that use RGB images and/or other visual sensors to solve indoor (interactive) navigation and manipulation tasks such as opening doors, picking and placing objects, or searching in cabinets.

Citation

If you use iGibson or its assets and models, consider citing the following publication:

@article{xia2020interactive,
         title={Interactive Gibson Benchmark: A Benchmark for Interactive Navigation in Cluttered Environments},
         author={Xia, Fei and Shen, William B and Li, Chengshu and Kasimbeg, Priya and Tchapmi, Micael Edmond and Toshev, Alexander and Mart{\'\i}n-Mart{\'\i}n, Roberto and Savarese, Silvio},
         journal={IEEE Robotics and Automation Letters},
         volume={5},
         number={2},
         pages={713--720},
         year={2020},
         publisher={IEEE}
}

Release

This is the repository for iGibson (gibson2) 0.0.4 release. Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. Here is the Changelog. The support for our previous version of the environment, Gibson v1, will be moved to this repository.

Documentation

The documentation for this repository can be found here: iGibson Environment Documentation. It includes installation guide (including data download), quickstart guide, code examples, and APIs.

If you want to know more about iGibson, you can also check out our webpage, our RAL+ICRA20 paper and our (outdated) technical report.

Dowloading Dataset of 3D Environments

There are several datasets of 3D reconstructed large real-world environments (homes and offices) that you can download and use with iGibson. All of them will be accessible once you fill in this form.

You will have access to ten environments with annotated instances of furniture (chairs, tables, desks, doors, sofas) that can be interacted with, and to the original 572 reconstructed 3D environments without annotated objects from Gibson v1.

You will also have access to a fully annotated environment: Rs_interactive where close to 200 articulated objects are placed in their original locations of a real house and ready for interaction. (The original environment: Rs is also available). More info can be found in the installation guide.