Go to file
kenzomenzo 6ffa700f71 Added global config back in 2020-11-09 19:24:55 +00:00
docker Add docker images that support headless GUI. Update installation and quickstart docs 2020-04-13 14:25:12 -07:00
docs fixed docs 2020-10-03 22:50:28 -07:00
examples Got first prototype of MUVR system working 2020-11-09 19:23:30 +00:00
gibson2 Added global config back in 2020-11-09 19:24:55 +00:00
test Merged most recent pbr into VR 2020-10-16 17:55:47 +02:00
.dockerignore update download instructions 2018-05-13 19:59:57 -07:00
.gitignore Added global config back in 2020-11-09 19:24:55 +00:00
.gitmodules Added SRAnipal back to repo 2020-09-30 20:34:21 +01:00
.style.yapf yapf style 2019-05-21 14:30:07 -07:00
.yapfignore fixing refactor 2020-09-12 18:58:10 -07:00
Jenkinsfile add a pbr test 2020-10-05 16:04:12 -07:00
LICENSE Initial commit 2017-12-18 12:12:36 -08:00
MANIFEST.in fixing refactor 2020-09-12 18:58:10 -07:00
README.md Uploaded new VR hand assets, fixed VR body speed issues, made modifications to pbr/non-pbr demos and updated data saving to work with new VR changes. In addition, tested action saving and put both state saving and action saving demos into the data_save_replay folder. 2020-11-04 17:54:31 +00:00
clean.sh fixing refactor 2020-09-12 18:58:10 -07:00
setup.py Cleanup and general changes 2020-10-20 16:07:47 +02:00

README.md

iGibson: the Interactive Gibson Environment

Large Scale Interactive Simulation Environments for Robot Learning

iGibson, the Interactive Gibson Environment, is a simulation environment providing fast visual rendering and physics simulation (based on Bullet). It is packed with a dataset with hundreds of large 3D environments reconstructed from real homes and offices, and interactive objects that can be pushed and actuated. iGibson allows researchers to train and evaluate robotic agents that use RGB images and/or other visual sensors to solve indoor (interactive) navigation and manipulation tasks such as opening doors, picking and placing objects, or searching in cabinets.

Latest Updates

[05/14/2020] Added dynamic light support 🔦

[04/28/2020] Added support for Mac OSX 💻

Citation

If you use iGibson or its assets and models, consider citing the following publication:

@article{xia2020interactive,
         title={Interactive Gibson Benchmark: A Benchmark for Interactive Navigation in Cluttered Environments},
         author={Xia, Fei and Shen, William B and Li, Chengshu and Kasimbeg, Priya and Tchapmi, Micael Edmond and Toshev, Alexander and Mart{\'\i}n-Mart{\'\i}n, Roberto and Savarese, Silvio},
         journal={IEEE Robotics and Automation Letters},
         volume={5},
         number={2},
         pages={713--720},
         year={2020},
         publisher={IEEE}
}

Release

This is the repository for iGibson (gibson2) 0.0.4 release. Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. The support for our previous version of the environment, Gibson v1, will be moved to this repository.

Documentation

The documentation for this repository can be found here: iGibson Environment Documentation. It includes installation guide (including data download), quickstart guide, code examples, and APIs.

If you want to know more about iGibson, you can also check out our webpage, our RAL+ICRA20 paper and our (outdated) technical report.

Dowloading Dataset of 3D Environments

There are several datasets of 3D reconstructed large real-world environments (homes and offices) that you can download and use with iGibson. All of them will be accessible once you fill in this form.

You will have access to ten environments with annotated instances of furniture (chairs, tables, desks, doors, sofas) that can be interacted with, and to the original 572 reconstructed 3D environments without annotated objects from Gibson v1.

You will also have access to a fully annotated environment: Rs_interactive where close to 200 articulated objects are placed in their original locations of a real house and ready for interaction. (The original environment: Rs is also available). More info can be found in the installation guide.

VR Information

Instructions for installing Gibson and VR integration on Windows 10. Assuming a fresh install of Windows.

These instructions partially overlap with installing Gibson http://svl.stanford.edu/gibson2/docs/installation.html#installation-method but are tailored to run the VR components in Windows.

VR Station

Install Steam and Steam VR, connect VR headset and base stations, set up VR room Run steam performance test.

https://www.vive.com/eu/support/vive/category_howto/setting-up-for-the-first-time.html

Dependencies and environment:

Make sure anaconda is added to the PATH as follows: C:\Users\C\anaconda3 C:\Users\C\anaconda3\Scripts C:\Users\C\anaconda3\Library\bin

Lack of the latter produced the following error: HTTP 000 CONNECTION FAILED for url https://repo.anaconda.com/pkgs/main/win-64/current_repodata.json Elapsed

Download the VIVE_SRanipalInstaller msi file and install SRAnipal.

Gibson

  • Get codebase and assets:
$ git clone https://github.com/fxia22/iGibson.git --recursive
$ cd iGibson
$ git checkout vr
$ git submodule update --recursive

Download Gibson assets and copy to iGibson/gibson2/assets/ Download enviroments (scenes) and copy to iGibson/gibson2/assets/dataset

  • Create anaconda env:
$ conda create -n gibsonvr python=3.6

Activate conda env:

$ source activate gibsonvr
  • Install Gibson in anaconda env:
$ cd iGibson
  • If you followed the instructions, iGibson is at the vr branch
$ pip install -e .

Should end printing 'Successfully installed gibson2'

You can find all the VR demos in iGibson/examples/demo/vr_demos

Run:

$ python vr_playground_no_pbr (for a scene without PBR)

or

$ python vr_playground_pbr (for the current state-of-the-art Gibson graphics)

Data saving/replay code can be found in vr_demos/data_save_replay. Run vr_demo_save to save a demo to a log file, and vr_demo_replay to run it again. Please see the demos and gibson2/utils/vr_logging.py for more details on the data saving/replay system.

To use the VR hand asset, please download and unzip the asset and put it into assets/models under the folder name 'vr_hand'. The asset is stored in a drive folder and is entitled vr_hand.zip. Link to VR hand zip: https://drive.google.com/drive/folders/1zm3ZpPc7yHwyALEGfsb0_NybFMvV81Um?usp=sharing

Have fun in VR!

Helpful tips: Press ESCAPE to force the fullscreen rendering window to close during program execution. Before using SRAnipal eye tracking, you may want to re-calibrate the eye tracker. Please go to the Vive system settings to perform this calibration.