Docker Development¶
cuRobo will work on most docker images that have pytorch. We provide some example dockerfiles in
curobo/docker for reference. The dockerfiles are described in the below table.
Tag |
File |
Description |
|---|---|---|
x86 |
x86.dockerfile |
Dockerfile that builds cuRobo with a pytorch base container. |
aarch64 |
aarch64.dockerfile |
Dockerfile that builds cuRobo with a pytorch base container for using on a NVIDIA Jetson. This container can only be built on a NVIDIA Jetson. |
isaac_sim_VERSION |
isaac_sim.dockerfile |
Dockerfile that builds cuRobo with NVIDIA Isaac Sim VERSION and vulkan. This docker can run
Isaac sim with native GUI or headless. Replace |
Building your own docker image with cuRobo¶
Add default nvidia runtime to enable cuda compilation during docker build:
Edit/create the /etc/docker/daemon.json with content: { "runtimes": { "nvidia": { "path": "/usr/bin/nvidia-container-runtime", "runtimeArgs": [] } }, "default-runtime": "nvidia" # ADD this line (the above lines will already exist in your json file) }
If you are building a docker with isaac sim, setup a NGC account following instructions in Isaac Sim Container Setup.
bash build_docker.sh TAG, replaceTAGwith a name from thetagcolumn in the above table.bash start_docker.sh TAGto launch the built docker.
If you want to have a docker that also enables development by mounting a folder, you can create a user docker with
the below commands (skip step 4 above). This will mount a folder /home/${USER}/code into the docker container where you can do your
development.
bash build_dev_docker.sh TAGbash start_dev_docker.sh TAGwill start the docker.
Build Warp for NVIDIA Jetson (Deprecated)¶
Note
Warp is available from pypi starting with 0.11.0. The below instructions are not needed anymore.
NVIDIA Warp requires 11.5+ CUDA to compile and NVIDIA Jetson ships with CUDA 11.4. We will install
a new version of CUDA and then compile the library (.so) file for using warp on NVIDIA Jetson.
Install a newer version of cuda using the below commands (details):
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/arm64/cuda-ubuntu2004.pin sudo mv cuda-ubuntu2004.pin /etc/apt/preferences.d/cuda-repository-pin-600 wget https://developer.download.nvidia.com/compute/cuda/11.8.0/local_installers/cuda-tegra-repo-ubuntu2004-11-8-local_11.8.0-1_arm64.deb sudo dpkg -i cuda-tegra-repo-ubuntu2004-11-8-local_11.8.0-1_arm64.deb sudo cp /var/cuda-tegra-repo-ubuntu2004-11-8-local/cuda-*-keyring.gpg /usr/share/keyrings/ sudo apt-get update sudo apt-get -y install cuda export CUDA_HOME=/usr/local/cuda-11.8/
Download and compile warp inside a clone of curobo for use inside a dockerfile later:
git clone https://github.com/NVlabs/curobo.git && cd curobo && mkdir pkgs && cd pkgs git clone https://github.com/NVIDIA/warp.git && cd warp && python3 build_lib.py --no_standalone
Note
Make sure when running python3 build_lib.py for warp, it’s compiling the CUDA kernels. warp looks for cuda toolkit at
CUDA_HOME so set it to your cuda toolkit path before running build_lib.py.