You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I read your article, it is a very impressive and interesting work. But according to the description of the repository, it is not entirely clear how you can run the model and evaluate its performance. You apply the pre-trained weights that are attached in the README, but at startup:
Besides that, I tried to run train_ae.sh , and in .py script for the train prescribes the paths to the mount points of the your local computer, and when setting its path to the gibson dataset, the model still does not start, it feels like the models need pictures specifically, but on the gibson dataset.
From here, I would like to know all the detailed steps for the successful launch of your model. Maybe the problem could be solved by using Dockerfile with all the dependencies fixed, I did a fork of the repository, I added Dockerfile to the repository, if there is a desire, you can use it as a draft.
Thank you very much for your work, I will be waiting for a response!
The text was updated successfully, but these errors were encountered:
Hello,
Thank you for your interest in our work and sorry for the very late reply as I have been on vacation.
The weight that is used in config.yaml can be downloaded at checkpoints as mentioned in README (SE-ResNeXt-50).
The main agent: DD-PPO (BatchNorm version)
Gibson dataset here refers to a set of 112k images collected by robot navigation in Gibson. However, due to the terms of use of Gibson, I may not be able to distribute these images. But you can simply create your own Gibson image datasets by saving camera frames of the navigation robot in Gibson scenes. You can download Gibson scenes here: https://docs.google.com/forms/d/e/1FAIpQLScWlx5Z1DM1M-wTSXaa6zV8lTFkPmTHW1LqMsoCBDWsTDjBkQ/viewform
If you've never used Habitat before, I suggest you try running simple navigation agents on the official Habitat code and understand all essential scripts in habitat-baselines (especially in ppo and ddppo folders). Our code is a minor modification (neural network inference parts) of the official code. Please let me know if you have questions.
Dockerfile can be downloaded on Habitat official repo.
Hi! I read your article, it is a very impressive and interesting work. But according to the description of the repository, it is not entirely clear how you can run the model and evaluate its performance. You apply the pre-trained weights that are attached in the README, but at startup:
the script asks for completely different weights.
Besides that, I tried to run
train_ae.sh
, and in.py
script for the train prescribes the paths to the mount points of the your local computer, and when setting its path to the gibson dataset, the model still does not start, it feels like the models need pictures specifically, but on the gibson dataset.From here, I would like to know all the detailed steps for the successful launch of your model. Maybe the problem could be solved by using
Dockerfile
with all the dependencies fixed, I did a fork of the repository, I addedDockerfile
to the repository, if there is a desire, you can use it as a draft.Thank you very much for your work, I will be waiting for a response!
The text was updated successfully, but these errors were encountered: