This is the Code for the paper 'Human Cognition-Inspired Active Room Segmentation'. Inspired by the human cognition system, this method incorporates vision input as an additional feature and follows a room-by-room exploration strategy to facilitate both the room exploration and exploration tasks. For full details refer to the paper.
The habitat-sim and habitat-api used in this method are the same as ANS. Please refer to ANS for installing the specific version of habitat-sim and habitat-api.
After installing habitat, clone the repository and install other requirements:
git clone https://github.com/B0GGY/Active_room_segmentation.git
cd Active_room_segmentation
pip install -r requirements.txt
In this method, we borrow the door detection network from aislabunimi. The train params of the network can be downloaded from here. After downloading and unzipping it:
cd detr_door_detection
mkdir -p train_params/detr_resnet_50_4
mv 'path to the downloaded final_doors_dataset' train_params/detr_resnet_50_4
To download the Gibson scene dataset and task datasets(Point goal navigation), please refer to this site.
For running the active room segmentation method:
python explorable_with_door_detection.py --split val --eval 1 -n 1 -v 1 --train_global 0 --train_local 0 --train_slam 0