This is an official implementation of the Social-NCE applied to the Trajectron++ forecasting model.
Social NCE: Contrastive Learning of Socially-aware Motion Representations
by
Yuejiang Liu,
Qi Yan,
Alexandre Alahi at
EPFL
to appear at ICCV 2021
TL;DR: Contrastive Representation Learning + Negative Data Augmentations 🡲 Robust Neural Motion Models
- Rank in 1st place on the Trajnet++ challenge since November 2020 to present
- Significantly reduce the collision rate of SOTA human trajectroy forecasting models
- SOTA on imitation / reinforcement learning for autonomous navigation in crowds
Please check out our code for experiments on different models as follows:
Social NCE + Trajectron | Social NCE + STGCNN | Social NCE + CrowdNav
Setup environments follwoing the SETUP.md
To train a model on the ETH / UCY Pedestrian datasets, the dataset needs to be specified, e.g.,
DATASET=univ
The vanilla Trajectron++ model can be trained and evaluated as follows:
bash scripts/run_train.sh ${DATASET} 0.0 && bash scripts/run_eval.sh ${DATASET} 0.0
To train the Trajectron++ with Social-NCE, run the following command:
bash scripts/run_train.sh ${DATASET} && bash bash scripts/run_eval.sh ${DATASET}
To search for hyper-parameters on different datasets, run the following bash scripts:
bash scripts/run_<dataset>.sh
Our pre-trained models can be downloaded as follows:
gdown https://drive.google.com/uc?id=1APAIlgJS9BDZHFCvwMrfzj9z_9DSS6LB
unzip pretrained_trajectron++.zip -d experiments/pedestrians/models
To compare different models, run the following command:
python benchmark.py --dataset ${DATASET}
The scripts above yield the following results (on GeForce RTX 3090). The result may subject to mild variance on different GPU devices.
On average, our method reduces the collision rate of the Trajectron++ by over 45%, without degrading its performance in terms of prediction accuracy and diversity.
Epoch | Trajectron++ w/o Ours | Trajectron++ w/ Ours | ||||
---|---|---|---|---|---|---|
ADE | FDE | COL | ADE | FDE | COL | |
ETH | 0.388 | 0.810 | 1.156 | 0.386 | 0.791 | 0.000 |
HOTEL | 0.110 | 0.184 | 0.837 | 0.107 | 0.177 | 0.381 |
UNIV | 0.199 | 0.450 | 3.378 | 0.195 | 0.435 | 3.079 |
ZARA1 | 0.148 | 0.320 | 0.462 | 0.150 | 0.330 | 0.178 |
ZARA2 | 0.114 | 0.250 | 1.027 | 0.114 | 0.255 | 0.993 |
Average | 0.192 | 0.403 | 1.372 | 0.191 | 0.398 | 0.926 |
If you find this code useful for your research, please cite our paper:
@article{liu2020snce,
title = {Social NCE: Contrastive Learning of Socially-aware Motion Representations},
author = {Yuejiang Liu and Qi Yan and Alexandre Alahi},
journal = {arXiv preprint arXiv:2012.11717},
year = {2020}
}
Our code is developed upon the official implementation of Trajectron++.