This is an official implementation of the Social-NCE applied to the Social-STGCNN forecasting model.
Social NCE: Contrastive Learning of Socially-aware Motion Representations
by
Yuejiang Liu,
Qi Yan,
Alexandre Alahi at
EPFL
to appear at ICCV 2021
TL;DR: Contrastive Representation Learning + Negative Data Augmentations 🡲 Robust Neural Motion Models
- Rank in 1st place on the Trajnet++ challenge since November 2020 to present
- Significantly reduce the collision rate of SOTA human trajectroy forecasting models
- SOTA on imitation / reinforcement learning for autonomous navigation in crowds
Please check out our code for experiments on different models as follows:
Social NCE + STGCNN | Social NCE + CrowdNav | Social NCE + Trajectron
Setup environments following the SETUP.md
Train from scratch:
bash train_snce.sh && python test.py --prefix snce # for social-nce results
bash train_random_sampling.sh && python test.py --prefix random-sampling # for social-nce + random sampling
See generated results_default.csv
for detailed results.
Test the pretrained model:
python test.py --mode snce # our social-nce pretrained model
python test.py --mode random-sampling # our social-nce method + random sampling, for ablation study
python test.py --mode baseline # baseline pretrained model, obtained from official repo.
See generated results_default.csv
for detailed results.
The script above results in the following results (on NVIDIA GeForce RTX 3090). The result may subject to mild variance on different GPU devices. More details will be released soon!
Scene | Social-STGCNN w/o Ours | Social-STGCNN w/ Ours | ||||
---|---|---|---|---|---|---|
ADE | FDE | COL | ADE | FDE | COL | |
ETH | 0.732 | 1.223 | 1.33 | 0.664 | 1.224 | 0.61 |
HOTEL | 0.414 | 0.687 | 3.82 | 0.435 | 0.678 | 3.35 |
UNIV | 0.489 | 0.912 | 9.11 | 0.473 | 0.879 | 6.44 |
ZARA1 | 0.333 | 0.525 | 2.27 | 0.325 | 0.515 | 1.02 |
ZARA2 | 0.303 | 0.480 | 6.86 | 0.289 | 0.482 | 3.37 |
Average | 0.454 | 0.765 | 4.70 | 0.437 | 0.756 | 2.96 |
If you find this code useful for your research, please cite our paper:
@article{liu2020snce,
title = {Social NCE: Contrastive Learning of Socially-aware Motion Representations},
author = {Yuejiang Liu and Qi Yan and Alexandre Alahi},
journal = {arXiv preprint arXiv:2012.11717},
year = {2020}
}
Our code is developed upon the official implementation of Social-STGCNN.