Skip to content

Latest commit

 

History

History
193 lines (147 loc) · 7.15 KB

README.md

File metadata and controls

193 lines (147 loc) · 7.15 KB

[NeurIPS 2024] Hawk: Learning to Understand Open-World Video Anomalies

This is the official repository for Hawk.

Jiaqi Tang^, Hao Lu^, Ruizheng Wu, Xiaogang Xu, Ke Ma, Cheng Fang,

Bin Guo, Jiangbo Lu, Qifeng Chen and Ying-Cong Chen*

^: Equal contribution. *: Corresponding Author.

GitHub license made-for-VSCode Visits Badge

Have eyes like a HAWK!

🔍 Motivation - Have eyes like a Hawk!

  • 🚩 Current VAD systems are often limited by their superficial semantic understanding of scenes and minimal user interaction.

  • 🚩 Additionally, the prevalent data scarcity in existing datasets restricts their applicability in open-world scenarios.

    Hawk

📢 Updates

  • ✅ Step 26, 2024 - Hawk is accepted by NeurIPS 2024.
  • ✅ July 29, 2024 - We release the dataset of Hawk. Check this Google Cloud link for DOWNLOAD.

▶️ Getting Started

🪒 Installation

💾 Dataset Preparation

  • DOWNLOAD all video datasets for their original sources.

    1. CUHK_Avenue
    2. DoTA
    3. Ped1
    4. Ped2
    5. ShanghaiTech
    6. UBNormal
    7. UCF_Crime
  • Google Drive Link to DOWNLOAD our annotations.

  • Data Structure: each folder contains one annotation file (e.g. CUHK Avenue, DoTA, etc.). The All_Mix directory contains all of the datasets in training and testing.

  • The dataset is organized as follows:

    data
    ├── All_Mix
    │   ├── all_videos_all.json
    │   ├── all_videos_test.json
    │   └── all_videos_train.json
    │    
    ├── CUHK_Avenue
    │   └── Avenue.json
    ├── DoTA
    │   └── DoTA.json
    ├── Ped1
    │   ├── ...
    ├── ...
    └── UCF_Crime
        └── ...
    

    Note:the data path should be redefined.

🏰 Pretrained Model

🔨 Configuration

Testing

🖥️ Training

Performance

🌐 Citations

The following is a BibTeX reference:

@inproceedings{atang2024hawk,
  title = {Hawk: Learning to Understand Open-World Video Anomalies},
  author = {Tang, Jiaqi and Lu, Hao and Wu, Ruizheng and Xu, Xiaogang and Ma, Ke and Fang, Cheng and Guo, Bin and Lu, Jiangbo and Chen, Qifeng and Chen, Ying-Cong},
  year = {2024},
  booktitle = {Neural Information Processing Systems (NeurIPS)}
}

📧 Connecting with Us?

If you have any questions, please feel free to send email to [email protected].