Skip to content

Commit

Permalink
Reference moving camera in main README and add demo.
Browse files Browse the repository at this point in the history
Also improve moving camera README.
  • Loading branch information
dekked committed Sep 2, 2022
1 parent 97cb9d8 commit 75764ed
Show file tree
Hide file tree
Showing 2 changed files with 25 additions and 23 deletions.
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ Norfair is a customizable lightweight Python library for real-time 2D object tra

Using Norfair, you can add tracking capabilities to any detector with just a few lines of code.

<img src="/docs/soccer.gif" alt="Tracking soccer players with Norfair and a moving camera." width="500px" />

## Features

- Any detector expressing its detections as a series of `(x, y)` coordinates can be used with Norfair. This includes detectors performing object detection, pose estimation, and keypoint detection (see [examples](#examples--demos)).
Expand Down Expand Up @@ -108,7 +110,8 @@ Most tracking demos are showcased with vehicles and pedestrians, but the detecto
### Advanced features

1. [Speed up pose estimation by extrapolating detections](demos/openpose) using [OpenPose](https://github.com/CMU-Perceptual-Computing-Lab/openpose).
2. [Re-identification (ReID) ](demos/reid) of tracked objects using appearance embeddings. This is a good starting point for scenarios with a lot of occlusion, in which the Kalman filter alone would struggle.
2. [Re-identification (ReID)](demos/reid) of tracked objects using appearance embeddings. This is a good starting point for scenarios with a lot of occlusion, in which the Kalman filter alone would struggle.
3. [Accurately track objects even if the camera is moving](demos/camera_motion), by estimating camera motion potentially accounting for pan, tilt, rotation, movement in any direction, and zoom.

### Benchmarking and profiling

Expand Down
43 changes: 21 additions & 22 deletions demos/camera_motion/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,41 +5,40 @@ In this example, we show how to estimate the camera movement in Norfair.
What's the motivation for estimating camera movement?

- When the camera moves, the apparent movement of the objects can be quite erratic and confuse the tracker; by estimating the camera movement we can stabilize the objects and improve tracking.
- By estimating the position of objects in a fixed reference we can correctly calculate their trajectory. This can help you if you are trying to determine when objects enter a predefined zone on the scene or trying to draw their trajectory
- By estimating the position of objects in a fixed reference we can correctly calculate their trajectory. This can help you if you are trying to determine when objects enter a predefined zone in the scene or trying to draw their trajectory.

Keep in mind that for estimating the camera movement we rely on a static background, if the scene is too chaotic with a lot of movement the estimation will lose accuracy. Nevertheless, even when the estimation is incorrect it will not hurt the tracking.
Keep in mind that the estimation of the camera movement works best with a static background. If the scene is too chaotic with a lot of movement, the estimation will lose accuracy. Nevertheless, even when the estimation is incorrect it will not hurt the tracking.

## First Example - Translation
## Example 1: Translation

This method only works for camera pans and tilts.
This method only works for camera pans and tilts.

![Pan and Tilt](/docs/pan_tilt.png)
<img src="/docs/pan_tilt.png" alt="Pan and tilt" width="350px">

The following video shows on the left we lost the person 4 times while on the right we were able to maintain the tracked object throughout the video:
For an example of results, see the following videos. On the left, the tracker lost the person 4 times (as seen by the increasing id, and the color of the bounding box changing). However, on the right the tracker is able to maintain the tracked object throughout the video:

![camera_stabilization](/docs/camera_stabilization.gif)
![Tracking an object with Norfair, with and without camera stabilization.](/docs/camera_stabilization.gif)

> videos generated using command `python demo.py --transformation none --draw-objects --track-boxes --id-size 1.8 --distance-threshold 200 --save video.mp4` and `python demo.py --transformation translation --fixed-camera-scale 2 --draw-objects --track-boxes --id-size 1.8 --distance-threshold 200 --save video.mp4`
> Videos generated using command `python demo.py --transformation none --draw-objects --track-boxes --id-size 1.8 --distance-threshold 200 --save <video>.mp4` and `python demo.py --transformation translation --fixed-camera-scale 2 --draw-objects --track-boxes --id-size 1.8 --distance-threshold 200 --save <video>.mp4`
## Second Example - Homographies
## Example 2: Homographies

This method can work with any camera movement, this includes pan, tilt, rotation, traveling in any direction, and zoom.
This method can work with any camera movement, including pan, tilt, rotation, movement in any direction, and zoom.

In the following video, the correct trajectory of the players is drawn even as the camera moves:
In the following video, the players are tracked and their trajectories are drawn, even as the camera moves:

![soccer](/docs/soccer.gif)
![Tracking soccer players with Norfair and a moving camera.](/docs/soccer.gif)

> video generated using command `python demo.py --transformation homography --draw-paths --path-history 150 --distance-threshold 200 --track-boxes --max-points=900 --min-distance=14 --save --model yolov5x --hit-counter-max 3 video.mp4` on a snippet of this [video](https://www.youtube.com/watch?v=CGFgHjeEkbY&t=1200s)
> Video generated using command `python demo.py --transformation homography --draw-paths --path-history 150 --distance-threshold 200 --track-boxes --max-points=900 --min-distance=14 --save --model yolov5x --hit-counter-max 3 <video>.mp4` on a snippet of this [video](https://www.youtube.com/watch?v=CGFgHjeEkbY&t=1200s).
## Instructions

## Setup
1. Build and run the Docker container with `./run_gpu.sh`.
2. Copy a video to the `src` folder.
3. Within the container, run with the default parameters:

Build and run the Docker container with ./run_gpu.sh.
```bash
python demo.py <video>.mp4
```

Copy a video to the src folder.

Within the container, run with the default parameters:

`python demo.py <video>.mp4`

For additional settings, you may display the instructions using `python demo.py --help`.
For additional settings, you may display the instructions using `python demo.py --help`.

0 comments on commit 75764ed

Please sign in to comment.