Skip to content

Commit c6fea4c

Browse files
MediaPipe Teamjqtang
MediaPipe Team
authored and
jqtang
committed
Project import generated by Copybara.
GitOrigin-RevId: 1a0caa03bbf3673dbe772c8045b687c6b6821bcc
1 parent 259b48e commit c6fea4c

File tree

1 file changed

+41
-41
lines changed
  • mediapipe/examples/desktop/youtube8m

1 file changed

+41
-41
lines changed

Diff for: mediapipe/examples/desktop/youtube8m/README.md

+41-41
Original file line numberDiff line numberDiff line change
@@ -59,62 +59,62 @@
5959

6060
1. Download the YT8M dataset
6161

62-
For example, download one shard of the training data:
62+
For example, download one shard of the training data:
6363

64-
```bash
65-
curl http://us.data.yt8m.org/2/frame/train/trainpj.tfrecord --output /tmp/mediapipe/trainpj.tfrecord
66-
```
64+
```bash
65+
curl http://us.data.yt8m.org/2/frame/train/trainpj.tfrecord --output /tmp/mediapipe/trainpj.tfrecord
66+
```
6767

6868
2. Copy the baseline model [(model card)](https://drive.google.com/file/d/1xTCi9-Nm9dt2KIk8WR0dDFrIssWawyXy/view) to local.
6969

70-
```bash
71-
curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz
70+
```bash
71+
curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz
7272
73-
tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe
74-
```
73+
tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe
74+
```
7575

7676
3. Build and run the inference binary.
7777

78-
```bash
79-
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
78+
```bash
79+
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
8080
mediapipe/examples/desktop/youtube8m:model_inference
8181
82-
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \
83-
--calculator_graph_config_file=mediapipe/graphs/youtube8m/yt8m_dataset_model_inference.pbtxt \
84-
--input_side_packets=tfrecord_path=/tmp/mediapipe/trainpj.tfrecord,record_index=0,desired_segment_size=5 \
85-
--output_stream=annotation_summary \
86-
--output_stream_file=/tmp/summary \
87-
--output_side_packets=yt8m_id \
88-
--output_side_packets_file=/tmp/yt8m_id
89-
```
82+
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \
83+
--calculator_graph_config_file=mediapipe/graphs/youtube8m/yt8m_dataset_model_inference.pbtxt \
84+
--input_side_packets=tfrecord_path=/tmp/mediapipe/trainpj.tfrecord,record_index=0,desired_segment_size=5 \
85+
--output_stream=annotation_summary \
86+
--output_stream_file=/tmp/summary \
87+
--output_side_packets=yt8m_id \
88+
--output_side_packets_file=/tmp/yt8m_id
89+
```
9090

9191
### Steps to run the YouTube-8M model inference graph with Web Interface
9292

9393
1. Copy the baseline model [(model card)](https://drive.google.com/file/d/1xTCi9-Nm9dt2KIk8WR0dDFrIssWawyXy/view) to local.
9494

9595

96-
```bash
97-
curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz
96+
```bash
97+
curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz
9898
99-
tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe
100-
```
99+
tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe
100+
```
101101

102102
2. Build the inference binary.
103103

104-
```bash
105-
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
106-
mediapipe/examples/desktop/youtube8m:model_inference
107-
```
104+
```bash
105+
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
106+
mediapipe/examples/desktop/youtube8m:model_inference
107+
```
108108

109109
3. Run the python web server.
110110

111-
Note: pip install absl-py
111+
Note: pip install absl-py
112112

113-
```bash
114-
python mediapipe/examples/desktop/youtube8m/viewer/server.py --root `pwd`
115-
```
113+
```bash
114+
python mediapipe/examples/desktop/youtube8m/viewer/server.py --root `pwd`
115+
```
116116

117-
Navigate to localhost:8008 in a web browser.
117+
Navigate to localhost:8008 in a web browser.
118118

119119
### Steps to run the YouTube-8M model inference graph with a local video
120120

@@ -130,15 +130,15 @@
130130

131131
3. Build and run the inference binary.
132132

133-
```bash
134-
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
135-
mediapipe/examples/desktop/youtube8m:model_inference
136-
137-
# segment_size is the number of seconds window of frames.
138-
# overlap is the number of seconds adjacent segments share.
139-
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \
140-
--calculator_graph_config_file=mediapipe/graphs/youtube8m/local_video_model_inference.pbtxt \
141-
--input_side_packets=input_sequence_example_path=/tmp/mediapipe/output.tfrecord,input_video_path=/absolute/path/to/the/local/video/file,output_video_path=/tmp/mediapipe/annotated_video.mp4,segment_size=5,overlap=4
142-
```
133+
```bash
134+
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
135+
mediapipe/examples/desktop/youtube8m:model_inference
136+
137+
# segment_size is the number of seconds window of frames.
138+
# overlap is the number of seconds adjacent segments share.
139+
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \
140+
--calculator_graph_config_file=mediapipe/graphs/youtube8m/local_video_model_inference.pbtxt \
141+
--input_side_packets=input_sequence_example_path=/tmp/mediapipe/output.tfrecord,input_video_path=/absolute/path/to/the/local/video/file,output_video_path=/tmp/mediapipe/annotated_video.mp4,segment_size=5,overlap=4
142+
```
143143

144144
4. View the annotated video.

0 commit comments

Comments
 (0)