Skip to content

Commit 5acd190

Browse files
authored
Merge pull request #1 from gpac/fix/lists-formatting
fix list formatting
2 parents da8c898 + 8d96311 commit 5acd190

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

41 files changed

+122
-0
lines changed

docs/Build/Upgrading.md

+1
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ If you build GPAC directly in the source tree (i.e., running `./configure && mak
2020
# Out of source tree building
2121

2222
To avoid the issue of cleaning dependencies, it is safer to have one dedicated build directory for each branch you test:
23+
2324
- `mkdir bin/master && cd bin/master && ../../configure && make -j`
2425
- `mkdir bin/somebranch && cd bin/master && git checkout somebranch && ../../configure && make -j`
2526

docs/Build/build/GPAC-Build-Guide-for-Linux.md

+3
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
_Preliminary notes: the following instructions will be based on Ubuntu and Debian. It should be easily applicable to other distributions, the only changes should be name of the packages to be installed, and the package manager used._
22

33
GPAC is a modular piece of software which depends on third-party libraries. During the build process it will try to detect and leverage the installed third-party libraries on your system. Here are the instructions to:
4+
45
* build GPAC easily (recommended for most users) from what's available on your system,
56
* build a minimal 'MP4Box' and 'gpac' (only contains GPAC core features like muxing and streaming),
67
* build a complete GPAC by rebuilding all the dependencies manually.
@@ -47,6 +48,7 @@ _If you are upgrading from a previous version (especially going from below 1.0.0
4748
## Use
4849

4950
You can either:
51+
5052
- `sudo make install` to install the binaries,
5153
- or use the `MP4Box` or `gpac` binary in `gpac_public/bin/gcc/` directly,
5254
- or move/copy it somewhere manually.
@@ -104,6 +106,7 @@ make
104106
4. Use
105107

106108
You can either:
109+
107110
- `sudo make install` to install the binaries,
108111
- or use the `MP4Box` or `gpac` binary in `gpac_public/bin/gcc/` directly,
109112
- or move/copy it somewhere manually.

docs/Filters/Rearchitecture.md

+4
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ The following lists the core principles of the re-architecture. Read the [genera
3636

3737
# Filter Design Principles
3838
A filter object obeys the following principles:
39+
3940
- may accept (consume) any number of data stream (named `PID` in this architecture)
4041
- may produce any number of PIDs
4142
- can have its input PIDs reconfigured at run time or even removed
@@ -65,6 +66,7 @@ The filter session main features are:
6566
- handle filters capability negotiation, usually inserting a filter chain to match the desired format
6667

6768
The filter session operates in a semi-blocking mode:
69+
6870
- it prevents filters in blocking mode (output PIDs buffers full) to operate
6971
- it will not prevent a running filter to dispatch a packet; this greatly simplifies demultiplexers writing
7072

@@ -82,6 +84,7 @@ These properties may also be overloaded by the user, e.g. to assign a ServiceID
8284
# Media Streams internal representation
8385

8486
In order to be able to exchange media stream data between filters, a unified data format had to be set, as follows:
87+
8588
- a frame is defined as a single-time block of data (Access Unit in MPEG terminology), but can be transferred in multiple packets
8689
- frames or fragments of frames are always transferred in processing order (e.g. decoding order for MPEG video)
8790
- multiplexed media data is identified as `file` data, where a frame is a complete file.
@@ -155,6 +158,7 @@ gpac -i source.mp4 reframer:xround=closest:splitrange:xs=2:xe=4 -o dest.mp4
155158
All other functionalities of MP4Box are not available through a filter session. Some might make it one day (BIFS encoding for example), but most of them are not good candidates for filter-based processing and will only be available through MP4Box (track add/remove to existing file, image item add/remove to existing file, file hinting, ...).
156159

157160
__Note__ For operations using a filter session in MP4Box, it is possible to view some information about the filter session:
161+
158162
- -fstat: this will print the statistics per filter and per PID of the session
159163
- -fgraph: this will print the connections between the filters in the session
160164

docs/Howtos/avmix_tuto.md

+13
Original file line numberDiff line numberDiff line change
@@ -105,6 +105,7 @@ _Note_
105105
A sequence not attached with a scene will not be visible nor played, even if active.
106106

107107
Now let's add :
108+
108109
- a logo
109110
- a bottom rectangle with a gradient
110111
- some text
@@ -131,6 +132,7 @@ In the following examples, we always use [relative coordinates system](avmix#coo
131132
## Animating a scene
132133

133134
Scenes can be animated through timer objects providing value interpolation instructions. A timer provides:
135+
134136
- a start time, stop time and a loop count
135137
- a duration for the interpolation period
136138
- a set of animation values and their targets
@@ -167,6 +169,7 @@ It can be tedious to apply the same transformation (matrix, active, ...) on a su
167169
The simplest way to do this is to group scenes together, and transform the group.
168170

169171
The following animates:
172+
170173
- the video from 90% to 100% , sticking it to the top-left corner and animated the rounded rectangle effect
171174
- the overlay group position from visible to hidden past the bottom-right corner
172175

@@ -270,6 +273,7 @@ This works with video scenes too:
270273

271274
You will at some point need to chain some videos. AVMix handles this through `sequence` objects describing how sources are to be chained.
272275
Sequences are designed to:
276+
273277
- take care of media prefetching to reduce loading times
274278
- perform transitions between sources, activating / prefetching based on the desired transition duration
275279

@@ -297,6 +301,7 @@ AVMix handles this by allowing scenes to use more than one sequence as input, an
297301
_Note: Currently, defined scenes only support 0, 1 or 2 input sequences_
298302

299303
This is done at scene declaration through:
304+
300305
- a `mix` object, describing a transition
301306
- a `mix_ratio` property, describing the transition ratio
302307

@@ -347,6 +352,7 @@ Specifying an identifier on the sequence avoids that.
347352
## Live mode
348353

349354
Live mode works like offline mode, with the following additions:
355+
350356
- detection and display of signal lost or no input sequences
351357
- `sequence` and `timer` start and stop time can be expressed as UTC dates (absolute) or current UTC offset
352358

@@ -368,6 +374,7 @@ You should now see "no input" message when playing. Without closing the player,
368374
]
369375
```
370376
And the video sequence will start ! You can use for start and stop time values:
377+
371378
- "now": will resolve to current UTC time
372379
- integer: will resolve to current UTC time plus the number of seconds specified by the integer
373380
- date: will use the date as the start/stop time
@@ -448,13 +455,15 @@ This is problematic if you use AVMix to generate a live feed supposed to be up 2
448455
To prevent this, the filter allows launching the sources as dedicated child processes. When the child process exits unexpectedly, or when source data is no longer received, the filter can then kill and relaunch the child process.
449456

450457
There are three supported methods for this:
458+
451459
- running a gpac instance over a pipe
452460
- running a gpac instance over TCP
453461
- running any other process capable of communicating with gpac
454462

455463
The declaration is done at the `sourceURL` level through the port option.
456464

457465
For each of these mode, the `keep_alive` option is used to decide if the child process shall be restarted:
466+
458467
- if no more data is received after `rtimeout`.
459468
- stream is in end of stream but child process exited with an error code greater than 2.
460469

@@ -598,6 +607,7 @@ return 0;
598607
```
599608

600609
Your module can also control the playlist through several functions:
610+
601611
- remove_element(id_or_elem): removes a scene, group or sequence from playlist
602612
- parse_element(JSON_obj): parses a root JSON object and add to the playlist
603613
- parse_scene(JSON_obj, parent_group): parses a scene from its JSON object and add it to parent_group, or to root if parent_group is null
@@ -729,12 +739,14 @@ In this mode, the texturing parameters used by the offscreen group can be modifi
729739
AVMix can use a global alpha mask (covering the entire output frame) for draw operations, through the [mask](avmix#scene-mask) scene module.
730740

731741
This differs from using an offscreen group as an alpha operand input to [shape](avmix#scene-shape) as discussed above as follows:
742+
732743
- the mask is global and not subject to any transformation
733744
- the mask is always cleared at the beginning of a frame
734745
- the mask is only one alpha channel
735746
- the mask operations can be accumulated between draws
736747

737748
The following example shows using a mask in regular mode:
749+
738750
- enable and clear mask
739751
- draw a circle with alpha 0.4
740752
- use mask and draw video, which will be blended only where the circle was drawn using alpha= 0.4
@@ -768,6 +780,7 @@ The following example shows using a mask in regular mode:
768780
The mask can also be updated while drawing using a record mode. In this mode, the mask acts as a binary filter, any pixel drawn to the mask will no longer get drawn.
769781

770782
The following draws:
783+
771784
- an ellipse with first video at half opacity, appearing blended on the background
772785
- the entire second video at full opacity, which will only appear where mask was not set
773786

docs/Howtos/dash/DASH-intro.md

+4
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,13 @@ GPAC has extended support for MPEG-DASH and HLS content generation and playback.
55
Basics concepts and terminology of MPEG-DASH are explained [here](DASH-basics) and, and the same terms are usually used in GPAC for both DASH and HLS.
66

77
For more information on content generation:
8+
89
- read MP4Box [DASH options](mp4box-dash-opts)
910
- read the [dasher](dasher) filter help
1011
- check the dash and HLS scripts in the GPAC [test suite](https://github.com/gpac/testsuite/tree/filters/scripts)
1112

1213
For more information on content playback:
14+
1315
- read the [dashin](dashin) filter help, used whenever a DASH or HLS session is read.
1416
- check the dash and HLS scripts in the GPAC [test suite](https://github.com/gpac/testsuite/tree/filters/scripts)
1517

@@ -19,6 +21,7 @@ If you generate your content with an third-party application such as ffmpeg, mak
1921
When using GPAC, this is usually ensure by using the `fintra` option.
2022

2123
GPAC can be used to generate both static and live DASH/HLS content. For live cases, GPAC can expose the created files:
24+
2225
- directly through disk
2326
- through its own HTTP server
2427
- by pushing them to a remote HTTP server
@@ -28,6 +31,7 @@ We recommend reading the [HTTP server](httpout) filter help, and looking at the
2831

2932
## Content Playback
3033
GPAC comes with a various set of adaptation algorithms:
34+
3135
- BBA0, BOLA, basic throughput (called `conventional` in the literature)
3236
- Custom throughput-based (`gbuf`) and buffer-based (`grate`) algorithms
3337

docs/Howtos/dash/Fragmentation,-segmentation,-splitting-and-interleaving.md

+1
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ Segmentation (`-dash`) is the process of creating segments, parts of an original
2020
Last, MP4Box can split (-split) a file and create individual playable files from an original one. It does not use segmentation in the above sense, it removes fragmentation and can use interleaving.
2121

2222
Some examples of MP4Box usages:
23+
2324
- Rewrites a file with an interleaving window of 1 sec.
2425

2526
`MP4Box -inter 1000 file.mp4`

docs/Howtos/dash/HAS-advanced.md

+2
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ Record the session in fragmented MP4
1515
gpac -i $HAS_URL -o grab/record.mp4:frag
1616
```
1717
Note that we specify [frag](mp4mx#store) option for the generated MP4 so that:
18+
1819
- we don't have a long multiplexing process at the end
1920
- if anything goes wrong (crash / battery dead / ...), we still have a file containing all media until the last written fragment.
2021

@@ -80,6 +81,7 @@ gpac -i $HAS_URL dashin:forward=file -o route://225.1.1.0:6000
8081

8182
The [DASH reader](dashin) can be configured through [-forward](dashin#forward) to insert segment boundaries in the media pipeline - see [here](dashin#segment-bound-modes) for more details.
8283
Two variants of this mode exist:
84+
8385
- `segb`: this enables `split_as`, DASH cue insertion (segment start signal) and fragment bounds signalling
8486
- `mani`: same as `segb` and also forward manifests (MPD, M3U8) as packet properties.
8587

docs/Howtos/dash/HEVC-Tile-based-adaptation-guide.md

+4
Original file line numberDiff line numberDiff line change
@@ -75,10 +75,12 @@ You can now playback your MPD using GPAC, and have fun with the different adapta
7575
## Live setup
7676

7777
If you want to produce a live feed of tiled video, you can either:
78+
7879
- produce short segments, package them and dash them using `-dash-live`, `dash-ctx` and `-subdur`, see discussion [here](https://github.com/gpac/gpac/issues/1648)
7980
- produce a live session with a [tilesplit](tilesplit) filter.
8081

8182
GPAC does not have a direct wrapper for Kvazaar, but you can either:
83+
8284
- use a FFmpeg build with Kvazaar enabled (`--enable-libkvazaar` in ffmpeg configure) - check GPAC support using `gpac -h ffenc:libkvazaar`
8385
- use an external grab+Kvazaar encoding and pipe its output into GPAC.
8486

@@ -134,6 +136,7 @@ gpac
134136

135137

136138
The resulting filter graph is quite fun (use `-graph` to check it) and shows:
139+
137140
- only one (or 0 depending on your webcam formats) pixel converter filter is used in the chain to feed both Kvazaar instances
138141
- all tile PIDs (and only them) connecting to the dasher filter
139142
- 21 output PIDs of the dasher: one for MPD, 2 x (1+3x3) media PIDs.
@@ -179,6 +182,7 @@ In 2D playback, the tile adaptation logic (for ROI for example) is controlled b
179182

180183
The compositor can use gaze information to automatically decrease the quality of the tiles not below the gaze.
181184
The gaze information can be:
185+
182186
- emulated via mouse using [--sgaze](compositor#sgaze) option.
183187
- signaled through filter updates on the [gazer_enabled](compositor#gazer_enabled) [gaze_x](compositor#gaze_x) [gaze_y](compositor#gaze_y)
184188

docs/Howtos/dash/HEVC-Tile-multi-resolution-adaptation-guide.md

+1
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,7 @@ Check the [HEVC Tile-based adaptation guide](HEVC-Tile-multi-resolution-adaptati
6868
# Content Playback
6969

7070
The logic of content playback is as follows:
71+
7172
- the MPD indicates SRD information and a GPAC extension for mergeable bitstream
7273
- when the compositor is used, the [hevcmerge](hevcmerge) filter is automatically created to reassemble the streams
7374
- otherwise (using vout), each PID is declared as an alternative to the other

docs/Howtos/dash/LL-DASH.md

+2
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ And when using gpac, you can enable real-time reporting of filters activities us
1919

2020

2121
The `gpac` application can be used for dashing whenever `MP4Box` is used, but the opposite is not true. Especially MP4Box cannot:
22+
2223
- use complex custom filter chains while dashing, such as transcoding in several qualities
2324
- produce two DASH sessions at the same time
2425

@@ -169,6 +170,7 @@ gpac -i source1 -i source2 reframer:rt=on -o http://ORIG_SERVER_IP_PORT/live.mpd
169170

170171
We will now use a live source (webcam), encode it in two qualities, DASH the result and push it to a remote server. Please check the [encoding howto](encoding) first.
171172
Compared to what we have seen previously, we only need to modify the input part of the graph:
173+
172174
- take as a live source the default audio video grabbed by the [libavdevice](ffavin) filter
173175
- rescale the video as 1080p and 720p
174176
- encode the rescaled videos at 6 and 3 mbps

docs/Howtos/dash/LL-HLS.md

+1
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@ In this howto, we will study various setups for HLS live streaming in low latenc
99
The same setup for configuring segments and CMAF chunks is used as the [DASH low latency](LL-DASH#dash-low-latency-setup) setup.
1010

1111
When you have low-latency producing of your HLS media segments, you need to indicate to the client how to access LL-HLS `parts` (CMAF chunks) while they are produced. LL-HLS offers two possibilities to describe these parts in the manifest:
12+
1213
- file mode: advertise the chunks as dedicated files, i.e. each chunk will create its own file. This requires double storage for segments close to the live edge, increases disk IOs and might not be very practical if you setup a PUSH origin (twice the bandwidth is required)
1314
- byte range mode: advertise the chunks as byte range of a media file. If that media file is the full segment being produced (usually the case), this does not induce bandwidth increase or extra disk IOs.
1415

docs/Howtos/dash/cmaf.md

+2
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ GPAC can be used to generate DASH or HLS following the CMAF specification.
55

66
CMAF defines two structural brands `cmfc`and `cmf2` for ISOBMFF-segmented content.
77
The `cmfc` brand constraints:
8+
89
- some default values in ISOBMFF boxes
910
- a single media per file
1011
- a single track fragment per movie fragment (`moof`)
@@ -16,6 +17,7 @@ The `cmfc` brand constraints:
1617

1718

1819
The `cmf2`brand further restrict the `cmfc` brand for video tracks:
20+
1921
- no edit list shall be used
2022
- negative composition offset (`trun` version 1) shall be used
2123
- sample default values shall be repeated in each track fragment

docs/Howtos/dash/dash_transcoding.md

+1
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ In this howto, we will study various setups for DASH transcoding.
55
Please make sure you are familiar with [DASH terminology](DASH-basics) before reading.
66

77
It is likely that your source media is not properly encoded for DASH or HLS delivery, most likely because:
8+
89
- openGOPs are used
910
- key-frame position do not match between your different qualities
1011
- key-frame intervals are not constant

docs/Howtos/dash/hls.md

+1
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,7 @@ This will generate `live.m3u8`, `video.m3u8` and `audio.m3u8`
3737
# Renditions
3838
## Grouping
3939
When several renditions are possible for a set of inputs, the default behavior is as follows:
40+
4041
- if video is present, it is used as the main content
4142
- otherwise, audio is used as the main content
4243

docs/Howtos/dynamic_rc.md

+2
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ In this example we will use RTP as delivery mechanism and monitor loss rate of c
1212
## RTP reader
1313

1414
The reader is a regular video playback from RTP (using SDP as input). We will:
15+
1516
- locate the `rtpin` filter in the chain, i.e. the first filter after the `fin`filter used for SDP access
1617
- update every 2 second the `loss_rate`option of the `rtpin` filter: this will force the loss ratio in RTCP Receiver Reports, but will not drop any packet at the receiver side
1718

@@ -77,6 +78,7 @@ gpac.close()
7778
## Encoder and RTP sender
7879

7980
The encoder consists in a source (here a single video file playing in loop), an AVC encoder and an RTP output. We will:
81+
8082
- locate the `rtpout` filter in the chain, i.e. the first filter before the `fout` filter used for SDP output
8183
- monitor every 2 second the statistics of the input PID of `rtpout` to get the real-time measurements reported by RTCP
8284
- adjust encoder max rate based on the percentage of loss packets

docs/Howtos/encoding.md

+1
Original file line numberDiff line numberDiff line change
@@ -67,6 +67,7 @@ The above command will encode the video track in `source.mp4` into AVC|H264 at
6767
```gpac -i source.mp4 c=avc::x264-params=no-mbtree:sync-lookahead=0::profile=baseline -o test.avc```
6868

6969
The above command will encode the video track in `source.mp4` into AVC|H264 and pass two options to ffmpeg encoder:
70+
7071
- `x264-params`, with value `no-mbtree:sync-lookahead=0`
7172
- `profile`, with value `baseline`
7273

docs/Howtos/encryption/encryption-filters.md

+1
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,7 @@ Another possibility is to define the `CryptInfo` PID property rather than using
6666
gpac -i udp://localhost:1234/:#CrypTrack=(audio)drm_audio.xml,(video)drm_video.xml cecrypt -o dest.mpd:profile=live:dmode=dynamic
6767
```
6868
This example assigns:
69+
6970
- a `CryptInfo` property to `drm_audio.xml` for PIDs of type audio
7071
- a `CryptInfo` property to `drm_video.xml` for PIDs of type video
7172
- no `CryptInfo` property for other PIDs

docs/Howtos/filters-oneliners.md

+4
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,12 @@
11
# Foreword
22

33
This page contains one-liners illustrating the many possibilities of GPAC filters architecture. For a more detailed information, it is highly recommended that you read:
4+
45
- the [general concepts](filters_general) page
56
- the [gpac application](gpac_general) help
67

78
To get a better understanding of each command illustrated in this case, it is recommended to:
9+
810
- run the same command with `-graph` specified to see the filter graph associated
911
- read the help of the different filters in this graph using `gpac -h filter_name`
1012

@@ -13,6 +15,7 @@ Whenever an option is specified, e.g. `dest.mp4:foo`, you can get more info and
1315
The filter session is by default quiet, except for warnings and error reporting. To get information on the session while running, use [-r](gpac_general#r) option. To get more runtime information, use the [log system](core_logs).
1416

1517
Given the configurable nature of the filter architecture, most examples given in one context can be reused in another context. For example:
18+
1619
- from the dump examples:
1720
```
1821
gpac -i source reframer:saps=1 -o dump/$num$.png
@@ -34,6 +37,7 @@ _NOTE The command lines given here are usually using a local file for source or
3437
_Reminder_
3538
Most filters are never specified at the prompt, they are dynamically loaded during the graph resolution.
3639
GPAC filters can use either:
40+
3741
- global options, e.g. `--foo`, applying to each instance of any filter defining the `foo` option,
3842
- local options to a given filter and any filters dynamically loaded, e.g. `:foo`. This is called [argument inheriting](filters_general#arguments-inheriting).
3943

0 commit comments

Comments
 (0)