-
Notifications
You must be signed in to change notification settings - Fork 33
Advanced Examples
- Rapid image capture
- Raw video from resizer
- Raw video from splitter
- Raw video from resizer with splitter component
- Encode / Decode from Stream - Image
- Encode / Decode from Stream - Video
- Static render overlay
- FFmpeg - RTMP streaming
- FFmpeg - Raw video convert
- FFmpeg - Images to video
For FFmpeg functionality, you will need to install the latest version of FFmpeg from source - do not install from the Raspbian repositories as they don't have H.264 support.
A guide to installing FFmpeg from source including the H.264 codec can be found here
By utilising the camera's video port, we are able to retrieve image frames at a much higher speed than using the conventional still port. Images captured via the video port will be of a lesser quality and do not support EXIF.
public async Task TakePictureFromVideoPort()
{
MMALCamera cam = MMALCamera.Instance;
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var splitter = new MMALSplitterComponent(null))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler, continuousCapture: true))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(portConfig);
cam.Camera.VideoPort.ConnectTo(splitter);
splitter.Outputs[0].ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
// Process images for 15 seconds.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Available in v0.5.1
The resizer component can adjust the resolution coming from the camera's video port, allowing you to record raw YUV420 frames. The resizer component accepts an optional Port Type to specify what type of port you want to use on the resizer's output. In the example below, we're passing in a VideoPort
type which as the name suggests, is specifically used for video capture; if no type is passed in, the resizer will default to using StillPort
for still image captures.
public async Task RecordVideoDirectlyFromResizer()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var resizer = new MMALResizerComponent(vidCaptureHandler, typeof(VideoPort)))
using (var preview = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Use the resizer to resize 1080p to 640x480.
var portConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 640, 480, 0, 0, 0, false, null);
resizer.ConfigureOutputPort(portConfig);
// Create our component pipeline.
cam.Camera.VideoPort
.ConnectTo(resizer);
cam.Camera.PreviewPort
.ConnectTo(preview);
// Camera warm up time
await Task.Delay(2000);
CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
// Record video for 20 seconds
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Available in v0.5.1
The splitter component can also be used to record raw video frames. The splitter component accepts an optional Port Type to specify what type of port you want to use on all of the splitter's output ports. In the example below, we're passing in a VideoPort
type which as the name suggests, is specifically used for video capture; if no type is passed in, the splitter will simply act as a pass-through component.
public async Task RecordVideoDirectlyFromSplitter()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var vidCaptureHandler2 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var vidCaptureHandler3 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var vidCaptureHandler4 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var preview = new MMALVideoRenderer())
using (var splitter = new MMALSplitterComponent(new[] { vidCaptureHandler, vidCaptureHandler2, vidCaptureHandler3, vidCaptureHandler4 }, typeof(VideoPort)))
{
cam.ConfigureCameraSettings();
var splitterPortConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 0, 0, 0, null);
// Create our component pipeline.
splitter.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, cam.Camera.VideoPort);
splitter.ConfigureOutputPort(0, splitterPortConfig);
splitter.ConfigureOutputPort(1, splitterPortConfig);
splitter.ConfigureOutputPort(2, splitterPortConfig);
splitter.ConfigureOutputPort(3, splitterPortConfig);
cam.Camera.VideoPort.ConnectTo(splitter);
cam.Camera.PreviewPort.ConnectTo(preview);
// Camera warm up time
await Task.Delay(2000);
CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(15));
// Record video for 20 seconds
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Available in v0.5.1
You can combine both previous examples into one by using the resizer component with the splitter. By combining the components, you can potentially resize up to 4 separate raw video streams which adds a lot of flexibility to your application.
public async Task RecordVideoDirectlyFromResizerWithSplitterComponent()
{
MMALCamera cam = MMALCamera.Instance;
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var vidCaptureHandler2 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var vidCaptureHandler3 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var vidCaptureHandler4 = new VideoStreamCaptureHandler("/home/pi/videos/tests", "raw"))
using (var preview = new MMALVideoRenderer())
using (var splitter = new MMALSplitterComponent())
using (var resizer = new MMALResizerComponent(vidCaptureHandler, typeof(VideoPort)))
using (var resizer2 = new MMALResizerComponent(vidCaptureHandler2, typeof(VideoPort)))
using (var resizer3 = new MMALResizerComponent(vidCaptureHandler3, typeof(VideoPort)))
using (var resizer4 = new MMALResizerComponent(vidCaptureHandler4, typeof(VideoPort)))
{
cam.ConfigureCameraSettings();
var splitterPortConfig = new MMALPortConfig(MMALEncoding.OPAQUE, MMALEncoding.I420, 0, 0, 0, null);
var portConfig = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 1024, 768, 0, 0, 0, false, DateTime.Now.AddSeconds(20));
var portConfig2 = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 800, 600, 0, 0, 0, false, DateTime.Now.AddSeconds(20));
var portConfig3 = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 640, 480, 0, 0, 0, false, DateTime.Now.AddSeconds(15));
var portConfig4 = new MMALPortConfig(MMALEncoding.I420, MMALEncoding.I420, 320, 240, 0, 0, 0, false, DateTime.Now.AddSeconds(20));
// Create our component pipeline.
splitter.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, cam.Camera.VideoPort);
splitter.ConfigureOutputPort(0, splitterPortConfig);
splitter.ConfigureOutputPort(1, splitterPortConfig);
splitter.ConfigureOutputPort(2, splitterPortConfig);
splitter.ConfigureOutputPort(3, splitterPortConfig);
resizer.ConfigureOutputPort(portConfig);
resizer2.ConfigureOutputPort(portConfig2);
resizer3.ConfigureOutputPort(portConfig3);
resizer4.ConfigureOutputPort(portConfig4);
// Create our component pipeline.
cam.Camera.VideoPort.ConnectTo(splitter);
splitter.Outputs[0].ConnectTo(resizer);
splitter.Outputs[1].ConnectTo(resizer2);
splitter.Outputs[2].ConnectTo(resizer3);
splitter.Outputs[3].ConnectTo(resizer4);
cam.Camera.PreviewPort.ConnectTo(preview);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.VideoPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
MMALSharp provides the ability to encode/decode images fed from Streams. It supports GIF, BMP, JPEG and PNG file formats, and decoding must be carried out to the following:
- JPEG -> YUV420/422 (I420/422)
- GIF -> RGB565 (RGB16)
- BMP/PNG -> RGBA
Encode
public async Task EncodeFromFilestream()
{
MMALCamera cam = MMALCamera.Instance;
using (var stream = File.OpenRead("/home/pi/raw_jpeg_decode.raw"))
using (var imgCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/images/", "raw"))
using (var imgEncoder = new MMALImageFileEncoder(imgCaptureHandler))
{
var inputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, 2560, 1920, 0, 0, 0, true, null);
var outputPortConfig = new MMALPortConfig(MMALEncoding.BMP, MMALEncoding.I420, 2560, 1920, 0, 0, 0, true, null);
// Create our component pipeline.
imgEncoder.ConfigureInputPort(inputPortConfig)
.ConfigureOutputPort(outputPortConfig);
await imgEncoder.Convert();
Console.WriteLine("Finished");
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Decode
public async Task DecodeFromFilestream()
{
MMALCamera cam = MMALCamera.Instance;
using (var stream = File.OpenRead("/home/pi/test.jpg"))
using (var imgCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/images/", "raw"))
using (var imgDecoder = new MMALImageFileDecoder(imgCaptureHandler))
{
var inputPortConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 2560, 1920, 0, 0, 0, true, null);
var outputPortConfig = new MMALPortConfig(MMALEncoding.I420, null, 2560, 1920, 0, 0, 0, true, null);
// Create our component pipeline.
imgDecoder.ConfigureInputPort(inputPortConfig)
.ConfigureOutputPort(outputPortConfig);
await imgDecoder.Convert();
Console.WriteLine("Finished");
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
You can also encode and decode video files fed from streams in MMALSharp.
Encode
public async Task EncodeVideoFromFilestream()
{
MMALCamera cam = MMALCamera.Instance;
using (var stream = File.OpenRead("/home/pi/videos/decoded_rgb.raw"))
using (var vidCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/videos/", "h264"))
using (var vidEncoder = new MMALVideoFileEncoder(vidCaptureHandler))
{
var inputPortConfig = new MMALPortConfig(MMALEncoding.RGB16, null, 1280, 720, 25, 0, 1300000, true, null);
var outputPortConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.RGB16, 1280, 720, 25, 0, 1300000, true, null);
// Create our component pipeline.
vidEncoder.ConfigureInputPort(inputPortConfig)
.ConfigureOutputPort(outputPortConfig);
await vidEncoder.Convert();
Console.WriteLine("Finished");
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Decode
public async Task DecodeVideoFromFilestream()
{
MMALCamera cam = MMALCamera.Instance;
using (var stream = File.OpenRead("/home/pi/videos/test.h264"))
using (var vidCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/videos/", "raw"))
using (var vidDecoder = new MMALVideoFileDecoder(vidCaptureHandler))
{
var inputPortConfig = new MMALPortConfig(MMALEncoding.H264, null, 1280, 720, 25, 0, 1300000, true, null);
var outputPortConfig = new MMALPortConfig(MMALEncoding.RGB16, null, 1280, 720, 25, 0, 1300000, true, null);
// Create our component pipeline.
vidDecoder.ConfigureInputPort(inputPortConfig)
.ConfigureOutputPort(outputPortConfig);
await vidDecoder.Convert();
Console.WriteLine("Finished");
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
MMAL allows you to create additional video preview renderers which sit alongside the usual Null Sink or Video renderers shown in previous examples. The purpose of the additional renderers is that they allow you to overlay static content which is shown onto the display your Pi is connected to.
The overlay renderers will only work with unencoded images and they must have one of the following pixel formats:
- YUV420 (I420)
- RGB888 (RGB24)
- RGBA
- BGR888 (BGR24)
- BGRA
An easy way to get an unencoded image for use with the overlay renderers is to use the Raw image capture functionality as described in this example, setting the MMALCameraConfig.StillEncoding
and MMALCameraConfig.StillSubFormat
properties to one of the accepted pixel formats. Once you have got your test frame, follow the below example to overlay your image:
public async Task StaticOverlayExample()
{
MMALCamera cam = MMALCamera.Instance;
PreviewConfiguration previewConfig = new PreviewConfiguration
{
FullScreen = false,
PreviewWindow = new Rectangle(160, 0, 640, 480),
Layer = 2,
Opacity = 1
};
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var video = new MMALVideoRenderer(previewConfig))
{
cam.ConfigureCameraSettings();
video.ConfigureRenderer();
PreviewOverlayConfiguration overlayConfig = new PreviewOverlayConfiguration
{
FullScreen = true,
PreviewWindow = new Rectangle(50, 0, 640, 480),
Layer = 1,
Resolution = new Resolution(640, 480),
Encoding = MMALEncoding.I420,
Opacity = 255
};
var overlay = cam.AddOverlay(video, overlayConfig, File.ReadAllBytes("/home/pi/test1.raw"));
overlay.ConfigureRenderer();
overlay.UpdateOverlay();
var portConfig = new MMALPortConfig(MMALEncoding.JPEG, MMALEncoding.I420, 90);
//Create our component pipeline.
imgEncoder.ConfigureOutputPort(portConfig);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(video);
cam.PrintPipeline();
await cam.ProcessAsync(cam.Camera.StillPort);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
In this example, we are using an unencoded YUV420 image and configuring the renderer using the settings in overlayConfig
.
public async Task FFmpegRTMPStreaming()
{
MMALCamera cam = MMALCamera.Instance;
// An RTMP server needs to be listening on the address specified in the capture handler. I have used the Nginx RTMP module for testing.
using (var ffCaptureHandler = FFmpegCaptureHandler.RTMPStreamer("mystream", "rtmp://192.168.1.91:6767/live"))
using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, MMALVideoEncoder.MaxBitrateLevel4, null);
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(portConfig);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
Note:
If you intend on using the YouTube live streaming service, you will need to create the below method to return your own FFmpegCaptureHandler
. You should replace the internal FFmpegCaptureHandler.RTMPStreamer
seen in the example above with your custom method. The reason for this is YouTube streaming requires your RTMP stream to contain an audio input or otherwise it won't work. Internally, our RTMP streaming method does not include an audio stream, and at the current time we don't intend on changing it for this specific purpose.
public static FFmpegCaptureHandler RTMPStreamerWithAudio(string streamName, string streamUrl)
=> new FFmpegCaptureHandler($"-re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv -metadata streamName={streamName} {streamUrl}");
Please see here which discusses the issue in-depth.
This is a useful capture mode as it will push the elementary H.264 stream into an AVI container which can be opened by media players such as VLC.
public async Task FFmpegRawVideoConvert()
{
MMALCamera cam = MMALCamera.Instance;
using (var ffCaptureHandler = FFmpegCaptureHandler.RawVideoToAvi("/home/pi/videos/", "testing1234"))
using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
var portConfig = new MMALPortConfig(MMALEncoding.H264, MMALEncoding.I420, 25, 10, MMALVideoEncoder.MaxBitrateLevel4, null);
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(portConfig);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}
This example will push all images processed by an image capture handler into a playable video.
public async Task FFmpegImagesToVideo()
{
MMALCamera cam = MMALCamera.Instance;
// This example will take an image every 10 seconds for 4 hours
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);
// Process all images captured into a video at 2fps.
imgCaptureHandler.ImagesToVideo("/home/pi/images/", 2);
}
// Only call when you no longer require the camera, i.e. on app shutdown.
cam.Cleanup();
}