Skip to content
This repository has been archived by the owner on Feb 22, 2024. It is now read-only.

Can't stream RTMP with .Net Core 2 #66

Closed
ghost opened this issue Jul 15, 2018 · 14 comments
Closed

Can't stream RTMP with .Net Core 2 #66

ghost opened this issue Jul 15, 2018 · 14 comments
Assignees

Comments

@ghost
Copy link

ghost commented Jul 15, 2018

Hello,

I'm unable to stream through RTMP. I have installed FFMPEG and am using this library in a .Net Core 2 Raspberry Pi app. I am streaming to YouTube live.

I have also followed the tutorial at: https://github.com/techyian/MMALSharp/wiki/Examples#RTMP-Streaming

The error I get alternates between Unhandled Exception: System.IO.IOException: Broken pipe and Unhandled Exception: System.ComponentModel.Win32Exception: No such process

I am able to take pictures with the PiCam and can stream to YouTube live through command line so not sure what I'm doing wrong. Are you able to assist?

Thanks,

Jesse

@techyian
Copy link
Owner

Hi Jesse,

Please can you post the code you are using to stream to RTMP? Can you also send over the command line statement that is working for you?

Thanks,

Ian

@ghost
Copy link
Author

ghost commented Jul 15, 2018

    public void Run()
    {
        var cam = MMALCamera.Instance;

        try
        {
            AsyncContext.Run(async () =>
            {
                // An RTMP server needs to be listening on the address specified in the capture handler. I have used the Nginx RTMP module for testing.    
                using (var ffCaptureHandler = MMALSharp.Handlers.FFmpegCaptureHandler.RTMPStreamer("STREAM_KEY_HERE", "rtmp://a.rtmp.youtube.com/live2"))
                using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
                using (var renderer = new MMALVideoRenderer())
                {
                    cam.ConfigureCameraSettings();

                    // Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
                    vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);

                    cam.Camera.VideoPort.ConnectTo(vidEncoder);
                    cam.Camera.PreviewPort.ConnectTo(renderer);

                    // Camera warm up time
                    await Task.Delay(2000);

                    var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));;

                    cam.PrintPipeline();
                    

                    // Take video for 3 minutes.
                    await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
                }

            });
        } catch (Exception e) {
            Console.WriteLine(e.Message);
            Console.WriteLine(e.StackTrace);
            cam.Cleanup();
        }

        cam.Cleanup();
    }

The command that is working is

raspivid -o - -t 0 -vf -hf -fps 10 -b 500000 | ffmpeg -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/<STREAM_NAME_HERE>

@ghost
Copy link
Author

ghost commented Jul 15, 2018

sorry the code above contains an error. The camera port should be 0. I have updated the post

@techyian
Copy link
Owner

To clarify, is <STREAM_NAME_HERE> seen in the command line statement and "STREAM_KEY_HERE" seen in your code above the same value?

Could you try appending <STREAM_NAME_HERE> onto the end of your RTMP URL so it matches the following:

using (var ffCaptureHandler = MMALSharp.Handlers.FFmpegCaptureHandler.RTMPStreamer("<STREAM_NAME_HERE>", "rtmp://a.rtmp.youtube.com/live2/<STREAM_NAME_HERE>"))

I can see there are a number of other arguments you send to FFmpeg in your command line statement which MMALSharp won't be passing in its RTMPStreamer method but I am unsure whether these will be required to stream your content.

I have signed up to the YouTube live streaming but I have to wait 24 hours before it'll be activated.

@ghost
Copy link
Author

ghost commented Jul 15, 2018

Ok so I've amended my code to match

using (var ffCaptureHandler = MMALSharp.Handlers.FFmpegCaptureHandler.RTMPStreamer("<STREAM_NAME_HERE>", "rtmp://a.rtmp.youtube.com/live2/<STREAM_NAME_HERE>"))

and it appears to be streaming (the frame details are being printed out in the shell). However, although the stream is printed out, no video appears on Youtube Live (Youtube Live briefly indicates it is receiving data and then says it is receiving none).

@techyian
Copy link
Owner

Ok, that could possibly be a bitrate issue. I'll wait until I get access to YouTube streaming tomorrow and will try myself to see if a code change is required or if it's a setup issue.

@techyian
Copy link
Owner

Still looking into this for you. I've got access to YouTube streaming and can reproduce your issue. #67 has been raised off the back of this as the bitrate for videos isn't being set correctly at present - a code change is required and I will push 0.4.4 to Nuget once I'm happy. That said, even after the fix, YouTube streaming still doesn't appear to be working, however Nginx RTMP module is working with a slight delay. I suspect it may be this delay that YouTube doesn't like so will continue investigating and will update this ticket when I've got more info.

@techyian techyian self-assigned this Jul 17, 2018
@ghost
Copy link
Author

ghost commented Jul 17, 2018

OK cool. I had noticed the bit rate issue yesterday but hadn't gotten around to updating the ticket yet. I'll keep a watch on this thread. Thanks for investigating

@techyian
Copy link
Owner

I think I've tracked down what's happening here. In your command line example, you are adding silent audio which YouTube appears to require for its live streaming whereas MMALSharp isn't passing through any audio input - see here. Changes I've got locally, along with the bitrate fix seems to have done the trick and the green status bar stays lit for the duration of the RTMP stream.

I'll put these changes together and run through some tests before pushing out a new release, but thank you for raising this ticket and discovering the bitrate issue :)

Ian

@ghost
Copy link
Author

ghost commented Jul 17, 2018

Great news! Thanks for fixing so quickly - I'll keep an eye out for the updated build.

@techyian
Copy link
Owner

Hi again,

Have you managed to actually go live with the command line example you provided? The reason I ask is that no matter which method I try, either via the command line or via MMALSharp, the status is green but always says "Starting" and no live streaming actually occurs. Have you experienced this before?

@techyian
Copy link
Owner

techyian commented Jul 19, 2018

Also, I've just done a new build which is on its way to NuGet (0.4.4) which includes the bitrate fix and a few other minor issues. With regards to the RTMP streaming method, I have kept the code as-is, and recommend creating your own method returning a FFmpegCaptureHandler to use instead of the in-built one like so:

public static FFmpegCaptureHandler RTMPStreamerWithAudio(string streamName, string streamUrl)
            => new FFmpegCaptureHandler($"-re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv -metadata streamName={streamName} {streamUrl}");

I will update the Wiki to suggest this as a capture handler if using YouTube streaming. I hope you manage to get up and running following the update. As mentioned in my previous post, when I try to live stream it says "Starting" with a green status but never changes, unsure if this is an issue with YouTube or my account but I can't see any other reason at this point what could be causing that?

@ghost
Copy link
Author

ghost commented Jul 19, 2018

Hello again,

Yes, I am up and running. I tried my command directly through command line and it streams to YouTube fine.

I've also gotten it working with the example you've provided above. I've just made a few small amendments.

public static FFmpegCaptureHandler RTMPStreamerWithAudio(string streamName, string streamUrl)
=> new FFmpegCaptureHandler($"-re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv {streamUrl}{streamName}")

I've removed the metadata argument, swapped the streamUrl and streamName variables (and removed the space between them) and removed the 'streamName=' section of the command. With this I am up and running.

Does this work for you?

@techyian
Copy link
Owner

Seems to be working now with same code I was using a few days ago, how strange, must have been an issue on their end or with my account. Glad you're up and running.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant