- Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
- Issuesの言語は、日本語が分かる方は日本語でお願いします!
Sponsored with 💖 by
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 💬
- If you need help with making LiveStreaming requests using HaishinKit, use a GitHub issue with Bug report template
- The trace level log is very useful. Please set
Logboard.with(HaishinKitIdentifier).level = .trace
. - If you don't use an issue template. I will immediately close the your issue without a comment.
- The trace level log is very useful. Please set
- If you'd like to discuss a feature request, use a GitHub issue with Feature request template.
- If you want to support e-mail based communication without GitHub issue.
- Consulting fee is $50/1 incident. I'm able to response a few days.
- If you want to contribute, submit a pull request!
- Authentication
- Publish and Recording (H264/AAC)
- Playback (Beta)
- Adaptive bitrate streaming
- Handling (see also #126)
- Automatic drop frames
- Action Message Format
- AMF0
- AMF3
- SharedObject
- RTMPS
- Native (RTMP over SSL/TLS)
- Tunneled (RTMPT over SSL/TLS) (Technical Preview)
- RTMPT (Technical Preview)
- ReplayKit Live as a Broadcast Upload Extension (Technical Preview)
- HTTPService
- HLS Publish
- | HKView | MTHKView |
---|---|---|
Engine | AVCaptureVideoPreviewLayer | Metal |
Publish | ○ | ◯ |
Playback | × | ◯ |
VisualEffect | × | ◯ |
Condition | Stable | Stable |
- Support tvOS 10.2+ (Technical Preview)
- tvOS can't publish Camera and Microphone. Available playback feature.
- Hardware acceleration for H264 video encoding, AAC audio encoding
- Support "Allow app extension API only" option
-
Support GPUImage framework (~> 0.5.12) -
Objective-C Bridging
- | iOS | OSX | tvOS | XCode | Swift |
---|---|---|---|---|---|
1.2.0+ | 9.0+ | 10.11+ | 10.2+ | 13.0+ | 5.5+ |
1.1.0+ | 9.0+ | 10.11+ | 10.2+ | 12.0+ | 5.0+ |
1.0.0+ | 8.0+ | 10.11+ | 10.2+ | 11.0+ | 5.0+ |
Please contains Info.plist.
iOS 10.0+
- NSMicrophoneUsageDescription
- NSCameraUsageDescription
macOS 10.14+
- NSMicrophoneUsageDescription
- NSCameraUsageDescription
*Please set up your project Swift 5.5. *
source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!
def import_pods
pod 'HaishinKit', '~> 1.2.2'
end
target 'Your Target' do
platform :ios, '9.0'
import_pods
end
github "shogo4405/HaishinKit.swift" ~> 1.2.2
https://github.com/shogo4405/HaishinKit.swift
BSD-3-Clause
Paypal
Bitcoin
3FnjC3CmwFLTzNY5WPNz4LjTo1uxGNozUR
Make sure you setup and activate your AVAudioSession.
import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
// https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
if #available(iOS 10.0, *) {
try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
} else {
session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with: [
AVAudioSession.CategoryOptions.allowBluetooth,
AVAudioSession.CategoryOptions.defaultToSpeaker]
)
try session.setMode(.default)
}
try session.setActive(true)
} catch {
print(error)
}
Real Time Messaging Protocol (RTMP).
let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
// print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
// print(error)
}
let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)
// add ViewController#view
view.addSubview(hkView)
rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)
- rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
- [] mark is an Optional.
rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]") rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
- rtmp://localhost/live/streamName
rtmpConneciton.connect("rtmp://localhost/live") rtmpStream.publish("streamName")
var rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.captureSettings = [
.fps: 30, // FPS
.sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
// .isVideoMirrored: false,
// .continuousAutofocus: false, // use camera autofocus mode
// .continuousExposure: false, // use camera exposure mode
// .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
.muted: false, // mute audio
.bitrate: 32 * 1000,
]
rtmpStream.videoSettings = [
.width: 640, // video output width
.height: 360, // video output height
.bitrate: 160 * 1000, // video output bitrate
.profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
.maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
AVMediaType.audio: [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 0,
AVNumberOfChannelsKey: 0,
// AVEncoderBitRateKey: 128000,
],
AVMediaType.video: [
AVVideoCodecKey: AVVideoCodecH264,
AVVideoHeightKey: 0,
AVVideoWidthKey: 0,
/*
AVVideoCompressionPropertiesKey: [
AVVideoMaxKeyFrameIntervalDurationKey: 2,
AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
AVVideoAverageBitRateKey: 512000
]
*/
],
]
// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)
var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")
// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))
HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8
var httpStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")
var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)
var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)
// add ViewController#view
view.addSubview(hkView)
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap --use-xcframeworks
open HaishinKit.xcodeproj
- Adobe’s Real Time Messaging Protocol
- Action Message Format -- AMF 0
- Action Message Format -- AMF 3
- Video File Format Specification Version 10
- Adobe Flash Video File Format Specification Version 10.1