从 iPhone 上传实时流媒体视频,例如 Ustream 或 Qik
如何将 iPhone 上的视频直播到 Ustream 或 Qik 等服务器?我知道 Apple 有一种叫做 Http Live Streaming 的东西,但我发现的大多数资源都只讨论将视频从服务器流式传输到 iPhone。
我应该使用 Apple 的 Http Living Streaming 吗?或者其他什么?谢谢。
How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.
Is Apple's Http Living Streaming something I should use? Or something else? Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
据我所知,没有内置的方法可以做到这一点。正如您所说,HTTP Live Streaming 用于下载到 iPhone。
我这样做的方法是实现一个 AVCaptureSession,它有一个带有在每个帧上运行的回调的委托。该回调通过网络将每个帧发送到服务器,服务器有一个自定义设置来接收它。
流程如下:
这里有一些代码:
然后输出设备的委托(这里是 self)必须实现回调:
编辑/更新
有几个人问如何在不将帧一一发送到服务器的情况下执行此操作。答案很复杂......
基本上,在上面的
didOutputSampleBuffer
函数中,您将示例添加到AVAssetWriter
中。实际上,我同时活跃了三个资产编写者——过去、现在和未来——在不同的线程上进行管理。过去的作者正在关闭电影文件并上传它。当前写入器正在从相机接收样本缓冲区。未来的作家正在打开一个新的电影文件并准备数据。每 5 秒,我设置一次过去=当前; current=future 并重新启动序列。
然后将视频以 5 秒为单位上传到服务器。如果需要,您可以使用
ffmpeg
将视频拼接在一起,或者将它们转码为 MPEG-2 传输流以进行 HTTP 直播。视频数据本身由资产编写器进行 H.264 编码,因此转码仅更改文件的标头格式。There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.
The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.
Here's the flow: https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2
And here's some code:
Then the output device's delegate (here, self) has to implement the callback:
EDIT/UPDATE
Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...
Basically, in the
didOutputSampleBuffer
function above, you add the samples into anAVAssetWriter
. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set
past=current; current=future
and restart the sequence.This then uploads video in 5-second chunks to the server. You can stitch the videos together with
ffmpeg
if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.我找到了一个可以帮助你解决这个问题的图书馆。
HaishinKit 流媒体库
上述库为您提供了通过 RTMP 或 HLS 进行流媒体的所有选项。
只需按照该库给出的步骤并仔细阅读所有说明即可。请不要直接运行此库中给出的示例代码,它存在一些错误,而不是将所需的类和 pod 获取到您的演示应用程序中。
我刚刚完成了这个,你可以录制屏幕、相机和音频。
I have found one library that will help you on this.
HaishinKit Streaming Library
Above Library is giving you all option streaming Via RTMP or HLS.
Just follow this library given step and read it all instruction carefully. Please don't direct run example code given in this library it is having some error instead of that get required class and pod into your demo app.
I have just done it with this you can record screen, Camera and Audio.
我不确定您是否可以使用 HTTP Live Streaming 来做到这一点。 HTTP Live Streaming 以 10 秒(大约)长度对视频进行分段,并使用这些分段创建播放列表。
因此,如果您希望 iPhone 成为 HTTP Live Streaming 的流服务器端,您将必须找到一种方法来分割视频文件并创建播放列表。
如何做到这一点超出了我的知识范围。对不起。
I'm not sure you can do that with HTTP Live Streaming. HTTP Live Streaming segments the video in 10 secs (aprox.) length, and creates a playlist with those segments.
So if you want the iPhone to be the stream server side with HTTP Live Streaming, you will have to figure out a way to segment the video file and create the playlist.
How to do it is beyond my knowledge. Sorry.