iPhone:HTTP 直播,无需任何服务器端处理
我希望能够将 iPhone 摄像头的帧/视频(实时)流式传输到互联网。我在一个线程(streaming video FROM an iPhone)中看到可以使用AVCaptureSession的beginConfiguration和commitConfiguration。但我不知道如何开始设计这个任务。已经有很多关于如何将视频流传输到 iPhone 的教程,但实际上这并不是我正在寻找的。
你们能给我任何可以进一步帮助我的想法吗?
I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
这是一个棘手的问题。你应该能够做到,但这并不容易。
一种不实时的方法(无法满足您的需求,但值得一提)是从相机捕获并将其保存到视频文件中。请参阅 AV Foundation Guide 了解如何执行此操作。保存后,您可以使用 HTTP Live Streaming 分段器生成正确的分段。 Apple 拥有适用于 Mac OSX 的应用程序,但也有一个开源版本,您可以将其改编为适用于 iOS。最重要的是,您还必须运行一个 http 服务器来为这些段提供服务。有很多 http 服务器可供您使用。
但要实时进行,首先正如您已经发现的那样,您需要从相机收集帧。一旦你有了这些,你想将它们转换为 h.264。为此你需要 ffmpeg。基本上,您将图像推送到 ffmpeg 的 AVPicture,形成流。然后,您需要管理该流,以便实时流分段器将其识别为实时流 h.264 设备。我不知道该怎么做,这听起来像是一项严肃的工作。完成此操作后,您需要有一个 http 服务器来为该流提供服务。
实际上可能更容易的是使用基于 RTP/RTSP 的流。 RTP 的开源版本涵盖了该方法,并且 ffmpeg 完全支持该方法。这不是 http 直播,但它可以很好地工作。
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.