iOS 视频编码库
我真的很困扰这个问题,因为我在互联网上没有看到足够的有关 iOS 中视频编码的信息,但是我们可以观察到很多应用程序成功地处理了视频流问题(skype、qik、justin.tv 等) .)
我将开发一个应用程序,该应用程序应该将从相机获取并以 h.263(h.264 或 MPEG-4 正在决定)编码的视频帧发送到网络服务器。为此,我需要一些视频编码库。显然,ffmpeg 可以处理该任务,但它是在 LGPL 许可下的,这可能会导致在 AppStore 中提交应用程序时出现一些问题。另一方面,有一些应用程序,似乎使用 ffmpeg 库,但仅 Timelapser 在应用说明中清楚地说明了这一事实。这是否意味着其他应用程序不使用 ffmpeg 或者只是隐藏此信息?
请分享您对此主题的想法和经验。我愿意讨论。
I really stucked with that problem, because I haven't seen enough information in the internet regarding video encoding in iOS, however we can observe plenty of apps that deal with the problem of video streaming successfully (skype, qik, justin.tv, etc.)
I'm going to develop an application, that should send video frames obtained from camera and encoded in h.263 (h.264 or MPEG-4 it is under decision) to a web-server. For this, I need some video encoding library. Obviously, ffmpeg can deal with that task, but it is under LGPL license, which could probably lead to some problems in submitting the app in the AppStore. On the other hand, there are some applications, which are seemed to use ffmpeg library, but only Timelapser clearly states this fact in app description. Does this mean, that other apps are not using ffmpeg or just hiding this information?
Please, share your thoughts and experience in this topic. I'm open for dicsussion.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
经过谷歌搜索并在这一领域进行了一些研究后,我发现了这个库 http://www.foxitsolutions.com/ iphone_h264_sdk.html。他们确实使用硬件编码。我用仪器检查了演示示例,他们向我展示了编码时使用了约 12% 的 cpu,并且不断调用系统调用
read()
。由此我可以得出结论,他们的库使用标准 AVFoundation 的AVAssetWriter
写入临时文件,并且(最有可能)并发线程用于读取此临时文件以检索编码帧。另外,请查看http://www.videolan.org/developers/x264.html。它遵循 GPL,但仍然有用。
After googling and making some research in this area, I found this one library http://www.foxitsolutions.com/iphone_h264_sdk.html. They really use hardware encoding. I've examined demo example with instruments, and they showed me that while encoding, ~12% cpu is used and syscall
read()
constantly called. From that I can conclude, that their library uses standard AVFoundation'sAVAssetWriter
to write into the temporary file, and (most probably) concurrent thread is used to read this temp file for retrieving encoded frames.Also, take a look at http://www.videolan.org/developers/x264.html. It is under GPL, but still can be useful.