从 iPhone 传输视频
我可以从 iPhone 的相机中获取单独的帧。我需要的是一种将它们与声音打包以便流式传输到服务器的方法。收到文件后发送它们并不是什么大问题。我遇到问题的是流媒体文件的生成。我一直在尝试让 FFMpeg 工作,但运气不佳。
有人对我如何实现这一目标有任何想法吗?我想要一个已知的工作 API 或有关使 FFMpeg 在 iPhone 应用程序中正确编译的说明。
I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.
Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您可以将录音分成长度为 10 秒的单独文件,然后分别发送。如果您使用 AVCaptureSession 的 beginConfiguration 和 commitConfiguration 方法来批量更改输出,则不应在文件之间丢弃任何帧。与逐帧上传相比,这有很多优点:
You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use
AVCaptureSession
'sbeginConfiguration
andcommitConfiguration
methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload: