如何在 Android 上实际使用 ffmpeg?
我有一个关于 Android 和 ffmpeg 的非常基本的问题。我从 http://bambuser.com/opensource 获得了 ffmpeg,并能够为 ARM 编译它。
结果是二进制文件 (ffmpeg
) 以及几个 libsomething.so
文件。
我的问题是:这足以解码视频吗?那么我该如何实际使用 ffmpeg 呢?
要加载我的库:
static {
System.load("/data/data/com.package/lib/libavcodec.so");
}
它加载得很好。但那又怎样呢?
<子> 更多解释:我看到其他项目,人们将 ffmpeg 源放在项目的 JNI 目录中。他们还创建了一些 Android.mk 文件和一些 C 代码。我也需要这个吗?为什么我要先创建 .so 文件,然后再次复制 ffmpeg 源代码?
<子> 我知道 NDK 以及它应该如何工作,但我从未见过如何使用它实际调用 ffmpeg 函数的示例,因为人们似乎隐藏了他们的实现(这是可以理解的),但甚至没有给出有用的指针或例子。
<子> 假设我想解码视频文件。我需要实现哪种本机方法?我如何运行该项目?需要传递哪些数据类型?等等。这里肯定有一些人至少做到了这一点,我通过几个小时的搜索才知道这一点。
I have a very basic question regarding Android and ffmpeg. I obtained ffmpeg from http://bambuser.com/opensource and was able to compile it for ARM.
The results are the binaries (ffmpeg
) as well as several libsomething.so
files.
My question is: Is this enough to decode videos? How do I actually use ffmpeg then?
To load the library I have:
static {
System.load("/data/data/com.package/lib/libavcodec.so");
}
It loads fine. But what then?
More explanation: I saw other projects where people had their ffmpeg source in a JNI directory in the project. They also created some Android.mk files and some C code along with it. Would I need this as well? Why would I create the .so files first and then copy the ffmpeg source code again?
I know the NDK and how it should work but I've never seen an example of how one would actually call ffmpeg functions using it, because people seem to be hiding their implementations (which is sort of understandable) but not even giving useful pointers or examples.
Let's just say I wanted to decode a video file. Which kind of native methods would I need to implement? How do I run the project? Which data types need to be passed? etc. There are certainly a few people here who have at least done that, I know this from searching for hours and hours.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
对于你的第一个问题;
仅构建不足以正确使用 ffmpeg 库。您还应该以正确的顺序包装这些 so 文件,因为这些 so 文件在链接时需要其他库。您可以通过使用显示so文件的头信息。
所以需要通过Android.mk来包装这些so文件。您可以查看此链接。
第二个;
您只需要 ffmpeg 项目中的头文件。该实现将从 so 库链接。这可能是因为,开发人员没有费心去过滤头文件。
最后一个;
您的想法暂时看来是正确的,当前大多数开发人员都在努力使用 ffmpeg,但他们缺乏文档和示例代码。
For your first question;
Just building is not enough for the proper use of the ffmpeg libraries. You should also wrap those so files in the right order because these so files NEED other libraries in the link time. You can display header information of the so file, by using.
So you need to wrap these so files through Android.mk. You may check this link.
The second one;
You only need the header files from the ffmpeg project. The implementation will linked from the so libraries. Thats perhaps because, developers didn't bother to filter header files.
And the last one;
your thoughts seems right for the time being, most of the current developers are struggling to use ffmpeg but they lack of documentation and sample codes.