Android 中的 H264 编码器?
我在尝试修复一个简单的视频录制应用程序时遇到了一些问题*。我认为我正确地遵循了步骤顺序。以下是给我带来问题的代码部分的简化。此代码仅在按下按钮后作为回调执行:
if ( mRecorder != null){
mRecorder.reset();
mRecorder.release();
}
mRecorder = new MediaRecorder();
if(mViewer.hasSurface){
mRecorder.setPreviewDisplay(mViewer.holder.getSurface());
Log.d(TAG,"Surface has been set");
}
try {
Log.d(TAG,"Sleeping for 4000 mili");
Thread.sleep(4000);
Log.d(TAG,"Waking up");
} catch (InterruptedException e) {
Log.e(TAG,"InterruptedException");
e.printStackTrace();
}
mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mRecorder.setVideoFrameRate(12);
mRecorder.setVideoSize(176, 144);
mRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setMaxDuration(MAX_DURATION_TEST);
String targetFile = "/sdcard/webcamera/temp.mp4";
File localFile = new File(targetFile);
if(localFile.exists()){
Log.d(TAG,"Local file exists");
}else{
Log.d(TAG,"Local file does not exist");
}
mRecorder.setOutputFile(targetFile);
try {
mRecorder.prepare();
bPrepared = true;
Log.i(TAG,"prepared");
return;
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
Log.e(TAG ,"IOException");
Log.e(TAG,"Message: "+e.getMessage());
StackTraceElement[] array = e.getStackTrace();
for(StackTraceElement element : array){
Log.e(TAG,""+element.toString());
}
}
我在这里不明白的重要一点是,每当我将视频编码器设置为 MPEG_4_S 时,它就会起作用。另一方面,每当我将编码器设置为 H264 时,它就不会。问题是这段代码只是一个更大项目的一部分,其余部分期望该视频使用 h264 进行编码。
我正在三星 Galaxy I-7500 上进行测试,顺便说一下,它在 froyo 上运行。我认为 Galaxy I-9000 也有同样的问题。
对我来说,令人困惑的是,根据这里的文档: http://developer.android.com/guide/appendix/media-formats.html ,根本不应该支持MPEG_4_SP编码,而H264自honeycomb以来就支持。那么为什么它能与 MPEG_4_SP 一起工作呢?是否可以使其与 H264 一起使用?
我得到的错误并不是很清楚。
07-11 00:01:40.626: ERROR/MediaSource(1386): Message: prepare failed.
07-11 00:01:40.766: ERROR/MediaSource(1386): android.media.MediaRecorder._prepare(Native Method)
07-11 00:01:40.766: ERROR/MediaSource(1386): android.media.MediaRecorder.prepare(MediaRecorder.java:508)
07-11 00:01:40.766: ERROR/MediaSource(1386): com.appdh.webcamera.MediaSource.prepareOutput(MediaSource.java:74)
07-11 00:01:40.766: ERROR/MediaSource(1386): com.appdh.webcamera.MainActivity.startDetectCamera(MainActivity.java:312)
*实际上,该应用程序比这更复杂一点,因为它也通过 LAN 传输视频,但我在这里关心的部分与此无关。您可以在这里查看这个有趣的项目:http://code.google.com/p /ipcamera-for-android/
I've been having some problems while trying to fix a simple video recording app*. I think I followed the sequence of steps correctly. The following is a simplification of the part of the code that is giving me problems. This code is executed only as a callback once a button is pressed:
if ( mRecorder != null){
mRecorder.reset();
mRecorder.release();
}
mRecorder = new MediaRecorder();
if(mViewer.hasSurface){
mRecorder.setPreviewDisplay(mViewer.holder.getSurface());
Log.d(TAG,"Surface has been set");
}
try {
Log.d(TAG,"Sleeping for 4000 mili");
Thread.sleep(4000);
Log.d(TAG,"Waking up");
} catch (InterruptedException e) {
Log.e(TAG,"InterruptedException");
e.printStackTrace();
}
mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mRecorder.setVideoFrameRate(12);
mRecorder.setVideoSize(176, 144);
mRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setMaxDuration(MAX_DURATION_TEST);
String targetFile = "/sdcard/webcamera/temp.mp4";
File localFile = new File(targetFile);
if(localFile.exists()){
Log.d(TAG,"Local file exists");
}else{
Log.d(TAG,"Local file does not exist");
}
mRecorder.setOutputFile(targetFile);
try {
mRecorder.prepare();
bPrepared = true;
Log.i(TAG,"prepared");
return;
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
Log.e(TAG ,"IOException");
Log.e(TAG,"Message: "+e.getMessage());
StackTraceElement[] array = e.getStackTrace();
for(StackTraceElement element : array){
Log.e(TAG,""+element.toString());
}
}
The important thing which I don't understand here is that whenever I put the video encoder to be MPEG_4_S it works. On the other hand whenever I put the encoder to be H264 it just does not. The problem is that this piece of code is just part of a bigger project, and the rest of it kind of expects this video to be encoded with h264.
I'm testing on a samsung Galaxy I-7500, running on froyo by the way. And I think the Galaxy I-9000 has the same problem.
The puzzling thing for me is that according to this documentation right here:
http://developer.android.com/guide/appendix/media-formats.html, MPEG_4_SP encoding should not be supported at all, while H264 is supported since honeycomb. So why is it working with MPEG_4_SP at all? and is it possible to make it work with H264?
The error I get is not really clear.
07-11 00:01:40.626: ERROR/MediaSource(1386): Message: prepare failed.
07-11 00:01:40.766: ERROR/MediaSource(1386): android.media.MediaRecorder._prepare(Native Method)
07-11 00:01:40.766: ERROR/MediaSource(1386): android.media.MediaRecorder.prepare(MediaRecorder.java:508)
07-11 00:01:40.766: ERROR/MediaSource(1386): com.appdh.webcamera.MediaSource.prepareOutput(MediaSource.java:74)
07-11 00:01:40.766: ERROR/MediaSource(1386): com.appdh.webcamera.MainActivity.startDetectCamera(MainActivity.java:312)
*Actually, the app is a little more complicated than just that, as it also does stream the video over LAN, but the part which I am concerned here has nothing to do with that. You can check this interesing project out here: http://code.google.com/p/ipcamera-for-android/
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
正如您已经写过的,H.264 编码支持只能在运行 Honeycomb 及更高版本的设备上实现,目前这仅意味着平板电脑。如果您需要 H.264,您应该测试“准备失败”,并告诉用户该设备不受支持,或者使用市场过滤器更好地阻止没有 H.264 的设备。或者你可以为 android 编译 ffmpeg - 就像其他几个项目一样。看看这些链接:
http://odroid.foros -phpbb.com/t338-ffmpeg-compiled-with-android-ndk
http://bambuser.com/opensource
Android 上的 FFmpeg
As you already wrote H.264 encoding support can be only expected from devices running honeycomb and later, which currently means only tablets. If you need H.264 you should test for prepare failed and either tell the user that the device is not supported or better block devices without H.264 using market filters. Or you can compile ffmpeg for android - like several other projects do. Have a look at these links:
http://odroid.foros-phpbb.com/t338-ffmpeg-compiled-with-android-ndk
http://bambuser.com/opensource
FFmpeg on Android
您还可以使用 JCodec
它支持 Android 并且有一些示例。
使用 Gradle 编译它的最佳方法是:
但是对于最新的改进和错误修复,您需要从最新的提交进行编译(2016 年仍然没有新版本)
You also can use JCodec
It supports Android and have few samples for it.
The best way to compile it with Gradle is:
but for latest improvements and bug fixes you need to compile from latest commits (there is still no new release from 2016)