FFmpeg 不解码 h264 流
我正在尝试从 rtsp 服务器解码 h264 流并将其呈现在 iPhone 上。
我找到了一些图书馆并阅读了一些有关它的文章。
库来自 dropCam for iPhone,称为 RTSPClient 和 DecoderWrapper。
但我无法使用 ffmpeg 上使用的 DecodeWrapper 解码帧数据。
这是我的代码。
VideoViewer.m
- (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
{
[VideoDecoder staticInitialize];
mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];
[mConverter decodeFrame:frameData];
if ([mConverter isFrameReady]) {
UIImage *imageData =[mConverter getDecodedFrame];
if (imageData) {
[mVideoView setImage:imageData];
NSLog(@"decoded!");
}
}
}
---VideoDecoder.m---
- (id)initWithCodec:(enum VideoCodecType)codecType
colorSpace:(enum VideoColorSpace)colorSpace
width:(int)width
height:(int)height
privateData:(NSData*)privateData {
if(self = [super init]) {
codec = avcodec_find_decoder(CODEC_ID_H264);
codecCtx = avcodec_alloc_context();
// Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).
// These fields will become filled in once the first frame is decoded and the SPS is processed.
codecCtx->width = width;
codecCtx->height = height;
codecCtx->extradata = av_malloc([privateData length]);
codecCtx->extradata_size = [privateData length];
[privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
codecCtx->pix_fmt = PIX_FMT_RGBA;
#ifdef SHOW_DEBUG_MV
codecCtx->debug_mv = 0xFF;
#endif
srcFrame = avcodec_alloc_frame();
dstFrame = avcodec_alloc_frame();
int res = avcodec_open(codecCtx, codec);
if (res < 0)
{
NSLog(@"Failed to initialize decoder");
}
}
return self;
}
- (void)decodeFrame:(NSData*)frameData {
AVPacket packet = {0};
packet.data = (uint8_t*)[frameData bytes];
packet.size = [frameData length];
int frameFinished=0;
NSLog(@"Packet size===>%d",packet.size);
// Is this a packet from the video stream?
if(packet.stream_index==0)
{
int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
NSLog(@"Res value===>%d",res);
NSLog(@"frame data===>%d",(int)srcFrame->data);
if (res < 0)
{
NSLog(@"Failed to decode frame");
}
}
else
{
NSLog(@"No video stream found");
}
// Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
if (!outputInit) {
if (codecCtx->width > 0 && codecCtx->height > 0) {
#ifdef _DEBUG
NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
#endif
outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
outputBuf = av_malloc(outputBufLen);
avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt, codecCtx->width,
codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL);
outputInit = YES;
frameFinished=1;
}
else {
NSLog(@"Could not get video output dimensions");
}
}
if (frameFinished)
frameReady = YES;
}
控制台向我显示如下。
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
[h264 @ 0x5815c00] no frame!
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.225 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x5017c00] no frame!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x581d000] no frame!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
. . . . .
但模拟器什么也没显示。
我的代码有什么问题。
帮我解决这个问题。
感谢您的回答。
I am trying to decode h264 stream from rtsp server and render it on iPhone.
I found some libraries and read some articles about it.
Libraries are from dropCam for iPhone called RTSPClient and DecoderWrapper.
But I can not decode frame data with DecodeWrapper that using on ffmpeg.
Here are my code.
VideoViewer.m
- (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
{
[VideoDecoder staticInitialize];
mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];
[mConverter decodeFrame:frameData];
if ([mConverter isFrameReady]) {
UIImage *imageData =[mConverter getDecodedFrame];
if (imageData) {
[mVideoView setImage:imageData];
NSLog(@"decoded!");
}
}
}
---VideoDecoder.m---
- (id)initWithCodec:(enum VideoCodecType)codecType
colorSpace:(enum VideoColorSpace)colorSpace
width:(int)width
height:(int)height
privateData:(NSData*)privateData {
if(self = [super init]) {
codec = avcodec_find_decoder(CODEC_ID_H264);
codecCtx = avcodec_alloc_context();
// Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).
// These fields will become filled in once the first frame is decoded and the SPS is processed.
codecCtx->width = width;
codecCtx->height = height;
codecCtx->extradata = av_malloc([privateData length]);
codecCtx->extradata_size = [privateData length];
[privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
codecCtx->pix_fmt = PIX_FMT_RGBA;
#ifdef SHOW_DEBUG_MV
codecCtx->debug_mv = 0xFF;
#endif
srcFrame = avcodec_alloc_frame();
dstFrame = avcodec_alloc_frame();
int res = avcodec_open(codecCtx, codec);
if (res < 0)
{
NSLog(@"Failed to initialize decoder");
}
}
return self;
}
- (void)decodeFrame:(NSData*)frameData {
AVPacket packet = {0};
packet.data = (uint8_t*)[frameData bytes];
packet.size = [frameData length];
int frameFinished=0;
NSLog(@"Packet size===>%d",packet.size);
// Is this a packet from the video stream?
if(packet.stream_index==0)
{
int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
NSLog(@"Res value===>%d",res);
NSLog(@"frame data===>%d",(int)srcFrame->data);
if (res < 0)
{
NSLog(@"Failed to decode frame");
}
}
else
{
NSLog(@"No video stream found");
}
// Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
if (!outputInit) {
if (codecCtx->width > 0 && codecCtx->height > 0) {
#ifdef _DEBUG
NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
#endif
outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
outputBuf = av_malloc(outputBufLen);
avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt, codecCtx->width,
codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL);
outputInit = YES;
frameFinished=1;
}
else {
NSLog(@"Could not get video output dimensions");
}
}
if (frameFinished)
frameReady = YES;
}
The console shows me as follows.
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
[h264 @ 0x5815c00] no frame!
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.225 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x5017c00] no frame!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x581d000] no frame!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
. . . . .
But the simulator shows nothing.
What's wrong with my code.
Help me solve this problem.
Thanks for your answers.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我在使用 H264 和 FFmpeg 时也遇到过类似的问题。
我的问题是,某些设备不会在每一帧中发送序列(SPS)和图片参数集(PPS),因此我需要稍微修改我的帧数据。
也许这篇文章会有所帮助:FFmpeg无法解码H264流/帧数据
I've had a similar problem with H264 and FFmpeg.
My problem was that some devices are not sending the sequence (SPS) and picture parameter sets (PPS) with every frame, so I've needed to slightly modify my frame data.
Maybe this post will help: FFmpeg can't decode H264 stream/frame data