在详细回顾了WWDC2014 Session513之后,我尝试在IOS8.0上编写我的应用程序来解码和显示一个实时H.264流。首先,我成功构建了H264参数集。当我得到一个4位起始码的帧时,就像“0x00 0x00 0x00 0x01 0x65 ...”,我把它放到一个CMblockBuffer中。然后我使用预览CMBlockBuffer构造一个CMSampleBuffer。之后,我将CMSampleBuffer放入AVSampleBufferDisplayLayer中。一切正常(我检查了返回的值),除了AVSampleBufferDisplayLayer不显示任何视频图像。由于这些API对每个人来说都是相当新颖的,所以我找不到任何可以解决此问题的机构。将H.264 I帧放到AVSampleBufferDisplayLayer中,但没有显示视频图像
我给出的关键代码如下,我真的很感激它,如果你能帮助找出为什么无法显示视频图像。非常感谢。
(1)AVSampleBufferDisplayLayer初始化。 dsplayer是我的主视图控制器的objc实例。
@property(nonatomic,strong)AVSampleBufferDisplayLayer *dspLayer;
if(!_dspLayer)
{
_dspLayer = [[AVSampleBufferDisplayLayer alloc]init];
[_dspLayer setFrame:CGRectMake(90,551,557,389)];
_dspLayer.videoGravity = AVLayerVideoGravityResizeAspect;
_dspLayer.backgroundColor = [UIColor grayColor].CGColor;
CMTimebaseRef tmBase = nil;
CMTimebaseCreateWithMasterClock(NULL,CMClockGetHostTimeClock(),&tmBase);
_dspLayer.controlTimebase = tmBase;
CMTimebaseSetTime(_dspLayer.controlTimebase, kCMTimeZero);
CMTimebaseSetRate(_dspLayer.controlTimebase, 1.0);
[self.view.layer addSublayer:_dspLayer];
}
(2)在另一个线程中,我得到一个H.264 I帧。 //构造H.264参数组确定
CMVideoFormatDescriptionRef formatDesc;
OSStatus formatCreateResult =
CMVideoFormatDescriptionCreateFromH264ParameterSets(NULL, ppsNum+1, props, sizes, 4, &formatDesc);
NSLog([NSString stringWithFormat:@"construct h264 param set:%ld",formatCreateResult]);
//构造cmBlockbuffer。// databuf指向H.264数据。开始于 “0×00 0×00 0×00 0×01 0x65 ........”
CMBlockBufferRef blockBufferOut = nil;
CMBlockBufferCreateEmpty (0,0,kCMBlockBufferAlwaysCopyDataFlag, &blockBufferOut);
CMBlockBufferAppendMemoryBlock(blockBufferOut,
dataBuf,
dataLen,
NULL,
NULL,
0,
dataLen,
kCMBlockBufferAlwaysCopyDataFlag);
//构建cmsamplebuffer确定
size_t sampleSizeArray[1] = {0};
sampleSizeArray[0] = CMBlockBufferGetDataLength(blockBufferOut);
CMSampleTiminginfo tmInfos[1] = {
{CMTimeMake(5,1), CMTimeMake(5,1), CMTimeMake(5,1)}
};
CMSampleBufferRef sampBuf = nil;
formatCreateResult = CMSampleBufferCreate(kCFAllocatorDefault,
blockBufferOut,
YES,
NULL,
NULL,
formatDesc,
1,
1,
tmInfos,
1,
sampleSizeArray,
&sampBuf);
//投入AVSampleBufferdisplayLayer,只是一个框架。但我看不到任何视频框架在我看来
if([self.dspLayer isReadyForMoreMediaData])
{
[self.dspLayer enqueueSampleBuffer:sampBuf];
}
[self.dspLayer setNeedsDisplay];
Scythe42答案可能会解决你的问题。我也遇到了一些问题,使其工作。但最后我做到了。你应该[看看](http://stackoverflow.com/questions/25980070/how-to-use-avsamplebufferdisplaylayer-in-ios-8-for-rtp-h264-streams-with-gstream)。 – Zappel 2014-10-29 19:09:07
也一样。有一个vail&ready CMSampleBuffer,但它不会显示在屏幕上...... :( – zaxy78 2017-11-01 14:15:19