2011-02-27 99 views
10

第一次在这里提问。我希望这篇文章很明确,示例代码格式正确。AVFoundation - Retiming CMSampleBufferRef视频输出

我正在试验AVFoundation和时间推移摄影。

我的目的是从iOS设备(我的iPod touch,版本4)的视频摄像头中抓取每个第N帧,并将每个帧写入文件以创建一个延时。我正在使用AVCaptureVideoDataOutput,AVAssetWriter和AVAssetWriterInput。

问题是,如果我使用传递给

captureOutput:idOutputSampleBuffer:fromConnection:
的CMSampleBufferRef,则每帧的回放是原始输入帧之间的时间长度。帧速率为1fps。我期待得到30fps。

我试过使用

CMSampleBufferCreateCopyWithNewTiming()
,但在13帧写入文件后,
captureOutput:idOutputSampleBuffer:fromConnection:
停止被调用。该界面处于活动状态,我可以点击一个按钮停止捕捉并将其保存到照片库进行播放。它看起来像我想要的那样播放,30fps,但它只有那13帧。

我该如何实现30fps播放的目标? 我怎么知道应用程序在哪里迷路?为什么?

我已经放置了一个名为useNativeTime的标志,以便我可以测试这两种情况。当设置为YES时,我会获得我感兴趣的所有帧,因为回调不会“丢失”。当我将该标志设置为NO时,我只能处理13个帧,并且从未再次返回该方法。如上所述,在这两种情况下,我都可以播放视频。

感谢您的任何帮助。

这里是我正在尝试重新定时的地方。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
    BOOL useNativeTime = NO; 
    BOOL appendSuccessFlag = NO; 

    //NSLog(@"in captureOutpput sample buffer method"); 
    if(!CMSampleBufferDataIsReady(sampleBuffer)) 
    { 
     NSLog(@"sample buffer is not ready. Skipping sample"); 
     //CMSampleBufferInvalidate(sampleBuffer); 
     return; 
    } 

    if (! [inputWriterBuffer isReadyForMoreMediaData]) 
    { 
     NSLog(@"Not ready for data."); 
    } 
    else { 
     // Write every first frame of n frames (30 native from camera). 
     intervalFrames++; 
     if (intervalFrames > 30) { 
      intervalFrames = 1; 
     } 
     else if (intervalFrames != 1) { 
      //CMSampleBufferInvalidate(sampleBuffer); 
      return; 
     } 

     // Need to initialize start session time. 
     if (writtenFrames < 1) { 
      if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
      else imageSourceTime = CMTimeMake(0 * 20 ,600); //CMTimeMake(1,30); 
      [outputWriter startSessionAtSourceTime: imageSourceTime]; 
      NSLog(@"Starting CMtime"); 
      CMTimeShow(imageSourceTime); 
     } 

     if (useNativeTime) { 
      imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
      CMTimeShow(imageSourceTime); 
      // CMTime myTiming = CMTimeMake(writtenFrames * 20,600); 
      // CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect. 
      appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer]; 
     } 
     else { 
      CMSampleBufferRef newSampleBuffer; 
      CMSampleTimingInfo sampleTimingInfo; 
      sampleTimingInfo.duration = CMTimeMake(20,600); 
      sampleTimingInfo.presentationTimeStamp = CMTimeMake((writtenFrames + 0) * 20,600); 
      sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid; 
      OSStatus myStatus; 

      //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer)); 
      myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, 
                  sampleBuffer, 
                  1, 
                  &sampleTimingInfo, // maybe a little confused on this param. 
                  &newSampleBuffer); 
      // These confirm the good heath of our newSampleBuffer. 
      if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus); 
      if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!"); 

      // No affect. 
      //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ? 
      //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus); 

      imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer); 
      CMTimeShow(imageSourceTime); 
      appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer]; 
      //CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe? 
      //CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS” 
     } 

     if (!appendSuccessFlag) 
     { 
      NSLog(@"Failed to append pixel buffer"); 
     } 
     else { 
      writtenFrames++; 
      NSLog(@"writtenFrames: %i", writtenFrames); 
      } 
    } 

    //[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting. 
} 

我的设置程序。

- (IBAction) recordingStartStop: (id) sender 
{ 
    NSError * error; 

    if (self.isRecording) { 
     NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~"); 
     self.isRecording = NO; 
     [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal]; 

     //[self.captureSession stopRunning]; 
     [inputWriterBuffer markAsFinished]; 
     [outputWriter endSessionAtSourceTime:imageSourceTime]; 
     [outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs. 
     NSLog(@"finished CMtime"); 
     CMTimeShow(imageSourceTime); 

     // Really, I should loop through the outputs and close all of them or target specific ones. 
     // Since I'm only recording video right now, I feel safe doing this. 
     [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]]; 

     [videoOutput release]; 
     [inputWriterBuffer release]; 
     [outputWriter release]; 
     videoOutput = nil; 
     inputWriterBuffer = nil; 
     outputWriter = nil; 
     NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~"); 
     NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum."); 
     NSLog(@"filePath: %@", [projectPaths movieFilePath]); 
     if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) { 
      NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum."); 
      UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil); 
     } 
     NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~"); 
    } 
    else { 
     NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~"); 
     projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"]; 
     intervalFrames = 30; 

     videoOutput = [[AVCaptureVideoDataOutput alloc] init]; 
     NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease]; 
     NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
     NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]; 
     [cameraVideoSettings setValue: value forKey: key]; 
     [videoOutput setVideoSettings: cameraVideoSettings]; 
     [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps 
     [videoOutput setAlwaysDiscardsLateVideoFrames: YES]; 

     queue = dispatch_queue_create("cameraQueue", NULL); 
     [videoOutput setSampleBufferDelegate: self queue: queue]; 
     dispatch_release(queue); 

     NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease]; 
     [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey]; 
     [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming 
     [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey]; 

     NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease]; 
     [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey]; 
     //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey]; 
     [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey]; 

     inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings]; 
     [inputWriterBuffer retain]; 
     inputWriterBuffer.expectsMediaDataInRealTime = YES; 

     outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error]; 
     [outputWriter retain]; 

     if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:"); 
     if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer]; 
     else NSLog(@"can not add input"); 

     if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported"); 

     if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput]; 
     else NSLog(@"could not addOutput: videoOutput to captureSession"); 

     //[self.captureSession startRunning]; 
     self.isRecording = YES; 
     [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal]; 

     writtenFrames = 0; 
     imageSourceTime = kCMTimeZero; 
     [outputWriter startWriting]; 
     //[outputWriter startSessionAtSourceTime: imageSourceTime]; 
     NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~"); 
     NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]); 
    } 

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO"); 

    [self displayOuptutWritterStatus]; 
} 

回答

3

随着多一点搜索和阅读我有一个工作解决方案。不知道这是最好的方法,但到目前为止,这么好。

在我的设置区域中,我设置了一个AVAssetWriterInputPixelBufferAdaptor。代码添加看起来像这样。

InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor 
      assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer 
      sourcePixelBufferAttributes: nil]; 
[inputWriterBufferAdaptor retain]; 

为了完整理解下面的代码,我在安装方法中也有这三行。

fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97; 
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97. 
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale/fpsOutput; 

而不是将重定时应用于采样缓冲区的副本。现在我有以下三行代码可以有效地做同样的事情。注意适配器的withPresentationTime参数。通过传递我的自定义价值,我获得了我正在寻找的正确时机。

CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer(sampleBuffer); 
imageSourceTime = CMTimeMake(writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale); 
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime]; 

使用AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool财产可能有一定的涨幅,但我还没有想通了这一点。

10

好的,我在第一篇文章中发现了这个错误。

当使用

myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, 
               sampleBuffer, 
               1, 
               &sampleTimingInfo, 
               &newSampleBuffer); 

你需要平衡与一个CFRelease(newSampleBuffer);

使用与AVAssetWriterInputPixelBufferAdaptor实例的piexBufferPool一个CVPixelBufferRef时同样的想法也是如此。调用appendPixelBuffer: withPresentationTime:方法后,您将使用CVPixelBufferRelease(yourCVPixelBufferRef);

希望这对别人有帮助。

+0

谢谢,这节省了我很多痛苦。 – 2011-03-11 02:12:36

+0

不客气。很高兴听到这个帖子的帮助。 – 2011-03-11 08:27:33

+0

谢谢!这真的拯救了我的一天。如果发现这个......你的第一个但有帮助的问题,那将需要更多的修补。 – CipherCom 2013-05-20 15:31:10

相关问题