2012-03-27 66 views
2

我在从iPhone库中选择的现有视频中读取视频帧时遇到内存问题。首先,我将UIImage框架本身添加到一个数组中,但我认为该数组对于内存太长了一段时间,所以我将UIImages保存在文档文件夹中并将图像路径添加到数组中。但是,即使使用仪器进行分配检查,仍然会收到相同的内存警告。总的分配内存从来没有超过2.5MB。也没有发现泄漏...任何人都可以想到什么?阅读视频帧时发生内存问题iPhone

-(void)addFrame:(UIImage *)image 
{ 
    NSString *imgPath = [NSString stringWithFormat:@"%@/Analysis%d-%d.png", docFolder, currentIndex, framesArray.count];  
    [UIImagePNGRepresentation(image) writeToFile:imgPath atomically:YES]; 
    [framesArray addObject:imgPath];  
    frameCount++;  
} 

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info 
{ 
    [picker dismissModalViewControllerAnimated:YES]; 
    [framesArray removeAllObjects];  
    frameCount = 0;   

    // incoming video 
    NSURL *videoURL = [info valueForKey:UIImagePickerControllerMediaURL]; 
    //NSLog(@"Video : %@", videoURL); 

    // AVURLAsset to read input movie (i.e. mov recorded to local storage) 
    NSDictionary *inputOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; 
    AVURLAsset *inputAsset = [[AVURLAsset alloc] initWithURL:videoURL options:inputOptions];  

    // Load the input asset tracks information 
    [inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler: ^{   

     NSError *error = nil; 
     nrFrames = CMTimeGetSeconds([inputAsset duration]) * 30; 
     NSLog(@"Total frames = %d", nrFrames); 

     // Check status of "tracks", make sure they were loaded  
     AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:@"tracks" error:&error]; 
     if (!tracksStatus == AVKeyValueStatusLoaded) 
      // failed to load 
      return;   

     /* Read video samples from input asset video track */ 
     AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:inputAsset error:&error]; 

     NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary]; 
     [outputSettings setObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (NSString*)kCVPixelBufferPixelFormatTypeKey]; 
     AVAssetReaderTrackOutput *readerVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[inputAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] outputSettings:outputSettings]; 


     // Assign the tracks to the reader and start to read 
     [reader addOutput:readerVideoTrackOutput]; 
     if ([reader startReading] == NO) { 
      // Handle error 
      NSLog(@"Error reading"); 
     } 

     NSAutoreleasePool *pool = [NSAutoreleasePool new]; 
     while (reader.status == AVAssetReaderStatusReading) 
     {    
      if(!memoryProblem) 
      { 
       CMSampleBufferRef sampleBufferRef = [readerVideoTrackOutput copyNextSampleBuffer]; 
       if (sampleBufferRef) 
       { 
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBufferRef); 
        /*Lock the image buffer*/ 
        CVPixelBufferLockBaseAddress(imageBuffer,0); 
        /*Get information about the image*/ 
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
        size_t width = CVPixelBufferGetWidth(imageBuffer); 
        size_t height = CVPixelBufferGetHeight(imageBuffer); 

        /*We unlock the image buffer*/ 
        CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

        /*Create a CGImageRef from the CVImageBufferRef*/ 
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
        CGImageRef newImage = CGBitmapContextCreateImage(newContext); 

        /*We release some components*/ 
        CGContextRelease(newContext); 
        CGColorSpaceRelease(colorSpace); 

        UIImage *image= [UIImage imageWithCGImage:newImage scale:[UIScreen mainScreen].scale orientation:UIImageOrientationRight];   
        //[self addFrame:image]; 
        [self performSelectorOnMainThread:@selector(addFrame:) withObject:image waitUntilDone:YES]; 

        /*We release the CGImageRef*/ 
        CGImageRelease(newImage);      

        CMSampleBufferInvalidate(sampleBufferRef); 
        CFRelease(sampleBufferRef); 
        sampleBufferRef = NULL; 
       } 
      } 
      else 
      {     
       break; 
      }    
     } 
     [pool release]; 

     NSLog(@"Finished");   
    }]; 
} 

回答

2

你做了一件事,并尝试。

NSAutoreleasePool移到while回路中,并将其排到回路中。

因此,它会像如下:

while (reader.status == AVAssetReaderStatusReading) 
{    
    NSAutoreleasePool *pool = [NSAutoreleasePool new]; 

    ..... 

    [pool drain]; 
} 
+0

你是天才,它的工作原理!我很想知道,你是怎么想的?为什么在将while循环放入autoreleasepool时应该这样工作? – 2012-03-27 12:06:56

+1

如果它在循环之外,那么该池将仅在循环结束时排空。但在循环结束之前,内存会积累并崩溃。所以如果它在里面,在每次迭代中,自动释放的对象将被释放。 – Ilanchezhian 2012-03-27 12:11:12

+0

嗯,当然,现在看起来合乎逻辑。无论如何,非常感谢! – 2012-03-27 12:20:56