2016-12-01 92 views
-1

在osx上我使用AVFoundation从USB摄像头捕捉图像,一切正常,但我得到的图像比实时视频更暗。AVFoundation拍摄的图像是黑暗的

设备捕获配置

-(BOOL)prepareCapture{ 
captureSession = [[AVCaptureSession alloc] init]; 
NSError *error; 

imageOutput=[[AVCaptureStillImageOutput alloc] init]; 
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat]; 
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; 

videoOutput=[[AVCaptureMovieFileOutput alloc] init]; 

AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:MyVideoDevice error:&error]; 
if (videoInput) { 
    [captureSession beginConfiguration]; 
    [captureSession addInput:videoInput]; 
    [captureSession setSessionPreset:AVCaptureSessionPresetHigh]; 
    //[captureSession setSessionPreset:AVCaptureSessionPresetPhoto]; 
    [captureSession addOutput:imageOutput]; 
    [captureSession addOutput:videoOutput]; 
    [captureSession commitConfiguration]; 
} 
else { 
    // Handle the failure. 
    return NO; 
} 
return YES; 
} 

用于实时预览添加视图

-(void)settingPreview:(NSView*)View{ 
// Attach preview to session 
previewView = View; 
CALayer *previewViewLayer = [previewView layer]; 
[previewViewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)]; 
AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession]; 
[newPreviewLayer setFrame:[previewViewLayer bounds]]; 
[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; 
[previewViewLayer addSublayer:newPreviewLayer]; 
//[self setPreviewLayer:newPreviewLayer]; 
[captureSession startRunning]; 
} 

代码来捕捉图像

-(void)captureImage{ 
AVCaptureConnection *videoConnection = nil; 
for (AVCaptureConnection *connection in imageOutput.connections) { 
    for (AVCaptureInputPort *port in [connection inputPorts]) { 
     if ([[port mediaType] isEqual:AVMediaTypeVideo]) { 
      videoConnection = connection; 
      break; 
     } 
    } 
    if (videoConnection) { break; } 
} 
[imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: 
^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
    CFDictionaryRef exifAttachments = 
    CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); 
    if (exifAttachments) { 
     // Do something with the attachments. 
    } 
    // Continue as appropriate. 
    //IMG is a global NSImage 
    IMG = [self imageFromSampleBuffer:imageSampleBuffer]; 
    [[self delegate] imageReady:IMG]; 
}]; 
} 

从样本缓冲区中的数据创建一个NSImage中,我认为问题在这里

- (NSImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{ 
// Get a CMSampleBuffer's Core Video image buffer for the media data 
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
// Lock the base address of the pixel buffer 
CVPixelBufferLockBaseAddress(imageBuffer, 0); 

// Get the number of bytes per row for the pixel buffer 
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

// Get the number of bytes per row for the pixel buffer 
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
// Get the pixel buffer width and height 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 

// Create a device-dependent RGB color space 
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

// Create a bitmap graphics context with the sample buffer data 
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
              bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
// Create a Quartz image from the pixel data in the bitmap graphics context 
CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
// Unlock the pixel buffer 
CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

// Free up the context and color space 
CGContextRelease(context); 
CGColorSpaceRelease(colorSpace); 

// Create an image object from the Quartz image 
//UIImage *image = [UIImage imageWithCGImage:quartzImage]; 
NSImage * image = [[NSImage alloc] initWithCGImage:quartzImage size:NSZeroSize]; 
// Release the Quartz image 
CGImageRelease(quartzImage); 

return (image); 
} 
+0

我没有发现

// Continue as appropriate. //IMG = [self imageFromSampleBuffer:imageSampleBuffer]; CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer); if (imageBuffer) { CVBufferRetain(imageBuffer); NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]]; IMG = [[NSImage alloc] initWithSize: [imageRep size]]; [IMG addRepresentation: imageRep]; CVBufferRelease(imageBuffer); } 

代码通过您的代码快速浏览任何内容......但只有一个原因AVCapture图像可能会出现黑屏,因为相机需要一些时间来自动调整焦点,曝光等。您是否可以在开始运行捕捉会话的'settingPreview'方法之后立即调用您的'captureImage'方法? – rickster

+0

代码大部分来自苹果示例。 **程序启动时会调用settingPreview **。 重点是保存的图像更暗一点,与现场的区别很小,就好像我稍微降低了亮度。我认为问题发生在从** CMSampleBufferRef **转换为** NSImage ** – Mex

回答

0

发现的解决方案

的问题是在imageFromSampleBuffer 我用这个代码和图片是十全十美的answer