2012-04-07 88 views
5

我想要捕捉特定实例的图像,例如按下按钮时;但我不想显示任何视频预览屏幕。我猜captureStillImageAsynchronouslyFromConnection是我需要用于这种情况。目前,如果我展示视频预览,我可以捕捉图像。然而,如果删除该代码,以显示预览,应用程序崩溃与下面的输出:如何在iOS中不显示预览的情况下捕捉图像

2012-04-07 11:25:54.898 imCapWOPreview [748:707] ***终止应用程序 由于未捕获异常'NSInvalidArgumentException',原因:'*** - [AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] - 已激活/无效连接已通过。' ***第一掷调用堆栈:(0x336ee8bf 0x301e21e5 0x3697c35d 0x34187 0x33648435 0x310949eb 0x310949a7 0x31094985 0x310946f5 0x3109502d 0x3109350f 0x31092f01 0x310794ed 0x31078d2d 0x37db7df3 0x336c2553 0x336c24f5 0x336c1343 0x336444dd 0x336443a5 0x37db6fcd 0x310a7743 0x33887 0x3382c)终止叫做抛出异常(LLDB)

因此,这里是我的实现:

BIDViewController.h:

#import <UIKit/UIKit.h> 
#import <AVFoundation/AVFoundation.h> 

@interface BIDViewController : UIViewController 
{ 
    AVCaptureStillImageOutput *stillImageOutput; 
} 
@property (strong, nonatomic) IBOutlet UIView *videoPreview; 
- (IBAction)doCap:(id)sender; 

@end 
里面BIDViewController.m 10

相关工作人员:

#import "BIDViewController.h" 

@interface BIDViewController() 

@end 

@implementation BIDViewController 
@synthesize capturedIm; 
@synthesize videoPreview; 

- (void)viewDidLoad 
{ 
[super viewDidLoad]; 
[self setupAVCapture]; 
} 

- (BOOL)setupAVCapture 
{ 
NSError *error = nil; 

AVCaptureSession *session = [AVCaptureSession new]; 
[session setSessionPreset:AVCaptureSessionPresetHigh]; 

/* 
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; 
captureVideoPreviewLayer.frame = self.videoPreview.bounds; 
[self.videoPreview.layer addSublayer:captureVideoPreviewLayer];  
*/ 

// Select a video device, make an input 
AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error]; 
if (error) 
    return NO; 
if ([session canAddInput:input]) 
    [session addInput:input]; 

// Make a still image output 
stillImageOutput = [AVCaptureStillImageOutput new]; 
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; 
[stillImageOutput setOutputSettings:outputSettings];  
if ([session canAddOutput:stillImageOutput]) 
    [session addOutput:stillImageOutput]; 

[session startRunning]; 

return YES; 
} 

- (IBAction)doCap:(id)sender { 
AVCaptureConnection *videoConnection = nil; 
for (AVCaptureConnection *connection in stillImageOutput.connections) 
{ 
    for (AVCaptureInputPort *port in [connection inputPorts]) 
    { 
     if ([[port mediaType] isEqual:AVMediaTypeVideo]) 
     { 
      videoConnection = connection; 
      break; 
     } 
    } 
    if (videoConnection) { break; } 
} 

[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection 
    completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) { 
     // Do something with the captured image 
    }]; 

} 

与上面的代码,如果doCap被调用,然后发生崩溃。在另一方面,如果我删除setupAVCapture功能如下意见

/* 
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; 
captureVideoPreviewLayer.frame = self.videoPreview.bounds; 
[self.videoPreview.layer addSublayer:captureVideoPreviewLayer];  
*/ 

然后它没有任何问题。

总之,我的问题是,如何在受控实例下捕获图像而不显示预览?

+0

最简单的方法就是'self.videoPreview.hidden = YES;' – Felix 2012-04-07 11:26:54

+0

该代码适用于我的iPhone 4S – Felix 2012-04-07 11:39:10

+0

@ phix23隐藏videoPreview也适用于我......接下来的问题是;这种方法会有性能损失吗?即将视频预览数据发送到隐藏层所花费的冗余处理? – 2012-04-07 14:11:12

回答

8

我使用以下代码从前置摄像头(如果可用)或使用后置摄像头进行捕获。适用于我的iPhone 4S。

-(void)viewDidLoad{ 

    AVCaptureSession *session = [[AVCaptureSession alloc] init]; 
    session.sessionPreset = AVCaptureSessionPresetMedium; 

    AVCaptureDevice *device = [self frontFacingCameraIfAvailable]; 

    NSError *error = nil; 
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; 
    if (!input) { 
     // Handle the error appropriately. 
     NSLog(@"ERROR: trying to open camera: %@", error); 
    } 
    [session addInput:input]; 

//stillImageOutput is a global variable in .h file: "AVCaptureStillImageOutput *stillImageOutput;" 
    stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; 
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; 
    [stillImageOutput setOutputSettings:outputSettings]; 

    [session addOutput:stillImageOutput]; 

    [session startRunning]; 
} 

-(AVCaptureDevice *)frontFacingCameraIfAvailable{ 

    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; 
    AVCaptureDevice *captureDevice = nil; 

    for (AVCaptureDevice *device in videoDevices){ 

     if (device.position == AVCaptureDevicePositionFront){ 

      captureDevice = device; 
      break; 
     } 
    } 

    // couldn't find one on the front, so just get the default video device. 
    if (!captureDevice){ 

     captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
    } 

    return captureDevice; 
} 

-(IBAction)captureNow{ 

    AVCaptureConnection *videoConnection = nil; 
    for (AVCaptureConnection *connection in stillImageOutput.connections){ 
     for (AVCaptureInputPort *port in [connection inputPorts]){ 

      if ([[port mediaType] isEqual:AVMediaTypeVideo]){ 

       videoConnection = connection; 
       break; 
      } 
     } 
     if (videoConnection) { 
      break; 
     } 
    } 

    NSLog(@"about to request a capture from: %@", stillImageOutput); 
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){ 

     CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); 
     if (exifAttachments){ 

      // Do something with the attachments if you want to. 
      NSLog(@"attachements: %@", exifAttachments); 
     } 
     else 
      NSLog(@"no attachments"); 

     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; 
     UIImage *image = [[UIImage alloc] initWithData:imageData]; 

     self.vImage.image = image; 
    }]; 
} 
+0

工作很好。谢谢 ! – 2012-10-10 15:40:29

+1

我在stillImageOutput.connections中得到错误或者包含0个对象。哪里不对? – 2013-08-14 10:44:32

+0

这真是一个很好的解决方案。但是在捕捉到图像后,屏幕一秒钟空白。在SnapChat应用程序中,它不是那样的。我如何获得SnapChat应用程序中的行为? – Satyam 2014-05-02 05:42:14

1

嗯,我正面临着类似的问题,即由captureStillImageAsynchronouslyFromConnection:stillImageConnection是引发异常,传递的connection是无效的。后来,我发现当我为会话制作propertiesstillImageOutPut以保留价值时,问题就解决了。

相关问题