2012-01-06 87 views
0

我想使用AVCaptureSession制作相机应用程序。现在我只想看看视频输入是否有效。但它看起来没有任何输入,我似乎无法理解为什么。未找到AVCapture会话。无法添加视频输入

- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 

    session = [[AVCaptureSession alloc] init]; 

    [self addVideoPreviewLayer]; 

    CGRect layerRect = [[[self view] layer] bounds]; 

    [[self previewLayer] setBounds:layerRect]; 
    [[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect), 
                    CGRectGetMidY(layerRect))]; 
    [[[self view] layer] addSublayer:[self previewLayer]]; 

    UIButton *myButton = [UIButton buttonWithType:UIButtonTypeRoundedRect]; 
    myButton.frame = CGRectMake(80, 320, 200, 44); 
    [myButton setTitle:@"Click Me!" forState:UIControlStateNormal]; 
    [myButton addTarget:self action:@selector(scanButtonPressed) forControlEvents:UIControlEventTouchDown]; 
    [self.view addSubview:myButton]; 
} 

-(void)addVideoPreviewLayer 
{ 
    [self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self session]] autorelease]]; 
    [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill]; 
} 

-(void) addVideoInput 
{ 
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 
    if (videoDevice) 
    { 
     NSError *error; 
     AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 
     if (!error) 
     { 
      if ([[self session] canAddInput:videoIn]) 
       [[self session] addInput:videoIn]; 
      else 
       NSLog(@"Couldn't add video input");  
     } 
     else 
      NSLog(@"Couldn't create video input"); 
    } 
    else 
     NSLog(@"Couldn't create video capture device"); 
} 

-(IBAction)scanButtonPressed 
{ 
    [self addVideoInput]; 
} 
+0

此代码的结果(控制台输出)是什么? – Till 2012-01-06 21:32:57

+0

@Till无法添加视频输入。 – ilaunchpad 2012-01-06 21:43:40

回答

0

下面是我该怎么做。这是从多个函数中压缩出来的,因此它可能不是可编译的代码,并且大部分错误处理已被删除。

captureSession = [[AVCaptureSession alloc] init]; 
captureSession.sessionPreset = AVCaptureSessionPresetMedium; 

AVCaptureDevice *videoDevice; 
videoDevice = [self frontFacingCamera]; 
if (videoDevice == nil) { 
    videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];  
} 

if (videoDevice) { 
    NSError *error; 
    videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 

    [captureSession addInput:self.videoInput]; 
} 

videoOutput = [[AVCaptureVideoDataOutput alloc] init]; 

[videoOutput setAlwaysDiscardsLateVideoFrames:NO]; 

AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo]; 

if (conn.supportsVideoMinFrameDuration) 
    conn.videoMinFrameDuration = CMTimeMake(1, frameRate); 
if (conn.supportsVideoMaxFrameDuration) 
    conn.videoMaxFrameDuration = CMTimeMake(1, frameRate); 

NSDictionary *videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; 
    [videoOutput setVideoSettings:videoSettings]; 
[videoOutput setSampleBufferDelegate:self queue:capture_queue]; 

if ([captureSession canAddOutput:videoOutput]) 
    [captureSession addOutput:videoOutput]; 
else 
    NSLog(@"Couldn't add video output");  

[self.captureSession startRunning]; 

previewLayer.session = captureSession; 
+0

谢谢。我改变了一些东西,它的工作。 – ilaunchpad 2012-01-09 16:42:57