2011-12-15 181 views
0

我使用AVfoudnation录制了音频/视频。在开始捕捉视频/音频之前,我需要使用系统声音播放声音。这是第一次正常工作,但当我第二次尝试时,系统奥迪不玩。我的猜测是AVfoundation中的某些内容没有正确发布。捕捉音频/视频后无法播放系统声音

在我的应用程序deletage,我有这样的代码中的applicationDidFinishLaunching方法:

VKRSAppSoundPlayer *aPlayer = [[VKRSAppSoundPlayer alloc] init]; 
[aPlayer addSoundWithFilename:@"sound1" andExtension:@"caf"]; 
self.appSoundPlayer = aPlayer; 
[aPlayer release]; 

而且这种方法

- (void)playSound:(NSString *)sound 
{ 
    [appSoundPlayer playSound:sound]; 
} 

正如你可以看到我使用VKRSAppSoundPlayer,它的伟大工程!

在视图中,我有这样的代码:

- (void) startSession 
{ 
    self.session = [[AVCaptureSession alloc] init]; 

    [session beginConfiguration]; 
    if([session canSetSessionPreset:AVCaptureSessionPreset640x480]) 
     session.sessionPreset = AVCaptureSessionPresetMedium; 

    [session commitConfiguration]; 


    CALayer *viewLayer = [videoPreviewView layer]; 

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; 

    captureVideoPreviewLayer.frame = viewLayer.bounds; 

    [viewLayer addSublayer:captureVideoPreviewLayer]; 

    self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCameraIfAvailable] error:nil]; 
    self.audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[self audioDevice] error:nil]; 

    if(videoInput){ 
     self.videoOutput = [[AVCaptureMovieFileOutput alloc] init]; 

     [session addOutput:videoOutput]; 
     //[videoOutput release]; 

     if([session canAddInput:videoInput]){ 
      //[session beginConfiguration]; 
      [session addInput:videoInput]; 

     } 
     //[videoInput release]; 

     [session removeInput:[self audioInput]]; 
     if([session canAddInput:audioInput]){ 
      [session addInput:audioInput]; 
     } 
     //[audioInput release]; 


     if([session canAddInput:audioInput]) 
      [session addInput:audioInput]; 


     NSLog(@"startRunning!"); 
     [session startRunning]; 

     [self startRecording]; 


     if(![self recordsVideo]) 
      [self showAlertWithTitle:@"Video Recording Unavailable" msg:@"This device can't record video."]; 

    } 
} 

- (void) stopSession 
{ 
    [session stopRunning]; 
    [session release]; 
} 


- (AVCaptureDevice *)frontFacingCameraIfAvailable 
{ 
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; 
    AVCaptureDevice *captureDevice = nil; 

    Boolean cameraFound = false; 

    for (AVCaptureDevice *device in videoDevices) 
    { 
     NSLog(@"1 frontFacingCameraIfAvailable %d", device.position); 
     if (device.position == AVCaptureDevicePositionBack){ 
      NSLog(@"1 frontFacingCameraIfAvailable FOUND"); 

      captureDevice = device; 
      cameraFound = true; 
      break; 
     } 
    } 

    if(cameraFound == false){ 
     for (AVCaptureDevice *device in videoDevices) 
     { 
      NSLog(@"2 frontFacingCameraIfAvailable %d", device.position); 
      if (device.position == AVCaptureDevicePositionFront){ 
       NSLog(@"2 frontFacingCameraIfAvailable FOUND"); 

       captureDevice = device; 
       break; 
      } 
     } 
    } 

    return captureDevice; 
} 

- (AVCaptureDevice *) audioDevice 
{ 
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio]; 
    if ([devices count] > 0) { 
     return [devices objectAtIndex:0]; 
    } 
    return nil; 
} 

- (void) startRecording 
{ 
#if _Multitasking_ 
    if ([[UIDevice currentDevice] isMultitaskingSupported]) { 
     [self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]]; 
    } 
#endif 

    [videoOutput startRecordingToOutputFileURL:[self generatenewVideoPath] 
          recordingDelegate:self]; 
} 

- (void) stopRecording 
{ 
    [videoOutput stopRecording]; 

} 

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput 
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL 
     fromConnections:(NSArray *)connections error:(NSError *)error 
{ 
    NSFileManager *man = [[NSFileManager alloc] init]; 
    NSDictionary *attrs = [man attributesOfItemAtPath: [outputFileURL path] error: NULL]; 
    NSString *fileSize = [NSString stringWithFormat:@"%llu", [attrs fileSize]]; 


    // close this screen 
    [self exitScreen]; 
} 

-(BOOL)recordsVideo 
{ 
    AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo 
                    fromConnections:[videoOutput connections]]; 
    return [videoConnection isActive]; 
} 

-(BOOL)recordsAudio 
{ 
    AVCaptureConnection *audioConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeAudio 
                    fromConnections:[videoOutput connections]]; 
    return [audioConnection isActive]; 
} 

如果我[videoInput释放];和[audioInput发布];我收到了错误的访问错误。这就是为什么他们被评论。这可能是问题的一部分。

如果我尝试播放系统声音n次,它会奏效,但是如果我先录制脚本,那么在此之后就不会起作用。

任何想法?

+0

你需要更好地理解self.iVar意味着什么,以及什么意味着什么。释放递减保留计数,如果0使对象有资格取消分配。使用self.iVar(假设你已经声明它是一个属性)保留了这个iVar,所以你可以在那之后发布它。但我不认为这是你的音频问题。 – Rayfleck 2011-12-15 17:50:30

回答

0

释放AVCaptureSession正确的方法是:

- (void) destroySession { 

    // Notify the view that the session will end 
    if ([delegate respondsToSelector:@selector(captureManagerSessionWillEnd:)]) { 
     [delegate captureManagerSessionWillEnd:self]; 
    } 

    // remove the device inputs 
    [session removeInput:[self videoInput]]; 
    [session removeInput:[self audioInput]]; 

    // release 
    [session release]; 

    // remove AVCamRecorder 
    [recorder release]; 

    // Notify the view that the session has ended 
    if ([delegate respondsToSelector:@selector(captureManagerSessionEnded:)]) { 
     [delegate captureManagerSessionEnded:self]; 
    } 
} 

如果您有某种形式的释放问题(坏的访问),我可以建议采取你的代码了当前的“混乱”项目到其他一些新项目并在那里调试问题。

当我遇到类似的问题时,我只是这样做了。我在Github上分享了它,你可能会发现这个项目很有用:AVCam-CameraReleaseTest