2016-09-17 43 views
2

我正在为Cocoa环境中不存在的程序编写程序插件(想想C++命令行程序)。如果有兴趣,这是v8节点附加系统。我想这个插件来记录画面,从而利用AVCaptureSession的,等等。所以基本上,像这样:从非Cocoa应用调用基础,我需要NSRunLoop吗?

void start(/*entry*/) 
{ 
    // No run loop is *necessarily* present. 
    AVCaptureSession * session = ... 
} 

void stop (/*entry*/) 
{ 
    // etc.. 
} 

实际上,我很可能会开始新的并行线程做这个东西,在这样没有任何阻碍。我的问题是,我需要建立多少周围的基础设施基础设施。我几乎可以肯定需要一个@autoreleasepool {},但我应该真的开始了我自己的默认NSRunLoop在线程中运行,如果不是我得到的印象是,在AVCapture等任何trickiness可能失败:

BOOL isStillRecording = YES; 
void start(/*entry*/) 
{ 
    // setup avcapture and what have you. 
    NSRunLoop *theRL = [NSRunLoop new]; 
    while (isStillRecording && [theRL runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]]); 
} 

void stop(/**entry**/) 
{ 
    // kill avcapture, maybe through async_dispatch to not stop on the start up. 
    isStillRecording = NO; 
} 

回答

0

UPDATE 其实我把它收回。它看起来像你需要使用当前的runloop获得一些代表回调,如AVCaptureFileOutput的。这将创建的屏幕和麦克风罚款QuickTime影片:

#import <AVFoundation/AVFoundation.h> 

@interface Capturer() <AVCaptureFileOutputRecordingDelegate> 

@property(nonatomic) AVCaptureSession *session; 
@property(nonatomic) AVCaptureMovieFileOutput *movieOutput; 
@property(nonatomic, copy) void(^finishBlock)(NSError*); 

@end 

@implementation Capturer 
- (instancetype)init 
{ 
    self = [super init]; 
    if (self) { 
     [self setup]; 
    } 
    return self; 
} 

- (void)setup { 
    self.session = [[AVCaptureSession alloc] init]; 

    // capture the screen 
    CGDirectDisplayID displayId = CGMainDisplayID(); 
    AVCaptureScreenInput *screenInput = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId]; 

    [self.session addInput:screenInput]; 

    // capture microphone input 
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; 
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil]; 
    [self.session addInput:audioInput]; 

    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init]; 

    [self.session addOutput:self.movieOutput]; 
} 

- (void)start { 
    [self.session startRunning]; 
    NSURL *movieURL = [[NSURL fileURLWithPath:[[NSFileManager defaultManager] currentDirectoryPath]] URLByAppendingPathComponent:@"output.mov"]; 
    [[NSFileManager defaultManager] removeItemAtURL:movieURL error:nil]; 
    NSLog(@"recording to %@", movieURL.path); 
    [self.movieOutput startRecordingToOutputFileURL:movieURL recordingDelegate:self]; 
} 

- (void)stop:(void (^)(NSError *))finishBlock { 
    self.finishBlock = finishBlock; 
    [self.movieOutput stopRecording]; 
} 

// MARK: AVCaptureFileOutputRecordingDelegate 
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error { 
    NSLog(@"Finished recording to %@ with error %@", outputFileURL.path, error); 
    [self.session stopRunning]; 
    self.finishBlock(error); 
} 

@end 


int main(int argc, const char * argv[]) { 
    @autoreleasepool { 
     Capturer *c = [[Capturer alloc] init]; 
     NSRunLoop *runLoop = [NSRunLoop currentRunLoop]; 

     [c start]; 

     // record 10s' worth 
     __block BOOL finished = NO; 
     [c performSelector:@selector(stop:) withObject:^(NSError *error) { 
      finished = YES; 
     } afterDelay:10]; 

     // cribbed from https://gist.github.com/syzdek/3220789 
     while(!finished && [runLoop runMode:NSDefaultRunLoopMode beforeDate:[NSDate dateWithTimeIntervalSinceNow:2]]) NSLog(@"waiting"); 

    } 
    return 0; 
} 

以前

我在命令行应用程序使用AVFoundation,我并不需要NSRunLoop。我需要

  1. 创建@autoreleasepool(如你所说)
  2. AVFoundation相当异步的,使用信号量,以确保我没有退出处理已经完成了。