2014-03-24 62 views
1

我在我的应用程序中使用GPUImage并尝试过滤视频。实时视频过滤运行良好。当我尝试从文件系统中将视频读入内存并使用sunsetlakessoftware教程页面和SimpleVideoFileFilter演示中发布的代码应用滤镜时,出现问题。使用GPUImage过滤视频

编辑:我意识到我原来的帖子可能没有提出足够具体的问题。我所问的是:我究竟能从磁盘读取视频到内存中,应用GPUImageFilter,然后用过滤版本覆盖原件?

该应用程序是与以下错误崩溃:

-[AVAssetWriter startWriting] Cannot call method when status is 2

状态2是​​。我已经看到所有其他三个AVAssetWriterStatus es发生同样的故障。

我已经发布了下面的相关代码。

GPUImageFilter *selectedFilter = [self.allFilters objectAtIndex:indexPath.item]; 

// get the file url I stored when the video was initially captured 
NSURL *url = [self.videoURLsByIndexPath objectForKey:self.indexPathForDisplayedImage]; 

GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:url]; 
movieFile.runBenchmark = YES; 
movieFile.playAtActualSpeed = NO; 
[movieFile addTarget:selectedFilter]; // apply the user-selected filter to the file 

unlink([url.absoluteString UTF8String]); // delete the file that was at that file URL so it's writeable 

// A different movie writer than the one I was using for live video capture. 
GPUImageMovieWriter *editingMovieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:url size:CGSizeMake(640.0, 640.0)]; 

[selectedFilter addTarget:editingMovieWriter]; 

editingMovieWriter.shouldPassthroughAudio = YES; 
movieFile.audioEncodingTarget = editingMovieWriter; 
[movieFile enableSynchronizedEncodingUsingMovieWriter:editingMovieWriter]; 

[editingMovieWriter startRecording]; 
[movieFile startProcessing]; // Commenting out this line prevents crash 

// weak variables to prevent retain cycle 
__weak GPUImageMovieWriter *weakWriter = editingMovieWriter; 
__weak id weakSelf = self; 
[editingMovieWriter setCompletionBlock:^{ 
    [selectedFilter removeTarget:weakWriter]; 
    [weakWriter finishRecording]; 
    [weakSelf savePhotosToLibrary]; // use ALAssetsLibrary to write to camera roll 
}]; 

也许我的问题是与editingMovieWriter的范围。或者,也许我正在使用与我尝试写入的URL相同的URL初始化GPUImageMovie实例。我在GPUImage github的问题页面上阅读了几篇文章,SO上的几个相关文章,自述文件和上面链接的教程。

任何有关这个问题的深入了解将不胜感激。谢谢。

回答

1

这里至少有一件事可能是背后的原因。在上面的代码中,你不会强调对源对象的强烈引用。

如果这是一个支持ARC的项目,那么当您完成设置方法(如果不是,您将泄漏该对象)时,该对象将被释放。这将停止电影播放,释放电影本身,并导致黑色帧被发送到过滤器管道(其他潜在的不稳定性)。

您需要使movieFile成为强引用的实例变量,以确保它挂在过去的设置方法上,因为所有电影处理都是异步的。

+0

感谢您的回复。好点子。我添加了一个强大的属性来保存我的电影文件。不幸的是,它没有解决问题。我发现当调用'[movieFile startProcessing]'时,'editingMovieWriter.assetWriter'的状态变为'AVAssetWriterStatusFailed'。我已经确认GPUImageMovieWriter是一个不同的实例(当然,这意味着AVAssetWriter是一个新的实例)。我似乎无法确定作者失败的原因。如果你有一个时刻和任何进一步的想法,那就太棒了。 – geraldWilliam

+1

@geraldWilliam - 您可以修改SimpleVideoFileFilter示例(应该干净地工作)到断点处吗?这可以帮助识别出现问题的地方。 –

+0

好吧,当我用我知道的无效文件URL初始化movieWriter时,我得到了同样的错误。这是一个很大的暗示。回到我的项目,并生成一个新的文件URL(不是我读的那个)来写修改过的视频。没有更多的崩溃。该图书馆是否从该URL主动阅读?因此它不是有效的写入位置?我想我假设用一个URL加载了整个文件来初始化GPUImageMovie。无论如何,感谢您的帮助。 – geraldWilliam

0
Here is solution : 

Declare it 
    var movieFile: GPUImageMovie! 
    var gpuImage: GPUImagePicture! 
    var sourcePicture: GPUImagePicture! 
    var sepiaFilter: GPUImageOutput! 
    var sepiaFilter2: GPUImageInput! 
    var movieWriter : GPUImageMovieWriter! 
    var filter: GPUImageInput! 

//Filter image 

    func StartWriting() 
    { 

     // Step - 1 pass url to avasset 
     let loadingNotification = MBProgressHUD.showHUDAddedTo(self.view, animated: true) 
     loadingNotification.mode = MBProgressHUDMode.Indeterminate 
     loadingNotification.labelText = "Loading" 

     let documentsURL1 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL 
     let pathToMovie = documentsURL1.URLByAppendingPathComponent("temp.mov") 
     self.movieFile = GPUImageMovie(URL: pathToMovie) 
     self.movieFile.runBenchmark = true 
     self.movieFile.playAtActualSpeed = false 
     self.filter = GPUImageGrayscaleFilter() 
     self.sepiaFilter = GPUImageGrayscaleFilter() 
     self.movieFile.addTarget(self.filter) 
     let documentsURL2 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL 
     self.paths = documentsURL2.URLByAppendingPathComponent("temp1.mov") 
     var fileManager: NSFileManager = NSFileManager.defaultManager() 
     var error: NSError 
     fileManager.removeItemAtURL(self.paths, error: nil) 
     let Data = NSData(contentsOfURL: pathToMovie) 
     println(Data?.length) 

     var anAsset = AVAsset.assetWithURL(pathToMovie)as!AVAsset 

     var videoAssetTrack = anAsset.tracksWithMediaType(AVMediaTypeVideo)[0]as! AVAssetTrack 
     var videoAssetOrientation_: UIImageOrientation = .Up 
     var isVideoAssetPortrait_: Bool = true 
     var videoTransform: CGAffineTransform = videoAssetTrack.preferredTransform 

     var naturalSize = CGSize() 
     var FirstAssetScaleToFitRatio: CGFloat = 320.0/videoAssetTrack.naturalSize.width 
     println(naturalSize) 
     naturalSize = videoAssetTrack.naturalSize 
     self.movieWriter = GPUImageMovieWriter(movieURL: self.paths, size: naturalSize) 
     let input = self.filter as! GPUImageOutput 
     input.addTarget(self.movieWriter) 
     self.movieWriter.shouldPassthroughAudio = true 
     if anAsset.tracksWithMediaType(AVMediaTypeAudio).count > 0 { 
      self.movieFile.audioEncodingTarget = self.movieWriter 
     } 
     else 
     { 
      self.movieFile.audioEncodingTarget = nil 
     } 

     self.movieFile.enableSynchronizedEncodingUsingMovieWriter(self.movieWriter) 
     self.movieWriter.startRecording() 
     self.movieFile.startProcessing() 

     self.movieWriter.completionBlock = 

      {() -> Void in 

       self.movieWriter.finishRecording() 


       self.obj.performWithAsset(self.paths) 



     } 
     let delayTime1 = dispatch_time(DISPATCH_TIME_NOW, Int64(15 * Double(NSEC_PER_SEC))) 
     dispatch_after(delayTime1, dispatch_get_main_queue()) { 
      MBProgressHUD.hideAllHUDsForView(self.view, animated: true) 


     } 
     hasoutput = true ; 
    }