2012-04-23 183 views
11

编辑:最奇怪的事情:它似乎是从完整的应用程序运行此代码时,一切正常,但我总是运行从我的电影创作单元测试,只有它没有工作。试图找出为什么...从单元测试运行时,CATextLayer没有出现在AVMutableComposition中

我想结合视频+音频+文本使用AVMutableComposition并将其导出到一个新的视频。

我的代码是基于AVEditDemo从WWDC '10

我添加了一个紫色的背景,CATextLayer这样我就可以知道它是出口到电影中的事实,但没有文字显示...我尝试玩各种字体,位置,颜色定义,但没有什么帮助,所以我决定在这里发布代码,看看是否有人偶然发现了类似的东西,并能告诉我我错过了什么。

下面的代码(self.audio和self.video是AVURLAssets):需要

CMTime exportDuration = self.audio.duration; 

AVMutableComposition *composition = [[AVMutableComposition alloc] init]; 

AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
AVAssetTrack *videoTrack = [[self.video tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

// add the video in loop until the audio ends 
CMTime currStartTime = kCMTimeZero; 
while (CMTimeCompare(currStartTime, exportDuration) < 0) { 
    CMTime timeRemaining = CMTimeSubtract(exportDuration, currStartTime); 
    CMTime currLoopDuration = self.video.duration; 

    if (CMTimeCompare(currLoopDuration, timeRemaining) > 0) { 
     currLoopDuration = timeRemaining; 
    } 
    CMTimeRange currLoopTimeRange = CMTimeRangeMake(kCMTimeZero, currLoopDuration); 

    [compositionVideoTrack insertTimeRange:currLoopTimeRange ofTrack:videoTrack 
            atTime:currStartTime error:nil]; 

    currStartTime = CMTimeAdd(currStartTime, currLoopDuration); 
} 

AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 

AVAssetTrack *audioTrack = [self.audio.tracks objectAtIndex:0]; 
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.audio.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil]; 

AVMutableVideoComposition *videoComposition; 

// the text layer part - THIS IS THE PART THAT DOESN'T WORK WELL 
CALayer *animatedTitleLayer = [CALayer layer]; 
CATextLayer *titleLayer = [[CATextLayer alloc] init]; 
titleLayer.string = @"asdfasdf"; 
titleLayer.alignmentMode = kCAAlignmentCenter; 
titleLayer.bounds = CGRectMake(0, 0, self.video.naturalSize.width/2, self.video.naturalSize.height/2); 
titleLayer.opacity = 1.0; 
titleLayer.backgroundColor = [UIColor purpleColor].CGColor; 

[animatedTitleLayer addSublayer:titleLayer]; 
animatedTitleLayer.position = CGPointMake(self.video.naturalSize.width/2.0, self.video.naturalSize.height/2.0); 

// build a Core Animation tree that contains both the animated title and the video. 
CALayer *parentLayer = [CALayer layer]; 
CALayer *videoLayer = [CALayer layer]; 
parentLayer.frame = CGRectMake(0, 0, self.video.naturalSize.width, self.video.naturalSize.height); 
videoLayer.frame = CGRectMake(0, 0, self.video.naturalSize.width, self.video.naturalSize.height); 
[parentLayer addSublayer:videoLayer]; 
[parentLayer addSublayer:animatedTitleLayer]; 

videoComposition = [AVMutableVideoComposition videoComposition]; 
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; 

AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, exportDuration); 
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack]; 

passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer]; 
videoComposition.instructions = [NSArray arrayWithObject:passThroughInstruction]; 

videoComposition.frameDuration = CMTimeMake(1, 30); 
videoComposition.renderSize = self.video.naturalSize; 

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; 

exportSession.videoComposition = videoComposition; 
exportSession.outputURL = [NSURL fileURLWithPath:self.outputFilePath]; 
exportSession.outputFileType = AVFileTypeQuickTimeMovie; 

[exportSession exportAsynchronouslyWithCompletionHandler:^() { 
    // save the video ... 
}]; 
+0

我不确定..可能是这可以帮助..http://stackoverflow.com/questions/7205820/iphone-watermark-on-recorded-video – 2012-04-28 15:30:25

+0

谢谢,我做的事情几乎相同那里提到了什么。如果有人发现我的代码和相关答案中的代码之间的重要区别,我会非常感激。 – yonix 2012-04-29 08:19:33

+0

您是否尝试设置字体大小? – 2012-04-30 00:55:56

回答

0

进一步调查,但AFAICT现在CATextLayer的AVMutableVideoComposition里面根本没有从逻辑单元测试中工作目标,并且此功能必须从常规目标进行测试。

1

我在不同的上下文中遇到了同样的问题。就我而言,我已经将AVMutableComposition的准备工作转移到了后台线程。将准备部分移回到主队列/线程,使CATextLayer叠加层再次正常工作。

这可能并不完全适用于您的单元测试上下文,但我的猜测是CATextLayer/AVFoundation依赖于UIKit/AppKit正在运行/可用的某个部分(绘图上下文?当前屏幕?)上下文,这可能解释我们都看到的失败。

+0

嗨,亚当你知道是否有可能使用AVMutableCompositionTracks合并图像和视频(类似于你如何合并视频),或者你需要CALayers将图像合并到合并的视频中。这里的问题:http://stackoverflow.com/questions/34937862/merge-videos-images-in-avmutablecomposition-using-avmutablecompositiontrack – Crashalot 2016-01-22 03:00:51

1

我有问题,几乎所有的东西都渲染得很好,还有CALayer和内容设置为CGImage的图像。除了CGTextLayer文本,如果我为CGTextLayer设置了背景颜色,则可以完美呈现beginTime和duration - 只是实际文本不想出现。 这些都在模拟器上,然后我在手机上运行它:它非常完美。

结论:模拟器渲染漂亮的视频......直到你使用CATextLayer。

0

我的问题是,我需要设置contentsGravity = kCAGravityBottomLeft - 否则我的文本离开了屏幕。

相关问题