2013-03-18 95 views
3

我有兴趣在iPhone上添加文字(以字幕样式)到现有视频。一些解决这个在stackoverflow建议添加一个透明的UIView覆盖视频。这很好,但是,我不能将它保存为“新的和修改过的视频”。iOS在iPhone上逐帧修改视频(添加CC)

我看到的唯一解决方案就是取文本,从视频中取一帧,将文本添加到帧中,然后将带有文本的已修改帧推回到视频中,替换视频中的原始帧。

有谁知道如何从视频中取出一帧(我想我可以算出添加文本),然后如何将帧推回到视频中?如果你有一些想法,或知道教程,我将不胜感激。

+0

你是否浏览了AVFoundation框架文档? – zoul 2013-03-18 15:06:31

+0

只要每次播放视频时,为什么要将其保存为修改过的视频呢?编辑视频是CPU密集型的,因此移动设备上的电池耗尽。 – codeghost 2013-03-18 15:10:21

+0

iPhone已经具有视频编辑功能,以及iPhone的完整iMovie套件,所以我不知道如何添加文本框架是更多的税收.... – geekyaleks 2013-03-18 15:29:45

回答

5

你不需要在逐帧的基础上做到这一点。自iOS 4.0以来,AVFoundation支持字幕。

例如,您可以创建一个AVMutableComposition,然后在视频顶部添加一个字幕轨道。 AVMediaTypeSubtitle是字幕的类型(或用于隐藏字幕的AVMediaTypeClosedCaption)。然后,您可以将作品提供给玩家或AVAssetWriter。节省你所有的麻烦。

+0

真棒 - 谢谢你,我会研究一下.. – geekyaleks 2013-03-18 16:39:07

+0

我对你的解决方案感兴趣,但我没有找到如何使用“AVMediaTypeSubtitle”设置“字幕”的文本。你能举个例子吗?谢谢! – lansher1985 2014-05-09 09:55:56

0

对于那些想要逐帧编辑电影的人,请使用AVReaderWriter。尽管它是一个OS X Apple示例代码,但AVFoundation在两个平台上都可用,几乎没有任何变化。

1

您可以通过编辑框架AvFoundation视频......

下面是一个例子..

VAR的AssetTrack = AVAsset.assetWithURL(filePath1)为! AVURLAsset var mutableComposition = AVMutableComposition()

// Step - 1 pass url to avasset 

    // video 

    var compositionVideoTrack = AVMutableCompositionTrack() 
    compositionVideoTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID()) 


    var assetVideoTrack = assetTrack.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack 
    compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero , assetTrack.duration), ofTrack: assetVideoTrack, atTime: kCMTimeZero, error: nil) 

    // audio 

    var compositionAudioTrack = AVMutableCompositionTrack() 
    compositionAudioTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) 

    var assetAudioTrack = assetTrack.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack 
    compositionAudioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero , assetTrack.duration), ofTrack: assetAudioTrack, atTime: kCMTimeZero, error: nil) 

    var videoAssetOrientation_: UIImageOrientation = .Up 
    var isVideoAssetPortrait_: Bool = false 

    var videoTransform: CGAffineTransform = assetTrack.preferredTransform 



    var videosize = CGSize() 
    videosize = assetVideoTrack.naturalSize 

    var parentLayer = CALayer() 
    var videoLayer = CALayer() 
    var textLayer = CALayer() 

    parentLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height) 
    videoLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height) 
    textLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height) 

    parentLayer.addSublayer(videoLayer) 

    if drawingView.image != nil 
    { 
     var drawingLayer = CALayer() 
     drawingLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height) 
     drawingLayer.contents = drawingView.image.CGImage 

     var image = UIImage() 
     parentLayer.addSublayer(drawingLayer) 

    } 


    var textV = UIView() 
    textV.backgroundColor = UIColor.clearColor() 
    textV.layer.backgroundColor = UIColor.clearColor().CGColor 
    textV.frame = CGRectMake(self.captureView.frame.size.width, 0, self.captureView.frame.size.width, self.captureView.frame.size.height) 

    var textL = UILabel() 
    textL = textShowOnPreview 
    textV.addSubview(textL) 

    if textL != "" 
    { 
     UIGraphicsBeginImageContext(textV.bounds.size) 
     textV.layer.renderInContext(UIGraphicsGetCurrentContext()) 
     var image1: UIImage = UIGraphicsGetImageFromCurrentImageContext() 
     UIGraphicsEndImageContext() 
     if (TextAnimation == "") 
     { 
      textLayer.contents = image1.CGImage 
      parentLayer.addSublayer(textLayer) 
     } 

     else if (TextAnimation == "flip") 
     { 

      var overlayer1 = CALayer() 
      overlayer1.backgroundColor = UIColor.clearColor().CGColor 
      let screenSize: CGRect = UIScreen.mainScreen().bounds 
      overlayer1.contents = image1.CGImage 
      overlayer1.masksToBounds = true 
      overlayer1.frame = CGRectMake(videosize.width/2-300, videosize.height/2 - 400, videosize.width,videosize.width); 
      var animation : CABasicAnimation = CABasicAnimation(keyPath: "transform.rotation") 
      animation.duration=5.0; 
      animation.repeatCount=5; 
      animation.autoreverses = true; 
      // rotate from 0 to 360 
      animation.fromValue = 0 
      animation.toValue = (2.0 * M_PI); 
      animation.beginTime = AVCoreAnimationBeginTimeAtZero; 
      overlayer1.addAnimation(animation, forKey:"rotation") 
      [parentLayer .addSublayer(overlayer1)] 

     } 
     else if (TextAnimation == "fade") 
     { 

         //opacity 
      var overlayer1 = CALayer() 
      overlayer1.backgroundColor = UIColor.clearColor().CGColor 
      overlayer1.contents = image1.CGImage 
      overlayer1.masksToBounds = true 

      overlayer1.frame = CGRectMake(videosize.width/2 - 300, videosize.height/2 - 100 , videosize.width+20, videosize.width); 
      var animation : CABasicAnimation = CABasicAnimation(keyPath: "transform.scale") 
      animation.duration = 2.0; 
      animation.repeatCount = 3; 
      animation.autoreverses = true; 
      // rotate from 0 to 360 

      animation.fromValue = 0.5; 
      animation.toValue = 1.0; 
      animation.beginTime = AVCoreAnimationBeginTimeAtZero; 

      overlayer1.addAnimation(animation, forKey:"scale") 
      [parentLayer .addSublayer(overlayer1)] 

     } 
     else if (TextAnimation == "bounce") 
     { 
      var overlayer1 = CALayer() 

      var bounce : CABasicAnimation = CABasicAnimation (keyPath:"position.y"); 
      overlayer1.backgroundColor = UIColor.clearColor().CGColor 
      overlayer1.contents = image1.CGImage 
      overlayer1.masksToBounds = true 
      overlayer1.frame = CGRectMake(videosize.width/2 - 300, videosize.height/2 - 100 , videosize.width, videosize.width); 
      bounce.duration = 1.0; 
      bounce.fromValue = overlayer1.frame.origin.y 

      bounce.toValue = overlayer1.frame.origin.y - 100 
      bounce.repeatCount = 10 
      bounce.autoreverses = true; 
      overlayer1.addAnimation(bounce, forKey: "y") 

      var animation = CABasicAnimation(keyPath: "transform.scale") 
      animation.toValue = NSNumber(float: 0.9) 
      animation.duration = 1.0 
      animation.repeatCount = 10; 
      animation.autoreverses = true 
      overlayer1.addAnimation(animation, forKey: nil) 
      [parentLayer .addSublayer(overlayer1)] 

     } 

    } 

    var mutableVideoComposition = AVMutableVideoComposition() 
    mutableVideoComposition.frameDuration = CMTimeMake(1, 30) 
    mutableVideoComposition.renderSize = videosize 

    mutableVideoComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer) 


    var passThroughInstruction = AVMutableVideoCompositionInstruction() 
    passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration) 


    var passThroughLayerInstruction = AVMutableVideoCompositionLayerInstruction() 


    // video 

    var assestVideoMutableCompositionVideo = mutableComposition.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack 
    passThroughLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: assestVideoMutableCompositionVideo) 



    if isVideoAssetPortrait_ == false 
    { 

     var FirstAssetScaleFactor: CGAffineTransform = CGAffineTransformMakeScale(1, 1) 

     passThroughLayerInstruction.setTransform(CGAffineTransformConcat(assetVideoTrack.preferredTransform, FirstAssetScaleFactor), atTime: kCMTimeZero) 

    } 


    passThroughInstruction.layerInstructions = [passThroughLayerInstruction] 
    mutableVideoComposition.instructions = [passThroughInstruction] 

    let documentsURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL 
    let filePath = documentsURL.URLByAppendingPathComponent("NewWatermarkedVideo.mov") as NSURL 


    var fileManager:NSFileManager = NSFileManager.defaultManager() 
    fileManager.removeItemAtURL(filePath, error: nil) 

    var exporter: AVAssetExportSession = AVAssetExportSession(asset: mutableComposition, presetName: AVAssetExportPresetMediumQuality) 
    exporter.videoComposition = mutableVideoComposition 
    exporter.outputFileType = AVFileTypeQuickTimeMovie 
    exporter.outputURL = filePath 
    exporter.shouldOptimizeForNetworkUse = true 
    self.captureView.addSubview(textShowOnPreview) 

    exporter.exportAsynchronouslyWithCompletionHandler({() -> Void in 

     println(exporter.status) 

     if exporter.status == AVAssetExportSessionStatus.Completed 
     { 


      dispatch_async(dispatch_get_main_queue(), {() -> Void in 




       MBProgressHUD.hideAllHUDsForView(self.view, animated: true) 

       self.topicSelectedImage.highlighted = false 
       self.timelineSelectedImage.highlighted = false 
       self.selectCat = "" 
       self.postView.hidden = false 

      }) 

      println("Completed") 
      self.mediaData = NSData(contentsOfURL:filePath, options: nil, error: nil)! 


      var err: NSError? = nil 
      var asset = AVURLAsset(URL: filePath, options: nil) 
      var imgGenerator = AVAssetImageGenerator(asset: asset) 
      var cgImage = imgGenerator.copyCGImageAtTime(CMTimeMake(0, 30), actualTime: nil, error: &err) 
      var uiImage = UIImage(CGImage: cgImage)! 
      self.videoThumbData = UIImageJPEGRepresentation(uiImage, 0.1) 

      var assetTrack = AVAsset.assetWithURL(filePath) as! AVURLAsset 
      self.videoTime = Int(CMTimeGetSeconds(assetTrack.duration)) + 3 

      println(self.videoTime) 


     } 
     else if exporter.status == AVAssetExportSessionStatus.Cancelled 
     { 


     } 
     else if exporter.status == AVAssetExportSessionStatus.Failed 
     { 


     } 
    })