2015-04-04 116 views
5

这里是链接github https://github.com/spennyf/cropVid/tree/master来试试你的自我,看看我在说什么,它将需要1分钟的时间来测试。谢谢!裁剪区域与iOS中的选定区域不同?

我正在拍摄带有方形的视频以显示vid的哪部分将被裁剪。就像这样:

enter image description here

现在我做了一张纸,4号线在广场上的这一点,并在顶部和底部半年线的差异。然后,我使用的代码,我将发布作物的视频,但后来当我显示视频我看到这个(忽略背景和绿色圆圈):

enter image description here

正如你可以看到有超过四线,所以我将它设置为裁剪某个部分,但是当我使用相机中显示的同一个矩形以及用于裁剪的同一个矩形时,它将添加更多内容。

所以我的问题是为什么裁剪不一样的大小?

这里是我做的产量和显示:

//this is the square on the camera 
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height-80)]; 
    UIImageView *image = [[UIImageView alloc] init]; 
    image.layer.borderColor=[[UIColor whiteColor] CGColor]; 
image.frame = CGRectMake(self.view.frame.size.width/2 - 58 , 100 , 116, 116); 
    CALayer *imageLayer = image.layer; 
    [imageLayer setBorderWidth:1]; 
[view addSubview:image]; 
    [picker setCameraOverlayView:view]; 

//this is crop rect 
CGRect rect = CGRectMake(self.view.frame.size.width/2 - 58, 100, 116, 116); 
[self applyCropToVideoWithAsset:assest AtRect:rect OnTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(assest.duration.value, 1)) 
        ExportToUrl:exportUrl ExistingExportSession:exporter WithCompletion:^(BOOL success, NSError *error, NSURL *videoUrl) { 
//here is player 
AVPlayer *player = [AVPlayer playerWithURL:videoUrl]; 

          AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player]; 
layer.frame = CGRectMake(self.view.frame.size.width/2 - 58, 100, 116, 116); 
}]; 

这里是代码,不会对作物:

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset 
{ 
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
CGSize size = [videoTrack naturalSize]; 
CGAffineTransform txf = [videoTrack preferredTransform]; 

if (size.width == txf.tx && size.height == txf.ty) 
    return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft; 
else if (txf.tx == 0 && txf.ty == 0) 
    return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight; 
else if (txf.tx == 0 && txf.ty == size.width) 
    return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown; 
else 
    return UIImageOrientationUp; //return UIInterfaceOrientationPortrait; 
} 

这里是种植代码的其余部分:

- (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion 
{ 

// NSLog(@"CALLED"); 
//create an avassetrack with our asset 
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

//create a video composition and preset some settings 
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; 
videoComposition.frameDuration = CMTimeMake(1, 30); 

CGFloat cropOffX = cropRect.origin.x; 
CGFloat cropOffY = cropRect.origin.y; 
CGFloat cropWidth = cropRect.size.width; 
CGFloat cropHeight = cropRect.size.height; 
// NSLog(@"width: %f - height: %f - x: %f - y: %f", cropWidth, cropHeight, cropOffX, cropOffY); 

videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight); 

//create a video instruction 
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
instruction.timeRange = cropTimeRange; 

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; 

UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset]; 

CGAffineTransform t1 = CGAffineTransformIdentity; 
CGAffineTransform t2 = CGAffineTransformIdentity; 

switch (videoOrientation) { 
    case UIImageOrientationUp: 
     t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY); 
     t2 = CGAffineTransformRotate(t1, M_PI_2); 
     break; 
    case UIImageOrientationDown: 
     t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY); // not fixed width is the real height in upside down 
     t2 = CGAffineTransformRotate(t1, - M_PI_2); 
     break; 
    case UIImageOrientationRight: 
     t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY); 
     t2 = CGAffineTransformRotate(t1, 0); 
     break; 
    case UIImageOrientationLeft: 
     t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY); 
     t2 = CGAffineTransformRotate(t1, M_PI ); 
     break; 
    default: 
     NSLog(@"no supported orientation has been found in this video"); 
     break; 
} 

CGAffineTransform finalTransform = t2; 
[transformer setTransform:finalTransform atTime:kCMTimeZero]; 

//add the transformer layer instructions, then add to video composition 
instruction.layerInstructions = [NSArray arrayWithObject:transformer]; 
videoComposition.instructions = [NSArray arrayWithObject: instruction]; 

//Remove any prevouis videos at that path 
[[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil]; 

if (!exporter){ 
    exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; 
} 
// assign all instruction for the video processing (in this case the transformation for cropping the video 
exporter.videoComposition = videoComposition; 
exporter.outputFileType = AVFileTypeQuickTimeMovie; 
if (outputUrl){ 
    exporter.outputURL = outputUrl; 
    [exporter exportAsynchronouslyWithCompletionHandler:^{ 
     switch ([exporter status]) { 
      case AVAssetExportSessionStatusFailed: 
       NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]); 
       if (completion){ 
        dispatch_async(dispatch_get_main_queue(), ^{ 
         completion(NO,[exporter error],nil); 
        }); 
        return; 
       } 
       break; 
      case AVAssetExportSessionStatusCancelled: 
       NSLog(@"crop Export canceled"); 
       if (completion){ 
        dispatch_async(dispatch_get_main_queue(), ^{ 
         completion(NO,nil,nil); 
        }); 
        return; 
       } 
       break; 
      default: 
       break; 
     } 
     if (completion){ 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       completion(YES,nil,outputUrl); 
      }); 
     } 

    }]; 
} 

return exporter; 
} 

所以我的问题是为什么视频区域不同于作物/相机区域,当我使用完全相同的坐标和s正方形?

+0

可以肯定的是,一旦裁剪的视频被执行(所以在完成块中)应该被保存在iphone磁盘上。请直接检查该文件,我的意思是访问该文件(将iPhone连接到Mac,并使用iExplorer或iFunBox等工具)。然后在Mac上复制它,并用默认的Mac快速播放器打开它。通过这种方式,您可以确保所得裁剪视频与您在该广场中看到的完全一致。此外,请确保裁剪区域使用适当的坐标来引用视图,对于x轴和y轴 – 2015-04-09 08:28:14

+0

@LucaIaco好的我正在使用iExplorer并将视频放到我的mac上,并且快速播放它,裁剪区域仍然是不正确。我一次又一次地查看坐标,我相信他们是正确的。我将在链接上发布一个git hub项目,所以如果你不介意的话,你可以下载并运行并自己查看。现在我拍摄一个绿色广场的视频,只是裁剪部分的广场,但是当我裁剪时我看到了白色。我真的很感激,如果你看项目 – iqueqiorio 2015-04-09 18:01:06

+0

这里是正确的链接https://github.com/spennyf/cropVid – iqueqiorio 2015-04-10 04:44:28

回答

-2

也许Check This Previous Question

它看起来可能与您所遇到的类似。对这个问题的用户提示裁剪这种方式:

CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage], cropRect); 
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef]; 
CGImageRelease(imageRef); 

我希望这会有所帮助,或者至少让你在正确的方向开始。

+0

这个答案是完全不相关的。问题是关于裁剪视频而不是图像。 – bgfriend0 2015-04-15 21:09:18

+0

对此深感抱歉,我起来时已经很迟,肯定误读这个!感谢您的高举。 – Kleigh 2015-04-16 01:35:27

+0

哈哈,np,发生在我们所有人身上。 – bgfriend0 2015-04-16 01:35:56