2013-04-29 144 views
4

我试图添加一个淡入到一个wav文件,然后使用AVAssetExportSession导出一个添加淡入淡出的新文件。我见过的所有例子都看到m4u出口甚至可以用wav或aif来做到这一点吗?使用AVAssetExportSession导出Wav文件

我得到的错误是:

AVAssetExportSessionStatusFailed Error Domain=AVFoundationErrorDomain Code=-11822 "Cannot Open" UserInfo=0x1f01c9f0 {NSLocalizedDescription=Cannot Open, NSLocalizedFailureReason=This media format is not supported.} 

我的代码看起来像下面

NSString *inpath = [path stringByAppendingFormat:@"/%@",file]; 

    NSString *ename = [file stringByDeletingPathExtension]; 
    NSString *incname = [ename stringByAppendingString:@"1t"]; 
    NSString *outname = [incname stringByAppendingPathExtension:@"wav"]; 
    NSString *outpath = [path stringByAppendingFormat:@"/%@",outname]; 

    NSURL *urlpath = [NSURL fileURLWithPath:inpath]; 
    NSURL *urlout = [NSURL fileURLWithPath:outpath]; 



    NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] 
                 forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; 
    AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:urlpath options:options]; 


    //check the soundfile is greater than 50seconds 
    CMTime assetTime = [anAsset duration]; 
    Float64 duration = CMTimeGetSeconds(assetTime); 
    if (duration < 50.0) return NO; 

    // get the first audio track 
    NSArray *tracks = [anAsset tracksWithMediaType:AVMediaTypeAudio]; 
    if ([tracks count] == 0) return NO; 

    AVAssetTrack *track = [tracks objectAtIndex:0]; 

    // create trim time range - 20 seconds starting from 30 seconds into the asset 
    CMTime startTime = CMTimeMake(30, 1); 
    CMTime stopTime = CMTimeMake(50, 1); 
    CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime); 

    // create fade in time range - 10 seconds starting at the beginning of trimmed asset 
    CMTime startFadeInTime = startTime; 
    CMTime endFadeInTime = CMTimeMake(40, 1); 
    CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime, 
                  endFadeInTime); 

    // setup audio mix 
    AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix]; 
    AVMutableAudioMixInputParameters *exportAudioMixInputParameters = 
    [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track]; 
    [exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0 timeRange:fadeInTimeRange]; 

    exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters]; 

    AVAssetExportSession *exportSession = [AVAssetExportSession 
              exportSessionWithAsset:anAsset presetName:AVAssetExportPresetPassthrough]; 


    //NSArray *listof = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset]; 
    //NSLog(@"LISTOF %@",listof); 

    id desc = [track.formatDescriptions objectAtIndex:0]; 
    const AudioStreamBasicDescription *audioDesc = CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef)desc); 
    FourCharCode formatID = audioDesc->mFormatID; 

    NSString *fileType = nil; 
    NSString *ex = nil; 

    switch (formatID) { 

     case kAudioFormatLinearPCM: 
     { 
      UInt32 flags = audioDesc->mFormatFlags; 
      if (flags & kAudioFormatFlagIsBigEndian) { 
       fileType = @"public.aiff-audio"; 
       ex = @"aif"; 
      } else { 
       fileType = @"com.microsoft.waveform-audio"; 
       ex = @"wav"; 
      } 
     } 
      break; 

     case kAudioFormatMPEGLayer3: 
      fileType = @"com.apple.quicktime-movie"; 
      ex = @"mp3"; 
      break; 

     case kAudioFormatMPEG4AAC: 
      fileType = @"com.apple.m4a-audio"; 
      ex = @"m4a"; 
      break; 

     case kAudioFormatAppleLossless: 
      fileType = @"com.apple.m4a-audio"; 
      ex = @"m4a"; 
      break; 

     default: 
      break; 
    } 



    exportSession.outputFileType = fileType; 
    exportSession.outputURL = urlout; 

    //exportSession.outputFileType = AVFileTypeWAVE; // output file type 
    exportSession.timeRange = exportTimeRange; // trim time range 
    exportSession.audioMix = exportAudioMix; // fade in audio mix 


    // perform the export 
    [exportSession exportAsynchronouslyWithCompletionHandler:^{ 

     if (AVAssetExportSessionStatusCompleted == exportSession.status) { 
      NSLog(@"AVAssetExportSessionStatusCompleted"); 
     } else if (AVAssetExportSessionStatusFailed == exportSession.status) { 
      // a failure may happen because of an event out of your control 
      // for example, an interruption like a phone call comming in 
      // make sure and handle this case appropriately 
      NSLog(@"AVAssetExportSessionStatusFailed %@",exportSession.error); 
     } else { 
      NSLog(@"Export Session Status: %d", exportSession.status); 
     } 
    }]; 

    return YES; 
    } 
+0

你在说哪个操作系统? – 2013-04-29 11:42:55

+0

这是用于iOS版本6的 – user1347149 2013-04-29 13:34:10

+0

你解决了你的问题吗?最近我正在寻找出口wav文件的方式。 – 2013-05-20 06:49:52

回答

2

你不能做到这一点与AVAssetExportSession因为预设是相当固定在它们的用法。 AVAssetExportPresetPassthrough的预设值将保持输出格式的输出。

由于您的任务将直接操作音频采样缓冲区,因此您应该使用AVFoundation将为您提供的第二个变体:配对的AVAssetReader和AVAssetWriter设置。 您将在Apple开发人员资源的AVReaderWriterOSX中找到适当的示例代码。除了您可以使用不同的I/O格式设置外,这也适用于iOS。应该给出将音频解压缩为PCM并将其写回未压缩的.wav文件的可用性。