2015-04-03 59 views
5

我试图使用新AVAudioEngine iOS中8completionHandler()被调用太早

它看起来像player.scheduleFile的completionHandler()被调用前的声音文件完成比赛。

我使用长度为5s的声音文件 - 和println() - 在声音结束前约1秒出现消息。

我做错了什么或者我误解了completionHandler的想法?

谢谢!


下面是一些代码:

class SoundHandler { 
    let engine:AVAudioEngine 
    let player:AVAudioPlayerNode 
    let mainMixer:AVAudioMixerNode 

    init() { 
     engine = AVAudioEngine() 
     player = AVAudioPlayerNode() 
     engine.attachNode(player) 
     mainMixer = engine.mainMixerNode 

     var error:NSError? 
     if !engine.startAndReturnError(&error) { 
      if let e = error { 
       println("error \(e.localizedDescription)") 
      } 
     } 

     engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0)) 
    } 

    func playSound() { 
     var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a") 
     var soundFile = AVAudioFile(forReading: soundUrl, error: nil) 

     player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") }) 

     player.play() 
    } 
} 

回答

6

这看起来像一个bug,我们应该提交雷达提交! http://bugreport.apple.com

与此同时,作为一种解决方法,我注意到如果您改用scheduleBuffer:atTime:options:completionHandler:,则会按预期触发回调(播放完成后)。

示例代码:

AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil]; 
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length]; 
[file readIntoBuffer:buffer error:&error]; 

[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{ 
    // reminder: we're not on the main thread in here 
    dispatch_async(dispatch_get_main_queue(), ^{ 
     NSLog(@"done playing, as expected!"); 
    }); 
}]; 
+0

作品作为一种解决方法。谢谢! – Oliver 2015-04-21 17:44:12

+0

很好的解决方法。 – 2015-06-13 21:40:27

+0

喜欢它。奇迹般有效! – 2016-01-31 21:43:12

6

我看到相同的行为。

从我的实验中,我认为一旦缓冲区/段/文件已被“调度”,而不是完成播放时,就会调用回调。

尽管文档明确指出: “在缓冲区完全播放或播放器停止后调用,可能为零”。

所以我认为这是一个错误或不正确的文档。不知道哪个

4

您可以随时计算出未来的时候,音频播放将完成,使用AVAudioTime。当前的行为很有用,因为它支持在当前缓冲区/段/文件结束之前调度附加的缓冲区/段/文件以从回调播放,避免音频播放出现间隙。这可以让您创建一个简单的循环播放器,而无需太多工作。下面是一个例子:

class Latch { 
    var value : Bool = true 
} 

func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch { 
    let looping = Latch() 
    let frames = file.length 

    let sampleRate = file.processingFormat.sampleRate 
    var segmentTime : AVAudioFramePosition = 0 
    var segmentCompletion : AVAudioNodeCompletionHandler! 
    segmentCompletion = { 
     if looping.value { 
      segmentTime += frames 
      player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion) 
     } 
    } 
    player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion) 
    segmentCompletion() 
    player.play() 

    return looping 
} 

上面的代码在调用player.play()之前调度整个文件两次。随着每个细分接近完成,它将在未来安排另一个整个文件,以避免播放出现空白。要停止循环,请使用返回值Latch,如下所示:

let looping = loopWholeFile(file, player) 
sleep(1000) 
looping.value = false 
player.stop() 
0

是的,它在文件(或缓冲区)完成之前会稍微调用。如果您在完成处理程序中调用[myNode stop],则文件(或缓冲区)不会完全完成。但是,如果您调用[myEngine stop],则文件(或缓冲区)将完成到末尾

1

我的错误报告已关闭为“按预期工作”,但Apple向我指出了scheduleFile的新变体, iOS 11中的scheduleSegment和scheduleBuffer方法。这些补充一点,你可以用它来指定要完成回调时回放完成completionCallbackType说法:

[self.audioUnitPlayer 
      scheduleSegment:self.audioUnitFile 
      startingFrame:sampleTime 
      frameCount:(int)sampleLength 
      atTime:0 
      completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack 
      completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) { 
    // do something here 
}]; 

documentation并没有说明它是如何工作什么,但我测试,它适用于我。

我一直在使用这种解决方法适用于iOS 8-10:

- (void)playRecording { 
    [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() { 
     float totalTime = [self recordingDuration]; 
     float elapsedTime = [self recordingCurrentTime]; 
     float remainingTime = totalTime - elapsedTime; 
     [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime]; 
    }]; 
} 

- (float)recordingDuration { 
    float duration = duration = self.audioUnitFile.length/self.audioUnitFile.processingFormat.sampleRate; 
    if (isnan(duration)) { 
     duration = 0; 
    } 
    return duration; 
} 

- (float)recordingCurrentTime { 
    AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime; 
    AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime]; 
    AVAudioFramePosition sampleTime = playerTime.sampleTime; 
    if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing 
    sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here 
    float time = sampleTime/self.audioUnitFile.processingFormat.sampleRate; 
    self.audioUnitLastKnownTime = time; 
    return time; 
} 
0
// audioFile here is our original audio 

audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: { 
     print("scheduleFile Complete") 

     var delayInSeconds: Double = 0 

     if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) { 

      if let rate = rate { 
       delayInSeconds = Double(audioFile.length - playerTime.sampleTime)/Double(audioFile.processingFormat.sampleRate)/Double(rate!) 
      } else { 
       delayInSeconds = Double(audioFile.length - playerTime.sampleTime)/Double(audioFile.processingFormat.sampleRate) 
      } 
     } 

     // schedule a stop timer for when audio finishes playing 
     DispatchTime.executeAfter(seconds: delayInSeconds) { 
      audioEngine.mainMixerNode.removeTap(onBus: 0) 
      // Playback has completed 
     } 

    })