在toneRender中,我用一个频率和另一个频率/ 3填充缓冲区。如何清除或不设置kLinearPCMFormatFlagIsNonInterleved
当我运行代码时,看起来好像读取的输出和缓冲区没有交错一样。实际上它是在createToneUnit中设置的。 声音仅在左扬声器中播放。当两个频率都写入缓冲器时,两个音调在左扬声器中播放。当频率没有写入缓冲器时,例如leftON = 0,它们不被播放。所以缓冲区写入代码看起来没问题。
由于我怀疑我不应该有createToneUnit中的kLinearPCMFormatFlagIsNonInterleaved集,我试图“清除”该标志。我几个小时阅读文件,但从来没有找到办法做到这一点。实验只会导致应用程序启动时发生崩溃。
如何清除kLinearPCMFormatFlagIsNonInterleaved?
或者我该如何不设置kLinearPCMFormatFlagIsNonInterleaved在第一个地方? (注释掉streamFormat.mFormatFlags也会造成崩溃。)
可能有其他一些设置会影响创建交错播放。
OSStatus RenderTone(
void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
*)inRefCon;
float sampleRate = viewController->sampleRate;
float frequency = viewController->frequency;
// etc.
float theta_increment = 2.0 * M_PI * frequency /sampleRate;
float wave;
float theta2;
float wave2;
float theta_increment2 =0.3 * theta_increment;
const int channel = 0;
Float32 *buffer = (Float32 *)ioData->mBuffers[channel].mData;
for (UInt32 frame = 0; frame < inNumberFrames;)
{
theta += theta_increment;
wave = sin(theta) * playVolume;
theta2 += theta_increment2;
wave2 = sin(theta2) * playVolume;
buffer[frame++] = wave * leftON; // leftON = 1 or 0
buffer[frame++] = wave2 * rightON; // rightON = 1 or 0
if (theta > 2.0 * M_PI)
{
theta -= 2.0 * M_PI;
}
}
// etc.
}
- (void)createToneUnit
{
AudioComponentDescription defaultOutputDescription;
defaultOutputDescription.componentType = kAudioUnitType_Output;
defaultOutputDescription.componentSubType = kAudioUnitSubType_RemoteIO;
defaultOutputDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
defaultOutputDescription.componentFlags = 0;
defaultOutputDescription.componentFlagsMask = 0;
// etc.
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input,
0,
&input,
sizeof(input));
const int four_bytes_per_float = 4;
const int eight_bits_per_byte = 8;
AudioStreamBasicDescription streamFormat;
streamFormat.mSampleRate = sampleRate;
streamFormat.mFormatID = kAudioFormatLinearPCM;
streamFormat.mFormatFlags =
kLinearPCMFormatFlagIsFloat | kLinearPCMFormatFlagIsNonInterleaved;
streamFormat.mBytesPerPacket = four_bytes_per_float;
streamFormat.mFramesPerPacket = 1;
streamFormat.mBytesPerFrame = four_bytes_per_float;
streamFormat.mChannelsPerFrame = 2; // 2= stereo/
streamFormat.mBitsPerChannel = four_bytes_per_float * eight_bits_per_byte;
err = AudioUnitSetProperty (toneUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
0,
&streamFormat,
sizeof(AudioStreamBasicDescription));
}
回答问题100%。这是我问自己为什么没有想到这个优雅的答案之一? – user1251228 2012-03-17 23:10:21