为什么音频单元 RemoteIO 初始化在 iPhone 上有效,但在模拟器中却无效?

发布于 2024-09-25 07:41:05 字数 3768 浏览 8 评论 0原文

我正在使用音频单元服务来设置输出渲染回调,以便我可以将合成音频混合在一起。我的代码似乎在我拥有的设备(iPod Touch、iPhone 3G 和 iPad)上完美运行,但在模拟器上无法运行。

在模拟器上,AudioUnitInitialise 函数失败并返回值 -10851(根据 Apple 文档,kAudioUnitErr_InvalidPropertyValue)。

这是我的初始化代码..有谁对这个 API 的经验比我看到的我在这里做的不正确的事情更丰富吗?

#define kOutputBus 0
#define kInputBus  1 

... 

static OSStatus playbackCallback(void *inRefCon, 
                             AudioUnitRenderActionFlags* ioActionFlags, 
                             const AudioTimeStamp*       inTimeStamp, 
                             UInt32                      inBusNumber, 
                             UInt32                      inNumberFrames, 
                             AudioBufferList*            ioData) 
{
    // Mix audio here - but it never gets here on the simulator
    return noErr;
}

...



{
    OSStatus status;

    // Describe audio component
    AudioComponentDescription desc;
    desc.componentType         = kAudioUnitType_Output;
    desc.componentSubType      = kAudioUnitSubType_RemoteIO;
    desc.componentFlags        = 0;
    desc.componentFlagsMask    = 0;
    desc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);

    // Get audio units
    status = AudioComponentInstanceNew(inputComponent, &m_audio_unit);
    if(status != noErr) {
        NSLog(@"Failed to get audio component instance: %d", status);
    }

    // Enable IO for playback
    UInt32 flag = 1;
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioOutputUnitProperty_EnableIO, 
                                  kAudioUnitScope_Output, 
                                  kOutputBus,
                                  &flag, 
                                  sizeof(flag));
    if(status != noErr) {
        NSLog(@"Failed to enable audio i/o for playback: %d", status);
    }

    // Describe format
    AudioStreamBasicDescription audioFormat;
    audioFormat.mSampleRate       = 44100.00;
    audioFormat.mFormatID         = kAudioFormatLinearPCM;
    audioFormat.mFormatFlags      = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    audioFormat.mFramesPerPacket  = 1;      
    audioFormat.mChannelsPerFrame = 2;
    audioFormat.mBitsPerChannel   = 16;
    audioFormat.mBytesPerPacket   = 4;
    audioFormat.mBytesPerFrame    = 4;      

    // Apply format
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioUnitProperty_StreamFormat, 
                                  kAudioUnitScope_Input, 
                                  kOutputBus, 
                                  &audioFormat, 
                                  sizeof(audioFormat));
    if(status != noErr) {
        NSLog(@"Failed to set format descriptor: %d", status);
    }

    // Set output callback
    AURenderCallbackStruct callbackStruct;
    callbackStruct.inputProc       = playbackCallback;
    callbackStruct.inputProcRefCon = self;
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioUnitProperty_SetRenderCallback, 
                                  kAudioUnitScope_Global, 
                                  kOutputBus,
                                  &callbackStruct, 
                                  sizeof(callbackStruct));
    if(status != noErr) {
        NSLog(@"Failed to set output callback: %d", status);
    }

    // Initialize (This is where it fails on the simulator)
    status = AudioUnitInitialize(m_audio_unit);
    if(status != noErr) {
        NSLog(@"Failed to initialise audio unit: %d", status);
    }

}

我的XCode版本是3.2.2(64位) 我的模拟器版本是3.2(虽然在3.1.3调试或发布中也出现同样的问题)

谢谢,我很感激!

I am using the Audio Unit services to set up an output rendering callback so I can mix together synthesized audio. The code I have seems to work perfectly on the devices I have (iPod Touch, iPhone 3G, and iPad) but fails to work on the simulator.

On the simulator, the AudioUnitInitialise function fails and returns a value of -10851 (kAudioUnitErr_InvalidPropertyValue according to Apple documentation).

Here is my initialisation code.. anyone with more experience with this API than I see anything I'm doing incorrect here?

#define kOutputBus 0
#define kInputBus  1 

... 

static OSStatus playbackCallback(void *inRefCon, 
                             AudioUnitRenderActionFlags* ioActionFlags, 
                             const AudioTimeStamp*       inTimeStamp, 
                             UInt32                      inBusNumber, 
                             UInt32                      inNumberFrames, 
                             AudioBufferList*            ioData) 
{
    // Mix audio here - but it never gets here on the simulator
    return noErr;
}

...



{
    OSStatus status;

    // Describe audio component
    AudioComponentDescription desc;
    desc.componentType         = kAudioUnitType_Output;
    desc.componentSubType      = kAudioUnitSubType_RemoteIO;
    desc.componentFlags        = 0;
    desc.componentFlagsMask    = 0;
    desc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc);

    // Get audio units
    status = AudioComponentInstanceNew(inputComponent, &m_audio_unit);
    if(status != noErr) {
        NSLog(@"Failed to get audio component instance: %d", status);
    }

    // Enable IO for playback
    UInt32 flag = 1;
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioOutputUnitProperty_EnableIO, 
                                  kAudioUnitScope_Output, 
                                  kOutputBus,
                                  &flag, 
                                  sizeof(flag));
    if(status != noErr) {
        NSLog(@"Failed to enable audio i/o for playback: %d", status);
    }

    // Describe format
    AudioStreamBasicDescription audioFormat;
    audioFormat.mSampleRate       = 44100.00;
    audioFormat.mFormatID         = kAudioFormatLinearPCM;
    audioFormat.mFormatFlags      = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
    audioFormat.mFramesPerPacket  = 1;      
    audioFormat.mChannelsPerFrame = 2;
    audioFormat.mBitsPerChannel   = 16;
    audioFormat.mBytesPerPacket   = 4;
    audioFormat.mBytesPerFrame    = 4;      

    // Apply format
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioUnitProperty_StreamFormat, 
                                  kAudioUnitScope_Input, 
                                  kOutputBus, 
                                  &audioFormat, 
                                  sizeof(audioFormat));
    if(status != noErr) {
        NSLog(@"Failed to set format descriptor: %d", status);
    }

    // Set output callback
    AURenderCallbackStruct callbackStruct;
    callbackStruct.inputProc       = playbackCallback;
    callbackStruct.inputProcRefCon = self;
    status = AudioUnitSetProperty(m_audio_unit, 
                                  kAudioUnitProperty_SetRenderCallback, 
                                  kAudioUnitScope_Global, 
                                  kOutputBus,
                                  &callbackStruct, 
                                  sizeof(callbackStruct));
    if(status != noErr) {
        NSLog(@"Failed to set output callback: %d", status);
    }

    // Initialize (This is where it fails on the simulator)
    status = AudioUnitInitialize(m_audio_unit);
    if(status != noErr) {
        NSLog(@"Failed to initialise audio unit: %d", status);
    }

}

My XCode version is 3.2.2 (64 bit)
My Simulator version is 3.2 (Though the same issue occurs in 3.1.3 Debug or Release)

Thanks, I appreciate it!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

戴着白色围巾的女孩 2024-10-02 07:41:05

为设备和模拟器编译是完全不同的。最常见的事情都有相同的预期结果。例如加载视图在它们之间切换播放声音等等。然而,当涉及到其他事情时,例如使用 OpenAL 加载 10 个缓冲区播放声音,然后在它们之间切换,模拟器无法处理,但设备可以。

我的看法是,只要它能在我所关心的设备上运行即可。当应用程序在设备上运行良好时,不要只是为了让应用程序在模拟器上运行而费尽心思。

希望对

PK有所帮助

compiling for a device and for a simulator is totally different. Most common things have the same expected result. For example loading a view switch between them playing sounds and so on. However when it comes to other things like playing sound with OpenAL loading 10 buffers and then switching between them the simulator cannot handle that but the devices can.

The way i see it is as long as it works on the device that's all I care about. Try not t pull your hair out just to make an application work on a simulator when it works fine on the device.

hope that helps

Pk

柒夜笙歌凉 2024-10-02 07:41:05

在调用 RemoteIO 初始化代码之前您是否配置并启用了音频会话?

Did you configure and enable an Audio Session prior to calling your RemoteIO initialization code?

深海夜未眠 2024-10-02 07:41:05

当您将流属性设置为输入总线时,您将使用 kOutputBus 作为输入范围。这可能不太好。此外,您可能不需要将渲染回调应用到全局范围,因为您只需要它用于输出。此外,我认为您对 kOutputBus 和 kInputBus 的定义是错误的......当我查看工作的 iPhone 音频代码时,它使用 0 作为输入总线,使用 1 作为输入总线。输出总线。

我还可以想到有关 AudioStreamBasicDescription 的一些小事情,尽管我认为这些不会产生太大差异:

  1. kAudioFormatFlagsNativeEndian 属性添加到格式标志中
  2. 显式设置 mReserved 字段设置为 0。

When you are setting the stream properties to the input bus, you are using kOutputBus for your input scope. That's probably not good. Also, you probably don't need to apply the render callback to the global scope, as you only need it for output. Furthermore, I think that your definitions of kOutputBus and kInputBus are wrong... when I look at working iPhone Audio code, it uses 0 for the input bus and 1 for the output bus.

I can also think of a few minor things in regards to the AudioStreamBasicDescription, though I don't think these will make much of a difference:

  1. Add the kAudioFormatFlagsNativeEndian property to your format flags
  2. Explicitly set the mReserved field to 0.
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文