检测对麦克风的打击

发布于 2024-11-08 05:52:52 字数 481 浏览 1 评论 0原文

对于我的一个项目,我需要检测用户何时向麦克风吹气。我已经按照本教程进行操作: http: //www.mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/ 和这个问题:检测麦克风中的吹气并执行某些操作 但我仍然没有得到我想要的结果。打击检测得太晚了,或者有时根本没有检测到。当我调整一些结果时,可以正确检测到打击,但是打击触发得太快,即。当您说话或发出咔哒声时,也会被检测为打击。

有没有人找到检测打击的好方法?谢谢。

For a project of mine I need to detect when the user blows into the mic. I've following this tutorial: http://www.mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/ and this question: Detect blow in Mic and do something
But still I do not get the results I want. The blow is detected way too late, or sometimes not at all. When I tweak some results the blow is detected correctly, but then the blow is triggered too fast, ie. when you talk or make a clicking sound it is detected as a blow too.

Has anyone found a good way of detecting a blow? Thanks.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

笨死的猪 2024-11-15 05:52:52

AVAudioRecorder 声级 API 的设计目的并不是为您提供可靠的结果,将吹气声与麦克风接收到的其他类型的声音分开。

我建议使用音频队列或音频单元 RemoteIO API,测量 RMS 信号能量、包络持续时间,然后使用加速 FFT 库检查宽带噪声与峰值的频谱,这会建议有声说话而不是吹气。

例如,更可靠的结果将需要比 1 个操作系统调用更多的工作。

The AVAudioRecorder sound level API is not designed to give you reliable results in separating blowing sounds from other types of sounds received by the mic.

I suggest using the Audio Queue or the Audio Unit RemoteIO API, measuring RMS signal energy, envelope duration, and then using the Accelerate FFT library to check the spectrum for broadband noise vs. peaks that would suggest voiced talking instead of blowing.

e.g. a more reliable result will require a lot more work than 1 OS call.

千柳 2024-11-15 05:52:52
Use return as you get first lowpass results >0.55

I have solve the issue have a look.

-(void)readyToBlow1 { NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];

NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
                          [NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
                          [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
                          [NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
                          [NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey,
                          nil];

NSError *error;

recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];

if (recorder) {
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];
    levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.01 target: self selector: @selector(levelTimerCallback1:) userInfo: nil repeats: YES];
} else
    NSLog(@"%@",[error description]);
}

(void)levelTimerCallback1:(NSTimer *)timer { [recorder updateMeters];

const double ALPHA = 0.05; double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0])); lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults; //NSLog(@"lowPassResults= %f",lowPassResults);

if (lowPassResults > 0.55) { lowPassResults = 0.0;

[self invalidateTimers];


NextPhase *objNextView =[[NextPhase alloc]init];

[UIView transitionFromView:self.view
                    toView:objNextView.view
                  duration:2.0
                   options:UIViewAnimationOptionTransitionCurlUp
                completion:^(BOOL finished) {
                }
 ];

[self.navigationController pushViewController:objNextView animated:NO];

**return;**
}

}
Use return as you get first lowpass results >0.55

I have solve the issue have a look.

-(void)readyToBlow1 { NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];

NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
                          [NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
                          [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
                          [NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
                          [NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey,
                          nil];

NSError *error;

recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];

if (recorder) {
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];
    levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.01 target: self selector: @selector(levelTimerCallback1:) userInfo: nil repeats: YES];
} else
    NSLog(@"%@",[error description]);
}

(void)levelTimerCallback1:(NSTimer *)timer { [recorder updateMeters];

const double ALPHA = 0.05; double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0])); lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults; //NSLog(@"lowPassResults= %f",lowPassResults);

if (lowPassResults > 0.55) { lowPassResults = 0.0;

[self invalidateTimers];


NextPhase *objNextView =[[NextPhase alloc]init];

[UIView transitionFromView:self.view
                    toView:objNextView.view
                  duration:2.0
                   options:UIViewAnimationOptionTransitionCurlUp
                completion:^(BOOL finished) {
                }
 ];

[self.navigationController pushViewController:objNextView animated:NO];

**return;**
}

}
孤君无依 2024-11-15 05:52:52

我对 kAudioQueueProperty_CurrentLevelMeter 使用 AudioQueueGetProperty() 取得了很好的成功。

I've had good success using AudioQueueGetProperty() for the kAudioQueueProperty_CurrentLevelMeter.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文