如何在Android中将音频格式(AudioFormat.CHANNEL_OUT_MONO)解析为AudioRecord.getMinBufferSize函数?
我想在我的应用程序中进行实时音频流。我有一个名为listenUserVoice(String userVoice)的函数接受编码的base64语音数据。我的问题是当我使用AudioRecord.getMinBufferSize函数检索缓冲区大小时与AudioFormat.CHANNEL_OUT_MONO。它返回-2(ERROR_BAD_VALUE)。所以,我尝试修复错误,我发现 android AudioRecord.java 类中没有 AudioFormat.CHANNEL_OUT_MONO 的 switch case。
public void listenUserVoice(String userVoice) {
Log.d(TAG, "voice data " + userVoice);
try {
playBufSize = AudioRecord.getMinBufferSize(frequency, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
Log.d(TAG, "-------------- playBufSize " + playBufSize);
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, frequency, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, playBufSize, AudioTrack.MODE_STREAM);
bytes = new byte[playBufSize];
if (audioTrack.getState() == AudioRecord.STATE_INITIALIZED){
audioTrack.play();
}
byte[] decoded = Base64.decode(userVoice, Base64.DEFAULT);
int i = 0;
InputStream inputStream = new ByteArrayInputStream(decoded);
while (true) {
if ((i = inputStream.read(bytes)) == -1) break;
audioTrack.write(bytes, 0, i);
}
inputStream.close();
} catch (IOException e) {
Log.d(TAG, "Error " + e.getMessage());
e.printStackTrace();
}
I would like to make real-time audio streaming in my application.I have a function named listenUserVoice(String userVoice) accept encoded base64 voice data.My Problem is when I retrieve buffer size using AudioRecord.getMinBufferSize function with AudioFormat.CHANNEL_OUT_MONO.It's return -2(ERROR_BAD_VALUE).So,I try to fix error and I found that there is no switch case for AudioFormat.CHANNEL_OUT_MONO in android AudioRecord.java Class.
public void listenUserVoice(String userVoice) {
Log.d(TAG, "voice data " + userVoice);
try {
playBufSize = AudioRecord.getMinBufferSize(frequency, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
Log.d(TAG, "-------------- playBufSize " + playBufSize);
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, frequency, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, playBufSize, AudioTrack.MODE_STREAM);
bytes = new byte[playBufSize];
if (audioTrack.getState() == AudioRecord.STATE_INITIALIZED){
audioTrack.play();
}
byte[] decoded = Base64.decode(userVoice, Base64.DEFAULT);
int i = 0;
InputStream inputStream = new ByteArrayInputStream(decoded);
while (true) {
if ((i = inputStream.read(bytes)) == -1) break;
audioTrack.write(bytes, 0, i);
}
inputStream.close();
} catch (IOException e) {
Log.d(TAG, "Error " + e.getMessage());
e.printStackTrace();
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论