NAudio C# 如何制作/控制从 IWavePlayer 读取的字节数?
我有以下代码:
IWavePlayer player = new WaveOut();
WaveStream reader = new WaveFileReader(filePath);
WaveChannel32 input = new WaveChannel32(reader);
AmplitudeStream ampStream = new AmplitudeStream(input);
ampStream.AmplitudeEventHandler += new EventHandler<AmplitudeArgs>(AmplitudeStreamHandler);
player.Init(ampStream);
player.Play();
AmplitudeStream 是我制作的一个实现 WaveStream 的类,以便我可以在从文件读取数据时“跟踪”并发送事件。但是,我的问题是我想控制读取的字节数。由于 Play() 函数正在调用 AmplitudeStream 的 Read 函数,因此我不知道如何限制读取的字节数(始终为 35,280)。
如何做到这一点?我使用 Play() 命令是因为我希望播放我的文件以便我可以观察。因此,如果有另一种方法可以在 Read() 命令期间播放文件,则类似于:
byte[] buffer = byte[8192]
int bytesRead = 0;
do{
bytesRead = ampStream.Read(buffer,0,buffer.Length);
}while(bytesRead != 0);
@override
private int Read(byte[] buffer, int offset, int count)
{
//some command to play the file here
}
我想知道。谢谢!除此之外,有没有办法不使用 WaveChannel32 类作为 AmplitudeStream 的输入?
I have the following code:
IWavePlayer player = new WaveOut();
WaveStream reader = new WaveFileReader(filePath);
WaveChannel32 input = new WaveChannel32(reader);
AmplitudeStream ampStream = new AmplitudeStream(input);
ampStream.AmplitudeEventHandler += new EventHandler<AmplitudeArgs>(AmplitudeStreamHandler);
player.Init(ampStream);
player.Play();
AmplitudeStream is a class I made that implements WaveStream so that I can 'track' and send an event whenever data is read from the file. However, my problem is that I want to control how much bytes are read. Since the Read function of AmplitudeStream is being called by the Play() function, I have no idea on how I can limit the number of bytes read (it is always 35,280).
How to do this? I am using the Play() command because I want my file to play so that I can observe. So if there is another way to play the file during the Read() command, which would look something like:
byte[] buffer = byte[8192]
int bytesRead = 0;
do{
bytesRead = ampStream.Read(buffer,0,buffer.Length);
}while(bytesRead != 0);
@override
private int Read(byte[] buffer, int offset, int count)
{
//some command to play the file here
}
I would like to know. Thanks! Aside from that, is there a way to not use the WaveChannel32 class as input for my AmplitudeStream?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
看看 NAudio 演示中波形绘制是如何完成的,因为这听起来正是您想要做的。它本质上拦截读取,并每绘制代码可以订阅的 n 个样本发送一个事件。
更新读取的字节数是播放延迟的函数 - 较低的延迟=读取的字节数较少。如果您的源流只能以 8192 的块大小读取,我会很想引入中间缓冲流。
have a look in the NAudio demo at how the waveform drawing is done, since this sounds exactly what you want to do. It essentially intercepts the Read, and sends an event every n samples that the drawing code can subscribe to.
Update the number of bytes read is a function of the playback latency - lower latency = fewer bytes read. If your source stream can only be read in block-sizes of 8192 though, I would be tempted to introduce an intermediate buffering stream.