从HLS.JS创建MediaStream

发布于 2025-01-24 09:46:16 字数 1369 浏览 2 评论 0原文

我有一个HLS流,我使用 hls.js库 。然后,我想进行该流并将其送入 wave.js 。我想使用方法的来构造音频波形可视化器,而不是Fromlement,以便我可以将可选的ConnectDestation参数设置为false。

为了创建被馈入wave.fromstream()构造函数的媒体流,我正在关注此示例该示例显示了capturestream()如何用于镜像音频或视频元素的播放。 我对捕获流的实现如下。

  let audio = this.audioPlayer.nativeElement;
  let stream;
  if (audio.captureStream) {
    stream = audio.captureStream();
  } else if (audio.mozCaptureStream) {
    stream = audio.mozCaptureStream();
  } else {
    console.error('Stream capture is not supported');
    stream = null;
  }

然后,流将传递到wave.fromstream()中。

不幸的是,当执行wave.fromstream()时,我会收到以下错误。

core.js:6498 ERROR DOMException: Failed to execute 'createMediaStreamSource' on 'AudioContext': MediaStream has no audio track

这意味着媒体流传递到Wave.FromStream,没有与之相关的音轨。当我查看我的音频元素时,即使附有HLS流,记录音频。Audiotracks返回未定义,即使有音频播放和由该音频元素控制的音频流。因此,从音频元素创建媒体流并没有问题,这只是HLS.JS将流将流传输到音频元素的方式。

是否有另一种方法可以从HLS.JS创建的HLS流创建媒体流对象?

I have a HLS stream that I am attaching to an audio element using the Hls.js library. I want to then take that stream and feed it into Wave.js. I want to use the fromStream method to construct the audio waveform visualiser and not fromElement so that I can set the optional connectDestination argument to false.

For creating the media stream that is fed into the Wave.fromStream() constructor, I am following this example that shows how captureStream() can be used to mirror the playback of an audio or video element.
My implementation of capturing the stream is as follows.

  let audio = this.audioPlayer.nativeElement;
  let stream;
  if (audio.captureStream) {
    stream = audio.captureStream();
  } else if (audio.mozCaptureStream) {
    stream = audio.mozCaptureStream();
  } else {
    console.error('Stream capture is not supported');
    stream = null;
  }

Stream is then passed into Wave.fromStream().

Unfortunately, when Wave.fromStream() is executed, I get the following error.

core.js:6498 ERROR DOMException: Failed to execute 'createMediaStreamSource' on 'AudioContext': MediaStream has no audio track

This means that the media stream passed into Wave.fromStream, has no audio tracks associated with it. And when I look at my audio element, even with the Hls stream attached to it, logging Audio.audioTracks returns undefined, even though there is a stream of audio playing and being controlled by that audio element. So there is no issue with creating the media stream from the audio element, it is just how Hls.js is attaching the stream to the audio element.

Is there another way to create a media stream object from a Hls stream created by Hls.js?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

初雪 2025-01-31 09:46:16

我找到了一个解决问题的解决方案,该解决方案是创建我自己的自定义版本的 @foobar404/wave npm软件包,并将其修改以接受输入参数,该参数是预先现有音频上下文和上下文中的最终节点。这意味着我可以将波形连接到音频上下文中的最终节点,而无需将波形连接到音频上下文目标或源,这是我的错误来自的位置。

See: https://www.npmjs.com/package/wave-external-音频字母

I have found a solution to the problem which was to create my own customised version of the @foobar404/wave npm package and modify it to accept an input argument which is a pre existing audio context and the final node in the context. this means that I can connect the waveform to the final node in the audio context without then connecting the waveform to the audio context destination or source which is where my errors were coming from.

See: https://www.npmjs.com/package/wave-external-audio-context

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文