MediaStreamTrackAudioSourceNode - Web APIs 编辑
The MediaStreamTrackAudioSourceNode
interface is a type of AudioNode
which represents a source of audio data taken from a specific MediaStreamTrack
obtained through the WebRTC or Media Capture and Streams APIs. The audio itself might be input from a microphone or other audio sampling device, or might be received through a RTCPeerConnection
, among other posible options.
A MediaStreamTrackAudioSourceNode
has no inputs and exactly one output, and is created using the AudioContext.createMediaStreamTrackSource()
method. This interface is similar to MediaStreamAudioSourceNode
, except it lets you specifically state the track to use, rather than assuming the first audio track on a stream.
Number of inputs | 0 |
---|---|
Number of outputs | 1 |
Channel count | defined by the first audio MediaStreamTrack passed to the AudioContext.createMediaStreamTrackSource() method that created it. |
Constructor
new MediaStreamTrackAudioSourceNode()
- Creates a new
MediaStreamTrackAudioSourceNode
object instance with the specified options.
Properties
The MediaStreamTrackAudioSourceNode
interface has no properties of its own; however, it inherits the properties of its parent, AudioNode
.
Methods
Inherits methods from its parent, AudioNode
.
Example
In this example, we grab a media (audio + video) stream from navigator.getUserMedia
, feed the media into a <video>
element to play then mute the audio, but then also feed the audio into a MediaStreamAudioSourceNode
. Next, we feed this source audio into a low pass BiquadFilterNode
(which effectively serves as a bass booster), then a AudioDestinationNode
.
The range slider below the <video>
element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!
Note: You can see this example running live, or view the source.
var pre = document.querySelector('pre');
var video = document.querySelector('video');
var myScript = document.querySelector('script');
var range = document.querySelector('input');
// getUserMedia block - grab stream
// put it into a MediaStreamAudioSourceNode
// also output the visuals into a video element
if (navigator.mediaDevices) {
console.log('getUserMedia supported.');
navigator.mediaDevices.getUserMedia ({audio: true, video: true})
.then(function(stream) {
video.srcObject = stream;
video.onloadedmetadata = function(e) {
video.play();
video.muted = true;
};
// Create a MediaStreamAudioSourceNode
// Feed the HTMLMediaElement into it
var audioCtx = new AudioContext();
var source = audioCtx.createMediaStreamSource(stream);
// Create a biquadfilter
var biquadFilter = audioCtx.createBiquadFilter();
biquadFilter.type = "lowshelf";
biquadFilter.frequency.value = 1000;
biquadFilter.gain.value = range.value;
// connect the AudioBufferSourceNode to the gainNode
// and the gainNode to the destination, so we can play the
// music and adjust the volume using the mouse cursor
source.connect(biquadFilter);
biquadFilter.connect(audioCtx.destination);
// Get new mouse pointer coordinates when mouse is moved
// then set new gain value
range.oninput = function() {
biquadFilter.gain.value = range.value;
}
})
.catch(function(err) {
console.log('The following gUM error occurred: ' + err);
});
} else {
console.log('getUserMedia not supported on your browser!');
}
// dump script to pre element
pre.innerHTML = myScript.innerHTML;
Note: As a consequence of calling createMediaStreamSource()
, audio playback from the media stream will be re-routed into the processing graph of the AudioContext
. So playing/pausing the stream can still be done through the media element API and the player controls.
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'MediaStreamTrackAudioSourceNode' in that specification. | Working Draft |
Browser compatibility
BCD tables only load in the browser
See also
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论