MediaStreamAudioDestinationNode - Web APIs 编辑
The MediaStreamAudioDestinationNode
interface represents an audio destination consisting of a WebRTC MediaStream
with a single AudioMediaStreamTrack
, which can be used in a similar way to a MediaStream
obtained from Navigator.getUserMedia()
.
It is an AudioNode
that acts as an audio destination, created using the AudioContext.createMediaStreamDestination()
method.
Number of inputs | 1 |
---|---|
Number of outputs | 0 |
Channel count | 2 |
Channel count mode | "explicit" |
Channel count interpretation | "speakers" |
Constructor
MediaStreamAudioDestinationNode.MediaStreamAudioDestinationNode()
- Creates a new
MediaStreamAudioDestinationNode
object instance.
Properties
Inherits properties from its parent, AudioNode
.
MediaStreamAudioDestinationNode.stream
- A
MediaStream
containing a singleMediaStreamTrack
whosekind
isaudio
and with the same number of channels as the node. You can use this property to get a stream out of the audio graph and feed it into another construct, such as a Media Recorder.
Methods
Inherits methods from its parent, AudioNode
.
Example
In the following simple example, we create a MediaStreamAudioDestinationNode
, an OscillatorNode
and a MediaRecorder
(the example will therefore only work in Firefox and Chrome at this time.) The MediaRecorder
is set up to record information from the MediaStreamDestinationNode
.
When the button is clicked, the oscillator starts, and the MediaRecorder
is started. When the button is stopped, the oscillator andMediaRecorder
both stop. Stopping the MediaRecorder
causes the dataavailable
event to fire, and the event data is pushed into the chunks
array. After that, the stop
event fires, a new blob
is made of type opus — which contains the data in the chunks
array, and a new window (tab) is then opened that points to a URL created from the blob.
From here, you can play and save the opus file.
<!DOCTYPE html>
<html>
<head>
<title>createMediaStreamDestination() demo</title>
</head>
<body>
<h1>createMediaStreamDestination() demo</h1>
<p>Encoding a pure sine wave to an Opus file </p>
<button>Make sine wave</button>
<audio controls></audio>
<script>
var b = document.querySelector("button");
var clicked = false;
var chunks = [];
var ac = new AudioContext();
var osc = ac.createOscillator();
var dest = ac.createMediaStreamDestination();
var mediaRecorder = new MediaRecorder(dest.stream);
osc.connect(dest);
b.addEventListener("click", function(e) {
if (!clicked) {
mediaRecorder.start();
osc.start(0);
e.target.textContent = "Stop recording";
clicked = true;
} else {
mediaRecorder.stop();
osc.stop(0);
e.target.disabled = true;
}
});
mediaRecorder.ondataavailable = function(evt) {
// push each chunk (blobs) in an array
chunks.push(evt.data);
};
mediaRecorder.onstop = function(evt) {
// Make blob out of our blobs, and open it.
var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
document.querySelector("audio").src = URL.createObjectURL(blob);
};
</script>
</body>
</html>
Note: You can view this example live, or study the source code, on Github.
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'MediaStreamAudioDestinationNode' in that specification. | Working Draft |
Browser compatibility
BCD tables only load in the browser
See also
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论