AudioContext - Web APIs 编辑
The AudioContext
interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode
. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext
before you do anything else, as everything happens inside a context. It's recommended to create one AudioContext and reuse it instead of initializing a new one each time, and it's OK to use a single AudioContext
for several different audio source and pipeline concurrently.
<div id="interfaceDiagram" style="display: inline-block; position: relative; width: 100%; padding-bottom: 11.666666666666666%; vertical-align: middle; overflow: hidden;"><svg style="display: inline-block; position: absolute; top: 0; left: 0;" viewbox="-50 0 600 70" preserveAspectRatio="xMinYMin meet"><a xlink:href="/wiki/en-US/docs/Web/API/EventTarget" target="_top"><rect x="1" y="1" width="110" height="50" fill="#fff" stroke="#D4DDE4" stroke-width="2px" /><text x="56" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">EventTarget</text></a><polyline points="111,25 121,20 121,30 111,25" stroke="#D4DDE4" fill="none"/><line x1="121" y1="25" x2="151" y2="25" stroke="#D4DDE4"/><a xlink:href="https://developer.mozilla.org/wiki/en-US/docs/Web/API/BaseAudioContext" target="_top"><rect x="151" y="1" width="160" height="50" fill="#fff" stroke="#D4DDE4" stroke-width="2px" /><text x="231" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">BaseAudioContext</text></a><polyline points="311,25 321,20 321,30 311,25" stroke="#D4DDE4" fill="none"/><line x1="321" y1="25" x2="351" y2="25" stroke="#D4DDE4"/><a xlink:href="https://developer.mozilla.org/wiki/en-US/docs/Web/API/AudioContext" target="_top"><rect x="351" y="1" width="120" height="50" fill="#F4F7F8" stroke="#D4DDE4" stroke-width="2px" /><text x="411" y="30" font-size="12px" font-family="Consolas,Monaco,Andale Mono,monospace" fill="#4D4E53" text-anchor="middle" alignment-baseline="middle">AudioContext</text></a></svg></div>
a:hover text { fill: #0095DD; pointer-events: all;}
Constructor
AudioContext()
- Creates and returns a new
AudioContext
object.
Properties
Also inherits properties from its parent interface, BaseAudioContext
.
AudioContext.baseLatency
Read only- Returns the number of seconds of processing latency incurred by the
AudioContext
passing the audio from theAudioDestinationNode
to the audio subsystem. AudioContext.outputLatency
Read only- Returns an estimation of the output latency of the current audio context.
Methods
Also inherits methods from its parent interface, BaseAudioContext
.
AudioContext.close()
- Closes the audio context, releasing any system audio resources that it uses.
AudioContext.createMediaElementSource()
- Creates a
MediaElementAudioSourceNode
associated with anHTMLMediaElement
. This can be used to play and manipulate audio from<video>
or<audio>
elements. AudioContext.createMediaStreamSource()
- Creates a
MediaStreamAudioSourceNode
associated with aMediaStream
representing an audio stream which may come from the local computer microphone or other sources. AudioContext.createMediaStreamDestination()
- Creates a
MediaStreamAudioDestinationNode
associated with aMediaStream
representing an audio stream which may be stored in a local file or sent to another computer. AudioContext.createMediaStreamTrackSource()
- Creates a
MediaStreamTrackAudioSourceNode
associated with aMediaStream
representing an media stream track. AudioContext.getOutputTimestamp()
- Returns a new
AudioTimestamp
object containing two audio timestamp values relating to the current audio context. AudioContext.resume()
- Resumes the progression of time in an audio context that has previously been suspended/paused.
AudioContext.suspend()
- Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.
Examples
Basic audio context declaration:
var audioCtx = new AudioContext();
Cross browser variant:
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();
var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'AudioContext' in that specification. | Working Draft |
Browser compatibility
BCD tables only load in the browser
See also
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论