public/webaudio library

Classes

AudioContext
The interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an before you do anything else, as everything happens inside a context. It's recommended to create one AudioContext and reuse it instead of initializing a new one each time, and it's OK to use a single for several different audio sources and pipeline concurrently.
AudioContextOptions
AudioDestinationNode
The interface represents the end destination of an audio graph in a given context — usually the speakers of your device. It can also be the node that will "record" the audio data when used with an OfflineAudioContext. has no output (as it is the output, no more AudioNode can be linked after it in the audio graph) and one input. The number of channels in the input must be between 0 and the maxChannelCount value or an exception is raised. The of a given AudioContext can be retrieved using the AudioContext.destination property.
AudioNode
The interface is a generic interface for representing an audio processing module. Examples include:
AudioNodeOptions
AudioParam
BaseAudioContext
Element
ElementEvents
ElementStream<T extends Event>
A specialized Stream available to Elements to enable event delegation.
Event
Events
Base class that supports listening for and dispatching browser events.
EventStreamProvider<T extends Event>
A factory to expose DOM events as Streams.
EventTarget
GainNode
GainOptions
JsUtil
MediaElement
MediaElementAudioSourceNode
MediaElementAudioSourceOptions
StereoPannerNode
StereoPannerOptions

Typedefs

EventListener = dynamic Function(Event event)