public/etau_class library
Classes
- AnalyserNode
-
The
AnalyserNoderepresents a node able to provide real-time frequency and time-domain analysis information. It is an AudioNode that passes the audio stream unchanged from the input to the output, but allows you to take the generated data, process it, and create audio visualizations. - AnalyserOptions
- AsyncWorkletNode
- AudioBuffer
-
The
AudioBufferrepresents a short audio asset residing in memory, created from an audio file using the BaseAudioContext.decodeAudioData method, or from raw data using BaseAudioContext.createBuffer. Once put into an AudioBuffer, the audio can then be played by being passed into an AudioBufferSourceNode. - AudioBufferOptions
- AudioBufferSourceNode
-
The
AudioBufferSourceNodeis an AudioScheduledSourceNode which represents an audio source consisting of in-memory audio data, stored in an AudioBuffer. - AudioBufferSourceOptions
- AudioContext
-
The
AudioContextrepresents an audio-processing graph built from audio modules linked together, each represented by an AudioNode. - AudioContextOptions
- AudioDestinationNode
-
The
AudioDestinationNoderepresents the end destination of an audio graph in a given context — usually the speakers of your device. It can also be the node that will "record" the audio data when used with anOfflineAudioContext. - AudioListener
-
The
AudioListenerrepresents the position and orientation of the unique person listening to the audio scene, and is used in audio spatialization. All PannerNodes spatialize in relation to theAudioListenerstored in the BaseAudioContext.listener attribute. - AudioNode
-
The
AudioNodeis a generic for representing an audio processing module. - AudioNodeOptions
- AudioParam
-
The Web Audio API's
AudioParamrepresents an audio-related parameter, usually a parameter of an AudioNode (such as GainNode.gain). - AudioParamMap
-
The
AudioParamMapof the Web Audio API represents an iterable and read-only set of multiple audio parameters. - AudioProcessingEvent
-
The
AudioProcessingEventof the Web Audio API represents events that occur when a ScriptProcessorNode input buffer is ready to be processed. - AudioProcessingEventInit
- AudioScheduledSourceNode
-
The
AudioScheduledSourceNodeinterface—part of the Web Audio API—is a parent for several types of audio source node interfaces which share the ability to be started and stopped, optionally at specified times. Specifically, this defines the AudioScheduledSourceNode.start and AudioScheduledSourceNode.stop methods, as well as theAudioScheduledSourceNode.ended_eventevent. - AudioSinkOptions
- AudioTimestamp
- AudioWorklet
-
The
AudioWorkletof the Web Audio API is used to supply custom audio processing scripts that execute in a separate thread to provide very low latency audio processing. - AudioWorkletGlobalScope
-
The
AudioWorkletGlobalScopeof the Web Audio API represents a global execution context for user-supplied code, which defines custom AudioWorkletProcessor-derived classes. - AudioWorkletNode
-
Note: Although the is available outside secure contexts, the BaseAudioContext.audioWorklet property is not, thus custom AudioWorkletProcessors cannot be defined outside them.
- AudioWorkletNodeOptions
- AudioWorkletProcessor
-
The
AudioWorkletProcessorof the Web Audio API represents an audio processing code behind a custom AudioWorkletNode. It lives in the AudioWorkletGlobalScope and runs on the Web Audio rendering thread. In turn, an AudioWorkletNode based on it runs on the main thread. - BaseAudioContext
-
The
BaseAudioContextof the Web Audio API acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext and OfflineAudioContext respectively. You wouldn't useBaseAudioContextdirectly — you'd use its features via one of these two inheriting interfaces. - BiquadFilterNode
-
The
BiquadFilterNoderepresents a simple low-order filter, and is created using the BaseAudioContext.createBiquadFilter method. It is an AudioNode that can represent different kinds of filters, tone control devices, and graphic equalizers. ABiquadFilterNodealways has exactly one input and one output. - BiquadFilterOptions
- Blob
-
The
Blobinterface represents a blob, which is a file-like object of immutable, raw data; they can be read as text or binary data, or converted into aReadableStreamso its methods can be used for processing the data. - BlobEvent
-
The
BlobEventinterface of the MediaStream Recording API represents events associated with a Blob. These blobs are typically, but not necessarily, associated with media content. - BlobEventInit
- BlobPropertyBag
- ChannelMergerNode
-
The
ChannelMergerNodeinterface, often used in conjunction with its opposite, ChannelSplitterNode, reunites different mono inputs into a single output. Each input is used to fill a channel of the output. This is useful for accessing each channels separately, e.g. for performing channel mixing where gain must be separately controlled on each channel. - ChannelMergerOptions
- ChannelSplitterNode
-
The
ChannelSplitterNodeinterface, often used in conjunction with its opposite, ChannelMergerNode, separates the different channels of an audio source into a set of mono outputs. This is useful for accessing each channel separately, e.g. for performing channel mixing where gain must be separately controlled on each channel. - ChannelSplitterOptions
- ConstantSourceNode
-
The
ConstantSourceNodeinterface—part of the Web Audio API—represents an audio source (based upon AudioScheduledSourceNode) whose output is single unchanging value. This makes it useful for cases in which you need a constant value coming in from an audio source. In addition, it can be used like a constructible AudioParam by automating the value of its ConstantSourceNode.offset or by connecting another node to it; see Controlling multiple parameters with ConstantSourceNode. - ConstantSourceOptions
- ConvolverNode
-
The
ConvolverNodeis an AudioNode that performs a Linear Convolution on a given AudioBuffer, often used to achieve a reverb effect. AConvolverNodealways has exactly one input and one output. - ConvolverOptions
- DelayNode
-
The
DelayNoderepresents a delay-line; an AudioNode audio-processing module that causes a delay between the arrival of an input data and its propagation to the output. - DelayOptions
- DynamicsCompressorNode
-
The
DynamicsCompressorNodeprovides a compression effect, which lowers the volume of the loudest parts of the signal in order to help prevent clipping and distortion that can occur when multiple sounds are played and multiplexed together at once. This is often used in musical production and game audio.DynamicsCompressorNodeis an AudioNode that has exactly one input and one output. - DynamicsCompressorOptions
- Event
- EventInit
- EventTarget
- GainNode
-
The
GainNoderepresents a change in volume. It is an AudioNode audio-processing module that causes a given gain to be applied to the input data before its propagation to the output. AGainNodealways has exactly one input and one output, both with the same number of channels. - GainOptions
- IIRFilterNode
-
The
IIRFilterNodeof the Web Audio API is a AudioNode processor which implements a general infinite impulse response (IIR) filter; this type of filter can be used to implement tone control devices and graphic equalizers as well. It lets the parameters of the filter response be specified, so that it can be tuned as needed. - IIRFilterOptions
- MediaDeviceInfo
- MediaDevices
- MediaElement
- MediaElementAudioSourceNode
-
The
MediaElementAudioSourceNoderepresents an audio source consisting of an HTMLaudioorvideoelement. It is an AudioNode that acts as an audio source. - MediaElementAudioSourceOptions
- MediaRecorder
- MediaRecorderOptions
- MediaStream
- MediaStreamAudioDestinationNode
-
The
MediaStreamAudioDestinationNoderepresents an audio destination consisting of a WebRTC MediaStream with a singleAudioMediaStreamTrack, which can be used in a similar way to aMediaStreamobtained from MediaDevices.getUserMedia. - MediaStreamAudioSourceNode
-
The
MediaStreamAudioSourceNodeis a type of AudioNode which operates as an audio source whose media is received from a MediaStream obtained using the WebRTC or Media Capture and Streams APIs. - MediaStreamAudioSourceOptions
- MediaStreamTrack
- MediaStreamTrackAudioSourceNode
-
The
MediaStreamTrackAudioSourceNodeis a type of AudioNode which represents a source of audio data taken from a specific MediaStreamTrack obtained through the WebRTC or Media Capture and Streams APIs. - MediaStreamTrackAudioSourceOptions
- MessagePort
- OfflineAudioCompletionEvent
-
The
Web Audio API
OfflineAudioCompletionEventrepresents events that occur when the processing of an OfflineAudioContext is terminated. TheOfflineAudioContext.complete_eventevent uses this interface. - OfflineAudioCompletionEventInit
- OfflineAudioContext
-
The
OfflineAudioContextis an AudioContext interface representing an audio-processing graph built from linked together AudioNodes. In contrast with a standard AudioContext, anOfflineAudioContextdoesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer. - OfflineAudioContextOptions
- OscillatorNode
-
The
OscillatorNoderepresents a periodic waveform, such as a sine wave. It is an AudioScheduledSourceNode audio-processing module that causes a specified frequency of a given wave to be created—in effect, a constant tone. - OscillatorOptions
- PannerNode
-
The
PannerNodedefines an audio-processing object that represents the location, direction, and behavior of an audio source signal in a simulated physical space. This AudioNode uses right-hand Cartesian coordinates to describe the source's position as a vector and its orientation as a 3D directional cone. - PannerOptions
- ParameterData
- PeriodicWave
-
The
PeriodicWavedefines a periodic waveform that can be used to shape the output of an OscillatorNode. - PeriodicWaveConstraints
- PeriodicWaveOptions
- ProcessorOptions
- ScriptProcessorNode
-
The
ScriptProcessorNodeallows the generation, processing, or analyzing of audio using JavaScript. - StereoPannerNode
-
The
StereoPannerNodeof the Web Audio API represents a simple stereo panner node that can be used to pan an audio stream left or right. It is an AudioNode audio-processing module that positions an incoming audio stream in a stereo image using a low-cost equal-power panning algorithm. - StereoPannerOptions
- WaveShaperNode
-
The
WaveShaperNoderepresents a non-linear distorter. - WaveShaperOptions
Enums
Typedefs
- AudioContextLatencyCategory = String
- AudioContextRenderSizeCategory = String
- AudioContextState = String
- AudioSinkType = String
- AudioWorkletProcessorConstructor = void Function()
- AutomationRate = String
- BiquadFilterType = String
- BitrateMode = String
- BlobPart = TauAny
- ChannelCountMode = String
- ChannelInterpretation = String
- DataEventHandler = void Function(Float32List)
- DecodeErrorCallback = void Function()
- DecodeSuccessCallback = void Function()
- DistanceModelType = String
- DOMHighResTimeStamp = int
- EndingType = String
- EventHandler = void Function()
- Message = dynamic
- MessageFn = void Function(dynamic msg)
- OnAudioBufferUnderflowFn = void Function(int outputNo)
- OnDataAvailableFn = void Function(Float32List data)
-
OnReceiveDataFn
= void Function(int inputNo, List<
Float32List> data) - OscillatorType = String
- OverSampleType = String
- PanningModelType = String
- RecordingState = String
- TauAny = Object
-
TauArray<
T> = List< T> - TauArrayBuffer = ByteBuffer
- TauFloat32Array = Float32List
- TauHighResTimeStamp = double
- TauNumber = num
- TauObject = Object
-
TauPromise<
T> = Future< T> - TauSampleRate = double
- TauTime = int
- TauUint8Array = Uint8List