Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

What is the function of HTML5 Audio API Web Audio

2025-01-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article introduces the relevant knowledge of "what is the role of HTML5 audio API Web Audio". In the operation of actual cases, many people will encounter such a dilemma. Next, let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

The main framework and workflow of HTML5 audio API are shown in the following figure. In the context of AudioContext audio, convert the audio file to buffer format, start with the audio source source, process the audio through AuidoNode, and then output music from destination. An audio channel is formed here, and each module links and transmits the audio through the connect method.

AudioContext

AudioContext is an audio context, like a big factory, all audio is processed in this audio context.

Let audioContext = new (window.AudioContext | | window.webkitAudioContext) ()

AudioContext audio context provides many properties and methods for creating various audio sources and audio processing modules. Only some of them are described here. More properties and methods can be found in the MDN documentation.

Attribute

AudioContext.destination

Returns the AudioDestinationNode object, which represents the final node of all nodes in the current AudioContext, which generally represents the audio rendering device.

Method

AudioContext.createBufferSource ()

Create an AudioBufferSourceNode object that can play and process the included audio data through the AudioBuffer object.

AudioContext.createGain ()

Create a GainNode that controls the total volume of the audio.

AudioContext.createBiquadFilter ()

Create a BiquadFilterNode, which represents a biquadratic filter, and can set several different and common filter types: high-pass, low-pass, band-pass, etc.

CreateOscillator ()

Create an OscillatorNode that represents a periodic waveform and basically creates a tone.

Convert audio to Buffer format

Use the decodeAudioData () method to compile the audio file into buffer format.

Function decodeAudioData (audioContext, url) {return new Promise ((resolve) = > {let request = new XMLHttpRequest (); request.open ('GET', url, true); request.responseType =' arraybuffer'; request.onload = () = > {audioContext.decodeAudioData (request.response, (buffer) = > {if (! buffer) {alert ('error decoding file data:' + url) Return;} else {resolve (buffer);}} request.onerror = function () {alert ('BufferLoader: XHR error');} request.send () })} let buffer = decodeAudioData (audioContext,'. / sounds/music.mp3')

AudioNode

The audio node interface is an audio processing module. Including audio source, audio output, intermediate processing module.

Method

AudioNode.connect ()

Link two AudioNode nodes to output audio from one AudioNode node to another AudioNode node to form an audio channel.

AudioNode.disconnect ()

Disconnect the AudioNode node from other nodes.

AudioBufferSourceNode

There are many audio sources. Only the audio source of buffer is introduced here. The audio source of buffer is created by the createBufferSource method of API AudioContext. The audio source node inherits the AudioNode audio node.

Let bufferSource = audioContext.createBufferSource ()

After the AudioBufferSourceNode object is created, the audio data in buffer format is assigned to the buffer attribute of the AudioBufferSourceNode object. At this time, the audio has been passed to the audio source, and the audio can be processed or output.

BufferSource.buffer = buffer

Method

AudioBufferSourceNode.start (when [, duration])

Start playing.

When: delay playback time (in seconds).

Offset: locate the audio in which second to start playing.

Duration: it takes a long time from the beginning to the end of the playback, and automatically ends the audio playback after setting the number of seconds.

AudioBufferSourceNode.stop ([when])

When: delay stop time (in seconds).

Stop playback. Note that after calling this method, AudioBufferSourceNode.start cannot be called again.

AudioDestinationNode

The audio endpoint is accessed through the destination property of the AudioContext interface. Audio endpoint inherits AudioNode audio node

The AudioDestinationNode node can no longer transmit the audio information to the next audio node, that is, it can no longer link to other audio nodes, because it is already the end point and has no output, which can be understood as himself as the output.

Let audioDestinationNode = audioContext.destination

At this time, we have audio start point AudioBufferSourceNode and audio end point AudioDestinationNode. Using the AudioNode.connect () method to link the start point and the end point, we form an audio channel with input and output, which can play the audio directly.

BufferSource.connect (audioDestinationNode)

GainNode

Used for volume change. It is an audio processing module of AudioNode type.

Let gainNode = audioContext.createGain ()

Link the audio source, audio output and audio processing module to form audio with controllable volume.

BufferSource.connect (gainNode); gainNode.connect (audioDestinationNode); let controlVolume = value = > {gainNode.gain.value = value);} / / double volume playback controlVolume (2)

BiquadFilterNode

Represents a simple low-frequency filter that controls tone. It is an audio processing module of AudioNode type.

Let filterNode = audioContext.createBiquadFilter ()

Output a tone-altered audio:

BufferSource.connect (filterNode); filterNode.connect (audioDestinationNode); let controlFrequency = function (value) {filterNode.frequency.value = value;} / / 1000 tone Sandhi controlFrequency (1000)

Multiple audio sources

In an audio context, there can be multiple audio processing channels, that is, multiple audio sources output at the same time. The operations within each audio processing channel are independent and do not affect other audio channels.

Multiple audio processing modules

An audio source can be processed by multiple audio processing modules, and the audio processing module is output after superimposing the effect.

This is the end of the content of "what is the function of HTML5 Audio API Web Audio"? thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 205

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report