Web Audio API guidelines
I have a few questions about what is or isn't allowed when using the Web Audio API in webpages.
According to the guidelines,
Webpages should only create sounds using the files available to ProcessingJS programs or the Web Audio API oscillator.
I assume that along with OscillatorNodes, GainNodes are also allowed to control volume. But there are other audio nodes that can be used to create different effects, none of which are referenced in the guidelines. Additionally, some effects can be created using LFOs, which are also oscillators, but I'm not sure if this was the intention of the statement in the guidelines.
The guidelines do state that the audio files available to ProcessingJS programs can be used, but they can also be modified using the Web Audio API. Is this allowed, or do the audio files have to be used unchanged?
Furthermore, along with playing sound files and using oscillators, there are other possibly less known ways to make sound with the Web Audio API. For example, audio buffers can be used to create sounds such as white noise - which isn't currently included in the guidelines.
Some specific examples of what I'm referencing can be found here.
https://www.khanacademy.org/computer-programming/web-audio-api/5925273191628800
Could the guidelines for using the Web Audio API be modified to be more specific, and possibly for certain things to be added? Just wanted to make sure what I'm doing is allowed before using them in a project.
Please sign in to leave a comment.