Assignment 1
web-based musical instrument
For this assignment, students will design and develop a web-based musical instrument. The instrument should allow users to interact with it and generate sounds in a meaningful way. This should not be a web-based version of an existing instrument (like a guitar or piano) but rather something more experimental which embraces the creative possibilities of the Web. Students should explore different interfaces and consider the use of sensors or controllers to enhance the user experience. The final deliverable will be a functional web-based instrument that can be shared online, along with a brief written reflection on the creative process and technical challenges.
some points to consider...
1. Level of Abstraction, think about which tools u'd like to use to generate sound. we've introduced multiple layers of abstraction, the higher the level the easier to use (but the more pre-defined the sound will be) the lowest level is most difficult to use (but the least defined && most malliable). These break down as follows (from high-level to low-level):
- Tone.js "Instruments" (docs / demos) Tone.js provides a list of classes (which it calls "instruments") for generating sounds, most are created using some pre-built combination of Oscillators (one exception being the Tone.js Sampler: docs / demo) attached to an AmplitudeEnvelope (so that our sounds have that more natural sounding ADSR envelope when we play them). These come with a variety of different settings we can modify, either by setting them manually (as with the present examples) or adjusting them with UI we create (like buttons or input range sliders) or algorithmically (ex: by "modulating" them with an LFO)
- Tone.js "Sources" (docs / demos) Tone.js also provides different types of Oscillators (beyond the basic "sine", "square", "triangle" && "sawtooth") which are made from some combination of nodes && thus have other properties besides frequency (or pitch) volume (or amplitude) && type (or shape). Similarly, the Tone.js Player (docs / demo) and Players (docs) allow us to create instruments from pre-recorded sound files without having to work directly with audio buffers.
- Keep in mind that the Tone.js classes (both the oscillators && higher level "instruments") are really just different combinations of a small set of lower level nodes, namely the basic Oscillator, the basic GainNode && the LFO (with occasional help from some of Tone.js's signal math classes). The big difference between the Tone.js Oscillators and Instruments is that the latter all have an AmplitudeEnvelope plugged in at the end (for that ADSR).
- At the lowest level you can work directly with audio buffers, at the end of the day all the interfaces/nodes mentioned above are just ways of creating buffers by describing higher level concepts (like "frequency") as opposed to thinking mathematically about the actual values representing that signal (or sound wave), you could of course create any sound from pure maths, which is what working with buffers directly allows for.
2. Timbre, think about what you want your instrument to sound like. on one end of the spectrum we have a simple sine wave, on the other end we have pure noise, there's loads of space in-between. a big part of making your instrument your own will be shaping it's timbre (it's texture or flavor).
- Besides adjusting the properties built into the Oscillators && Instruments in Tone.js, we can also shape our sound by adding to our audio graph (or signal chain) some of Tone.js's Effects classes (docs / demos). These too have their own properties which can be modified, controlled (with UI) or modulated (with code like LFOs), and don't forget, as with the source nodes themselves, u can create your own types of effects from some of the more "core" or basic types of effect nodes, namely Delay, Reverb and Distortion (most of the other effects are really some combination of those with Oscillators && LFOs)
3. Controls, think about what aspect of the sound generation you want to be able to control && in what way. Some might be obvious (most instruments let you control the frequency, ie. which note/pitch is played) but some controls might be less obvious && more specific to the design of your instrument. We've gone over some basic UI as well as how to use the mouse/trackpad and the keyboard as inputs, as well as MIDI controllers and even our video camera (with the help of AI).