Anatomy of sound

Home > Performing Arts > Sound Design (performing arts) > Anatomy of sound

Understanding the different components of sound, such as pitch, frequency, amplitude, and timbre, and how they relate to one another.

Frequency: The rate of sound waves, measured in Hertz (Hz).
Waveform: The shape of a sound wave, which affects its timbre.
Amplitude: The power of a sound wave, measured in decibels (dB).
Pitch: The perceived frequency of a sound, which determines its musical notes.
Envelope: The shape of a sound's volume over time, consisting of attack, decay, sustain, and release.
Harmonics: Additional frequencies that are produced by a sound wave, which contribute to a sound's timbre.
Filters: Devices that allow certain frequencies through while attenuating others, used for shaping a sound's frequency spectrum.
Synthesis: The process of creating sounds electronically, such as through subtractive, additive, or granular synthesis.
Sampling: The technique of recording and reusing real sounds, commonly used in music production and sound effects.
Effects: Devices or software that modify the sound in various ways, such as reverb, delay, distortion, and EQ.
Spatialization: The placement of sounds in three-dimensional space, achieved through panning and reverb.
Foley: The art of creating sounds for film and TV, often involving recording and editing real-world objects.
Musical instrument anatomy: The physical components of musical instruments, such as strings, keys, and valves, and how they generate sound.
Psychoacoustics: The study of how humans perceive sound, including topics such as loudness, pitch, and masking.
Acoustics: The physics of sound waves and how they interact with the environment, including room acoustics and soundproofing.
Recording techniques: The ways in which sound is captured and recorded, such as microphone placement, gain staging, and mixing.
MIDI: The Musical Instrument Digital Interface, a protocol for digital communication between musical instruments and computers.
DAWs: Digital Audio Workstations, software used for recording, editing, and mixing audio.
Music theory: The principles and rules governing the construction of music, including scales, chords, and harmony.
Sound design history: The evolution and development of sound design in film, video games, and other media.
Frequency: The frequency of sound is the number of cycles per second that a sound wave goes through, measured in hertz (Hz). It determines the pitch of the sound.
Amplitude: The amplitude of sound is the strength or intensity of the sound wave, measured in decibels (dB). It determines the volume or loudness of the sound.
Duration: The duration of sound is the length of time that a sound lasts, measured in seconds.
Timbre: The timbre of sound is the unique characteristic of the sound that distinguishes it from other sounds, such as the tone or color of a voice.
Envelope: The envelope of sound is the way in which the amplitude of the sound changes over time, typically characterized by four stages: attack, decay, sustain, and release.
Spatialization: The spatialization of sound refers to the process of creating the illusion of sound sources being located in different positions and distances in the surrounding environment.
Reverberation: Reverberation is the effect of sound waves reflecting off surfaces in a space, creating an echo or a sense of spaciousness.
Foley: Foley refers to the process of creating sound effects for films and other media, typically by recording and manipulating everyday sounds.
Synthesis: Synthesis refers to the creation of sounds using electronic or digital means, often used in music production and sound design.
Soundscapes: Soundscapes refer to the environmental sounds that create a sense of place, such as the sound of waves on a beach or the chirping of birds in a forest.
"Sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid or solid."
"Sound is the reception of such waves and their perception by the brain."
"Only acoustic waves that have frequencies lying between about 20 Hz and 20 kHz, the audio frequency range, elicit an auditory percept in humans."
"These represent sound waves with wavelengths of 17 meters (56 ft) to 1.7 centimeters (0.67 in)."
"Sound waves above 20 kHz are known as ultrasound and are not audible to humans."
"Sound waves below 20 Hz are known as infrasound."
"Yes, different animal species have varying hearing ranges."
"Sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid, or solid."
"In air at atmospheric pressure, these represent sound waves with wavelengths of 17 meters (56 ft) to 1.7 centimeters (0.67 in)."
"Only acoustic waves that have frequencies lying between about 20 Hz and 20 kHz, the audio frequency range, elicit an auditory percept in humans."
"No, ultrasound waves are not audible to humans."
"Sound waves below 20 Hz are known as infrasound."
"Sound is the reception of such waves and their perception by the brain."
"Only acoustic waves that have frequencies lying between about 20 Hz and 20 kHz, the audio frequency range, elicit an auditory percept in humans."
"Yes, different animal species have varying hearing ranges."
"Only acoustic waves that have frequencies lying between about 20 Hz and 20 kHz, the audio frequency range, elicit an auditory percept in humans."
"Yes, sound is a vibration that propagates as an acoustic wave."
"Sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid, or solid."
"These represent sound waves with wavelengths of 17 meters (56 ft) to 1.7 centimeters (0.67 in)."
"Sound waves below 20 Hz are known as infrasound."