Technology has always shaped music. As new instruments were invented, new genres of music came along with them, just like the architectural features of the spaces where music was played had a profound effect on the harmonies and rhythms classical composers used. Innovations in electronics were quickly adopted by musicians and manufacturers alike, contributing significantly to the art and business of making music.
The reverse is not necessarily true. Sure, there is a music technology industry that develops tools specifically to make music, but the real technological revolutions are almost always propelled by fields much larger than just music.
Vacuum tubes and solid state circuitry first made an appearance in telecommunication, the digital audio revolution was just a small facet of computers changing all of civilization, and Auto-Tune was developed by an oil drilling engineer.
When acoustics met electronics
By the time big mixing consoles were the norm, a major practical limitation in post-production was that if a change needed to be made to the mix, it would take a very long time and several Polaroid pictures to set every knob and fader just right again. The first recall system was a gamechanger for this reason, even if it wasn’t a sound processing or generating device.
Another workflow issue was that to change any parameter over time (like riding a vocal, meaning moving the fader to make sections and even syllables louder and quieter), you had to have your hands on the control throughout the song as it was being recorded from the mixing desk. In the case of a full band and complex arrangements, a whole team played the console like an instrument. This too become much less labor-intensive when automation came about: fader movements could be stored and replayed, so a single engineer could handle the mix and increasingly focus on the creative aspects.
Digital love
Audio, like almost every other field, was completely transformed when computers took the world by storm. The introduction of CDs on the consumer side and digital audio tapes (digital signals recorded to traditional magnetic tape) on the production side led to a further increase in dynamic range and spectral bandwidth. The real revolution came with hard disk recording, where splicing tape with literal razor blades gave way to digital editing. The consistent increase in CPU and memory allowed for ever more tracks to be played simultaneously, efficient non-destructive editing, limitless takes for cheap, and real-time processing in the form of software plugins. Soon enough, a general-purpose desktop or laptop computer was almost all anyone needed to record or synthesize music, a stark contrast to the earlier mixing consoles going for the price of a nice house. Throw in a good interface and versatile microphone and you’re good to go.
Interestingly, while the underlying technology is completely different, digital recording still follows the workflow, architecture and even looks of the analog days to a large extent. This is probably due to the many gradual innovations, and an industry that has a soft spot for nostalgia. You’re unlikely to find a telecom engineer promoting manual switchboards, but in music there are definitely those who swear by analog tape and processors, if not digital emulations that come ever closer to the original.
And yet, some software (and hardware) is breaking away from the traditional paradigms, leveraging the unique possibilities of digital interfaces and state-of-the-art algorithms.
As always, opponents of democratization will point to lots of bad content being produced with very little threshold for quality. While it may be hard to change their minds, few will argue that the sound quality and efficiency of music production has gone up by several orders of magnitude in the last century alone. This has not only led to flooding of the internet with mediocre material, but even to an abundance of good content, more than you could ever consume!
Beyond recording
Aside from recording technology, new musical instruments such as the electric guitar, the synthesizer and the drum machine changed music too, though rarely in ways imagined at their conception. The fact that they were often underwhelming at the job they were marketed to do – provide a cheap, lightweight, quiet/loud, or versatile alternative to their acoustic equivalent – made them all the more legendary. Indeed, the sound of these electric and electronic instruments was rarely confused with the one they were supposed to mimic, but popular music history would be unrecognizable without them.
On the consumption side of music, the development of sound carriers also influenced the music that was created. The limited duration of early media gave rise to shorter songs; long-play formats inspired concept albums; and lower noise floors meant a higher dynamic range could be used. More strikingly, the advent of stereo sound opened up the range of sonic possibilities, soon far beyond the left-right gimmick but adding space, depth, and intelligibility to the sources. With streaming, the business of recorded music has radically shifted once again, where music as a commodity you can physically buy is quickly becoming an obsolete model. This has effects on artists’ priorities and music as a whole that we are just starting to see now.
Future music
If history taught us anything, it’s that things will continue to change. Incremental improvements are still being made every day, for instance in the form of affordable yet professional-grade interfaces, and increasingly realistic emulations of everything from microphones over analog effects to instruments. Even computer performance is still going up, resulting in ever higher track and plugin counts. But more fundamental shifts are happening, too: look at the increasingly widespread adoption of immersive formats for music. Be part of the future, and use whatever technology you have available to make something new today!