Close this search box.

Use your ears – How intuitive is music production today?

Estimated reading time: 12 minutes

Video killed the Radio Star

MTV was just the beginning. It seems that video has totally gained dominance over good old audio. We know from communication science that pictures are the most direct form of communication. Does this also mean that visual communication is generally superior? What influence does this have on the way we consume and produce music? Is seeing even more important than hearing? 

Corona is changing our world permanently. Since the beginning of the pandemic, video traffic in Germany has increased quadruple. Instead of picking up the phone, people prefer to make Zoom calls. This has a clear impact on our communication structures. But as with any mass phenomenon, there is always a counterbalancing correlation, a countermovement. This manifests itself in the form of the good old record players.

For the first time, more vinyl records than CDs were sold in Germany in 2021. This decelerated type of music consumption is so completely at odds with the prevailing zeitgeist. The desire to be able to hold your favorite music in your hand as a vinyl record is extremely popular. The fact that we process music from the vinyl record player exclusively with hearing is so archaic that it seems to be out of time.

pexels matthias groeneveld 4200747 scaled 1

At the same time, the enjoyment of music with the help of a record player corresponds phylogenetically completely to human nature. In the following, we will clarify why this is so. We learn from the past for the future. This is also true for producing music. The goal of a successful music production should be to inspire the audience. Music is not a pure end in itself. For this, we only need to look at the creation of music [sic].

The origin of music

Germany is an old cultural nation. Very old, to be precise. This is shown by archaeological findings discovered during excavations in a cave in the Swabian Alb. Researchers found flutes made of bone and ivory there that are believed to be up to 50,000 years old. The flute makers even implemented finger holes that allowed the pitch to be manipulated. Experts estimate that humanity has been expressing itself rhythmically and melodically for a very long time. These non-linguistic acoustic events are believed to have served primarily social contexts. Music enabled emotional vocal expressions and established itself as a second communication system parallel to language. Much of the emotional level of music-making has survived to this day, such as the so-called “chill effect“.

This occurs when music gives you goosebumps. The goosebumps are the physical reaction to a chill effect moment. The chill effect also causes the brain’s reward system to be stimulated and happy hormones to be released. This happens when the music provides special moments for the listener, and these moments are often very subjective. But this is precisely where music listeners derive their benefit during music consumption. Emotionality is the currency of music. For this reason, children should be enabled to learn a musical instrument. Along with language, music is a profoundly human means of expression. Music teaches children to experience emotions and also to express their own feelings. It is an alternative means of expression in case language fails. It is the desire for emotionality that makes us reach for the vinyl record as our preferred music medium in special moments.

Then and now

The vinyl record is preserved music. The flutists of the Swabian Alb could always practice their music only in the “here and now”. No recording, no playback – handmade music for the moment. That meant making music for the longest period in human history. With the digital revolution, music-making changed radically. In addition to traditional instruments, keyboards, drum computers, sampling, and sequencers came along in the 80s. The linearity of music-making was broken. Music no longer necessarily had to be played simultaneously. Rather, a single musician was able to gradually play a wide variety of instruments and was no longer dependent on fellow musicians. As a result, several new musical styles emerged side by side in a short time, a trademark of the 80s.

The Nineties

In the 90s, the triumph of digital recording and sampling technology continued. Real sounds were replaced by samplers and romplers, which in turn received competition from midi programming. With midi sequencers, screens and monitors increasingly entered the recording studios, and music was made visible for the first time. The arrangement could be heard and seen simultaneously. The 2000s is the time of the comprehensive visualization of music production. Drums, guitars, basses, and synths – everything is available as a VST instrument and since then virtually at home inside our monitors.

At the same time, the DAW replaces the hard disk recorders that were common until then. The waveform display in a DAW is the most comprehensive visual representation of music to date and allows precise intervention in the audio material. For many users, the DAW is becoming a universal production tool, providing theoretically infinite resources in terms of mix channels, effects, EQs, and dynamics tools. In recent years, the previously familiar personnel structure has also changed. Not the band, but the producer creates the music. Almost everything takes place on the computer.

Due to this paradigm shift, new music genres emerge, which are particularly at home in the electronic field (Trap, Dubstep, EDM). It is not uncommon for these productions to no longer use audio hardware or real instruments. 

Burnout from Wellness Holidays

A computer with multiple monitors is the most important production tool for many creatives. The advantages are obvious. Cost-effective, unlimited number of tracks, lossless recordings, complex arrangements can be handled, an unlimited number of VST instruments and plug-ins. Everything can be automated and saved. A total recall is obligatory. If you get stuck at any point in the production, YouTube offers suitable tutorials on almost any audio topic. Drawing by numbers. Music from the automatic cooker. Predefined ingredients predestine a predictable result without much headache. 

Stone Age

Our Swabian flutists would be surprised. Music only visual? No more hardware needed? No need to play by hand? The Neanderthal hidden in our brain stem subconsciously resists. The eye replaces the ear? Somehow something is going wrong. In fact, this kind of producing contradicts the natural prioritization of human senses. The Stone Age flute player could usually hear dangers before he could see them. Thanks to our ears, we can even locate with amazing accuracy the direction from which a saber-toothed tiger is approaching.

Evolution has thought of something that the sense of hearing is the only sense that cannot be completely suppressed. You can hold your nose or close your eyes, but even with fingers in your ear, a human being perceives the approaching mammoth. The dull vibrations trigger a fear sensation. This was and is essential for survival. Sounds are always be associated with emotions. According to Carl Gustav Jung (1875 – 1961), the human psyche has collective memories in the subconscious. He called these archetypes.


Sounds such as thunder, wind or water generate immediate emotions in us. Conversely, emotions such as joy or sadness can be best expressed with music. In this context, hearing is eminently important. Hands and ears are the most important tools of the classical musician and for this reason, there are many relative musicians who are blind and play at the highest level. Those who rely exclusively on the computer for music production are depriving themselves of one of their best tools. Music production with keyboard and mouse is rarely more than a sober data processing with artificial candy coating. DAW operation via mouse demands constant control by our eyes. There is no tactile feedback. In the long run, this is tiring and does not remain without collateral damage. Intuition is usually in the first place when it comes to reporting damage.

Seeing instead of listening?

The visualization of music is not problematic by itself. Quite the opposite, in fact, because sometimes it is extremely helpful. Capturing complex song sequences or precisely editing audio files is a blessing with adequate visualization. As far as the core competence of music production is concerned, the balance looks much more ambivalent. Adjusting an EQ, compressor, effect, or even adjusting volume ratios exclusively with monitor & mouse is ergonomically questionable. It is like trying to saw through a wooden board with a wood planer. It is simply an unfortunate tool of choice.

Another aspect also has a direct impact on our mix.

The visual representation of the EQ curve in a DAW or digital mixer has a lasting effect on how we process signals with the EQ. Depending on the resolution of the display, we use the filters sometimes more and sometimes less drastically. If the visual representation creates a massive EQ hump on the screen, our brain inevitably questions this EQ decision. Experiences have shown that with an analog EQ without a graphical representation, these doubts are much less pronounced. 

The reason: the reference of an analog EQ is the ear, not the eye. If a guitar needs a wide boost at 1.2 kHz to assert itself in the mix, we are more likely to make drastic corrections with an analog EQ than with a DAW EQ whose visualization piles up a massive EQ hump on the monitor screen. Successful producers and mixers sometimes work with drastic EQ settings without giving it much thought. Inexperienced users who resort to an equalizer with a visual curve display too often use their eyes instead of their ears in their search for suitable settings. This often leads to wrong decisions.

Embrace the chaos 

When asked what is most lacking in current music productions, the answer is intuition, interaction, and improvisation. When interacting with other musicians, we are forced to make spontaneous decisions and sometimes make modifications to chords, progressions, tempos, and melodies. Improvisation leads to new ideas or even a song framework, the DNA of which can be traced back to the sense of hearing and touch.

Touch and Feel

The sense of touch in combination with a real instrument offers unfiltered access into the subconscious. Or loosely according to Carl Gustav Jung to the primal images, the archetypes. Keyboard & mouse do not have this direct connection. To be able to interact musically with VST instruments and plugins, we, therefore, need new user interfaces that serve our desire for a haptic and tactile experience. Especially at this point, a lot has happened in the past few years. The number of DAW and plug-in controllers is steadily increasing, forming a counter-movement to the keyboard & mouse.

Feeling potentiometer positions allows operation without consciously looking, like a car radio. For this reason, the Federal Motor Transport Authority considers the predominant operation of a modern electric car via the touchscreen to be problematic. The fact is: with this operating concept, the driver’s attention shifts from the road to the touchscreen more often than in conventional automobiles with hardware pushbuttons and switches. The wrong tool for the job? The similarities are striking. A good drummer plays a song in a few takes. Yet some producers prefer to program the drums, even if it takes significantly longer. Especially if you want to implement something like a feel and groove to the binary drum takes.

The same goes for programming automation curves for synth sounds, for example, the cut-off of a TB 303. It’s faster to program in than to program out, and the result is always organic. It’s no accident that experienced sound engineers see their old SSL or Neve console as an instrument. And in the literal sense. Intuitive interventions in the mix with pots and faders put the focus on the ear and deliver original results in real-time.

Maximum reduction as a recipe for success

In the analog days, you could only afford a limited number of instruments and pro audio equipment. Purchasing decisions were made more consciously and the limited equipment available was used to its full potential. Today it is easy to flood the plugin slots of your DAW with countless plugins on a small budget. But one fact is often overlooked. The reduction to carefully selected instruments is very often style-shaping. Many musicians generate a clear musical fingerprint precisely through their limited instrument selection.

The concentration on a few, but consciously selected tools define a signature sound, which in the best case becomes an acoustic trademark. This is true for musicians as well as for sound engineers and producers. Would Andy Wallace deliver the same mixes if he swapped his favorite tool (SSL 4000 G+) for a plugin bundle complete with DAW? It’s no coincidence that plugin manufacturers are trying to port the essence of successful producers and sound engineers to the plugin level. Plugins are supposed to capture the sound of Chris Lord Alge, Al Schmitt, or Bob Clearmountain.

A comprehensible approach. However, with the curious aftertaste that just these gentlemen are only conditionally known for preferring to use plugins. Another curiosity is to revive popular hardware classics as plugin emulations. A respectable GUI is supposed to convey a value comparable to that of the hardware. Here, only the programming, the code determines the sound of the plugin. Another example of how visualization influences the choice of audio tools.

Just switch off

Don’t get me wrong, good music can also be produced with a mouse & keyboard. But there are sustainable reasons to question this way of working. We are not spreading the audio engineering gospel. We just want to offer an alternative to visualized audio production and shift the focus from the eye to the ear again. That music often sends itself in the background noise of the zeitgeist, which we will hardly be able to reverse.

But maybe it helps to remember the archetypes of music. Listening to music instead of seeing it and, in the literal sense, taking a hands-on approach again. Using real instruments, interacting with other musicians, using pro audio hardware that allows tactile feedback.

Self-limiting to a few deliberately selected instruments, analog audio hardware, and plug-ins with hardware controller connectivity. This intuitive workflow can help break through familiar structures and ultimately create something new that touches the listener. Ideally, this is how we find our way back to the very essence of music: emotion!

Finally, one last tip: “Just switch it off!” Namely, the DAW monitor. Listen through the song instead of watching it. No plugin windows, no meter displays, no waveform display – listen to the song without any visualization. Like a record, because unlike MTV, it has a future.

Yours, Ruben