Visual music & concert visuals

In a previous blog post I discussed a piece of mine that combines visuals and music. To make this piece I conducted research on how music influences our perception of visuals, but also how visuals influence our perception of music. This is a field that seems overlooked, as sound and music is, like in movies, often created after and based on the visuals. However, there are also fields where visuals are created after and in reaction to the sounds or music, like visual music and concert visuals.

Here I’ll give you some quick information on this topic, and show you some of the works I found really interesting.

You have probably noticed that what we see influences what we hear, and vice versa. The McGurk effect shows this very clearly. Here, you watch someone saying the syllable ‘gah’ but the video is dubbed with the syllable ‘bah’. As the brain uses both visuals and sound to infer what it hears, it combines these two and we end up hearing a combination of ‘gah’ and ‘bah’, namely ‘dah’.

Music and sound can also very much influence the way we perceive a story. A quite famous example of this is the ‘horror Mary Poppins’ video, where horror music has been set to the trailer of Mary Poppins, and changes the atmosphere and story completely.

One of the areas where music often comes first and visuals second, with the visuals based on the music, is the field of visual music. Visual music artists try to emulate musical structures with visuals in a variety of ways. Oskar Fischinger was one of the pioneers in this field.

https://www.youtube.com/watch?v=they7m6YePo Oskar Fischinger – An Optical Poem

These videos are handmade because of the limited technological possibilities of the time. However, many new visual music pieces make use of intricate software to analyse sounds and generate visuals. For his piece Patah, Diego Garro used computer-generated animations to represent the spectro-morphologies of the sound in visuals, and the result is quite impressive:

Another field where visuals are based on music is the area of concert visuals. The visuals are often generated live, either by a VJ (visual disk jockey) or responsive software that analyses the music and creates visuals in real-time. One of these examples, which is already a few years old but still amazes me, is Amon Tobin’s use of 3D projection mapping for his ISAM show.

Flying Lotus has made for his recent shows use of see-through mesh screens on which one can project visuals, and used 3 of these ‘layers’ to create impressive visuals with depth.

A Dutch VJ, Tarik Barri, decided to make his own software that allows you to mix visuals and music in a more intuitive way. In his software, called Versum, musical objects representing tones and sounds are placed in a 3D space. As you get closer the sounds become louder, and thus the way you travel through the 3D space determines how the music sounds. I think this is a really cool example of integrating music and visuals, and one that could also be used in other settings like interactive installations and virtual reality environments. Here you can see a demo video of his software: