Apple Vision Pro is the latest mixed reality headset to hit the market and it's been making some serious waves. The company already supported 360-degree spatial audio as a feature, but this will be the first opportunity for app developers to combine that capability with immersive visual experiences.
Within days of Apple Vision Pro's release, a DJ software company called Algoriddim announced that they would now be supporting spatial music making with Djay. A second app, Animoog Galaxy by the modular synth company Moog, has also been released and making its rounds through music producer circles.
In this article we'll define mixed reality music and a variety of programs available across several devices for those who want to explore.
What is mixed reality music?
Mixed reality music is a broad label for any audio-reactive technology blending physical and digital worlds in a single experience. These could range from live performances by celebrity holograms to mixed reality DAWs that take your audio workflows off-screen.
Major labels started experimenting with augmented reality music videos and live holograms as early as 2012. Rhianna famously collaborated with Doritos to bring her track Who's That Chick? into an augmented reality format. Fan used their webcams to scan an AR marker located on the bag of chips, triggering a custom performance of the song that changed depending on how the bag was held and moved.
2012 was also the year that deceased rapper Tupac made an appearance with Snoop Dogg at Coachella, marking one of the first ever major debuts of a 3D hologram live on stage. Mixed reality has remained a popular attraction at music festivals ever since.
In the ten years that have followed, consumer access to mixed reality technology has continued to advance. Hardware solutions like Microsoft Hololens remain cost prohibitive for most, ranging from $3,500 - 5,000, but that hasn't stopped software companies from jumping in to contribute to the app ecosystem.
A team of Japanese software developers created and promoted a mixed reality piano teacher called Teomirn in 2017. You can watch a clip of that experience below:
A separate group of students from Carnegie Mellon University released their own AR piano experience called Music Everywhere in 2017. It offered a different take on piano lessons, seeking to create a colorful gamified experience with tiny musician holograms that sit atop the player's electric keyboard.
Six years have passed since these initial efforts were made to enhance the piano playing experience. Many new developments have since taken place. It was only one week ago that the first major mixed reality DAW appeared.
Apple Vision Pro music making workflows
The tutorial above from L Dre offers a deep dive into the pros and cons of using the Apple Vision Pro for music production. His workflow centers around the option to "spatialize" a computer monitor view within the headset. So rather than using an app that was built specially for the AVP, this technique works with any DAW, midi controller, and so forth.
Once Ableton open, L Dre continues to use the trackpad from his laptop and an OP-1 from Teenage Engineering to trigger his samples. Near the end of the video, he shares a quick demo of the AlgoRiddim Djay app, shown below.
The Vision Pro quickly ran low on battery. It holds a two hour charge, but most beat makers will spend more than that during a single session. The solution was to simply plug in the battery and continue working. It has a long enough charging cable that he was able to remain at his desk without constrained movement.
Musicians who are used to working with dual monitors may be frustrated to learn that AVP supports only one screen view at a time. However, as L Dre points out, you can make that screen as large as you'd like, collapsing your DAW into limited width views and bringing in other standalone applications as a "bento" style layout.
It's apparently uncomfortable to wear headphones over the Vision pro headset, so if you're not used to mixing with monitors, you may need to get earbuds instead. That being said, L Dre said that he found the headset comfortable on its own and that he enjoyed using it more than the Meta Quest headsets.
Animoog Galaxy: A synthesizer for the Vision Pro
If the Djay experience doesn't align with your existing music creation workflows, this one probably will. Animoog Galaxy is a spatial synthesizer with a large set of sound design parameters and audio vis animations. Its sci-fi aesthetic matches the futuristic nature of the Vision Pro perfectly and proves that Moog is determined to continue evolving their brand in step with the latest tech.
The Galaxy includes an intelligent step sequencer with smart randomization that helps create melodic material on the fly. It can record loops and build layers around any sound. Their "pinch an drag" gestural keyboard lets you configure scales, glide between notes, and perform pitch correction. Under the hood is what they call an Anisotropic Synth Engine (ASE) that detects spatial positioning and creates new, expressive timbres in response to movement.
Spatial Labs presents Light Field's AR DAW
Digital music creatives and producers are notorious for putting in long hours at the studio. Ear fatigue is a well documented audio editing problem that can impact your performance as you try to perfect a mix.
It's far less common to hear about eye fatigue among music producers, despite the risks that screen time may pose. Secondary health problems like inhibited melatonin production are also associated with blue light exposure during long computer sessions.
Spatial Labs recently announced their efforts to create a viable alternative to making music on LCD screens, potentially reducing the strain on your eyes as you work on a track. Their mixed reality music software, called Light Field, has not been released commercially yet.
The impressive promo video above dropped just one week ago, without much clarity regarding their release timeline. This lack of transparency has attracted skepticism on internet forums, where one user speculated the company could be hyping a product that doesn't exist, to attract investors.
It's too early to know whether that's the case, but we'll give Spatial Labs the benefit of the doubt. Comparable technology already exists, like the laser keyboard niche that replaces physical QWERTY keyboards with a flat holographic projection of keys onto your desk.
So despite the impressive set of features that Light Field claims to offer, there's no reason to jump to conclusions about fakery. It's reasonable to believe that their team has in fact made headway on this tool, even if it's not publicly available yet.
Lucas Rizzotto's mixed reality synthesizer
Spatial Labs aren't the only ones cooking up new mixed reality music experiences. Futurist Lucas Rizzotto has been breaking new ground year after year, from his AI Beatles covers (back when AI music was "still underground") to current experiments with holographic synthesizers.
A video published recently on LinkedIn showed Lucas manipulating a granular synthesizer using a mixed reality hologram. He explained to one of his fans that each particle in the block represents a variation of a sample - the cube detects the location of your hand relative to the holographic cube and activates the particles as they are touched. Check that out below:
The Y-Z dimensions of the cube change pitch and other attributes in the synthesizer. Lucas designed scripts to record the movement of his hands as he performs live, so that he can play back his performance in Unity. From there, he places the video in Unity using their timeline system, lines up the virtual elements with his camera footage and replays the hand data from his recorded performance.
Lucas has also designed an interactive musical body suit that acts as a kind of MIDI controller. He's not the first to attempt this - artists like Red Letter J have built physical, interactive audio outfits like the one below.
The key innovation we're witnessing today is the option to strip away physical sensors in favor of lidar and machine vision. Lucas shares his inventions on LinkedIn and answers questions about how everything works. We encourage you to reach out to him and show support for all the progress he's made in this field.
Audio reactive elements in Adobe Aero & Spark AR
Artists have been experimenting with reactive audio for decades, but Brian Eno was one of the first to push 2D music visualizers into 3D space. He launched his mobile app Bloom back in 2008, correctly predicting that users would enjoy making ambient music from the manipulation of simple shapes. Ten years later, in 2018, Eno leveled up the Bloom experience with a mixed reality music exhibit of the same name.
If you're itching to get into the mixed reality music space yourself, there are at least two popular options available. Augmented reality artist Heather Dunaway Smith has been leading the way as one of the most accomplished creators in this space. As a product evangelist, she also educates consumers on how to build their own AR music experiences.
In the video above, she shows how anyone can create audio-reactive elements in augmented reality using Meta's Spark AR. The tutorial includes step-by-step instructions for constructing nodes in the patch editor. It's refreshing to see this kind of transparency in the artist process, considering this same AR mirage became a featured exhibit at Coachella this year.
Meta's SparkAR is not the only option available to musicians who want to expand their creative horizons. Adobe Aero also offers extensive resources for creating immersive experiences with audio.
As an aside, we would love to see a creative application of AudioCipher to an augmented reality experience. Our article on musical synesthesia might give you some ideas. Have fun and let us know if you come up with something - we'll be happy to promote it!
You can check out our previous article on VR music experiences for more information on VR DAWs and immersive music technology.