At the LIVELab, technology unlocks the mystery of our musical minds
It's a concert hall, a living lab and the only one of its kind in the world. McMaster's cutting-edge LIVELab is allowing us to quantify one of our most ancient relationships — the deep connection between music and the human brain.
BY Kaleigh Rogers for the Faculty of Science
March 4, 2022
It’s a 600-square-metre concrete box with floating walls and floors balanced on rubber discs. Massive two-metre-wide vents circulate air without a sound. The room is laced with motion capture sensors, cameras, and brain wave-collecting wires. Due to its construction, the space is remarkably soundproof. You can’t hear the air ambulances that routinely fly overhead to bring patients to the hospital next door.
It sounds like something from a spy flick: a villain’s lair, or a high-tech vault storing priceless treasure. But it’s actually the home to something much more quotidian: music.
LIVELab is a concert hall-cum-science lab at McMaster where researchers study all aspects of live musical performances, from its therapeutic indications for people with memory loss to how an audience reacts to a crescendo. It’s a unique space, though the work there is exemplary of a larger trend in research. Cutting-edge technology is allowing us to finally start to quantify one of our most ancient relationships — the deep connection between music and the human brain.
“The more we discover, the more we come to understand just how important music is,” says McMaster professor Laurel Trainor, the founding director of the McMaster Institute for Music and the Mind and LIVELab.
“In our society, we tend to think of music as entertainment. But if you look at history, music was traditionally a part of medicine.”
High-tech ways to study music and the mind
Research on the connection between music and the human brain dates back to the earliest days of neuroscience. In the mid and late 19th century, psychologists, musicologists and neurologists explored how people perceive music, and the physical and psychological reactions music can elicit. But the technological limitations of the time meant that a lot of this work was done through observational studies, theories, or from studying individuals with brain damage that impacted music-related functions.
Throughout the 20th century, advances in imaging technology gave us better glimpses into the human brain. Magnetic resonance imaging (MRI), computed tomography (CT), and positron emission tomography (PET) allowed us to map the brain with increasing detail, but it wasn’t until the emergence of the functional MRI (fMRI) — which visually maps activity in different regions of the brain by measuring blood flow changes — in 1990 that scientists could truly begin to investigate what’s happening between our ears when we listen to, play, or read music.
“You can see the potential in all of these early papers, but it kind of got busted open in the last 10 to 15 years,” says Charles Limb, a surgeon and neuroscientist at the University of California, San Francisco.
fMRI has enabled researchers to quantify some of our most basic understandings about music and the brain, such as music’s connection to memory. We’ve all experienced that moment when a certain song comes on the radio and suddenly you’re transported back to a specific time and place, a waterfall of memories reflexively cascading over your mind. That connection between music and memory has been captured using fMRI.
In a 2009 study, for example, Petr Janata, a cognitive neuroscientist at the University of California, Davis, had students get in an fMRI and listen to a random sampling of songs that were popular from their teenage years. Afterwards, he had them indicate which songs they knew, and which, if any, had personal significance for them and elicited that nostalgic reaction.
The songs that the students identified as most salient to their own history were associated with the strongest activation of the medial prefrontal cortex, a part of the brain located just behind the forehead that is associated with self-knowledge and autobiographical information.
This link — between music and the personal memories from our life — may help explain why music has proven to be therapeutic for memory disorders like dementia and Alzheimer’s disease.
Music and emotions
Matthew Sachs, a postdoctoral research fellow at Columbia University, uses fMRI to help answer questions about music and emotions. Why is it, for example, that some people get chills when listening to a particular piece of music, but other people do not? In a study published in 2016, Sachs and his colleagues found that part of it may be due to their individual brain structure. The researchers used fMRI to compare the brains of individuals who experience chills when listening to certain pieces of music (this was both self-reported and measured using electrodes) to individuals who don’t.
People who tend to get chills from music have stronger connections, stronger fibers, between auditory and emotional processing regions of the brain,” Sachs says. “In people who don’t, they have fewer fibers, so we’ve shown it both ways. That’s one possible answer for the difference.”
fMRI has also allowed Limb to gain insight into how musical geniuses improvise. In a study published in 2018, Limb had jazz musicians play their instruments while inside an fMRI to compare how their brains function when playing a known piece of music versus improvising something completely original. He found that some areas of the brain lit up while performing a known piece of music turned off when the musicians were improvising, and an entirely different region of the brain was activated instead.
If you can imagine memorizing a speech and what brain capacities you need to be able to do that and deliver that speech, versus being able to extemporaneously lecture or debate. You can imagine that the brain mechanisms have to be different because the tasks are fundamentally different,” Limb says. “That’s really what we were getting at here with music. When you start to try to improvise and you’re not an improviser, you kind of freeze, and that’s your brain saying ‘I don’t know how to do this.’”
While the fMRI is the centrepiece technology for many researchers studying music and the brain, there is a pile of other tools in the toolbox that allow for complex exploration of our neurological processes. Those electrodes Sachs used to track music chills, for example? They measure something called skin conductance response, a phenomenon where the skin momentarily becomes better at conducting electricity when experiencing a moment of physiological arousal (such as getting chills from a moving piece of music). This phenomenon can be measured with electrodes passing a mild electric charge between each other. If the skin conductance response happens, it will make that electric charge easier to conduct, which can be measured and tracked.
There’s also electroencephalogram (EEG) signals, a measurement of brain waves captured through electrodes placed on the scalp. EEG is a slightly different way of measuring activity in the brain, and isn’t as precise as an fMRI. But it can be particularly useful in studies where having participants stuck inside the confines of an fMRI machine isn’t possible, such as with ensemble musicians, or certain types of instruments — try squeezing a double bass into an fMRI scanner.
Dimitrios Adamos, an honorary research fellow at the Department of Computing of Imperial College London, has been using EEG to help pinpoint what our brain waves look like when we’re grooving, listening to music we enjoy. Adamos has identified a biomarker in the brainwaves that can predict whether the listener is enjoying a particular song, and thinks it could be used to help streaming services offer more custom recommendations.
Music and the body
Of course, there’s more to the human experience of music than just our heads. Our bodies can help unveil an even more complex understanding of music’s relationship to the brain: think of musicians swaying while in the midst of song, or the sudden urge to shimmy you get when Taylor Swift starts blasting through your ear buds (or insert your artist of choice, but it should be Taylor Swift).
This is where motion capture technology can come in handy for researchers. The same method that’s used to create special effects in blockbuster movies can be used to help scientists track the movements of musicians and audience members. Small markers are attached to strategic points on the person’s body. These markers reflect infrared light, and then special cameras that emit IR pulses are able to capture the markers in 2D space. From there, computer algorithms can reconstruct the markers in 3D and allow researchers to analyse the movement.
It’s how Trainor and some of her colleagues at McMaster’s LIVELab made some fascinating discoveries about how musicians in ensemble groups silently communicate with one another while playing. They invited two different string quartets to perform at the concert hall, and used motion capture to track their movements as they played. They even asked the musicians to try performing while facing away from one another, to determine just how much the subtle movements of the other musicians impacted their playing.
What they discovered is that the musicians sway their bodies as they play, and through this are able to communicate a host of information to one another, including coordinating timing and who is leading.
Adrian Fung was one of those musicians. A world class cellist and founding member of the critically-acclaimed Afiara Quartet, Fung has always had an interest in musical innovation. He said even just participating in the study — before the results were gathered — helped him and his fellow musicians gain a richer understanding of their craft.
He compared growth as a musician to moving to a new city. If during your first month in the city you take the same route to get home every day, you’re not going to learn much about that city.
“But if I walk a different way every time, take a bicycle, take an Uber, get my mom to drive me, take a helicopter, take a horseback, at the end of the month I would know that city a lot better,” Fung says. “Finding new pathways to understand how we communicate, how things work, is terrific.”
While most researchers in this field may have access to one or two of these options for a study, McMaster’s LIVELab is decked out with a suite of these tools. Along with EEG caps for both performers and audience members, and motion trackers, LIVELab can also monitor heart rate, breathing, and skin conductance. It has an active acoustic system that can recreate the soundscape of virtually any environment on earth, from Carnegie Hall to a crowded coffee shop. There’s an acoustic mannequin that can measure exactly how a sound hits the human ear. Even the concert hall’s grand piano is high-tech: it can record every keystroke, including the velocity with which it’s played.
This collection of state-of-the-art technology in a custom-built space is what makes LIVELab unparalleled, and a veritable playground for researchers in the field of music and neurology and psychology.
“The level of innovation that’s there, but also just the consistently excellent science coming out of LIVELab, is really impressive,” Limb says. “I don’t think you’ll find another lab like it. It’s so unique.”
Yet the ambitious team behind LIVELab still have more they’d love to do. Trainor said currently on their wishlist are eyesight trackers, which would allow researchers to track the gaze of musicians and audience members throughout a performance, potentially unlocking further understanding about the wordless communication that takes place.
These high-tech tools open a new window into the connection of music and the mind, and with the advance of technology, the field has made great progress in the last two decades.
Though music’s relationship with the human brain has been a topic of research for more than a century, it hasn’t enjoyed the same level of focus as some other disciplines. But that’s starting to change. Part of it is thanks to technological progress, but it’s also due to a generation of scientist-musicians who are fascinated by the mysteries yet to be solved. Many of the top researchers in this field are also musicians — Trainor is a symphony flautist, Limb plays sax, piano and bass, and Sachs is a pianist and bassoonist — with their passion for the arts helping to drive their scientific focus.
As Fung says about the study of music and the mind, “I’m more of a bird than an ornithologist. Laurel is both.”
Along with the general benefit of a richer understanding of the human brain, this research also offers the potential for specific advancements in medicine. Researchers in this field have been able to refine music-based treatments for Alzheimer’s and mood disorders, and study hearing loss. The potential good from this research is only limited by imagination. Music is nearly as ancient as humanity itself. Now, technology has created a bridge between our innate connection to music, what that connection means, and how it can be harnessed to heal and entertain.