Singing in the brain
What does anger sound like? What music does sorrow imply? Human emotion is being given a new soundtrack thanks to an exciting new collaboration between art and neuroscience.
Concordia University researcher Erin Gee is taking feelings to a new level by tapping directly into the human brain, delivering music powered purely by the human body and its emotions. Using data collected from physiological displays of emotion, Gee is creating a software and hardware system that incorporates a set of experimental musical instruments that will perform a symphony of sentiments.
This research could have significant therapeutic benefits for those who have difficulty expressing emotion. Individuals with autism disorders, for example, often struggle to understand the emotions of others. Gee's robotic technology could be used to teach them how to identify feelings by externalising and exaggerating them into such forms as music.
Having developed strong research connections in Australia around the topic of human bodies and electronic voices, Gee, who is pursing a Master’s of Fine Arts in Studio Arts, linked her work to that of neurophysiologist Vaughan Macefield at the University of Western Sydney.
Their process begins with inserting very fine microelectrode needles into a peripheral nerve. This allows the researchers to eavesdrop on the subject’s emotions by recording nerve activity transmitted through a single neuron. Essentially, they are listening in on electrical signals emitted directly from the brain through the nervous system. It’s a less invasive procedure than inserting electrodes directly into the brain. To build a more accurate emotional map, blood flow, heart rate, sweat release and respiration are also recorded.
These signals, which paint an electronic picture of the subject’s emotions, are fed into Gee's computer, where custom-made software converts them into a chorus of chimes and bells, creating a music literally composed by feelings.
Gee hopes to transform her research into an emotional symphony unlike any other attempted before. Actors – chosen as subjects for their expert abilities to manifest emotion on demand – will be attached to various sensors that monitor their bodies’ reactions to emotion. On stage, they will perform an emotional score that will require them to summon a broad range of feelings. "It will be like seeing someone expertly playing their emotions, as one would play a cello," says Gee.
Now back at Concordia, Gee is working in close collaboration with technicians in the Faculty of Fine Arts to produce the robotic musicians that will play this emotional symphony. The robots are based on the percussive, xylophone-like glockenspiel, which, she says, “reflects the on/off nature of data itself.”
Gee is looking forward to giving her robots their debut during a world premiere in Montreal next fall through chamber music organization Innovations en Concert. “Each performance will be truly unique,” she says. “Our specialized musical instruments will allow the emotional state of performers to drive the musical composition.” Thanks to Gee and her collaborators, audiences will soon know what happiness sounds like.
Partners in research: This research was funded in part by Concordia International, the Social Sciences and Humanities Research Council of Canada, and the Marcs Institute of the University of Western Sydney. The performance, scheduled for fall 2013, will be presented by Innovations en Concert with the support of the Conseil des arts de Montréal.
Related links:
• Concordia University Department of Studio Arts
• Erin Gee student profile
• Erin Gee’s personal website
• University of Western Sydney
• Innovations en concert
• Concordia International
• Conseil des arts de Montréal
• Social Sciences and Humanities Research Council
• Marcs Institute of the University of Western Sydney