Use + Remix

Seeing emotions just the start for brain-reading technology

Systems that sense a person’s thoughts or emotions have the potential to help us develop empathy and foster teamwork. : Nathan Semertzidis, Monash University (supplied) CC BY 4.0 Systems that sense a person’s thoughts or emotions have the potential to help us develop empathy and foster teamwork. : Nathan Semertzidis, Monash University (supplied) CC BY 4.0

Technology that can interpret mood and visualise it using augmented reality is just the first step in a promising new field of brain science.

In a flat in Melbourne, Australia, Charlie* can clearly see her housemate Robbie’s* emotions. It isn’t Robbie’s facial expressions or movements helping Charlie understand – his face is obscured by augmented-reality goggles – instead, swarms of colourful patterns swirl around Robbie, moving with him, changing in colour and shape. Charlie can see the swarms through her own augmented-reality set: they’re both using Neo-Noumena, a brain-sensing software.

Brain-machine interfaces have recently been employed as assistive devices, restoring mobility and communication to people suffering from various forms of paralysis. They’ve allowed people to control robotic limbs, wheelchairs, and computer keyboards, all with the power of thought.

But newer generations of brain-computer interfaces are taking advantage of artificial intelligence’s pattern recognition to decode more complex information from the brain, such as emotion.

Neo-Noumena, a headset developed by Monash University researchers, reads a person’s emotions by detecting electrical activity on their scalp. When users see themselves or other users through the headset’s glasses, they are surrounded by colourful repeating patterns known as fractals. The fractals move and change with a user’s emotions, like an aura around them, allowing everyone to understand them in real-time.

Brain sensing systems can help teams understand their emotions and work together more efficiently.

When study participants took the system home to use in pairs, the Monash researchers didn’t know how they would use it. Participants began exploring the possibilities immediately, using the system in emotionally charged activities to see how it would change their fractal swarms, discovering new things about themselves and their emotions in the process.

The technology could one day help people with autism in social situations; improve brain conditions that resist other forms of treatment; and maybe even create a networked human consciousness.

Charlie and Robbie became better at regulating their emotions as a pair as they became more aware of their emotional patterns and reactions over time. Interestingly, the transparency and constant availability of each user’s emotional state eventuated in the emergence of unexpected social phenomena. One half of each pair began to extrapolate information about the environment through interpreting the emotions of the other. Another pair of participants found they could infer how good their partner’s hand was in a card game wthout speaking.

Virtual reality and augmented reality devices are changing many kinds of work and play.

Brain-machine interfaces that can read emotions are closer than some might think. Some commercial products are already available to buy, geared toward meditation training, sleep tracking, productivity, and gaming. A growing open-source community has also emerged, sharing tools and lessons for enthusiasts to make the technology more accessible.

But the legal and ethical risks of ‘mind reading’ technology need to be addressed before it becomes mainstream. Only a handful of jurisdictions have so far passed ‘neuro rights’ laws to protect a persons’ right to keep their thoughts and emotions their own.      

The law is struggling to keep up with the privacy risks of current technologies, but even more egregious privacy breaches may yet be possible. The ability to read thought as if it were spoken word or text is still far beyond our grasp — at least without invasive surgery — but recent studies have demonstrated that identifying information can be extracted from brain-machine signals.

Emerging ‘bidirectional’ interfaces, which promise to modify brain activity using electrical or magnetic impulses, further amplify these risks. This research is still in its earliest stages, limited to invoking perceptions of light flashes, gross limb movements, and mild stimulation of large swathes of the outer layer of the brain, but it may one day be possible to send and receive targeted, precise messages, which would completely change our legal ideas of autonomy and individual personhood.

Despite these risks, the potential good is even more profound. In the near future, devices like Neo-Noumena could help people with conditions that can disrupt individual and interpersonal interpretation of emotion, like autism. Conditions like these can make social interaction extremely difficult.

Similarly, technologies that ‘write’ to the brain, like deep-brain-stimulation and transcranial magnetic stimulations, already show strong promise in treating seriously debilitating mental health conditions like treatment-resistant depression.

The promise of this technology hints at possibilities that even science fiction writers would struggle to write. With brain-machine interfaces, we may very well be able to link our minds together in vast networks of deep empathy, challenging every preconceived notion of what makes us ‘human’.

* Names changed for compliance with study ethics

Originally published under Creative Commons by 360info™.

Monash University has established and is proud to host the global headquarters for 360info. Monash University is also the host of the Asia-Pacific Hub.
Supporting partners: