https://flyairshare.com.au/
Blog

The Technology To Read Minds Might Be Here Sooner Than You Think

By 19 March, 2019 No Comments

An Artificial Neural Network can reconstruct intelligible speech from the auditory cortex. Isn’t it amazing?

The sounds you are hearing in the video, counting from zero to nine, have been reconstructed directly from an analysis of the auditory cortex of the human brain. It is not the recording of what this person is hearing, but rather a reconstructed voice produced from their brain activities generated in response to listening.  What does it sound like? Listen for yourself here:

A recently published Nature magazine article titled “Towards reconstructing intelligible speech from the human auditory cortex” describes an experimental system that has successfully done what was once thought confined to science fiction movies: the ability to read minds.

 

When we talk, or just imagine we are talking, the brain creates similar signals to those detected in the auditory cortex. The idea is that electrodes, implanted in a specific location in the brain, can record brain activity signals. The signals can be then interpreted into text+speech.

 

What are the possible applications for this technology? Most obviously, it would allow people with physical communication disability to communicate again, directly from the brain.

 

People without speech disability may be also interested in such capabilities, perhaps in order to communicate with a future version of Alexa or Siri, without voice command.

 

So how does it work? The research was conducted with the assistance of epilepsy patients undergoing brain operations. The patients agreed to allow the researchers to record their brain activity during the operation, while listening to certain recordings of counting. Such brain operations are often done when the patience are conscious (yet they do not feel pain). In order to make sure the brain is not damaged during the operation, the patient interacts with the surgeon.

 

Later, the researchers collated the signal recordings, and trained an artificial neural network to decrypt the noisy signals, forcing it to voice synthesizer, and converting it to sound—resulting in a sound that can be understood by the human ear.

 

As technology advances, non-invasive medical probes may become a possibility. In the future, we mightn’t need to implant actual electrodes into the brain to read signals. This could open up a new world of possibilities, such as providing destination instruction to an autonomous vehicle, or giving greater support to people with communication difficulties.

If you are interested in how Artificial Intelligence is changing cognitive and medical research you are invited to listen to futurist by Dr. Roey Tzezana and discuss with MedTech entrepreneurs at QODE Brisbane, 2-3 April 2019