• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Using AI to read minds

Jonn

Well-Known Member
V.I.P Member

Using AI to read minds​


Postdoctoral research fellow Daniel Leong sits in front of a computer at the Australian Artificial Intelligence Institute wearing what looks like a rubber swimming cap with wires coming out of it.

The 128 electrodes in the cap are detecting electrical impulses in Dr Leong's brain cells and recording them on a computer.

It's called an electroencephalogram (EEG) and it's technology used by doctors to diagnose brain conditions.

The UTS team is using it to read his thoughts.

A pioneering AI model, developed by PhD student Charles (Jinzhao) Zhou and his supervisors Chin-Teng Lin and Dr Leong, uses deep learning to translate the brain signals from EEG into specific words.

Deep learning is a form of AI that uses artificial neural networks to mimic how the human brain works to learn from data, in this case, lots of EEG data.
Dr Leong reads the simple phrase "jumping happy just me" slowly and silently on the screen.

He also mouths the words, which helps detect them by sending signals to the brain to activate the parts involved in speech.

The AI model works instantly to decode the words and come up with a probability ranking, based on what it has learned from lots of EEG waves from 12 volunteers reading texts.

At this stage, Professor Lin says the AI model has learned from a limited collection of words and sentences to make it easier to detect individual words.

A second type of AI, a large language model, matches the decoded words and corrects mistakes in the EEG coding to come up with a sentence.

Large language models, like ChatGPT, have been trained on massive text datasets to understand and generate human-like text.

"I am jumping happily, it's just me" is the sentence the AI model has come up with, with no input from Dr Leong apart from his brainwaves.
MSN
 
@Jonn

Do you know if it works without the lip movements?

Lip reading is certainly possible, so AI should be able to do it sooner or later.
And presumably the gross electrical signals involved in speaking can be picked up and interpreted.

Actual mind reading "feels" like it would be much more difficult.

Getting words from electrical signals isn't necessarily impossible, and it may be simpler than it sounds.
But the brain does a lot more than just speech. And it's more or less "always on".

You'd expect a significant "signal to noise" problem getting words directly from the brain.

BTW - OFC it's very interesting that they're working on this, and even if it doesn't deliver "real" mind reading, I can see it being useful.

But e.g. if one positive outcome would be to help deaf people, and the AI is "interpreting speech" from the signals controlling the muscles controlling the vocal system, it might be simpler and more convenient to use optical inputs (i.e. actual AI-supported lip-reading).

Also real mind reading would be extremely easy to misuse. And there are some unpleasant possible scenarios. I can see North Korea being an early adopter.

I've been hoping that mind reading would be too difficult since modern AI was quite new (the early one that was good at playing Go/Weichi/Babuk) and I saw a summer entertainment thing for kids (and adults if they wanted) where you could turn on and run a toy train with (IIRC) Alpha waves
( i.e. the gear for simple measurement of the electrical activity in a human brain had become cheap, compact, and comfortable).

Nothing like mind reading OFC, but it connected the two developing technologies.

Anyway, for deaf people, it would be way better to put gear on the deaf people first.
Next step would be e.g. ASL interpretation for people who can't use it, but I'd expect that to be possible with a mobile phone as sensor and (sooner or later) AI runtime platform.
 
I am wondering, given that brain signals are involved with the motor movement of the facial muscles, around the mouth, as well as, the act of vocalization, whether this influences the AI program's ability to recognize and interpret speech?

OR...is it actually reading the associated thought patterns? I have a built in "narrator" when I am thinking and speaking, but evidently, some people do not. In which case, could it separate what was thought versus what was said, thereby "outing" the person in a lie? Could it be used as a universal translator, allowing people speaking different languages and dialects to communicate? What about people who have more than one thought going on at the same time? I have my primary thoughts out "front", but I may be thinking around two or three different things in the "background".

Too many questions. LOL! :)
 
Last edited:
Would be ironic- yet disappointing if this amounted to a cyber version of what is known as "cold reading".

"Cold reading is a set of techniques used by mentalists, psychics, and fortune-tellers to gather information about a person without prior knowledge. It involves making high-probability guesses and observing body language and responses to create the illusion of having special insight or abilities."

Sorry...with such skepticism I must be channeling Elizabeth Holmes and Theranos. ;)
 
Last edited:
I am wondering, given that brain signals are involved with the motor movement of the facial muscles, around the mouth, as well as, the act of vocalization, whether this influences the AI program's ability to recognize and interpret speech?

OR...is it actually reading the associated thought patterns?

<...>
That's what I was thinking too.

IMO "electronic lip reading" (interpreting the signals directing the vocal system) would be simpler than any obvious (to me anyway :) alternatives.

Also a lot less scary.
 

New Threads

Top Bottom