Galina Peshkova/123RF
 

AIs Can Read, Visualize, and Decode Human Thought From Photo and Video

  • 1 March 2018
  • Cas Proffitt

Japanese scientists have created an AI that can visualize your thoughts and memories.

Have you ever seen that episode of Black Mirror where the characters are able to transmit their memories to a television screen so that they, or anyone else, can review them?

Well, we aren’t quite there yet, but we’re one step closer.

Artificial intelligence can visualize static thoughts 

A group of Japanese scientists have published a paper called “Deep image reconstruction for human brain activity,” and their results are staggering.

The scientists showed images to multiple humans over the course of 10 weeks and recorded their brainwaves both while looking at the images and while recalling what they had seen.

Deep image reconstruction from human brain activity

 

 

 

 

 

 

 

Image taken from Deep image reconstruction for human brain activity

With this data, the deep learning network was trained to understand brainwave data to decode the thoughts and, in turn, visualize the output.

The images are still fuzzy, as you can see in the video below, but they are eerily close.

As with any technology, we can expect accuracy to progress toward refinement.

AI can also decode what video you’re watching from your brain activity

Similarly, a group of scientists published “Neural Encoding and Decoding with Deep Learning for Dynamic Natural Vision” in Cerebral Cortex.. In their experiments, three women watched hours of short videos in fifteen categories, like bird and airplane, while their brainwaves were recorded.

Neural encoding and decoding through a deep-learning model

 

 

 

 

 

 

 

 

 

 

Image taken from Neural Encoding and Decoding with Deep Learning for Dynamic Natural Vision

Then, an artificial neural network was trained to process the images in association with the visuals and a network to decode brain activity in order to guess what a participant was watching with approximately 50% accuracy.

Even with a person whose brain waves were not used to train the network, the system still managed a 25% accuracy in determining what category of video the person was watching.

Conclusion

Although we’re not yet in a dystopian sci-fi, our progress in AI-thought crossover research is building, and we can expect to see improvements for disabilities with this new technologies. These may pave the way for victims of stroke and other similarly communication-limited people to communicate readily with those around them.

How do you feel about the potential for AI to read your thoughts? Let us know in the comments below!

About Cas Proffitt

Cas is a B2B Content Marketer and Brand Consultant who specializes in disruptive technology. She covers topics like artificial intelligence, augmented and virtual reality, blockchain, and big data, to name a few. Cas is also co-owner of an esports organization and spends much of her time teaching gamers how to make a living doing what they love while bringing positivity to the gaming community.

Comments

COMMUNITY