Jump to content

We Know What You’re Thinking:


humble3d

Recommended Posts

We Know What You’re Thinking:


 'Mind Reading' Technology Decodes Complex Thoughts


Carnegie Mellon University scientists can now use brain activation patterns to identify complex thoughts, such as, "The witness shouted during the trial."


This latest research led by CMU's Marcel Just builds on the pioneering use of machine learning algorithms with brain imaging technology to "mind read."


 The findings indicate that the mind's building blocks for constructing complex thoughts are formed by the brain's various sub-systems and are not word-based.

 

Published in Human Brain Mapping and funded by the Intelligence Advanced Research Projects Activity (IARPA), the study offers new evidence that the neural dimensions of concept representation are universal across people and languages.

 

"One of the big advances of the human brain was the ability to combine individual concepts into complex thoughts, to think not just of 'bananas,' but 'I like to eat bananas in evening with my friends,'" said Just, the D.O. Hebb University Professor of Psychology in the Dietrich College of Humanities and Social Sciences.


"We have finally developed a way to see thoughts of that complexity in the fMRI signal.

 

The discovery of this correspondence between thoughts and brain activation patterns tells us what the thoughts are built of."

 

Previous work by Just and his team showed that thoughts of familiar objects, like bananas or hammers, evoke activation patterns that involve the neural systems that we use to deal with those objects.

 

For example, how you interact with a banana involves how you hold it, how you bite it and what it looks like.

 

The new study demonstrates that the brain's coding of 240 complex events, sentences like the shouting during the trial scenario uses an alphabet of 42 meaning components, or neurally plausible semantic features, consisting of features, like person, setting, size, social interaction and physical action.

 

Each type of information is processed in a different brain system -- which is how the brain also processes the information for objects.

 

By measuring the activation in each brain system, the program can tell what types of thoughts are being contemplated.

 

For seven adult participants, the researchers used a computational model to assess how the brain activation patterns for 239 sentences corresponded to the neurally plausible semantic features that characterized each sentence.

 

 Then the program was able to decode the features of the 240th left-out sentence. They went through leaving out each of the 240 sentences in turn, in what is called cross-validation.

 

The model was able to predict the features of the left-out sentence, with 87 percent accuracy, despite never being exposed to its activation before.


It was also able to work in the other direction, to predict the activation pattern of a previously unseen sentence, knowing only its semantic features.

 

"Our method overcomes the unfortunate property of fMRI to smear together the signals emanating from brain events that occur close together in time, like the reading of two successive words in a sentence," Just said.


"This advance makes it possible for the first time to decode thoughts containing several concepts.

 

That's what most human thoughts are composed of."


He added, "A next step might be to decode the general type of topic a person is thinking about, such as geology or skateboarding.

 

We are on the way to making a map of all the types of knowledge in the brain."


CMU's Jing Wang and Vladimir L. Cherkassky also participated in the study.

 

Discovering how the brain decodes complex thoughts is one of the many brain research breakthroughs to happen at Carnegie Mellon.

 

CMU has created some of the first cognitive tutors, helped to develop the Jeopardy-winning Watson, founded a groundbreaking doctoral program in neural computation, and is the birthplace of artificial intelligence and cognitive psychology.

 

Building on its strengths in biology, computer science, psychology, statistics and engineering, CMU launched BrainHub, an initiative that focuses on how the structure and activity of the brain give rise to complex behaviors.

https://pionic.org/we-know-what-youre-thinking-mind-reading-technology-decodes-complex-thoughts

 

BACKGROUND ??


Scientists extract images directly from human brain

 

Is this how Santa "Knows All" ?  :lol:

 

"They" won't tell you how much they can do today, but,


It is far more invasive than even the most enlightened
suspect...


Journalist Bob Woodword of the Washington Post News
Paper knows a little but he is sworn to secrecy...


Be cautious what you think because the thought
police are very real...


Not only can they see they can also implant...

 

Researchers from Japan’s ATR Computational Neuroscience Laboratories have developed new brain analysis technology that can reconstruct the images inside a person’s mind and display them on a computer monitor, it was announced on December 11.

 

According to the researchers, further development of the technology may soon make it possible to view other people’s dreams while they sleep.

 

The scientists were able to reconstruct various images viewed by a person by analyzing changes in their cerebral blood flow.

 

Using a functional magnetic resonance imaging (fMRI) machine, the researchers first mapped the blood flow changes that occurred in the cerebral visual cortex as subjects viewed various images held in front of their eyes.

 

 

Subjects were shown 400 random 10 x 10 pixel black-and-white images for a period of 12 seconds each.

 

 

While the fMRI machine monitored the changes in brain activity, a computer crunched the data and learned to associate the various changes in brain activity with the different image designs.

 

Then, when the test subjects were shown a completely new set of images, such as the letters N-E-U-R-O-N, the system was able to reconstruct and display what the test subjects were viewing based solely on their brain activity.

 

For now, the system is only able to reproduce simple black-and-white images. But Dr. Kang Cheng, a researcher from the RIKEN Brain Science Institute, suggests that improving the measurement accuracy will make it possible to reproduce images in color.

 

 

“These results are a breakthrough in terms of understanding brain activity,” says Dr. Cheng. “In as little as 10 years, advances in this field of research may make it possible to read a person’s thoughts with some degree of accuracy.”

 

 

The researchers suggest a future version of this technology could be applied in the fields of art and design — particularly if it becomes possible to quickly and accurately access images existing inside an artist’s head.

 

 

The technology might also lead to new treatments for conditions such as psychiatric disorders involving hallucinations, by providing doctors a direct window into the mind of the patient.

 

 

ATR chief researcher Yukiyasu Kamitani says, “This technology can also be applied to senses other than vision. In the future, it may also become possible to read feelings and complicated emotional states.”

 

 

The research results appear in the December 11 issue of US science journal Neuron.


Original link:
 

http://www.pinktentacle.com/2008/12/scientists-extract-images-directly-from-brain/

 

Link to comment
Share on other sites


  • Replies 1
  • Views 1.2k
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...