Peacepink

Worldwide Campaign to stop the Abuse and Torture of Mind Control/DEWs

Mind Reading -- 60 minutes CBS News video
June 28, 2009 4:50 PM
Neuroscience has learned so much about how we think and the brain activity linked to certain thoughts that it is now possible - on a very basic scale - to read a person's mind. Lesley Stahl reports.
How Technology May Soon "Read" Your Mind
Read more: 
http://www.cbsnews.com/stories/1998/07/08/60minutes/main4694713.sht...

http://www.foxnews.com/story/0,2933,426485,00.html

LiveScience Topics: Mind Reading

Mind-machine interfaces can read your mind, and the science is improving. Devices scan the brain and read brain waves with electroencephalography, or EEG, then use a computer to convert thoughts into action. Some mind-reading research has recorded electrical activity generated by the firing of nerve cells in the brain by placing electrodes directly in the brain. These studies could lead to brain implants that would move a prosthetic arm or other assistive devices controlled by a brain-computer interface.

http://utdev.livescience.com/topic/mind-reading

 

16:09 03/11/2010 © Alex Steffler

Rossiiskaya Gazeta
Mind-reading devices to help screen Russian cops

It reads like science fiction, but it’ll soon be science fact. Special mind-reading devices are to be rolled out across Russia’s revamped police force.
http://en.rian.ru/papers/20101103/161202851.html

 

Homeland Security Detects Terrorist Threats by Reading Your MindTuesday, September 23, 2008
By Allison Barrie
Baggage searches are SOOOOOO early-21st century. Homeland Security is now testing the next generation of security screening — a body scanner that can read your mind.

Most preventive screening looks for explosives or metals that pose a threat. But a new system called MALINTENT turns the old school approach on its head. This Orwellian-sounding machine detects the person — not the device — set to wreak havoc and terror.

MALINTENT, the brainchild of the cutting-edge Human Factors division in Homeland Security's directorate for Science and Technology, searches your body for non-verbal cues that predict whether you mean harm to your fellow passengers.

It has a series of sensors and imagers that read your body temperature, heart rate and respiration for unconscious tells invisible to the naked eye — signals terrorists and criminals may display in advance of an attack.

But this is no polygraph test. Subjects do not get hooked up or strapped down for a careful reading; those sensors do all the work without any actual physical contact. It's like an X-ray for bad intentions.

Currently, all the sensors and equipment are packaged inside a mobile screening laboratory about the size of a trailer or large truck bed, and just last week, Homeland Security put it to a field test in Maryland, scanning 144 mostly unwitting human subjects.

While I'd love to give you the full scoop on the unusual experiment, testing is ongoing and full disclosure would compromise future tests.

• Click here for an exclusive look at MALINTENT in action.

But what I can tell you is that the test subjects were average Joes living in the D.C. area who thought they were attending something like a technology expo; in order for the experiment to work effectively and to get the testing subjects to buy in, the cover story had to be convincing.

While the 144 test subjects thought they were merely passing through an entrance way, they actually passed through a series of sensors that screened them for bad intentions.

Homeland Security also selected a group of 23 attendees to be civilian "accomplices" in their test. They were each given a "disruptive device" to carry through the portal — and, unlike the other attendees, were conscious that they were on a mission.

In order to conduct these tests on human subjects, DHS had to meet rigorous safety standards to ensure the screening would not cause any physical or emotional harm.

So here's how it works. When the sensors identify that something is off, they transmit warning data to analysts, who decide whether to flag passengers for further questioning. The next step involves micro-facial scanning, which involves measuring minute muscle movements in the face for clues to mood and intention.

Homeland Security has developed a system to recognize, define and measure seven primary emotions and emotional cues that are reflected in contractions of facial muscles. MALINTENT identifies these emotions and relays the information back to a security screener almost in real-time.

This whole security array — the scanners and screeners who make up the mobile lab — is called "Future Attribute Screening Technology" — or FAST — because it is designed to get passengers through security in two to four minutes, and often faster.

If you're rushed or stressed, you may send out signals of anxiety, but FAST isn't fooled. It's already good enough to tell the difference between a harried traveler and a terrorist. Even if you sweat heavily by nature, FAST won't mistake you for a baddie.

"If you focus on looking at the person, you don't have to worry about detecting the device itself," said Bob Burns, MALINTENT's project leader. And while there are devices out there that look at individual cues, a comprehensive screening device like this has never before been put together.

While FAST's batting average is classified, Undersecretary for Science and Technology Adm. Jay Cohen declared the experiment a "home run."

As cold and inhuman as the electric eye may be, DHS says scanners are unbiased and nonjudgmental. "It does not predict who you are and make a judgment, it only provides an assessment in situations," said Burns. "It analyzes you against baseline stats when you walk in the door, it measures reactions and variations when you approach and go through the portal."

But the testing — and the device itself — are not without their problems. This invasive scanner, which catalogues your vital signs for non-medical reasons, seems like an uninvited doctor's exam and raises many privacy issues.

But DHS says this is not Big Brother. Once you are through the FAST portal, your scrutiny is over and records aren't kept. "Your data is dumped," said Burns. "The information is not maintained — it doesn't track who you are."

DHS is now planning an even wider array of screening technology, including an eye scanner next year and pheromone-reading technology by 2010.

The team will also be adding equipment that reads body movements, called "illustrative and emblem cues." According to Burns, this is achievable because people "move in reaction to what they are thinking, more or less based on the context of the situation."

FAST may also incorporate biological, radiological and explosive detection, but for now the primary focus is on identifying and isolating potential human threats.

And because FAST is a mobile screening laboratory, it could be set up at entrances to stadiums, malls and in airports, making it ever more difficult for terrorists to live and work among us.

Burns noted his team's goal is to "restore a sense of freedom." Once MALINTENT is rolled out in airports, it could give us a future where we can once again wander onto planes with super-sized cosmetics and all the bottles of water we can carry — and most importantly without that sense of foreboding that has haunted Americans since Sept. 11.

Allison Barrie, a security and terrorism consultant with the Commission for National Security in the 21st Century, is FOX News' security columnist.

 

Please go to LAST PAGE OF "Replies to this Discussion" to read NEWEST Information

Views: 7640

Reply to This

Replies to This Discussion

September 12, 2010 |
Mind-Reading Tools Go Commercial
The tools used by the commercial industry to detect our thoughts and brain states are very different, and somewhat limited, compared with those used in the research lab.
Christie Nicholson reports
http://www.scientificamerican.com/podcast/episode.cfm?id=mind-readi...
The ability for brains to control inanimate objects, like computer cursors, robotic arms, wheelchairs, has seen significant progress in the last decade. A case in point is the recent success at Andrew Schwartz’s lab at the University of Pittsburgh where macaque monkeys fed themselves using a robotic arm controlled only by their thoughts.

Even commercial companies are now using brain-computer interface (or BCI.) Products like Mattel’s MindFlex or the Star Wars Force Trainer allow players to move a ball with thoughts alone. Or there is also the consumer product Zeo that follows your sleeping brain waves in order to diagnose any restless nights.

But it’s important to note that there are very different tools being used in the lab versus the marketplace. In Schwartz’s lab, an electrode placed beneath the skull of the macaque can detect spikes from single neurons. The pattern of neurons firing is then translated into code that a computer can understand.

The commercial products, however, cannot be so invasive. These companies use an electroencephalography cap (or EEG) that is placed on top of your head, and reads your overall brain state. Here the results are fairly crude. We can detect if one is calm, angry, excited or distracted, and we can manipulate those brain states to activate switches, like move a ball forward and back. But if we want to go beyond any binary on/off activation, however, we need to get deeper into the brain.

To do anything more complex with an EEG cap is like trying to distinguish the cello in an orchestra from outside of Lincoln Center.

To put this into perspective, the electrodes that are placed under the skull and are tapping into our grey matter and are used to move robotic arms or surf the Web, are not only inside Lincoln Center but are right smack in the front row, directly monitoring every string bowed on that same cello. And it is this sort of extreme detail that we are probably gonna to need to do any complex tasks with thoughts alone.

—Christie Nicholson
http://www.cbsnews.com/stories/2010/09/15/ap/tech/main6869275.shtml...
Sep. 15, 2010
Mind-reading Phones? The Tech's Evolving There
Intel Research Chief Says Rise Of Mobile Phones Helping Hit Goal Of 'context-aware Computing'

(AP) SAN FRANCISCO (AP) - How smart do you want your smart phone to actually be? Do you want it to read your mind, even a little bit?

Intel Corp.'s research chief, Justin Rattner, says that technology has advanced to the point that "context-aware computing," an idea that's been around for two decades, is becoming more of a reality with the rise of mobile devices.

That could lead to phones that act as psychics in your pocket. Rather than simply amass secrets about you, the devices could be doing things with that information, such as predicting what you might do next and offering suggestions.

Rattner showed a few examples during his keynote speech Wednesday at Intel's annual developer conference in San Francisco.

Among them: a prototype application Intel worked on with Fodor's Travel. It learns what types of foods you like to eat and what types of attractions you like to visit, based on searches you type into the phone or locations identified using GPS. The software makes similar recommendations when you visit a new city.

Tech companies are already working to predict what people want, but only in pieces.

Search engine Google Inc., movie-rental service Netflix Inc. and online radio service Pandora try to anticipate what people want even before they know they want it.

Stringing those types of functions together with the wealth of other information that phones collect about people could pave the way for even more helpful electronics, Rattner said.

A challenge is training computers to analyze data from "hard sensors" (which measure location, motion, voice patterns, temperature and the like) and combining those findings with data from "soft sensors" (such as calendar appointments and Web browsing history).

For example, your phone could detect that you've just left work and seem to be on your way home - a location it might know from your address book. It could then automatically recommend the best route around traffic.

"Things don't get really interesting until you fuse that hard sensor data with soft sensor data," Rattner said. "It gives devices almost this sixth sense of anticipating what a user will need in the future, whether that's the next few moments or at dinner later in the day."

Rattner added that researchers are even making steps toward the "ultimate form of sensing" - a computer understanding a human's thoughts.

He acknowledged the need for stronger privacy controls as the phones have better ability to think for themselves
http://www.cbsnews.com/8301-504763_162-20015935-10391704.html?tag=m...
September 9, 2010 10:31 AM
Mind Reading Machine Shocks Scientists, Recognizes Words
Posted by Neil Katz
(CBS) Scientists at the University of Utah have moved one step closer to using electrodes to read someone's mind.

No need to think nefariously. It's all for a good purpose - helping people who are paralyzed speak and operate machines.

In the experiment, doctors implanted tiny electrodes on the brain of an epileptic with severe seizures. The man already had part of his skull removed so doctors could treat his epilepsy.

According to a statement: "Scientists recorded brain signals as the patient repeatedly read each of 10 words that might be useful to a paralyzed person: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less."

"Later, they tried figuring out which brain signals represented each of the 10 words. When they compared any two brain signals - such as those generated when the man said the words 'yes' and 'no' - they were able to distinguish brain signals for each word 76 percent to 90 percent of the time."

But the doctors ran into trouble when they looked at all 10 signals simultaneously. Their accuracy dropped to as low as 28 percent.

"This is proof of concept," says Bradley Greger, an assistant professor of bioengineering at the university. "We've proven these signals can tell you what the person is saying well above chance. But we need to be able to do more words with more accuracy before it is something a patient really might find useful."

The hope, they say, is to one day help patients who are "locked in" and have very few, if any, ways to communicate. That includes patients with advanced-stage Lou Gehrig's disease.


Part of the cool geek factor here is that scientists used a new kind of electrode, called microECoGs. It sits on top of the brain, but doesn't penetrate. That makes it safer to place electrodes on sensitive areas. Doctors are already using them to help severe epileptics.

In this case, scientists placed a net of 16 miniature electrodes and collected a web of data, which later helped them get such high accuracy.
http://www.eurekalert.org/pub_releases/2010-09/uou-tbs090110.php
Public release date: 6-Sep-2010
Contact: Lee J. Siegel
leesiegel@ucomm.utah.edu
801-581-8993
University of Utah

The brain speaks
Scientists decode words from brain signals

SALT LAKE CITY, Sept. 7, 2010 – In an early step toward letting severely paralyzed people speak with their thoughts, University of Utah researchers translated brain signals into words using two grids of 16 microelectrodes implanted beneath the skull but atop the brain.

"We have been able to decode spoken words using only signals from the brain with a device that has promise for long-term use in paralyzed patients who cannot now speak," says Bradley Greger, an assistant professor of bioengineering.

Because the method needs much more improvement and involves placing electrodes on the brain, he expects it will be a few years before clinical trials on paralyzed people who cannot speak due to so-called "locked-in syndrome."

The Journal of Neural Engineering's September issue is publishing Greger's study showing the feasibility of translating brain signals into computer-spoken words.

The University of Utah research team placed grids of tiny microelectrodes over speech centers in the brain of a volunteer with severe epileptic seizures. The man already had a craniotomy – temporary partial skull removal – so doctors could place larger, conventional electrodes to locate the source of his seizures and surgically stop them.

Using the experimental microelectrodes, the scientists recorded brain signals as the patient repeatedly read each of 10 words that might be useful to a paralyzed person: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less.

Later, they tried figuring out which brain signals represented each of the 10 words. When they compared any two brain signals – such as those generated when the man said the words "yes" and "no" – they were able to distinguish brain signals for each word 76 percent to 90 percent of the time.

When they examined all 10 brain signal patterns at once, they were able to pick out the correct word any one signal represented only 28 percent to 48 percent of the time – better than chance (which would have been 10 percent) but not good enough for a device to translate a paralyzed person's thoughts into words spoken by a computer.

"This is proof of concept," Greger says, "We've proven these signals can tell you what the person is saying well above chance. But we need to be able to do more words with more accuracy before it is something a patient really might find useful."

People who eventually could benefit from a wireless device that converts thoughts into computer-spoken spoken words include those paralyzed by stroke, Lou Gehrig's disease and trauma, Greger says. People who are now "locked in" often communicate with any movement they can make – blinking an eye or moving a hand slightly – to arduously pick letters or words from a list.

University of Utah colleagues who conducted the study with Greger included electrical engineers Spencer Kellis, a doctoral student, and Richard Brown, dean of the College of Engineering; and Paul House, an assistant professor of neurosurgery. Another coauthor was Kai Miller, a neuroscientist at the University of Washington in Seattle.

The research was funded by the National Institutes of Health, the Defense Advanced Research Projects Agency, the University of Utah Research Foundation and the National Science Foundation.

Nonpenetrating Microelectrodes Read Brain's Speech Signals

The study used a new kind of nonpenetrating microelectrode that sits on the brain without poking into it. These electrodes are known as microECoGs because they are a small version of the much larger electrodes used for electrocorticography, or ECoG, developed a half century ago.

For patients with severe epileptic seizures uncontrolled by medication, surgeons remove part of the skull and place a silicone mat containing ECoG electrodes over the brain for days to weeks while the cranium is held in place but not reattached. The button-sized ECoG electrodes don't penetrate the brain but detect abnormal electrical activity and allow surgeons to locate and remove a small portion of the brain causing the seizures.

Last year, Greger and colleagues published a study showing the much smaller microECoG electrodes could "read" brain signals controlling arm movements. One of the epileptic patients involved in that study also volunteered for the new study.

Because the microelectrodes do not penetrate brain matter, they are considered safe to place on speech areas of the brain – something that cannot be done with penetrating electrodes that have been used in experimental devices to help paralyzed people control a computer cursor or an artificial arm.

EEG electrodes used on the skull to record brain waves are too big and record too many brain signals to be used easily for decoding speech signals from paralyzed people.

Translating Nerve Signals into Words

In the new study, the microelectrodes were used to detect weak electrical signals from the brain generated by a few thousand neurons or nerve cells.

Each of two grids with 16 microECoGs spaced 1 millimeter (about one-25th of an inch) apart, was placed over one of two speech areas of the brain: First, the facial motor cortex, which controls movements of the mouth, lips, tongue and face – basically the muscles involved in speaking. Second, Wernicke's area, a little understood part of the human brain tied to language comprehension and understanding.

The study was conducted during one-hour sessions on four consecutive days. Researchers told the epilepsy patient to repeat one of the 10 words each time they pointed at the patient. Brain signals were recorded via the two grids of microelectrodes. Each of the 10 words was repeated from 31 to 96 times, depending on how tired the patient was. Then the researchers "looked for patterns in the brain signals that correspond to the different words" by analyzing changes in strength of different frequencies within each nerve signal, says Greger.

The researchers found that each spoken word produced varying brain signals, and thus the pattern of electrodes that most accurately identified each word varied from word to word. They say that supports the theory that closely spaced microelectrodes can capture signals from single, column-shaped processing units of neurons in the brain.

One unexpected finding: When the patient repeated words, the facial motor cortex was most active and Wernicke's area was less active. Yet Wernicke's area "lit up" when the patient was thanked by researchers after repeating words. It shows Wernicke's area is more involved in high-level understanding of language, while the facial motor cortex controls facial muscles that help produce sounds, Greger says.

The researchers were most accurate – 85 percent – in distinguishing brain signals for one word from those for another when they used signals recorded from the facial motor cortex. They were less accurate – 76 percent – when using signals from Wernicke's area. Combining data from both areas didn't improve accuracy, showing that brain signals from Wernicke's area don't add much to those from the facial motor cortex.

When the scientists selected the five microelectrodes on each 16-electrode grid that were most accurate in decoding brain signals from the facial motor cortex, their accuracy in distinguishing one of two words from the other rose to almost 90 percent.

In the more difficult test of distinguishing brain signals for one word from signals for the other nine words, the researchers initially were accurate 28 percent of the time – not good, but better than the 10 percent random chance of accuracy. However, when they focused on signals from the five most accurate electrodes, they identified the correct word almost half (48 percent) of the time.

"It doesn't mean the problem is completely solved and we can all go home," Greger says. "It means it works, and we now need to refine it so that people with locked-in syndrome could really communicate."

"The obvious next step – and this is what we are doing right now – is to do it with bigger microelectrode grids" with 121 micro electrodes in an 11-by-11 grid, he says. "We can make the grid bigger, have more electrodes and get a tremendous amount of data out of the brain, which probably means more words and better accuracy."

###
University of Utah Public Relations
201 Presidents Circle, Room 308
Salt Lake City, Utah 84112-9017
(801) 581-6773 fax: (801) 585-3350
www.unews.utah.edu
Intel's Mind-Reading Software
April 8, 2010 12:03 PM

The next version of the iPhone OS will be unwrapped today, how to get your mom to stop stalking you on Facebook, and Intel shows off brain-scanning software that can pretty much read your mind.
Read more: http://www.cbsnews.com/video/watch/?id=6376405n&tag=mncol;lst;6...
'Mind-reading machine' can convert thoughts into speech
A mind reading machine is a step closer to reality after scientists discovered a way of translating people's thoughts into words.
By Richard Alleyne, Science Correspondent
Published: 5:30AM BST 08 Sep 2010
http://www.telegraph.co.uk/science/science-news/7987821/Mind-readin...
Researchers have been able to translate brain signals into speech using sensors attached to the surface of the brain for the first time.

The breakthrough, which is up to 90 per cent accurate, offers a way to communicate for paralysed patients who cannot speak and could eventually lead to being able to read anyone thoughts.

"We were beside ourselves with excitement when it started working," said Professor Bradley Greger, a bioengineer at Utah University who led the team of researchers.

"It was just one of the moments when everything came together.

"We have been able to decode spoken words using only signals from the brain with a device that has promise for long-term use in paralysed patients who cannot now speak.

"I would call it brain reading and we hope that in two or three years it will be available for use for paralysed patients."

The experimental breakthrough came when the team attached two button sized grids of 16 tiny electrodes to the speech centres of the brain of an epileptic patient. The sensors were attached to the surface of the brain The patient had had part of his skull removed for another operation to treat his condition.

Using the electrodes, the scientists recorded brain signals in a computer as the patient repeatedly read each of 10 words that might be useful to a paralysed person: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less.

Then they got him to repeat the words to the computer and it was able to match the brain signals for each word 76 per cent to 90 per cent of the time. The computer picked up the patinet's brain waves as he talked and did not use any voice recognition software.

Because just thinking a word – and not saying it – is thought to produce the same brain signals, Prof Greger and his team believe that soon they will be able to have translation device and voice box that repeats the word you are thinking.

What is more, the brains of people who are paralysed are often healthy and produce the same signals as those in able bodied people – it is just they are blocked by injury from reaching the muscle.

The researchers said the method needs improvement, but could lead in a few years to clinical trials on paralysed people who cannot speak due to so-called "locked-in" syndrome.

“This is proof of concept,” Prof Greger said, “We’ve proven these signals can tell you what the person is saying well above chance.

"But we need to be able to do more words with more accuracy before it is something a patient really might find useful.”

People who eventually could benefit from a wireless device that converts thoughts into computer-spoken words include those paralysed by stroke, disease and injury, Prof Greger said.

People who are now “locked in” often communicate with any movement they can make – blinking an eye or moving a hand slightly – to arduously pick letters or words from a list.

The new device would allow them freedom to speak on their own.

"Even if we can just get them 30 or 40 words that could really give them so much better quality of life," said Prof Greger.

“It doesn’t mean the problem is completely solved and we can all go home. It means it works, and we now need to refine it so that people with locked-in syndrome could really communicate.”.

The study, published in the journal of Neural Engineering, used a new kinds of nonpenetrating microelectrodes that sit on the brain without poking into it.

The first was attached to the face motor cortex, which controls facial movement and is on the top left hand side of the brain.

The second was attached to the Wernicke's area, an area just above the left ear that acts as a sort of language translator for the brain.

Because the microelectrodes do not penetrate brain matter, they are considered safe to place on speech areas of the brain – something that cannot be done with penetrating electrodes that have been used in experimental devices to help paralysed people control a computer cursor or an artificial arm.

The researchers were most accurate – 85 per cent – in distinguishing brain signals for one word from those for another when they used signals recorded from the facial motor cortex.

They were less accurate – 76 per cent – when using signals from Wernicke’s area.

Last year, Prof Greger and colleagues published a study showing electrodes could “read” brain signals controlling arm movements.
Mind-Reading Tech Predicts Terrorist's Intentions
Brain electrodes coupled with imagery can identify who is planning an attack on what city, when and how.
By Eric Bland
Thu Aug 12, 2010 03:07 AM ET
http://news.discovery.com/tech/mind-reading-tech-predicts-terrorism...
THE GIST
A computer records specific brainwaves that elevate when a person has a strong reaction.
Such a device could be used to stop illegal acts before they happen.
Criminals could be pinpointed for capers committed in the past.

A simple slide show could be the next weapon against terrorists. Using a brain-electrode cap and imagery, scientists at Northwestern University can pick the date, location and means of a future terrorist attack from the minds of America's enemies.

The new research could not only stop terrorist attacks before they happen, but could also be used to help prevent other capers, or convict criminals after they break the law.

"We presented [the mock terrorists] with stimuli that are rational choices of what they might do,” said J. Peter Rosenfeld, a scientist at Northwestern University and coauthor of a new study in the journal Psychophysiology.

"They sit in a chair, we put brain wave recording electrodes on their scalp and they look at the screen."

The electrodes measure the P300 brain wave, an involuntary response to stimuli that starts in the temporoparietal junction and spreads across the rest of the brain. When the wave hits the surface of the brain, the electrodes detect the signal. The stronger the reaction of the subject to a particularly stimuli, the stronger the P300 brain wave.

Rosenfeld and his co-author, graduate student John Meixner, divided 29 Northwestern University students into two groups. One group planned a vacation while the other group planned a terrorist attack. The students then had electrodes placed on their scalp, and were shown a series of images of various cities, such as Boston and Houston, and various means of attack, along with other related, but irrelevant, images as controls.

As the slide show advanced, the electrodes recorded the P300 waves. When, for instance, the mock terrorists saw an image of the city they planned to attack, the electrodes recorded strong P300 brain waves. The Northwestern scientist then compared the strength of all the brain waves to find out who was planning at attack on which city, when they were planning it and how they meant to carry out the attack.

The Northwestern scientists correlated the strongest brain waves with "guilty knowledge" every time. Weaker P300 waves were seen when subjects saw images not associated with their planned attack. Scientists also examined P300 waves from the students in the group that was planning vacations, and did not falsely identify any of them as terrorists.

The P300 waves could be even more pronounced in real terrorists, said Rosenfeld. The college students spent a mere 30 minutes planning out their attack. Real terrorists would likely spend days, weeks or months planning an attack, which should mean even stronger P300 waves.

While the terrorists plan their attack, they could also plan their defense. That's because P300 brain waves can, or at least could, be defeated. By artificially creating a strong response to an image that is unrelated to their attack, terrorists could have a countermeasure against the electrodes.

"The subject just has to be on the alert for any other stimulus, and when they see an irrelevant stimulus, they secretly make a response," said Rosenfeld. "That response could be anything from thinking about their girlfriend or wiggling a toe."

The new research can identify, and counter, the countermeasures up to 83 percent of the time, according to the Psychophysiology article. In his as yet unpublished work, Rosenfeld says that he can defeat nearly 100 percent of countermeasures.

"This new research is very impressive," said Gershon Ben-Shakhar, a scientist who also studies brain waves and deception at the Hebrew University of Jerusalem. Not only did Rosenfeld show that it's possible to find the location and means of a terrorist attack, "but he also invented and demonstrated a technique to detect and prevent countermeasures."

The new Northwestern research won't put police officers or soldiers out of work anytime soon though, said Ben-Shakhar.

It's still very difficult to detect someone who has actually committed a crime, let alone someone who is only planning a crime. The chances that this technology will be used anytime soon to prevent terrorist attacks is low, for now at least, but it offers intriguing possibilities for the future of law enforcement.
Brain Scan Can Read Your Thoughts
New insights into brain activity could explain how memories are formed and how they change over time..Thu Mar 11, 2010 02:30 PM ET
Content provided by AFP
http://news.discovery.com/tech/brain-scan-mind-reading.html
THE GIST:
•Brain scans could reveal what a person is thinking.•Using fMRI scans, scientists can distinguish memories of a past event a person is recalling.•The brain scans could provide fresh insight into how memories are stored and how they may change through time.

A scan of brain activity can effectively read a person's mind, researchers said Thursday.

British scientists from University College London found they could differentiate brain activity linked to different memories and thereby identify thought patterns by using functional magnetic resonance imaging (fMRI).

The evidence suggests researchers can tell which memory of a past event a person is recalling from the pattern of their brain activity alone.

"We've been able to look at brain activity for a specific episodic memory -- to look at actual memory traces," said senior author of the study, Eleanor Maguire.

"We found that our memories are definitely represented in the hippocampus. Now that we've seen where they are, we have an opportunity to understand how memories are stored and how they may change through time."

The results, reported in the March 11 online edition of Current Biology, follow an earlier discovery by the same team that they could tell where a person was standing within a virtual reality room in the same way.

The researchers say the new results move this line of research along because episodic memories -- recollections of everyday events -- are expected to be more complex, and thus more difficult to crack than spatial memory.

In the study, Maguire and her colleagues Martin Chadwick, Demis Hassabis, and Nikolaus Weiskopf showed 10 people each three very short films before brain scanning. Each movie featured a different actress and a fairly similar everyday scenario.

The researchers scanned the participants' brains while the participants were asked to recall each of the films. The researchers then ran the imaging data through a computer algorithm designed to identify patterns in the brain activity associated with memories for each of the films.

Finally, they showed that those patterns could be identified to accurately predict which film a given person was thinking about when he or she was scanned.

The results imply that the traces of episodic memories are found in the brain, and are identifiable, even over many re-activations, the researchers said.

The results reinforce the findings of a 2008 US study that showed similar scans can determine what images people are seeing based on brain activity.
Mind-Reading Devices to Help the Speechless Speak
University of Utah Researchers Decode Brain Signals Using Microelectrodes
7 comments By ALYSSA DANIGELIS
Sept. 12, 2010
http://abcnews.go.com/Technology/mind-reading-devices-speechless-sp...
The thoughts are there, but there is no way to express them. For "locked in" patients, many with Lou Gehrig's disease, the only way to communicate tends to be through blinking in code.


"They're perfectly aware. They just can't get signals out of their brain to control their facial expressions. "They're the patients we'd like to help first," said University of Utah's Bradley Greger, an assistant professor of bioengineering who, with neurosurgery professor Paul House, M.D., published the study in the October issue of the Journal of Neural Engineering.

How to Read Thoughts
Some severely-epileptic patients have the seizure-stricken parts of the brain removed. This standard procedure requires cutting the skull open and putting large, button-sized electrodes on the brain to determine just what needs removal. The electrodes are then taken off the brain.

The University of Utah team worked with an epileptic patient who let them crowd together much smaller devices, called micro-electrocorticography, onto his brain prior to surgery.

"The microelectrode grids that we placed on top of the brain are actually simple technology," Greger said. Made from platinum wires and silicone, a grid of 16 microelectrodes is less than a centimeter in diameter.

"The hard part for us was to figure out how to take the recordings we got from the microelectrodes and relate it to the words that the patients were speaking," Greger said.

Microelectrode grids sat on two parts of the volunteer's brain crucial for speech: the Face-Motor Cortex and Wernicke's area. Both grids were hooked up to a computer and run through an algorithm. The researchers asked the volunteer to repeat a string of 10 words out loud while the computer read the brain signals: "yes," "no," "hot," "cold," "hungry," "thirsty," "hello," "goodbye," "more" and "less."
Patients with locked-in syndrome will soon be able to communicate with mind-reading technology, predicts scientist
By Daily Mail Reporter
Last updated at 5:00 PM on 21st September 2010
Read more: http://www.dailymail.co.uk/sciencetech/article-1313860/Patients-loc...
Psychic 'mind-reading' computer will show your thoughts on screen By David Derbyshire
Last updated at 7:55 AM on 2nd November 2009
http://www.dailymail.co.uk/sciencetech/article-1224489/Psychic-plug...
A mind-reading machine that can produce pictures of what a person is seeing or remembering has been developed by scientists.
The device studies patterns of brainwave activity and turns them into a moving image on a computer screen.
While the idea of a telepathy machine might sound like something from science fiction, the scientists say it could one day be used to solve crimes.

In a pioneering experiment, an American team scanned the brain activity of two volunteers watching a video and used the results to recreate the images they were seeing.
Although the results were crude, the technique was able to reproduce the rough shape of a man in a white shirt and a city skyline.
Professor Jack Gallant, who carried out the experiment at the University of California, Berkeley, said: 'At the moment when you see something and want to describe it you have to use words or draw it and it doesn't work very well.

'This technology might allow you to recover an eyewitness's memory of a crime.'
The experiment is the latest in a series of studies designed to show how brain scans can reveal our innermost thoughts.

Using a functional magnetic resonance imaging (fMRI) scanner, normally found in hospitals, the American team scanned the brains of two volunteers while they watched videos.
The results were fed into a computer which looked for links between colours, shapes and movements on the screen, and patterns of activity in the brain.
The computer software was then given the brain scans of the volunteers as they watched a different video and was asked to recreate what they were seeing.
According to Dr Gallant, who has yet to publish the results of the experiment, the software was close to the mark.
In one scene featuring comic actor Steve Martin in a white shirt, the computer reproduced his white torso and rough shape, but was unable to handle details of his face.

In another, the volunteers watched an image of a city skyline with a plane flying past.
The software was able to recreate the skyline - but not the aircraft.
Mind Reading, Brain Fingerprinting and the Law
ScienceDaily (Jan. 24, 2010)
http://www.sciencedaily.com/releases/2010/01/100120085459.htm
What if a jury could decide a man's guilt through mind reading? What if reading a defendant's memory could betray their guilt? And what constitutes 'intent' to commit murder? These are just some of the issues debated and reviewed in the inaugural issue of WIREs Cognitive Science, the latest interdisciplinary project from Wiley-Blackwell, which for registered institutions will be free for the first two years.
--------------------------------------------------------------------------------
In the article "Neurolaw," in the inaugural issue of WIREs Cognitive Science, co-authors Walter Sinnott-Armstrong and Annabelle Belcher assess the potential for the latest cognitive science research to revolutionize the legal system.

Neurolaw, also known as legal neuroscience, builds upon the research of cognitive, psychological, and social neuroscience by considering the implications for these disciplines within a legal framework. Each of these disciplinary collaborations has been ground-breaking in increasing our knowledge of the way the human brain operates, and now neurolaw continues this trend.

One of the most controversial ways neuroscience is being used in the courtroom is through 'mind reading' and the detection of mental states. While only courts in New Mexico currently permit traditional lie detector, or polygraph, tests there are a number of companies claiming to have used neuroscience methods to detect lies. Some of these methods involve electroencephalography (EEG), whereby brain activity is measured through small electrodes placed on the scalp. This widely accepted method of measuring brain electrical potentials has already been used in two forensic techniques which have appeared in US courtrooms: brain fingerprinting and brain electrical oscillations signature (BEOS).

Brain fingerprinting purportedly tests for 'guilty knowledge,' or memory of a kind that only a guilty person could have. Other forms of guilt detection, using functional magnetic resonance imaging (fMRI), are based on the assumption that lying and truth-telling are associated with distinctive activity in different areas of the brain. These and other potential forms of 'mind reading' are still in development but may have far-reaching implications for court cases.

"Some proponents of neurolaw think that neuroscience will soon be used widely throughout the legal system and that it is bound to produce profound changes in both substantive and procedural law," conclude the authors. "Other leaders in neurolaw employ a less sanguine tone, urging caution so as to prevent misuses and abuses of neuroscience within courts, legislatures, prisons, and other parts of the legal system. Either way we need to be ready to prevent misuses and encourage legitimate applications of neuroscience and the only way to achieve these goals is for neuroscientists and lawyers to work together in the field of neurolaw."

As this paper shows, WIREs Cognitive Science takes an original interdisciplinary approach to understanding the key issues surrounding state-of-the-art cognitive research. Disciplines featured in the inaugural issue include topics as diverse as cognitive biology, computer science, linguistics, neuroscience, philosophy and psychology.

Editor's Note: This article is not intended to provide medical advice, diagnosis or treatment.

Reply to Discussion

RSS

Latest Activity

Franco Villa replied to Franco Villa's discussion Chloridic acid hidden in radiators.
"The school of doctors are corrupt to hide this problem."
2 seconds ago
Franco Villa replied to Franco Villa's discussion Chloridic acid hidden in radiators.
"But can't you really understand that they put hydrochloric acid which also contains gas into the radiator system after I explained it to you? Distinguish continuous voice from hydrochloric acid."
1 minute ago
web designer posted a status
8 hours ago
web designer posted a status
8 hours ago
web designer posted a status
8 hours ago
web designer posted a status
8 hours ago
web designer posted a status
8 hours ago
web designer posted a status
8 hours ago

Badge

Loading…

© 2020   Created by Soleilmavis.   Powered by

Badges  |  Report an Issue  |  Terms of Service