Original link: http://www.us.net/signal/Archive/Oct01/decoding-oct.html

October 2001

©SIGNAL Magazine 2001

Decoding Minds, Foiling Adversaries

Information warfare is no longer just about machines; it is also about how users think.

By Sharon Berry

Whether a threat comes from pilot error or enemy aggression, scientists are finding that multisensor mapping and analysis of the brain lead to systems with human-machine interfaces that can correct human error, aid counterintelligence work and guard against attacks.

A technology, known as bio-fusion, combines sensors to examine biological systems to understand how information and neural structures produce thought and to display the thought in mathematical terms. By creating an advanced database containing these terms, researchers now can look at brain activity and determine if a person is lying, receiving instructions incorrectly or concentrating on certain thought types that may indicate aggression.

Mapping human brain functions is not new; however, using multiple components of the electromagnetic spectrum allows investigators to produce a different snapshot of the brain to gain additional insight.

Dr. John D. Norseen, systems scientist for embedded systems, Lockheed Martin Aeronautics Company, Marietta, Georgia, is developing the bio-fusion concept further. "If you went into a hospital and had an EEG [electroencephalogram], it is just telling you if your electrical patterns look fine, but maybe your magnetic components are not functioning properly," he explains. "What I am encouraging is multisensor analysis of the brain--looking at many areas of the spectrum to get a different picture."

After the information is placed in a database, a composite model of the brain is created. "Now, just by getting an EEG, we can begin to interpolate a better hyperspectral analysis," Norseen says. "The model provides us amplified information."

Simple interaction with subjects has been used to test the system. A researcher shows a picture to a person or asks a person to think of a number between one and nine. Information is gathered and displayed on a monitor much like on a television. It shows that the person is thinking about the number nine. The researcher then tells the person to say the same number, an action that appears in another part of the brain, the parietal region. "By looking at the collective data, we know that when this person thinks of the number nine or says the number nine, this is how it appears in the brain, providing a fingerprint, or what we call a brainprint," Norseen offers.

"We are at the point where this database has been developed enough that we can use a single electrode or something like an airport security system where there is a dome above your head to get enough information that we can know the number you're thinking," he adds. "If you go to an automatic teller machine and the sensor system is in place, you could walk away and I would be able to access your personal identification code."

Norseen shares that the defense industry is interested because this type of data is culturally independent information. Worldwide, most individuals process certain information in the same regions of the brain.

Brainprints are unique to each person. While the number nine will appear in the same brain areas of different people, it still occurs as a unique signature of how a person specifically thinks of the number. Biology has the tendency to create things that are self-similar, Norseen says. "The proteins that lay down your fingerprints are the same protein materials that lay down the neurons of the brain," he offers.

He also has been asked by military and law enforcement agencies to show how brainprints can be used to determine probable cause, which could be used for an anti-terrorism situation. "If someone is walking through the airport and he goes through the security checkpoint and we get a feeling that this person is preoccupied with certain numbers or certain thought types that may indicate hostility or aggression we could ask him questions and verify the answers. Then it gives you probable cause to say, 'Sir/Ma'am, may we step aside with you and ask you additional questions?' It allows you to find a problem set within a large group." Norseen is confident that if such a system were fully developed it would be accepted if it meant everyone would be safer at the airport gate. The data he collects may not only show probable cause but also truth verification, he adds. The brain, which uses energy, does not want to expend it needlessly, he says. If someone is telling the truth, it is kept on the outside portion of the brain in low-energy domain areas of the brain. "If someone starts to light up in more areas of the brain and at a higher energy level, it means that the person is now starting to confabulate or obfuscate." Research so far indicates a 90 to 95 percent accuracy rate.

Now that bio-fusion research has developed beyond the initial stages and the database of what, how and where thoughts occur in the brain is mature, scientists are looking at information injection, a contentious issue, Norseen admits. The concept is based on the fact that human perception consists of certain invariant electromagnetic and biochemical lock-and-key interactions with the brain that can be identified, measured and altered by mathematical operations. If researchers can re-create the inverse function of what has been observed, they gain the ability to communicate or transmit that information back--intact or rearranged--to the individual or someone else, Norseen says. "When you get down to the mathematical properties, information injection is beginning to be demonstrated."

The brain is very susceptible to accepting information that is either real and comes from its own memory mechanisms or from injection from an outside source, Norseen notes. "I am sure you have memories of when the lawn was being cut in late summer and of the smell of the chlorophyll," he says. "The chlorophyll would then evoke other memories. I could possibly ping you with a light sequence or with an ELF [extremely low field] radiation sequence that will cause you to think of other things, but they may be in the area that I am encouraging. Those are direct ways in which I can cause the inverse function of something to be fired off in the brain so that you are thinking about it. I have now caused you to think about something you would not have otherwise thought about."

By using information injection, a person could be isolated from a group and made to believe that something is happening, while others in the group are being left alone. Likewise, someone at a command post monitoring information on a screen could be affected. Some experts believe that adversaries now are designing techniques that could affect the brain and alter the human body's ability to process stimuli.

Norseen hopes his work will lead to filters and walls that would block intentional or unintentional corrupted information. "Look at the incident in Japan where a lot of young children were watching a cartoon, and it caused many of them to have cerebral seizures," he explains. "The information that came over the screen showed lights at particular timing and pulsing frequencies and in a certain combination of colors that caused the brain to go into a seizure. If you were alerted, you could slow parts of the video stream or change the timing mechanisms so the stream would not have a negative impact on the brain."

Modifying corrupt information may not always be enough. Norseen compares this type of offensive attack to cyberspace attacks in which viruses infiltrate computers. "Now there's potential for the viruses to affect the video stream," he says. "They can be corrected or defended against, but more complex protective measures would have to be installed. Instead of electronic warfare countermeasures and software virus countermeasures, we're getting into information countermeasures."

Norseen believes his work with bio-fusion and the human-machine interface is revolutionary and that a new set of questions must be asked when looking at the state of information warfare. "We're so concerned with information corrupting our machines that we're spending millions of dollars for our protection against people writing Trojan horses. What about the human side of the human-machine interface? What's happening to the operator?" he asks.

Some experts believe that the information operator is a weak spot in the nation's military assets. Additionally, some developers in the field see Russia, China and several Middle Eastern countries as more advanced than the United States in this area.

"The United States is not behind other countries [in this field]," Norseen argues. "Government leaders are very aware of the information threat to the soldier, but they are concerned about being careful to work on the defensive side. However, other countries may be more interested in the offensive/exploitative side. When we talk about our ability to have information dominance, we know that our machines can be better and faster, but sometimes we underplay what could happen to the operator. We are aware that the enemy is going to go after the mind of the operator to bring down the system, not by corrupting the machine but by corrupting the individual soldier or decision maker."

One of the challenges of addressing the human side of the human-machine interface is creating quantitative means to measure the impact of information on the human brain and neurophysiology. "We're looking at incidents such as Columbine or teenagers playing games like Doom," he says. "How are they being influenced negatively? There have been no quantitative measures like what I've been developing. When we can show that, we can identify more ways to protect the human side of the human-machine envelope."

Bio-ethics specialists are reviewing bio-fusion and its applications, specifically neural emulation software. Rather than involving a human, the software captures human mental activity and can be tested against psychological attacks. Software corrections and builds then would be put in place to protect users. "Our ethics people are excited about this because this is a way to protect people without subjecting them to experimentation," he says.

Synthetic reality is another approach to protection. "If I pick up a phone right now, is it a person I talked to or a recording of the person's voice? Or was it synthetically generated?" Norseen asks. Scientists can look at components of a personality in a software application, select certain components of the personality and create a synthetic person. "We can look at 150 things that Joe Smith, a special forces agent, does. He smiles 20 percent of the time. He has a tick in his eye. We can extract those features and create mini-morphs of him--we create identical Joes on the computer. I want to communicate with Joe about secret information, but I want other parts of my system to communicate with avatars of Joe. If someone tried to find our communication, they would have to sift through a lot of other communications that looked an awful lot like Joe. I would bury Joe in the noise of himself."

Scientists have found that human errors also could possibly be corrected using external means. A recent discovery determined that error correction coding parameters of the brain involve the globus pallidus, a powerful error correction mechanism. When people consider a decision, they visualize it and talk to themselves and send it to the kinesthetic nerve to say, "Do you feel good about this?" Then it comes back through the globus pallidus for one more visual look, and people decide to do it or not do it--a go/no go decision. "If we proceed and do something that is in error, our globus pallidus comes into play," Norseen notes. "It is connected to the kinesthetic nerve, which is ineffable. It can't talk to the 'talk' areas of the brain, but it can send signals that go back through the stomach, and that's why you get that sick feeling in your stomach. Something's wrong here."

For example, a pilot in the cockpit and the aircraft's system both may hear an instruction, "Come right 90 degrees." The human hears the instruction, but the brain may actually have heard 80 degrees. "Even though the pilot may confirm 90 degrees, the system can see that the person actually misunderstood," Norseen notes. "The machine can say, 'I'm monitoring you and even though you said you're coming 90 degrees, your brainprint analysis indicates that you only understood 80 degrees. I request you come an additional 10 degrees so we're in compliance with the overall command of the system.' If we can show the globus pallidus 'go/no go' display of error correction, we can create a checklist that says, 'Am I in accordance with the globus pallidus?'" Today, more than 70 percent of all accidents are caused not by the machine but by humans using information incorrectly, Norseen says.

Many of Norseen's ideas are still in early development. "As the areas of the brain that reflect behavioral components of the human are identified and understood, and as the software components are laid down, I can begin to conduct tests on the synthetic human without using a real human," he says. "I can find out more things about the human now in a year than it took me in the past 10 years ¼ where I can actually launch a truth verification system or a knowledge warfare protection system. To do what? Enhance, strengthen and protect the human side of the human-machine interface in any domain, any weapon system," he concludes.



Back to October 2001 's Home Page

Back to SIGNAL's Home Page