Download thought_reading_capacity.doc
Download thought_reading_capacity.txt
John J. McMurtrey, M. S., Copyright 2004,[a] 12 Sept. 05
Co-authorship is negotiable towards professional publication in an NLM indexed journal, Email- [email protected]
Donations toward future research are gratefully appreciated at http://www.slavery.org.uk/FutureResearch.htm
ABSTRACT
Reports of specific concept recognition in humans by technical means on hearing words, viewing images or words, and prior to vocalization are examined. These reports are consistent with an extensive literature on word category differentiation by electrophysiology and blood flow, which is reviewed. EEG discrimination literature of emotional states, and deception is surveyed along with non-invasive brain computer interface reports. Non-contact and remote methods of brain wave assessment are also considered. The literature treated lends some substantiation to press accounts indicating thought reading is possible, and has had covert development.
INTRODUCTION
The Bible attributes to God the capacity to know the thoughts of men. [1] Most scientists are unaware that thought reading by electroencephalogram (EEG) was reported as feasible in work begun over 30 years ago, [2] which more recently a number of groups confirm by EEG, Magnetoencephalograpy (MEG), and functional Magnetic Resonance Imaging (fMRI) technologies. This review focuses on literature relating to technologic thought reading, though also treated are the discrimination of more general cognitive states, brainwave capture methods, and reports of thought reading development apparently covert to open literature.
METHODS OF SPECIFIC CONCEPT RECOGNITION
The Defense Advanced Research Projects Agency in 1972 contracted Pinneo & Hall for work that a 1975 US technical report entitled “Feasibility Study For Design of a Biocybernetic Communication System.” The study concludes “that it is feasible to use the human EEG coincident with overt and covert speech as inputs to a computer for such communication” (covert speech is defined as verbal thinking). 2 The 149 page report [b] states: “enough information has been obtained . . . to specify the optimum parameters to use for an EEG operating system, and to suggest future research towards that end.”
Pinneo & Hall utilized templates for EEG word recognition constructed by averaging EEG patterns evoked by 9 words in each subject for visually presented words, and primarily utilized 4 electrodes over brain language areas for prediction. People with high hemispheric lateralization had EEG patterns for some words that frequently classified 100% correctly, regardless of the number of repetitions with stablity over time. Over all words, however, classification accuracy for these people was 85% for overtly, and 72% for words repeated to oneself, but solely by mental means without vocalization. Across all subjects specific word EEG patterns were classified 35% correctly for overtly, and 27% correctly for covertly spoken words, but more people were in the 70-100% classification range than in the 10-15% range. [c] Subjects with low hemispheric laterality, particularly stutterers had near chance EEG classification. EEG concept recognition was actually 10-15% higher for pictures rather than words. Phrases containing similarly articulated words or homonyms were better recognized than these words alone without context.
Suppes et. al. have the most extensive recent publications supporting and reporting specific EEG thought recognition. [3] [4] [5] [6] [7] This work largely compares recognition improvement methods with some emphasis on a relative invariance of EEG concept representations across individuals. The procedures generally utilized Fourrier transforms of both templates for recognizing words, and test samples with an optimal EEG frequency window, or filter selected for each subject. EEG word templates constructed by averaging each subject’s responses (50 trials) at single electrodes resulted in less EEG word recognition, 3 than recognition templates averaged across all subjects (700 trials) [d] for bipolar electrode difference. The latter technique produced recognition rates over seven words of 100% for visual images and auditory words. 5 [e] However, for visually presented words, recognition templates generated by excluding from the average the subject tested was better--75% than averaging within subject or over all subjects. The waveforms for each presentation modality were very similar, and when recognition templates averaged across subjects in the modalities of visual images or words were utilized for recognizing other modalities (visual images or words & auditory), recognition still was generally 60-75%. Such results were despite inclusion of three subjects with English as a second language, and obvious hemispheric laterality confounds important to Pinneo & Hall, [f] such as one left handed and another ambidexterous subject. These results indicate a relative invariance of EEG representations for different concepts between subjects and perception modality, when averaging out and filtering noise. Matching templates to words is derived by amplitude difference between template and test word waveforms, when sampled at 814 difference points as squared and summed (Pinneo & Hall had 255 samples per word).
Also examined are brain wave patterns for sentences. Recognizing the first sentence word by the same words individually presented, and the same words in sentences when cut and pasted was successful at a 50% recognition rate (with 8.3% as chance). 4 Even when excluding a subject from the averaged template, over 90% recognition was obtained for 48 sentences, as visually presented one word at a time. 6
Averaged unfiltered auditory responses are classified 100% correctly by the superposition of 3 sine waves chosen from the frequency domain maxima for each word.7 The same procedure when averaged across subjects and presentation modalities (visual images, visual and auditory words) classifies 100% of the words by 5 frequencies per word, while data fit decreased only 6% compared to the filtered templates. Syllable classification is less successful, with six correct classifications out of eight examples from superposition of nine frequencies.
A Korean group reports yes/no decision discrimination of 86% by spatio-temporal cross correlation. [8] This was achieved from 4 electrodes over bilateral frontal and occipital sites. Differential equation measures of synchronization rate and average polarity also had high recognition rates of 78% and 81% respectively.
Other investigators publish magnetoencephalographic (MEG) recognition of viewed words above chance significantly by 27% for recognition and 44% for accuracy. [9] Although these results were only somewhat above chance, MEG also was less successful for Suppes et al., 3 4 and a speech recognition optimized artificial intelligence system was utilized without filters or recognition templates. The authors expressed surprise that any recognition was possible, considering that input utilized only a simple technique; root mean squares of foci.
There is apparently a Russian report of specific EEG word recognition before 1981. [10] The work is only known from a science reporter, and specifically unavailable, but is mentioned to aid this report’s discovery, and because of the claim that specific words contain category information, which is of possible significance for word category differentiation studies.
Patents for EEG thought recognition exist. Electroencephalographic (EEG) instant detection by syllables of “a content of category which the testee wishes to speak” quotes Kiyuna et. al. Patent # 5785653 “System and method for predicting internal condition of live body.” [11] A stated use: “the present invention may be use (sic) to detect the internal condition of surveillance in criminal investigation” by EEG. NEC Corporation licensed this patent. Mardirossian Patent # 6011991 “Communication system and method including brain wave analysis and/or use of brain activity” includes remote EEG communication with armed forces or clandestine applications. [12] This patent proposes transmitter capable skin implants, utilizes artificial intelligence, and is licensed by Technology Patents, LLC.
Studies of brain blood flow changes detected by functional Magnetic Resonance Imaging (fMRI), confirms that viewing pictures of objects activates specifically identifiable brain patterns. Comparing the distributed brain activity observed by fMRI for viewing faces, houses, cats, chairs, bottles, shoes, and scissors were 90-100% correct in all two category comparisons (with 50% as chance). [13] A different group replicates the results of this report. [14] Even though all these objects are described as categories because different exemplars and views were presented, discrimination of these objects generally requires an adjective, so that the distinctions qualify as specific concepts. A further report examined just 20 seconds of fMRI data rather than one half of an fMRI session in the previous studies, and utilized different exemplars of an object category for training classifiers from those utilized during classification. A support vector classifier provided the best results with 59-97% accuracy among ‘categories’ of baskets, birds, butterflies, chairs, teapots, cows, horses, tropical fish, garden gnomes, and African masks (with 10% as chance). [15] “Brain reading” are descriptive terms titling the report.
Numerous fMRI studies show similarly activated brain regions for viewing images or words, and hearing words. Viewing pictures of objects or the word naming them activates similar distributed brain systems for storing semantic knowledge, [16] [17] [18] and auditory presentation also shares the same [19] or a similar [20] system with that of viewing these words. These studies give anatomical basis for the high cross modality recognition rates of concepts observed by Suppes et al. 5 7
PHYSIOLOGIC DISCRIMINATION OF WORD CATEGORIES
Broca and Wernicke originally defined anatomy pertinent to aphasia resulting from brain injury. [21] More recently described are brain lesion patients who have very selective agnosias, which is an inability to name or recognize specific object classes. [22] [23] [24] Many word category differentiation reports reviewed below were initiated to explain and substantiate such deficits. This literature is consistent with specific word recognition, because word responses are averaged by category, and distinguished with only statistical inspection without template generation or specific comparison thereto as is required for thought recognition. Brain cell assembly activation provides a theoretical framework for both specific concept recognition, and word category discrimination. [25]
Electroencephalogram and Magnetoencephalogram Word Category Discrimination
Evoked EEG responses discriminate nouns and verbs. Nouns elicit more theta power than verbs, but verbs have greater theta coherence decrease, particularly in frontal versus posterior sites. [26] Noun waveforms generally are more negative than verb responses at post-stimulus intervals of both 200-350 and 350-450 milliseconds (msec.) [27] [28] [29] [30] Ambiguous noun/verbs are more negative than unambiguous nouns or verbs in the early latency interval, and when context indicates noun meaning versus verb use, are more negative over both these latency windows. 30 Anterior-posterior electrode activity also differs for ambiguous versus unambiguous nouns and verbs. 30 [31]
Action verb waveforms differ in amplitude, 28 and central versus posterior distribution compared to visual nouns, [32] with particular 30 Hz increase over the motor cortex for action verbs, and over the visual cortex for visual nouns. [33] [34] Face, arm, or leg action verbs differ in amplitude by time interval, and activity increases over the specific corresponding motor strip locus as well as by frontal electrode. [35] [36] Low resolution electromagnetic tomography finds irregular verb activity more in the left superior and middle temporal gyri, while regular verbs are more active in the right medial frontal gyrus at 288-321 msec. [37] Irregular verbs respond more in the left ventral occipito-temporal cortex than regular verbs at ~340 msec. by MEG, which localizes perpendicular sources undetectable by EEG. [38] Regular verb activity modulates more the left inferior prefrontal region including Broca’s area at ~470 msec with MEG, but irregular verbs have more right dorsolateral prefrontal cortex activity at ~570 msec. Priming evoked patterns occur for regular but not irregular verbs, [39] [40] while incorrect irregular noun plural [41] and verb participle [42] [43] waveforms differ from that of incorrect regular forms.
Abstract word waveforms onset more positively about 300 msec., persist longer at lateral frontal sites, and distribute more to both hemispheres compared to concrete words.28 [44] [45] β-1 frequency coherence during memorization of concrete nouns indicates left hemisphere electrode T5 as the main brain processing node. [46] Left hemisphere electrode T3 is similarly important for abstract nouns, which have more frontal area contribution, and massive right posterior hemisphere coupling. Abstract versus concrete memorization distinctly changes other frequency bands, [47] [48] and theta synchronization predicts efficient encoding. [49]
Content words yield a more negative peak at 350-400 msec. than functional grammar words, with a subsequent occipital positivity that function words lack, and more electrode and hemisphere differences from 400- 700 msec. [50] [51] In sentences, the late component of function words resembles preparatory slow waves that apparently subserve their introductory and conjunctive grammatical function. [52] Other studies show content versus function word differences at additional intervals and more bi-hemispheric effects,[53] with right visual field advantage for function words. [54] MEG distinguishes functional grammar words, or content words such as multimodal nouns, visual nouns, or action verbs, each by response strength and laterality at intervals of both ~100 and greater than 150 msec. [55]
Proper name amplitudes peak more just after 100 msec. negatively, and just after 200 msec. positively than common nouns, while one’s own name accentuates these peaks relative to other proper names with further positive and negative components. [56] Proper names, animals, verbs, and numerals show electrode site differences: proper name temporal negativity extends to inferior electrodes bilaterally; verbs and animal names are less negative and similar, but verbs have left frontal inferior positivity; while numerals have less waveform negativity, and bilateral parietal positivity. [57] Non-animal objects are more negative in both the 150-250 and 350-500 msec. intervals than animals, while animals are more positive in the 250-350 msec. interval. [58] [59] Animals are more positive in approximately the same latter interval than vegetables/fruits, while vegetables/fruits are more negative in about the earlier interval (150-250 msec.), and have stronger frontal region current sources than animals. [60] Animals in natural scenes evoke different waveforms than just natural scene or building pictures. [61] Responses to words for living things are less negative over the right occipital-temporal region than artifactual objects, while pictorial presentations of the same items further differ and have hemisphere effects noted as unreported. [62] EEG waveforms for specific meanings could be as discretely categorized as indicated by the reported but unspecified Russian work, which claims that “the waves for such concepts as “chair”, “desk”, and “table” are all overlapped by another wave that corresponds” to the concept of furniture. 10
Affective word meanings such as good-bad, strong-weak, or active-passive are discriminated [63] by both category and meaning polarity according to response latency, amplitude, and scalp distribution at intervals of 80-265 and 565-975 msec. [64] Positive words have amplitude increases peaking at 230 msec. compared to negative words, and relative to neutral words increase a subsequent peak amplitude as well as a slow wave component. [65] Emotional words also show less amplitude decrease on repetition than neutral words. [66]
Some of these word category differentiation reports are consistent with both the specific recognition reports, and/or the discrimination of non-verbal cognition. Based on EEG/MEG responses, words are readily distinguished from non-words, [67] [68] [69] pictures, [70] and as to length. [71] Even commas have a characteristic waveform similar to the speech phrase closure evoked pattern called closure positive shift. [72] Color selection modulates the EEG. [73] EEG discriminates the judgement of gender for both faces and hands. [74]
Positron Emission Tomography (PET) and Functional Magnetic Resonance Imaging (fMRI) Word Category Discrimination
Positron Emission Tomography (PET) and Functional Magnetic Resonance Imaging (fMRI) localize brain blood flow, with ability to distinguish perceptual categories. Some studies locate recognition of places [75] [76] and faces [77] within certain brain areas, however, expertise can recruit the face recognition area, [78] and other studies show these areas only responding maximally for specific stimuli. [79] Word category activity is both distributed and overlapping 79 [80] in a somewhat lumpy manner. [81] Though regions of word category difference are indicated below, brain comprehension is not solely dependent on these areas. Discrete category responsive emergence may have some resemblance to category segregation in the feature processing of artificial neural networks that self organize without programming. [82]
Meta-analysis of 14 studies locating activity for face, natural, and manufactured object recognition shows ventral temporal cortex difference. Face recognition activates more inferior ventral temporal portions including the fusiform gyrus of which manufactured objects activate more medial aspects than face or natural objects, yet natural objects distribute more widely in this region. [83] Eighty eight percent of face studies converged for mid fusiform gyrus activity, while natural and manufactured objects converged no more than 50% for any discrete area. Manufactured object activity locates to the middle temporal cortex from natural objects, which locate more in the superior temporal cortex. Face and natural object activity is more bilateral, and in the left inferior frontal cortex, while particularly tools activate the premotor area. These studies also feature activity in the inferior occipital/posterior fusiform and the medial occipital structures of lingual gyrus, calcarine sulcus, and cuneus.
There is some agreement that verbs have greater activity in temporal, parietal, and premotor/prefrontal regions than nouns, while nouns have little [84] or no [85] greater activated areas than verbs, yet no noun/verb difference is also reported. [86] German regular noun and verb fMRI responses compared to irregular words differ significantly in the right precentral gyrus, the left prefrontal cortex, bilateral posterior temporal lobes, and bilateral complexes including superior parietal lobules, supramarginal gyri, and angular gyri. [87] Regular words are left hemisphere lateralized, while irregular words have somewhat greater distribution to the right hemisphere, and a greater activation over all cortical areas. Irregular verbs activate more total cortex than regular verbs, but lack motor strip, insular, and most occipital cortex activity present for regular verbs. [88] Though both forms activate the inferior parietal lobule, irregular verbs activate more posterior and superior portions than regular verbs
Depending on control task correction, naming actions activates the left inferior parietal lobule, which is lacking for locative prepositions, which activate the left supramarginal gyrus selectively from actions. [89] Furthermore, naming abstract shape location compared to locating concrete items increases right supramarginal gyrus activity,89 which specifically also activates on long-term memory for spatial relations [90] and in American sign language prepositions. [91] The supramarginal gyrus is encompassed by the temporal-parietal-occipital junction active for location judgments, and is separate from temporal activity for judging color. [92] Action word generation activity is just anterior to the motion perception area, while color word generation activity is just anterior to the color perception area. [93] Naming object color activates distinct brain regions from naming the object, with color knowledge retrieval activity being slightly removed from that of naming colors. [94] Irrespective of language and visual or auditory modality, the naming of body parts activates the left intraparietal sulcus, precentral sulcus, and medial frontal gyrus, while naming numbers activates the right post central sulcus as joined to the intraparietal sulcus. 19
Concrete words are discriminated from abstract words in both noun or verb forms,85 with more right hemisphere activity for abstract words than concrete words. [95] [96] [97] Abstract/concrete contrasts feature both right or left temporal areas, while the reverse concrete/abstract comparison features frontal activity. [98] [99] [100] [101] [102] Besides distinction from abstract nouns, the concrete categories of animals contrasted to implements respond selectively in the posterior-lateral temporal, and frontal cortex areas across studies. 95 100 Limbic activity, particularly the cingulate, distinguishes emotional words from both abstract and concrete words. 96
Naming pictures of animals, tools, and famous people are discriminated [103] by increased regional blood flow in the left inferior frontal gyrus for animals, premotor area for tools and left middle frontal gyrus for people. [104] Faces activate the right lingual and bilateral fusiform gyri, while the left lateral anterior middle temporal gyrus response differs to famous faces, famous proper names, and common names. [105] Particularly the left anterior temporal cortex responds to names, faces, and buildings when famous relative to non-famous stimuli. 105 [106] Viewing photographs of faces, buildings, and chairs evokes activity distributed across several cortical areas, which are each locally different in both the visual ventral temporal 79 and occipital cortices. [107] Photograph perception of these same categories has more hemispheric lateralization and activation than non-perceptual imagery, [108] while short term memory face imagery activity is stronger than that of long term memory. [109]
More advanced fMRI techniques discriminate further word or object classes. In a high resolution fMRI limited brain cross section study, the activity differs for animals, furniture, fruit, or tools in discrete sites of the left lateral frontal and 3 separate medial temporal cortex loci respectively. [110] The application of artificial intelligence to fMRI pattern distinguishes between 12 noun categories (fish, four legged animals, trees, flowers, fruits, vegetables, family members, occupations, tools, kitchen items, dwellings, and building parts). [111] Finally are the reports of discriminating the viewing of 7 13 14 and 10 15 different ‘categories’ so discrete as to require an adjective for distinction as previously discussed.
Some cognitive functions are related to or partly dependent on language. Letters activate the left insula more than objects and exclusively activate the left inferior parietal cortex. [112] Letters also activate an area in the left ventral visual cortex more than digits in most subjects. [113] [114] Brain activations of mathematical thinking are partly dependent on language. [115] Subtraction activates bilaterally the anterior intraparietal sulcus and a phoneme area in the intraparietal sulcus mesial to the angular gyrus, selectively from simple motor tasks. [116] Number comparison activates right hemisphere intraparietal and prefrontal areas, while multiplication localizes more to the left hemisphere. [117]
ELECTROENCEPHALOGRAM DISCRIMINATION OF OTHER COGNITIVE STATES
Other literature indicates EEG differentiation of completely non-verbal cognition. Greater left prefrontal activity predicts positive affect, while greater right prefrontal activity predicts negative disposition in psychological testing. [118] However, the stability of hemispheric activation is important for such a trait characteristic, [119] and more transient mood states have exactly the opposite arousal symmetry. [120] Decreased left prefrontal activity is also found in depression, [121] [122] and the anxiety situations of social phobics. [123] Patented is more specific attitude, mood, and emotion differentiation, by plotting at least two and as many as five EEG frequencies, with reference to Air Force research. [124] EEG patterns discriminate relative misanthropy and philanthropy in facial preferences, and favorable or negative responses to faces, [125] while waveform topography identifies sad face perception. [126] Another EEG emotion indicator is the stimulus-preceding negativity (SPN). Although slight SPNs can precede instruction cues, this wave is most pronounced while awaiting performance assessment and reward or aversive feedback. [127] [128] [129] [130]
A number of groups have developed procedures to detect deception based on the P300 (positive @ 300 millisec) event related potential (ERP) from EEG. [131] [132] [133] [134] [135] [136] A commercial system, Brain Fingerprinting, [137] which includes additional frequency analysis, particularly a late negative ERP potential, cites 100% accuracy over five separate studies. [138] [139] [140] [141] [142] Though most EEG deception detection concerns situation specific knowledge, a late positive potential approximate to the P300, is reported to vary as a function of real attitude rather than attitude report. [143]
BRAIN COMPUTER INTERFACES
EEG cortical potentials are detected for both actual movement, [144] and movement readiness potentials (bereitschaftspotential). [145] [146] EEG sufficiently differentiates just the imagination of movement to operate switches, [147] move a cursor in one [148] or two dimensions, [149] control prosthesis grasp, [150] and guide wheel chairs left or right [151] in a prompted manner. EEG detects such potentials to play Pac Man, [152] and imagining the spinning of cubes, or arm raising in appropriate direction guides robots through simulated rooms, [153] [154] [155] both achieved without response prompting. Unprompted slow cortical potentials also can turn on computer programs. [156] Signals from implanted brain electrodes in monkeys achieve even more complex grasping and reaching robot arm control without body arm movement. [157] Some ability to recognize evoked responses to numbers [158] and tones [159] in real time by a commercial system called BrainScope has limited report.
REMOTE AND PROXIMATE BRAIN WAVE CAPTURE METHODS
EEG is typically recorded with contact electrodes with conductive paste, while MEG detectors are in an array slightly removed from the head. Remote detection of brain rhythms by electrical impedance sensors is described. [160] Though non-contact is the only remote descriptor for EEG, this same detector design is applied to monitoring electrocardiogram with wrist sensor location. [161] Passive brain wave fields extend as far as 12 feet from man as detected by a cryogenic antenna. [162] This device is entirely adaptable to clandestine applications, and pointed comments are made on the disappearance of physiological remote sensing literature since the 1970’s for animals and humans, while all other categories of remote sensing research greatly expanded. [163]
In 1976, the Malech Patent # 3951134 “Apparatus and method for remotely monitoring and altering brain waves” was granted. [164] Example of operation is at 100 and 210 MHz, which are frequencies penetrating obstruction. [165] “The individual components of the system for monitoring and controlling brain wave activity may be of conventional type commonly employed in radar”; and “The system permits medical diagnosis of patients, inaccessible to physicians, from remote stations” are quotes indicating remote capacity. License is to Dorne & Margolin Inc., but now protection is expired with public domain. The Malech patent utilizes interference of 210 and 100 MHz frequencies resulting in a 110 MHz return signal, which is demodulated to give EEG waveform.
The capability of remote EEG is predicted by electromagnetic scattering theory using ultrashort pulses, [166] which is different from the unpulsed Malech patent. Ultrashort pulses are currently defined in the range of 10-12 to 10-15 second. Considering that EEG word elicited potentials are comparatively long (hundreds of milliseconds), indicates that remote radar brain wave capture is adequate to word recognition, with ultrashort pulses allowing some 109 or more radar reflections in a millisecond (10-3 sec.)
THOUGHT READING COVERT DEVELOPMENT EVIDENCE
The research arm of agencies with missions to covertly acquire information would certainly develop to operational capability any thought reading potential, which was reported feasible 30 years ago to the Department of Advanced Research Projects Agency (DARPA). Reports that such development has progressed are multiple, and two are confirmed by details of the 1975 DARPA EEG specific word recognition report, which itself is evidence of development covert to open databases. 2 An International Committee of the Red Cross Symposium synopsis states EEG computer mind reading development by Lawrence Pinneo in 1974 at Stanford. [167] A letter by the Department of Defense Assistant General Counsel for Manpower, Health, and Public Affairs, Robert L. Gilliat affirmed brain wave reading by the Advanced Research Projects Agency in 1976, [168] the same year as Malech remote EEG patent grant. Such a capacity would be unlikely to neglected by DARPA in the 22 years between the current confirmations and the Pinneo report.
In fact news reports assert such development. Articles quote Dr. John Norseen of Lockheed Martin Aeronautics that thought reading is possible and has had development.[169] [170] He predicted by 2005 the deployment of thought reading detectors for profiling terrorists at airports. 170 A further acknowledgement of developing a device to read terrorists’ minds at airports was made in a NASA presentation to Northwest Airlines security specialists. [171] Statements in all articles indicate remoteness of brain wave detection, though somewhat proximate.
“Thought reading or synthetic telepathy” communications technology procurement is considered in a 1993 Jane’s [g] Special Operations Forces (SOF) article: “One day, SOF commandos may be capable of communicating through thought processes.” [172] Descriptive terms are “mental weaponry and psychic warfare” Although contemplated in future context, implied is availability of a technology with limited mobility, since troop deployment anticipation must assume prior development. Victim complaints that mind reading is part of an assault upon them are very similar to such a capacity. Other complaints by these victims, such as technologic internal voice assault are upheld by considerable documentation that internal voice transmission is feasible, even at a distance and within structures, 165 and a presumptive diagnosis of such complaints is largely consistent with microwave exposure [173]--a basis for both internal voice and EEG capture technologies.
DISCUSSION
There is considerable confirmation of an ability to recognize specific concepts by brain activity across subjects. Identifying visual images viewed by a subject solely by measures of mental activity is replicated across five groups by two methods, with best recognition rates of 100%. Three groups report success in visually viewed word identification by brain waves in two methods with best recognition rates of 75%. Isolated groups report EEG word recognition by auditory perception and prior to vocalization, with best results of 100% for auditory perception and 35% for vocalization. Although single reports examine lesser vocabularies, over all open studies of thought recognition, some 80 words have been examined. In all, seven groups have reported some level of specific concept recognition by EEG, MEG, or fMRI. Word category distinctions would be expected from such individual differences. EEG, MEG, PET, or fMRI techniques discriminate some 42 word class or dimension distinctions, many of which would survive separate direct comparison just by reported results.
The finding that words can be classified by superposition of sine waves suggests an obvious interpretation, when considering word category blood flow activations of cell assemblies. 7 The frequencies resulting from neuron firing rates in the distributed, yet somewhat discrete regions, when interference phase summed and subtracted by arrival from different locations results in word representation in the brain’s language.
Considerable capacity to specifically detect and differentiate mental states is evident from literature reports by EEG. The fact that EEG signals are detected on a voluntary unprompted basis for turning on computer programs, 156 playing Pac Man, 152 and robot guidance 153 154 155 suggests the feasibility of a similar capacity for specific EEG concept recognition. Although most concept recognition work is related to stimulus prompted responses, unprompted detection of numbers apparently as a class, has limited report. 158 The references to remote EEG provide plausibly exploitable mechanisms, for which covert development has some indication. Making those more proximate electromagnetic detectors (MEG, cryogenic antenna, or electrical impedance sensor) the focus of a parabolic antenna, would be obvious to remote brain wave detection engineers to extend the range and provide directionality, and is a simple, common design innovation.
The plausibility of thought reading has not completely escaped scientific attention, as a French government panel expresses concern about the potential for thought reading and such a remote capacity. [174] Complete rejection of reports of a remote mind reading capability is just as presumptuous, in the face of complaints, as has been the dismissal of internal voice capacity. 165 News reports of covert thought reading development have confirmation in the Pinneo study, and independent assertions of more proximate thought reading development “against terrorists” affirm each other. Special operations officials consider procurement of a similar remote capacity to that of which many victims complain. Though victims will regard their experience to affirm such a thought reading capability, professional prejudice regards such complaints as defining psychiatric condition. The certain fact is that these claims have had no adequate investigation, and the available evidence questions the routinely egregious denial of civil rights to such individuals. Mind reading development must at least be considered as plausible, even regarding very remote methods.
It is known that government elements have done work in thought reading development. The logic that in the 30 years since the Pinneo work started, this capacity is operationally applied is too sound to dismiss victim corroboration and other evidence, without appropriate investigation. It would have to be admitted that funding for projects by the defense and security agencies is considerably greater than for open science, and that thought reading would be a priority area. Particularly disturbing is the existence of a remote EEG method in the public domain. Educated democracies should not be complacent at any prospect of mind reading, given the potential for privacy loss, civil rights violation, and political control.
Acknowledgements: Thanks are given to God for inspiration and guidance as well as Mr. John Allman, Secretary of Christians Against Mental Slavery for invaluable materials and support (website http://www.slavery.org.uk/ ).
EEG concept recognition articles are printable thru Pubmed as designated.
All patents are printable from the U. S. Patent Office website.
Each is free
REFERENCES
[a] This article has been partly supported by substantial financial contributions from Christians Against Mental Slavery http://www.slavery.org.uk
[b] Pinneo’s report does not include all experiments reported to the Defense Advanced Research Projects Agency in the six annual reports over the 3 year contract.
[c] Over the experiments presented by the report, chance would be from 6.5 to 14% depending on the size of tested vocabulary.
[d] Suppes points out that this may have been due to increased averaging per se.
[e] Though apparently only single electrodes or pairs were utilized for prediction, the best recognition rates were not always from the same electrode of pair.
[f] Almost half of the Pinneo report is devoted to resolving such confounds.
[g] Jane’s is the most respected and authoritative of defense reporting services.
[1] The Bible Job 42: 2, Psalms 139: 2, 94: 11, I Chronicles 28: 9, Isaiah 66: 18.
[2] Pinneo LR and Hall DJ. “Feasibility Study for Design of a Biocybernetic Communication System” Report #ADA017405 National Technical Information Service (NTIS), 1975. Prepared for the Advanced Research Projects Agency Order #2034, Program Code #2D20, Contractor: Stanford Research Institute Contract dates: 2/9/72-8/31/76, SRI Project LSU-1936. (US cost ~$50.) Available at http://www.slavery.org.uk/Pinneo.doc and http://www.sysos.co.uk/Pinneo.doc
[3] Suppes P, Lu Z, and Han B. “Brain wave recognition of words” Proc Natl Acad Sci 94: 14965-69, 1997. Printable free online thru Pubmed or at http://www.pnas.org/cgi/content/full/94/26/14965
[4] Suppes P, Han B, and Lu Z. “Brain-wave recognition of sentences” Proc Natl Acad Sci 95: 15861-66, 1998. Printable free online thru Pubmed or at http://www.pnas.org/cgi/content/full/95/26/15861
[5] Suppes P, Han B, Epelboim J, and Lu Z. “Invariance of brain-wave representations of simple visual images and their names” Proc Natl Acad Sci 96: 14658-63, 1999. Printable free online thru Pubmed or at http://www.pnas.org/cgi/content/full/96/25/14658
[6] Suppes P, Han B, Epelboim J, and Lu ZL. “Invariance between subjects of brain wave representations of language” Proc Natl Acad Sci 96(22): 12953-8, 1999. Printable free online thru PubMed or at http://www.pnas.org/cgi/content/full/96/22/12953
[7] Suppes P and Han B. “Brain-wave representation of words by superposition of a few sine waves” Proc Natl Acad Sci 97: 8738-43, 2000. Printable free online thru Pubmed or at http://www.pnas.org/cgi/content/full/97/15/8738
[8] Kim M-J, Shin S-C, Song Y, and Ryu CS. “Yes/No Discrimination With Spatio-Temporal Characteristics of EEG” 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society Oct 25-28, Istanbul Turkey. Obtained from the Storming Media Pentagon collection of technical papers at a cost of ~$10. Abstract at http://www.stormingmedia.us/30/3012/A301214.html?searchTerms=~Kim~Shin~Song~Ryu The paper was apparently collected from the conference with the Koreans not under Pentagon contract. Entire conference is #ADM001351 on cd-rom.
[9] Assadullahi R and Pulvermuller F. “Neural Network Classification of Word Evoked Neuromagnetic Brain Activity” In: Wermter S, Austin J, and Willahaw D (eds.) Lecture Notes in Artificial Intelligence: Emergent Neurocomputational Architechures Based on Neuroscience Heidelberg Springer, p 311-20, 2001. More limited preliminary communication at http://www.his.sunderland.ac.uk/durhamab/ramin.doc
[10] Selden G. “Machines That Read Minds” Sci Digest Oct 89: 60-6, 1981. Also at http://www.datafilter.com/mc/machinesThatReadMinds.html
[11] Kiyuna T, Tanigawa T, and Yamazaki T. Patent #5785653 “System and method for predicting internal condition of live body” USPTO granted 7/28/98.
[12] Mardirossian A. Patent #6011991 “Communication system and method including brain wave analysis and/or use of brain activity” USPTO granted 1/4/00.
[13] Haxby JV, Gobbini MI, Furey ML, Ishai A, Schonten JL, and Pietrini P. “Distributed and Overlapping Representations of Faces and Objects in Ventral Temporal Cortex” Science 293(5529): 2425-30, 2001.
[14] Spiridon M, and Kanwisher N. “How Distributed is Visual Category Information in Human Occipito-Temporal Cortex? An fMRI Study” Neuron 35: 1157-1165, 2002.
[15] Cox DD, and Savoy RL. “Functional magnetic resonance imaging (fMRI) “brain reading”: detecting and classifying distributed patterns of fMRI activity in human visual cortex” Neuroimage 19: 261-70, 2003.
[16] Vandenberghe R, Price C, Wise R, Josephs O, and Frackowiak RSJ. “Functional anatomy of a common semantic system for words and pictures” Nature 383: 354-6, 1996.
[17] Chao LL, Haxby JV, and Martin A. “Attribute-based neural substrates in temporal cortex for perceiving and knowing about objects” Nature Neurosci 2(10): 913-9, 1999.
[18] Moore CJ and Price CJ. “Three Distinct Ventral Occipitotemporal Regions for Reading and Object Naming” NeuroImage 10: 181-92, 1999.
[19] Le Clec’H G, Dehaene S, Cohen L, Mehler E, Dupoux E, Poline JB, Lehericy S, van de Moortele PF, and Le Bihan D. “Distinct Cortical Areas for Names of Numbers and Body Parts Independent of Language and Input modality” NeuroImage 12: 381-91, 2000.
[20] Chee MWL, O’Craven KM, Bergida R, Rosen BR, and Savoy RL. “Auditory and Visual Word Processing Studied With fMRI” Hum Brain Mapp 7: 15-28, 1999.
[21] Isselbacher KJ, Adams RD, Brunwald E, Petersdorf RG, and Wilson JD (eds.) Harrison’s Principles of Internal Medicine 9th edition, McGraw-Hill, p 141-2, 1980.
[22] Warrington EK and Shallice T. “Category specific semantic impairments” Brain 107(Pt 3): 829-54, 1984.
[23] Damasio H, Grabowski TJ, Tranel D, Hichwa RD, and Damasio AR. “A neural basis for lexical retrieval” Nature 380: 499-505, 1996.
[24] De Renzi E. “Disorders of Visual Recognition” Semin Neurol 20(4): 479-85, 2000.
[25] Pulvermuller F. “Words in the brain’s language” Behav Brain Sci 22: 253-336, 1999.
[26] Khader P and Rosler F. “EEG power and coherence analysis of visually presented nouns and verbs reveals left frontal processing differences” Neurosci Lett 354: 111-14, 2004.
[27] Preissl H, Pulvermuller F, Lutzenberger W, and Birbaumer N. “Evoked potentials distinguish between nouns and verbs” Neurosci Lett 197: 81-3, 1995.
[28] Kellenbach ML, Wijers AA, Hovius M, Mulder J, and Mulder G. “Neural Differentiation of Lexico-Syntactic Categories or Semantic Features? Event-Related Potential Evidence for Both” J Cog Neurosci 14(4): 561-77, 2002.
[29] Khader P, Scherag A, Streb J, and Rosler F. “Differences between noun and verb processing in minimal phrase context: a semantic priming study using event-related brain potentials” Cogn Brain Res 17: 293-313, 2003.
[30] Federmeier KD, Segal JB, Lombrozo T, and Kutas M. “Brain responses to nouns, verbs and class-ambiguous words in context” Brain 123(12): 2552-66, 2000. Also at http://www.ncbi.nlm.nih.gov/entrez/utils/fref.cgi?http://brain.oupjournals.org/cgi/pmidlookup?view=full&pmid=11099456
[31] Brown WS, Lehmann D, and Marsh JT. “Linguistic Meaning Related Differences in Evoked Potential Topography: English, Swiss-German, and Imagined” Brain Lang 11: 340-53, 1980.
[32] Pulvermuller F, Mohr B, and Schleichert H. “Semantic or lexico-syntactic factors: what determines word-class specific activity in the human brain?” Neurosci Lett 275: 81-4, 1999.
[33] Pulvermuller F, Lutzenberger W, and Preissl H. “Nouns and Verbs in the Intact Brain: Evidence from Event-retlated Potentials and High-frequency Cortical Responses” Cerebral Cortex 9(5): 497-506, 1999. Also at http://www.ncbi.nlm.nih.gov/entrez/utils/fref.cgi?http://cercor.oupjournals.org/cgi/pmidlookup?view=full&pmid=10450894
[34] Pulvermuller F, Preissl H, Lutzenberger W, and Birbaumer N. “Brain Rhythms of Language: Nouns Versus Verbs” Eur J Neurosci 8: 917-41, 1996.
[35] Pulvermuller F, Harle M, and Hummel F. “Walking or Talking? Behavioral and Neruophysiological Correlates of Action Verb Processing” Brain Lang 78: 143-68, 2001.
[36] Pulvermuller F, Harle M, and Hummel F. “Neurophysiological distinction of verb categories” Cog Neurosci 11(12): 2789-93, 2000.
[37] Lavric A, Pizzagalli D, Forstmeier S, and Rippon G. “A double dissociation of English past-tense production revealed by event-related potentials and low-resolution electromagnetic tomography (LORETA)” Clin Neurophysiol 112: 1833-1849, 2001.
[38] Dhond RP, Marinkovic K, Dale AM, Wotzel T, and Halgren E. “Spatiotemporal maps of past-tense verb inflection” Neuroimage 19: 91-100, 2003.
[39] Weyerts H, Munte TF, Smid HGOM, and Heinze H-J. “Mental representations of morphologically complex words: and event-related potential study with adult humans” Neurosci Lett 206: 125-8, 1996.
[40] Munte TF, Say T, Clahsen H, Schlitz K, and Kutas M. “Decomposition of morphologically complex words in English: evidence from event-related potentials” Cogn Brain Res 7: 241-53, 1999.
[41] Weyerts H, Penke M, Dohrn U, Clahsen H, and Munte TF. “Brain potentials indicate differences between regular and irregular German plurals” Neuroreport 8(4): 957-62, 1997.
[42] Penke M, Weyerts H, Gross M, Zander E, Munte TF, and Clahsen H. “How the brain processes complex words: an event-related potential study of German verb inflection” Cogn Brain Res 6: 37-52, 1997.
[43] Gross M, Say T, Kleingers M, Clahsen H, and Munte TF. “Human brain potentials to violations in morphologically complex Italian words” Neurosci Lett 241: 83-6, 1998.
[44] Kounios J and Holcomb PJ. “Concreteness Effects in Semantic Processing: ERP Evidence Supporting Dual-Coding Theory” J Exp Psychol 20(4): 804-23, 1994.
[45] West CW and Holcomb PJ. “Imaginal, Semantic, and Surface-Level Processing of Concrete and Abstract Words: An Electrophysiological Investigation” J Cogn Neurosci 12: 1024-37, 2000.
[46] Weiss S and Rappelsberger P. “EEG coherence within the 13-18 band as a correlate of a distinct lexical organization of concrete and abstract nouns in humans” Neurosci Lett 209: 17-20, 1996.
[47] Schack B, Weiss S, and Rappelsberger P. “Cerebral Information Transfer During Word Processing: Where and When Does It Occur and How Fast is it?” Hum Brain Mapp 19: 18-36, 2003.
[48] Weiss S and Rappelsberger P. “Left Frontal EEG Coherence Reflects Modality Independent Language Processes” Brain Topogr 11(1): 33-42, 1998.
[49] Weiss S, Muller HM, and Rappelsberger P. “Theta synchronization predicts efficient memory encoding of concrete and abstract nouns” NeuroReport 11(11): 2357-61, 2000.
[50] Neville HJ, Mills D, and Lawson DS. “Fractionating Language: Different Neural Subsystems with Different Sensitive Periods” Cerebral Cortex 2: 244-58, 1992.
[51] Munte TF, Wieringa BM, Weyerts H, Szentkuti A, Matzke M, and Johannes S. “Differences in brain potentials to open and closed class words: class and frequency effects” Neuropsychologia 39: 91-102, 2001.
[52] Van Petten C and Kutas M. “Influences of semantic and syntactic contex on open- and closed-class words” Mem Cogn 19: 95-112, 1991.
[53] Pulvermuller F, Lutzenberger W, and Birbaumer N. “Electrocortical distinction of vocabulary types” Electroenceph Clin Neurophysiol 94: 357-70, 1995.
[54] Mohr B, Pulvermuller F, and Zaidel E. “Lexical Decision After Left, Right, and Bilateral Presentation of Function Words, Content Words, and Non-Words: Evidence For Interhemispheric Interaction” Neuropsychologia 32(1): 105-24, 1994.
[55] Pulvermuller F, Assedollahi R, and Ekbert T. “Neuromagnetic evidence for early semantic access in word recognition” J Neurosci 13: 201-5, 2001.
[56] Muller HM and Kutas M. “What’s in a name? Electrophysiological differences between spoken nouns, proper names and one’s own name” Neuroreport 8: 221-5, 1996.
[57] Dehaene S. “Electrophysiological evidence for category-specific word processing” Neuroreport 6: 2153-7, 1995.
[58] Antal A, Keri S, Kovacs G, Janka Z, and Benedek G. “Early and Late components of visual categorization: an evant-related potential study” Cogn Brain Res 9: 117-19, 2000.
[59] Antal A, Keri S, Kovacs G, Liszli P, Janka Z, and Benedek G. “Event-related potentials from a visual categorization task” Brain Res Protocols 7: 131-6, 2001.
[60] Ji J, Porjesz B, and Begleiter H. “ERP components in category matching tasks” Electroencephalogr Clin Neurophysiol 108: 380-9, 1998.
[61] Thorpe S, Fize D, and Marlot C. “Speed of processing in the human visual system” Nature 381: 520-2, 1996.
[62] Kiefer M. “Perceptual and semantic sources of category-specific effects: Event-related potentials during picture and word categorization” Mem Cogn 29(1): 100-116, 2001.
[63] Skrandies W and Chiu MJ. “Dimensions of affective meaning – behavioral evoked potential correlates in Chinese subjects” Neurosci Lett 341: 45-8, 2003.
[64] Skrandies W. “Evoked potential correlates of semantic meaning—A brain mapping study” Cog Brain Res 6: 175-183, 1998.
[65] Schapkin SA, Gusev AN, and Kuhl J. “Categorization of unilaterally presented emotional words: an ERP analysis” Acta Neurobiol Exp 60: 17-28, 2000.
[66] Dietrich DE, Waller C, Johannes S, Wieringa BM, Emrich HM, and Munte TF. “Differential Effects of Emotional Content on Event-Related Potentials in Word Recognition Memory” Neuropsychobiol 43: 96-101, 2001.
[67] Krause CM, Korpilahti P, Porn B, Joskim J, and Lang HA. “Automatic auditory word perception as measured by 40 Hz EEG responses” Electroencephal Clin Neurophysiol 107: 84-7, 1998.
[68] Diesch E, Biermann S, and Luce T. “The magnetic mismatch field elicited by words and phonological non-words” Neuroreport 9(3): 455-60, 1998.
[69] Lutzenberger W, Pulvermuller F, and Birbaumer N. “Words and pseudowords elicit distinct patterns of 30-Hz EEG responses” Neurosci Lett 176: 115-18, 1994.
[70] Kiefer M. “Perceptual and semantic sources of category-specific effects: Event-related potentials during picture and word categorization” Mem Cog 29(1): 100-16, 2001.
[71] Assadollahi R and Pulvermuller F. “Neuromagnetic evidence for early access to cognitive representations” Cog Neurosci Neurophysiol 12(2): 207-13, 2001.
[72] Steinhauer K. “Electrophysiological correlates of prosody and punctuation” Brain Lang 86: 142-164, 2003.
[73] Lange JJ, Wijers AA, Mulder LJM, and Mulder G. “Color selection and location selection in ERPs: differences, similarities and ‘neural specificity’” Biol Psychology 48: 53-82, 1998.
[74] Mouchetant-Rostaing Y, Girad M-H, Benlin S, Aguera P-E, and Pernier J. “Neurophysiological correlatres of face gender processing in humans” Eur J Neurosci 12: 303-12, 2000.
[75] Aguirre GK, Zarahn E, and D’Esposito M. “An Area within the Human Ventral Cortex Sensitive to “Building” Stimuli: Evidence and Implications” Neuron 21: 373-83, 1998.
[76] Epstein R, Harris A, Stanley D, and Kanwisher N. “The Parahippocampal Place Area: Recognition, Navigation, or Encoding?” Neuron 23: 115-25, 1999.
[77] Kanwisher N, McDermott J, and Chun MM. “The Fusiform Face Area: A Module in Human Extrastriate Cortex Specialized for Face Perception” J Neurosci 17(11): 4302-11, 1997.
[78] Gauthier I, Scudlarski P, Gore JC, and Anderson AW. “Expertise for cars and birds recruits brain areas involved in face recognition” Nature Neurosci 3(2): 191-7, 2000.
[79] Ishai A, Ungerleider LG, Martin A, Schouten JL, and Haxby JV. “Distributed representations of objects in the human ventral visual pathway” Proc Natl Acad Sci 96(16): 9379-84, 1999. Also at http://www.ncbi.nlm.nih.gov/entrez/utils/fref.cgi?http://www.pnas.org/cgi/pmidlookup?view=full&pmid=10430951
[80] Martin A. “Functional Neuroimaging of Semantic Memory” In: Cabeza R and Kingstone A (eds.) Handbook of Functional Neuroimaging of Cognition MIT Press, Cambridge, Mass p 153-86, 2001.
[81] Martin A and Chao LL. “Semantic memory and the brain: structure and processes” Curr Opinion Neurobiol 11(2): 194-201, 2001.
[82] Small SL, Hart J, Nguyen T, and Gordon H. “Distributed representations of semantic knowledge in the brain” Brain 118: 441-53, 1995.
[83] Joseph JE. “Functional Neuroimaging Studies of Category Specificity in Object Recognition: A critical review and meta-analysis” Cog Affect Behav Neurosci 1(2): 119-36, 2001.
[84] Warburton E, Wise RJS, Price CJ, Weiller C, Hadar U, Ramsay S, and Frackowiak RSJ. “Noun and verb retrieval by normal subjects. Studies with PET” Brain 119(Pt 1): 159-79, 1996.
[85] Perani D, Cappa SF, Schnur T, Tettamanti M, Collina S, Rosa MM, and Faziol F. “The neural correlates of verb and noun processing: a PET study” Brain 122(12): 2237-44, 1999. Also at http://brain.oupjournals.org/cgi/content/full/122/12/2337
[86] Tyler LK, Russell R, Fadili J, and Moss HE. “The neural representation of nouns and verbs: PET studies” Brain 124(8): 1619-34, 2001.
[87] Beretta A, Campbell C, Carr TH, Huang J, Schmitt LM, Cristianson K, and Cao Y. “An ER-fMRI investigation of morphological inflection in German reveals that the brain makes a distinction between regular and irregular forms” Brain Lang 85: 67-92, 2003.
[88] Jaeger JJ, Lockwood AH, Kemmerer DL, Val Valin RD, Murphy BW, and Khalak HG. “A Positron Emission Tomographic Study of Regular and Irregular Verb Morphology in English” Language 42(3): 451-97, 1996.
[89] Damasio H, Grabowski TJ, Tranel D, Ponto LLB, Hichwa RD, and Damasio AR. “Neural Correlates of Naming Actions and of Naming Spatial Relations” NeuroImage 13: 1053- 64, 2001.
[90] Moscovitch M, Kapur S, Kohler S, and Houle S. “Distnict neural correlates of visual long-term memory for spatial location and object identity: a positron emission tomography study in humans” Proc Natl Acad Sci 92: 3721-5, 1995. Also at http://www.pnas.org/cgi/reprint/92/9/3721.pdf
[91] Emmorey K, Damasio H, McCullough S, Grabowski T, Ponto LLB, Hichwa RD, and Bellugi U. “Neural Systems Underlying Spatial Language in American Sign Language” Neuroimage 17: 812-24, 2002.
[92] Mummery CJ, Patterson K, Hodges JR, and Price CJ. “Functional Neuroanatomy of the Semantic System: Divisible by What?” J Cogn Neurosci 10(6): 766-77, 1998.
[93] Martin A, Haxby JV, Lalonde FM, Wiggs CL, and Ungerleider LG. “Discrete Cortical Regions Associated with Knowledge of Color and Knowledge of Action” Science 270: 102-5, 1995.
[94] Chao LL and Martin A. “Cortical Regions Associated with Perceiving, Naming, and Knowing about Colors” J Cogn Neurosci 11(1): 25-35, 1999.
[95] Kounios J, Koenig P, Glosser G, DeVita C, Dennis K, Moore P, and Grossman M. “Category-specific medial temporal lobe activation and the consolidation of semantic memory: evidence from fMRI” Cogn Brain Res 17: 484-94, 2003.
[96] Beauregard M, Chertkow H, Bub D, Murtha S, Dixon R, and Evans A. “The Neural Substrate for Concrete, Abstract, and Emotional Word Lexica: A Positron Emission Tomographic Study” J Cogn Neurosci 9(4): 441-61, 1997.
[97] Fiebach CJ and Friederici AD. “Processing concrete words: fMRI evidence against a specific right-hemisphere involvement” Neuropsychologia 42: 62-70, 2003.
[98] Kiehl KA, Liddle PF, Smith AM, Mendrek A, Forster BB, and Hare RD. “Neural Pathways Involved in the Processing of Concrete and Abstract Words” Hum Brain Mapp 7: 225-33, 1999.
[99] Mellet E, Tzourio N, Denis M, and Mazoyer B. “Cortical Anatomy of mental imagery of concrete nouns based on their dictionary definition” NeuroReport 9: 803-8, 1998.
[100] Grossman M, Koenig P, DeVita C, Glosser G, Alsop D, Detre J, and Gee J. “The Neural Basis for Category Specific Knowledge: An fMRI Study” NeuroImage 15: 936-48, 2002.
[101] Jessen F, Heun R, Erb M, Granath D-O, Klose U, Papassotiropoulos A, and Grodd W. “The Concreteness Effect: Evidence for Dual Coding and Context Availability” Brain Lang 74: 103-112, 2000.
[102] Grossman M, Koenig P, DeVita C, Glosser G, Alsop D, Debre J, and Gee J. “Neural Representation of Verb Meaning: An fMRI Study” Hum Brain Mapp 15: 124-34, 2002.
[103] Damasio H, Grabowski TJ, Tranel D, Hichwa RD, and Damasio AR. “A neural basis for lexical retrieval” Nature 380: 499-505, 1996.
[104] Grabowski TJ, Damasio H, and Damasio AR. “Premotor and Prefrontal Correlates of Category-Related Lexical Retrieval” NeuroImage 7: 232-43, 1998.
[105] Gorno-Tempini ML, Price CJ, Josephs R, Vandenberghe R, Cappa SF, Kapur N, and Frackowiak RSJ. “The neural systems sustaining face and proper name processing” Brain 121: 2103-18, 1998. Also at http://brain.oupjournals.org/cgi/reprint/121/11/2103.pdf
[106] Gorno-Tempini ML and Price CJ. “Identification of famous faces and buildings: A functional neuroimaging study of semantically unique items” Brain 124: 2087-97, 2001. Also at http://brain.oupjournals.org/cgi/content/full/124/10/2087
[107] Ishai A, Ungerleider LG, Martin A, and Haxby JV. “The Representation of Objects in the Human Occipital and Temporal Cortex” J Cogn Neurosci 12(Supp 3): 35-52, 2000.
[108] Ishai A, Ungerleider LG, and Haxby JV. “Distributed Neural Systems for the Generation of Visual Images” Neuron 28: 979-90, 2000.
[109] Ishai A, Haxby JV, and Ungerleider LG. “Visual Imagery of Famous Faces: Effects of Memory and Attention Revealed by fMRI” NeuroImage 17: 1729-41, 2002.
[110] Spitzer M, Kwong KK, Kennedy W, Rosen BR, and Belliveau JW. “Category-specific brain activation in fMRI during picture naming” NeuroReport 6: 2109-12, 1995.
[111] Mitchell TM, Hutchinson R, Just MA, Niculescu RS, Percira F, and Wang X. “Classifying Instantaneous Cognitive States from fMRI Data” Am Med Informatics Assoc, November, 2003. Also at http://www-2.cs.cmu.edu/~tom/amia2003-final.pdf
[112] Joseph JE, Gathers AD, and Piper GA. “Shared and dissociated cortical regions for object and letter processing” Cogn Brain Res 17: 56-67, 2003.
[113] Polk TA and Farah MJ. “The neural development and orgainization of letter recognition: Evidence from functional neuroimaging, computational modeling, and behavioral studies” Proc Natl Acad Sci 95(3): 847-52, 1998. Also at http://www.pnas.org/cgi/content/full/95/3/847
[114] Polk TA, Stallcup M, Aguirre GK, Alsop DC, D’Esposito M, Detre JA, and Farah MJ. “Neural Specialization for Letter Recognition” J Cogn Neurosci 14(2): 145-59, 2002.
[115] Dehaene S, Spelke E, Pinel P, Stanescu R, and Tsivkin S. “Sources of Mathematical Thinking: Behavioral and Brain-Imaging Evidence” Science 284(5416): 970-4, 1999.
[116] Simon O, Mangin JF, Cohen L, Le Bihan D, and Dehaene S. “Topographical Layout of Hand, Eye, Calculation, and Language-Related Areas in the Human Parietal Lobe” Neuron 33: 475-87, 2002.
[117] Cochin F, Cohen L, van de Moortele PF, and Dehaene S. “Differential Contributions of the Left and Right Inferior Parietal Lobules to Number Processing” J Cogn Neurosci 11: 617-30, 1999.
[118] Davidson RJ. “Affective Style and Affective Disorders: Perspectives form Affective Neuroscience” Cogn Emot 13(3): 307-30, 1998.
[119] Davidson RJ. “Anterior electrophysiological asymmetries, emotion, and depression: Conceptural and methodological conundrums” Psychophysiol 35: 607-14, 1998.
[120] Papousek B and Schulter G. “Covariations of EEG asymmetries and emotional states indicate that activity at frontopolar locations is particularly affected by state factors” Psychophysiol 39: 350-60, 2002.
[121] Tomarkin AJ and Keener AD. “Frontal Brain Asymmetry and Depression: A Self-regulatory Perspective” Cogn Emotion 12(3): 387-420, 1998.
[122] Heller W and Nitschke JB. “The Puzzle of Regional Brain Activity in Depression and Anxiety: The Importance of Subtypes and Comorbidity:” Cogn Emotion 12(3): 421-47, 1998.
[123] Davidson RJ, Marshall JR, and Tomarkin AJ. “While a Phobic Waits: Regional Brain Electrical and Autonomic Activity in Social Phobics during Anticipation of Public Speaking” Biol Psychiatry 47: 85-95, 2000.
[124] Patton RE. Patent #6292688 “Method and apparatus for analyzing neurological response to emotion-inducing stimuli” USPTO granted 9/18/01.
[125] Pizzagalli D, Koenig T, Regard M, and Lehmann D. “Faces and emotions: brain electric field sources during covert emotional processing” Neuropsychologia 36(4): 323-32, 1998.
[126] Eger E, Jedynak A, Iwaki T, and Skrandies W. “Rapid extraction of emotional expression: evidence form evoked potential fields during brief presentation of face stimuli” Neuropsychologia 41: 808-17, 2001.
[127] Bocker KBE, Baas JMP, Kenemans JL, and Verbaten MN. “Stimulus-preceding negativity induced by fear: a manifestation of affective anticipation” Int J Psychophysiol 43: 77-90, 2001.
[128] Kotani Y, Hiraku S, Suda K, and Aihara Y. “Effect of positive and negative emotion on stimulus-preceding negativity prior to feedback stimuli” Psychophysiol 38: 873-78, 2001.
[129] Kotani Y, Kishida S, Hiraku S, Suda K, Ishii M, and Aihara Y. “Effects of information and reward on stimulus-preceding negativity prior to feedback stimuli” Psychophysiol 40(5): 818-28, 2003.
[130] Baas JMP, Kenemans JL, Bocker JL, and Verbaten MN. “Threat-induced cortical processing and startle potentiation” Neuroreport 13(1): 133-7, 2002.
[131] Farwell LA and Donchin E. “The Truth Will Out: Interrogative Polygraphy (“Lie Detection”) With Event-Related Brain Potentials” Psychophysiology 28(5): 531-47, 1991.
[132] Johnson MM and Rosenfeld JP. “Oddball-evoked P300-based method of deception detection in the laboratory II. Utilization of non-selective activation of relevant knowledge” Int J Psychophysiol 12: 289-306, 1992.
[133] Rosenfeld JP, Ellwanger J, and Sweet J. “Detecting simulated amnesia with event-related brain potentials” Int J Psychophysiol 19: 1-11, 1995.
[134] Allen JJB and Iacono WG. “A Comparison of methods for the analysis of event-related potentials in deception detection” Psychophysiology 34: 234-40, 1997.
[135] Lorenz J, Kunze K, and Bromm B. “Differentiation of conversive sensory loss and malingering by P300 in a modified oddball task” Neuroreport 9: 187-91, 1998.
[136] Tardif HP, Barry RJ, and Johnstone SJ. “ Event-related potentials reveal processing differences in honest vs. malingered memory performance” Int J Psychophysiol 46: 147-58, 2002.
[137] Brain Fingerprinting Laboratories, Inc., 108 West Palm Drive, Fairfield, IA 52556 at http://www.brainwavescience.com/
[138] Farwell LA and Smith SS. “Using Brain MERMER Testing to Detect Knowledge Despite Efforts to Conceal” J Forensic Sci 46(1): 135-46, 2001.
[139] Farwell LA. “Two new twists on the truth detector: brain-wave detection of occupational information” Psychophysiology 29(4A): S3, 1992.
[140] Farwell LA. Patent #5363858 “Method and apparatus for multifaceted electroencephalographic response analysis” USPTO granted 11/15/94.
[141] Farwell LA and Conte FL. Patent #5406956 “Method and apparatus for truth detection” USPTO granted 4/18/95.
[142] Farwell LA and Conte FL. Patent #5467777 “Method for electroencephalographic information detection” USPTO granted 11/21/95.
[143] Crites SL, Cacioppo JT, Gardner WL, and Berntson GG. “Bioelectrical Echoes From Evaluative Categoriization: II. A Late Positive Brain Potential That Varies as a Function of Attitude Registration Rather Than Attitude Report” J Person Soc Psychol 68(6): 997-1013, 1995.
[144] Chen R and Hallett M. “The Time Course of Changes in Motor Cortex Excitability Associated with Voluntary Movement” Can J Neurol Sci 26(3): 163-9, 1999.
[145] Deeke L. “Bereitschaftspotential as an indicator of movement preparation in supplementary motor area and motor cortex” Ciba Found Symp 182: 132-231, 1987.
[146] Cui RQ, Lang HW, and Deeke L. “Neuroimage of Voluntary Movement: Topography of the Bereitschaftspotential, a 64-Channel DC Current Source Density Study” NeuroImage 9: 124-34, 1999.
[147] Birch GE. “Initial On-Line Evaluations of the LF-ASD Brain-Computer Interface With Able Bodied and Spinal-Cord Subjects Using Imagined Voluntary Motor Potentials” IEEE Trans Neural Syst Rehabil Eng 10(4): 219-24, 2002.
[148] Penney WD, Roberts SJ, Curran EA, and Stokes MJ. “EEG-Based Communication: A Pattern Recognition Approach” IEEE Trans Rehab Eng (2): 214-15, 2000.
[149] Kostov A and Polak M. “Paralell Man-Machine Training in Development of EEG-Based Cursor Control” IEEE Trans Rehab Eng 8(2): 203-5, 2000.
[150] Guger C, Harkam W, Hertnacs C, and Pfurtscheller G. “Prosthetic Control by an EEG-based Brain-Computer Interface (BCI)” In: Bühler C and Knops H (eds.) Assistive Technology on the Threshold of the new Millennium 2003. Also at http://www.gtec.at/research/Publications/aaate.pdf
[151] Tanaka K, Matsunaga K, and Wang HO. “Electroencephalogram-based Control of an Electric Wheelchair” IEEE Transactions on Robotics 21(4): 762-66, 2005.
[152] Krepki R, Blankertz B, Curio G, and Muller K-R. “The Berlin Brain-Computer Interface (BBCI) towards a new communication channel for online control of multimedia applications and computer games” 9th International Congress on Distributed Multimedia Systems, 2003. At http://ida.first.fhg.de/publications/KreBlaCurMue03.pdf
[153] Millan JR. “Adaptive Brain Interfaces” Communications of the ACM 46(3): 74-80, 2003. Abstract at http://www.idiap.ch/publications/millan-2003-comm-acm.bib.abs.html
[154] Millan JR and Mourifio J. “Asynchronous BCI and Local Neural Classifiers: An Overview of the Adaptive Brain Interface Project” IEEE Transactions on Neural Systems and Rehabilitation Engineering (Brain-Computer Interface Technology) 11(2): 159-61, 2003. Article also at ftp://ftp.idiap.ch/pub/reports/2003/millan_2003_nsre.pdf
[155] Millan JR, Renkens F, Mourifio J, and Gerstner W. “Non-Invasive Brain-Actuated Control of a Mobile Robot” Proceedings of the 18th Joint International Conference on Artificial Intelligence Aug 9-15, in press, 2003. Article also at ftp://ftp.idiap.ch/pub/reports/2003/millan_2003_ijcai.pdf
[156] Kaiser J, Perelmouter J, Iversen IH, Neumann N, Ghanayim N, Hinterberger T, Kubler A, Kotchonbey B, and Birbaumer N. “Self-initiation of EEG-based communication in paralysed patients” Clin Neurophysiol 112: 551-4, 2001.
[157] Carmens JM, Lebedev MA, Crist RE, O’Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriques CS, and Nicolelis MAL. “Learning to Control a Brain-Machine Interface for Reaching and Grasping by Primates” Public Library of Science, Biology Oct 1(1) 2003. At http://www.plosbiology.org/archive/1545-7885/1/2/pdf/10.1371_journal.pbio.0000042-L.pdf
[158] Roscher G, Pogrzeba G, Emde D, and Neubauer. “Application of a Multi-Processor System for Recognition of EEG-Activities in Amplitude, Time and Space in Real Time” In: D’Hollander EH, Joubert GR, Peters FJ, Trottenberg U (Eds.) Parallel Computing: Fundamentals, Applications and New Directions Elsevier Science B. V., p 89-96, 1998. Also at http://www.icsroscher.de/Parco.htm
[159] Roscher G, Herrmann WM, Henning K, Wendt D, Fechner S, Godenschweger F, Weib C, Abel E, Rijhwani A, Martinez J, Karawas A, and Dahan N. “A System to Recognize, Estimate and Describe Single Events in the Spontaneous Electroencephalogram: Example for Single Sweep N1 and P2 Detection” Clinical Applications of Advanced EEG Data Processing Rome, May 8-9 p 47, 1995. At http://www.icsroscher.de/Rom.html
[160] Harland CJ, Clark TD, and Prance RJ. “Remote detection of human electroencephalograms using ultrahigh input impedance electric potential sensors” Appl Physics Lett 81(17): 3284-6, 2002. Abstract at http://content.aip.org/APPLAB/v81/i17/3284_1.html
[161] Harland CJ, Clark TD, and Prance RJ. “High resolution ambulatory electrocardiographic monitoring using wrist mounted electric potential sensors” Meas Sci and Technol 14: 923-8, 2003. Abstract at http://www.iop.org/EJ/abstract/0957-0233/14/7/305
[162] Taff BE and Stoller KP. Patent #49400558 “Cryogenic remote sensing physiograph” USPTO granted 7/10/90.
[163] Stoller KP and Taff BE. “Remote Physiological Sensing: Historical Perspective, Theories and Preliminary Developments” Med Instrum 20(5): 260-5, 1986.
[164] Malech RG. Patent #3951134 “Apparatus and method for remotely monitoring and altering brain waves” USPTO granted 4/20/76.
[165] McMurtrey JJ. “Inner Voice, Target Tracking, and Behavioral Influence Technologies” in press 2004. Accessed 8/11/04 at http://www.slavery.org.uk/InnerVoiceTargTrackBehavInflu.doc
[166] Department of the Army, USAF Scientific Advisory Board. “New World Vistas: air and space for the 21st century” 14 vol. (Ancillary Volume) p 89-90, 1996. Also at http://www.azstarnet.com/~freetht/biologic.htm
[167] Guyatt DG. “Some Aspects of Anti-Personnel Electromagnetic Weapons” International Committee of the Red Cross Symposium: The Medical Profession and the Effects of Weapons, ICRC publication ref. 06681996 (The paper is available from the Health Division of the ICRC.) Also at http://www.mahlers.com/wompaoapew.htm
[168] Brodeur P. The Zapping of America Norton, New York, p 299 & 105, 1977.
[169] Berry S. “Decoding Minds, Foiling Adversaries” Signal Magazine 56(2): 56-7 Oct 2001. Also at http://cartome.org/brainmap.htm
[170] Pasternak D. “Reading your mind-and injecting smart thoughts” U S News and World Report p 67-8, Jan 3, 2000. Also at http://www.raven1.net/norsee2.htm
[171] Murray FJ. “NASA plans to read terrorists’ minds at airports” Washington Times, Aug 17, p A1, 2002.
[172] Lopez R. “Special operations survives Pentagon budget constraints” Jane’s International Defense Review 26(3): 247-51, 1993.
[173] McMurtrey JJ. “Microwave Bioeffect Congruence with Schizophrenia” In press, 2004. Available at http://www.slavery.org.uk/MicrowaveCongruenceSchiz.doc and http://www.grn.es/electropolucio/microwav.rtf
[174] Butler D. “Advances in neuroscience ‘may threaten human rights’” Nature 22 January 391: 316, 1998. Also at http://raven1.net/nature1.htm