Sometimes, these arguments should simply be left for the birds. But, in light of these recent papers and technologies, it is interesting to think about defining telepathy.
Here are some of the definitions I have found:
(1) "Communication between minds by some means other than sensory perception."
(2) "The communication between people of thoughts, feelings, desires, etc, involving mechanisms that cannot be understood in terms of known scientific laws."
(3) "Communication through means other than the senses, as by the exercise of an occult power."
I am quite fond of definition number 3 because it indicates that one must employ magic to use your telepathic abilities. Quite frankly, if you didn't understand some of the neuroscience involved in these pieces of technology, it just might appear to be magic.
A lot of these devices work by translating weak electrical signals from your brain. This is done with a non-invasive technique called electroencephalography (EEG). Traditionally, when Hans Berger invented EEG in 1929 (for a historical review on EEG, click here), this meant you would have your head shaved and electrodes placed along the entire extent of your scalp. Each electrode represents a discrete 'channel', or the electrical signals coming from the brain that are read at that area of your scalp. Now, more sophisticated and modern EEG devices are able to record an EEG with a single channel on your forehead (without shaving your hair off). Although single channel devices are now available, multiple channel devices still have their uses. For example, take a look at the EEG recording caps pictured below:
This type of cap allows you to place these electrodes over your scalp (again, without shaving your hair) and record those weak electrical signals with multiple channels across your entire head. Why not just use one channel? Well, if you did and say there appeared to be electrical activity that suggested epilepsy, a neurologist would not know where this activity came from. So, if you used a cap, like the ones pictured above, a neurologist will have a good idea where this epileptic brain tissue is located. These devices are not only used for medical/research purposes. They have been adapted for some really cool, fun, and sometimes, frightening applications.
Check out this new multi-channel EEG device from Emotiv. Devices like this are running rampant right now. Startup companies and Emotiv's device claims it can help you harness your thoughts via your EEG to express your creative juices to make art, play some computer games designed to respond to their equipment, or help disabled patients by controlling a keyboard or even a wheelchair.Another example of this type of technology is from a company called NeuroVigil. Unlike the previous company's device, this is a single-channel EEG system. The inventor, Philip Low created this device based on a sophisticated but simple EEG algorithm used to analyze the EEG signal. The math of this algorithm takes up less than a page, which resulted in his one-page thesis for his Ph.D. in computational neurobiology at UCSD. Graduating with the shortest thesis in history. Now, as a graduate student working on my PhD in neuroscience, I have two reactions to this: 1) Ugh, one-page?! I don't think my thesis is going to be one page, and 2) That's awesome!! This company received some major publicity last year as NeuroVigil is working with Stephen Hawking to develop a more efficient means of communication for him, as well as the many other patients suffering from amyotrophic lateral sclerosis and of course, other debilitating diseases.
When I read articles talking about these new devices, I get excited. Exhilarated even. I have this reaction because this really is telepathy. Sure, it's machine assisted but still. I can communicate my thoughts and feelings to instruct a device to perform actions through "means other than sensory perception." Are there machine-assisted examples of telepathy between organisms? Yes.Probably the most exciting paper in this field, as well as all of neuroscience so far this year, is the paper entitled, "A Brain-to-Brain Interface for Real-Time Sharing of Sensorimotor Information." Released last month in Scientific Reports, this work was led by Miguel A.L. Nicolelis at Duke and was in collaboration with the Edmond and Lily Safra International Institute of Neuroscience of Natal in Natal, Brazil. In short, two rat brains were linked on two separate continents and shared information in real-time.
One rat learned a behavior where the rat must press a correct lever in order to receive an award. This rat, known as the encoder, learned the behavior and had 32 microelectrodes placed into its primary motor cortex, which were used to record information from populations of neurons involved in learning/performing the behavior. Once the encoder rat would accurately perform the task of pressing the correct lever in order to receive the reward, information collected across multiple trials was averaged and transmitted to a 'decoder' rat. This rat had 4 to 6 microstimulating electrodes implanted into its primary motor cortex. Now, information collected from the encoder rat could be transmitted into the brain of the decoder rat by stimulating the primary motor cortex in the pattern observed from the encoder rat during the accurate performance of the behavior. When the decoder rat pressed the correct lever to receive the award, (based on the information provided by the encoder rat, which is thousands of miles away) the encoder rat received an additional award. This resulted in the two rats performing this behavior as a pair and the encoder rat therefore depended on the decoder rat to perform the task correctly.
The authors performed an additional experiment to determine whether or not this brain-to-brain interface could not only transfer motor information, but tactile information between pairs of rats. In this case, electrodes were implanted into the sensory cortex of the encoder rat, and microstimulating electrodes were implanted into the sensory cortex of the decoder rat. The encoder rat was trained to use its whiskers to determine the width of an aperture. If an aperture was narrow, they learned to nose poke a water port on the left side of the cage. If the aperture was wide, they nose poked the right side of the cage. Again, information from the encoder rat that had accurately learned the behavior had its recorded information from its sensory cortex converted into stimuli that were then applied to the sensory cortex of the decoder rat. The decoder rat then knew to nose poke the left water port if an aperture was narrow (during microstimulation of its sensory cortex) or to nose poke the right side of the cage if the aperture was wider (without the application of stimulation to its sensory cortex).
As the authors discuss, this could have been done if you previously recorded information from the encoder rat then at a later time, applied the stimuli to the decoder rat. They found that based on the behavior they observed, rights worked together if information transfer was made in real time. When the data was transferred between the rats in real time, it appeared as though the rats would try and respond to information received at a faster rate. For example, in the first experiment, the encoder rat would perform the behavior accurately and rapidly in order for the decoder rat to also perform the task quickly and accurately. This was important since this pair relationship allowed the encoder rat to receive an additional reward by cooperating with the decoder rat.
I would love to hear what some of you think about this work. If you're interested in the actual technical research (a lot of which I did not cover here), I highly recommend reading the open access article linked to the paper's title above. When I first read this paper, I immediately was reminded of a scene from the movie 'The Matrix,' where Keanu Reeves opens his eyes after undergoing a machine-asissited learning program and says, "I know Kung Fu." To me, implications of this research are similar to the machine-assisted learning that characters in 'The Matrix' underwent. Although the experiment with these rats was performed in real-time, one person could very well learn a martial art, a new language, or hey, maybe even the skills required to be an electrophysiologist (now wouldn't that be nice?!) simply by downloading the stimuli from the learner.
Working in the greater Washington, D.C. area, you cannot avoid hearing or thinking about security. While these devices may allow us to communicate sensory and motor information to one another, or to devices that can assist the disabled, allow us to fly airplanes with our thoughts, the question naturally arises, are there security concerns of this technology? Well, think about it. With the EEG devices that can allow you to control a keyboard without actually physically typing, what if you are typing a password with the device? Spyware could be embedded into your system that records your EEG signals when you're typing a password, accessing your bank accounts, or while entering your personal information in many potentially compromising settings, all while in the privacy of your own home.
Although these security concerns are alarming, I am more excited by the prospect of these technological advances than I am frightened. As with any major advance like this, there will be the malicious that use this technology for their personal gain, but I think the predominate impact of this research will be for the betterment of society. I would love to hear what some of you think about this research and these technologies. What is most exciting about this technology to you?
Pais-Vieira M, Lebedev M, Kunicki C, Wang J, & Nicolelis MA (2013). A Brain-to-Brain Interface for Real-Time Sharing of Sensorimotor Information. Scientific reports, 3 PMID: 23448946