With the continuing development of artificial intelligence (AI), ethical questions explode and fear continues to escalate as the differences between man and machine dissolve. Unlike AI, a field in which machines are made to act more like man, another field, Brain-Machine Interfaces (BMI) acts to make man more like machine. 

There are several essential neuroscientific concepts important to better understand BMIs. For one, the brain transmits information in a way that is inherently compatible with computerized systems.  Signals are sent between neurons by electrical impulses as a result of the difference in electrical potential from membrane ions (1). These signals, which are sent in response to sensory inputs, are encoded within the brain and can be recalled through short term or long term memory (2). BMIs seek to interject this encoding process and allow bidirectional communication: this may include not only signals picked up from the brain by a device (ie. EEG), but also communication from a machine to the brain.The plasticity of the brain is a significant characteristic that makes such technology plausible. Plasticity, or the moldability, is the way in which our brain adapts to the external environment. Essentially, neurons that fire several times in a particular pattern become easier to fire overtime. Knowing such aspects of the brain sets current developments of neurotechnology in clearer perspective. 

Early neural prosthetics focused on playing with the plasticity of the brain. An early pioneer, Dr. Bach-y-Rita, first proposed the idea of sensory substitution by linking one sensory input to a usually unrelated stimulus (3). One device created by Dr. Bach-y-Rita, for example, used touch to convey light intensity in order to help the blind to "see". This was done by having an external camera detect light which then stimulated electrodes on an array placed on the tongue. Even after 15 minutes, users of the device were able to interpret spatial information (4). Today, patients with particular vision impairments have sight repaired in a similar fashion whereby such an electrode array is placed directly on the visual cortex. The BCI component is the processor and software which converts images into instructions that are then sent to an implanted chip in the brain. In one sense, this is a sort of induced neuroplasticity (5). 

In recent years, companies such as Neuralink are looking to take BCIs into the 21st century. Founded in 2016, a recent summit this past July revealed a bit more about the company’s current progress. The overarching goals of Neuralink is to allow humans to keep up with AI by closing the input and output gap. Currently, we are limited to more rudimentary input methods, like typing or speaking, but with fully integrated interfaces this lag time of communication would be diminished and thereby increase efficiency (6). For instance, if you were in the midst of a conversation and wanted to search something online it would require you to completely divert your attention away to look up such information on your phone. Researchers (separate from Neuralink) have been able to diminish the latency of communication between humans and computers allowing for more seamless information transmission (7). As of now, Neuralink has revealed limited details, but they demonstrated some of their BCI developments for medical conditions (ranging from chemical depression to epilepsy). Most significant is their creation of a flexible polymer electrode array with a robotic probe insertion system. This is unique as most current arrays are stiff and inflexible, which limits their longevity. Not to mention, these conventional probes afflict chronic damage to the brain (8).

While there still remains much research before BCIs can fully integrate with the brain, (independently transmitting and encoding information) current technological developments like  smaller chip sizes, faster computer processing, and quicker data analysis make this goal much more plausible(9). BCI technology still has a long way to go, but there remain many ethical debates. In particular, how does informed consent and privacy come into play? Ultimately, to what ends are we willing to change man in an effort to prevent the infamous Singularity, or Kurzweil’s feared AI takeover?



References

  1. Grabianowski, E. (2007). How Brain-Computer Interfaces Work. Retrieved from https://computer.howstuffworks.com/brain-computer-interface.htm
  2. McLeod, S. (2013). Stages of Memory. Retrieved from https://www.simplypsychology.org/memory.html
  3. Rutkin, A.H. (2013, September 01). Champagne for the Blind: Paul Bach-y-Rita, Neuroscience’s Forgotten Genius. Retrieved from https://cmsw.mit.edu/paul-bach-y-rita-neurosciences-forgotten-genius/
  4. Kendrick, M. (2009, August 13). Tasting the Light: Device Lets the Blind "See" with Their Tongues. Retrieved from https://www.scientificamerican.com/article/device-lets-blind-see-with-tongues/
  5. Mullin E. (2017, September 18). Blind Patients to Test Bionic Eye Brain Implants. Retrieved from https://www.technologyreview.com/s/608844/blind-patients-to-test-bionic-eye-brain-implants/
  6. Rington, D. (2019, July 10). Elon Musk-backed Neuralink to detail its progress on upgrading the brain to keep pace with AI. Retrieved from https://techcrunch.com/2019/07/12/elon-musk-backed-neuralink-to-detail-its-progress-on-upgrading-the-brain-to-keep-pace-with-ai/
  7. Hardesty, L. (2018, April 4). Computer system transcribes words users “speak silently”. Retrieved from http://news.mit.edu/2018/computer-system-transcribes-words-users-speak-silently-0404
  8. Hanson, T.L. (2019, March 14) The “sewing machine” for minimally invasive neural recording. Retreived from https://www.biorxiv.org/content/early/2019/03/14/578542.full.pdf
  9. Miller A. (2019, May 13). The intrinsically linked future for human and Artificial Intelligence interaction. Retrieved from https://journalofbigdata.springeropen.com/articles/10.1186/s40537-019-0202-7