A few days ago, as I was thinking of iced coffee, an advert for iced coffee appeared on my iPhone screen. This has happened numerous times, with different thoughts but recently, I realized that it is not mere coincidence. The emergence of neuroscience and neurotechnology in the 21st Century has made it possible for third parties to know what your thoughts are and equally, what your thoughts could be. Neuroscience and neurotechnology have created many avenues, through which brain data or information can be predicted, known, changed or replaced. This advancement in science and technology has its advantages, as well as disadvantages. However, in the face of recent developments, it is obvious that if we do not create adequate regulations, we might just be looking at the gradual demise of one of the safest domains of a human being; the mind.
The Society for Neuroscience (SIN) describes neuroscience as ‘the study of the nervous system, including the brain, spinal cord, and networks of sensory nerve cells called the neurons’. Neurotechnology, on the other hand has been defined as ‘the assembly of methods and instruments that enable a direct connection of technical components with the nervous system. These technical components are electrodes, computers or intelligent prosthesis which are meant to either record signals from the brain and translate them into technical control commands, or to manipulate brain activity by applying electrical or optical stimuli’.
Many inventions in neuroscience and neurotechnology have been highly beneficial in fields such as medicine, psychology, biology, engineering and business. In the medical field, neuroscience and neurotechnology are used in pharmaceuticals, virtual reality treatments, implant technologies and imaging technologies, amongst others. For instance, neurotechnology is used in medical and psychological diagnostics and treatments to determine the efficiency of such treatments for patients. In business, a term called ‘neuromarketing’ has been coined out of the use of imaging techniques such as Electroencephalography (EEG) to monitor and influence consumer behavior. Popular brands such as Airbnb, PepsiCo and Disney have successfully used these techniques and technologies. Other neuroscience and neurotechnology inventions include brain stimulators, lie detectors, functional MRI, brain-wave controlled gaming and neurogadgets. According to neurotechnology experts, the near future would see neurotechnology inventions such as brain-wave passwords, brain-wave text messaging, and mind-controlled robotic exoskeletons.
Generally, inventions in neuroscience and neurotechnology have been described as non-invasive simply because, they do not introduce new matter into the human body. However, this is far-fetched. As correctly stated by psychologist and academic, Nick J Davis, the use of the term ‘non-invasive’ is rather misleading. In his article titled The Regulation of Consumer tDCS: Engaging a Community of Creative Self-Experimenters, he stated, ‘I believe that however the stimulation is delivered, directly influencing brain physiology is an invasion of some sort’.
The truth is that many of these inventions if abused, can forcibly allow access to the brain, and alter or control behavior. These inventions, if unregulated can create the possibility of the leakage of brain data or brain information to third parties. In fact, if unrestrained, they may constitute one of the worst infringements on the privacy and other rights of the human being. For example, for the past ten years, brain scans have been admitted as valid evidence in Indian courts to determine guilt, deception or otherwise of accused persons. In the case of Aditi Sharma and Anr vs State of Punjab and Ors, the accused was found guilty of murder due to admitted brain scan evidence. This controversial judgement sparked an international debate on the accuracy of brain scanning technologies, and the thin line between legitimacy and violation of privacy. Fortunately, this judgement has since been overturned.
In China, a mind-reading technology is often used to monitor the emotions and abilities of workers, through electronical sensors in hats and helmets. By doing this, the mental and emotional state of employees are directly monitored by their employers. Even though practicing firms claim that this technology assists them in increasing efficiency to maximize profits, the risk of abuse is very prominent. According to Professor Qiao Zhain of the Normal University of Beijing, the lack of regulation on this technology may place employers at an unfair advantage, in terms of privacy and the demanded level of productivity. In the United States, a procedure called transcranial direct current stimulation (tDCS) has been used to boost the mental skills of military personnel. Unfortunately, this procedure has become commercialized and is increasingly used in personal devices, without regard for the risks involved in terms of health and privacy.
Frankly, the major problem is clearly not in the creation of these technologies and techniques but, in the unavailability and inadequacy of regulations, which has led to misuse and abuse, and has created a leeway for further abuse and misuse. In the wake of the recent Facebook, Google Plus and Marriot data scandals, it has been proven that the existing regulations are not adequate to protect our physical and virtual data. So, it is quite obvious that our brain data and information would fare much worse. As rightly observed by lawyer and bioethicist, Nita Farahany, there are no laws that protect us from the risk of third parties having access to our brain information. Anyone, including the government, criminal agents and malicious third parties would have access to our minds with and most importantly, without our consent or through forcefully obtained consent. And so, what happens to our rights as human beings? What happens to our internationally guaranteed fundamental human rights of freedom of thought and conscience, right to mental integrity, freedom from discrimination, right to fair hearing and the principle against self-incrimination? These are important questions, with regard to Ienca and Andorno’s paper.
According to Marcello Ienca and Roberto Andorno, academic researchers at the Institute for Biomedical Ethics, University of Basel and the School of Law, University of Zurich respectively, there is an important nexus that connects neurotechnology and human rights. So, we must carefully consider the legal and ethical implications of neurotechnological advancement, and ask ourselves an important question, ‘is our current human rights frame-work capable of protecting our minds?’ As cited by Ienca and Andorno, specific international treaties including the Universal Declaration on the Human Genome and Human Rights (UDHGHR), the International Declaration on Human Genetic Data (IDHGD) and the Universal Declaration on Bioethics and Human Rights are inadequate safeguards due to their anachronistic nature and the non-specification of neurotechnology. Although, the Universal Declaration of Human Rights (UDHR) and the International Covenant on Civil and Political Rights (ICCPR) provide for the right to privacy as contained in Article 12 and Article 17 respectively, these guarantees are grossly inadequate for the emergence of neurotechnological innovations. Dr. Alta Charo, a professor of law and bioethics has emphasized this in a statement saying, ‘technology innovates faster than the regular system can adapt’.
In their 2017 paper, Towards Human Rights in the Age of Neuroscience and Neurotechnology, Ienca and Andorno, noted that the human mind is ‘a kind of last refuge of personal freedom and self-determination’. They further noted that, ‘with advances in neural engineering, brain imaging and pervasive neurotechnology, the mind might no longer be such an unassailable fortress’. In their paper, Ienca and Andorno have advocated for four new human rights namely, ‘the right to cognitive liberty’, ‘the right to mental privacy’, ‘the right to mental integrity’ and ‘the right to psychological continuity’. These human rights have been designed to meet the demands of emerging neurotechnology and neuroscience which have previously been ignored by existing human rights. According to these authors, the new human rights would ‘give people the right to refuse coercive and invasive neurotechnology, protect the privacy of data collected by neurotechnology and protect the physical and psychological aspects of the mind from damage by the misuse of neurotechnology’.
The right to cognitive liberty advocated by these authors, is synonymous to the right to mental self-determination. According to Bublitz, as cited by Ienca and Andorno, mental self-determination consists in ‘the right of individuals to use emerging neurotechnologies and the protection of individuals from the coercive and unconsented use of these technologies’. Quite similar to the traditional freedom of thought, cognitive liberty protects the right of the individual to determine the use of procedures such as alteration on his/her brain. It protects the human mind from unlawful invasion, and introduces an ethical and legal obligation to promote cognitive liberty. Thus, cognitive liberty means that people can make their own choices regarding their brains, without unwanted or unsolicited interventions. In addition, cognitive liberty includes the right to mental integrity, control of one’s mental life and according to Ienca and Andorno, the right to refuse the coercive use of technology. Existing human rights do not cover these risks and vulnerabilities, and so, the right to cognitive liberty cannot be referred to as an inflated human right.
As stated earlier, the right to privacy is contained in several international treaties and regulations including the Universal Declaration of Human Rights (UDHR) and the 2018 European Union (EU) General Data Protection Regulation (GDPR) regulating the protection of consumer data within Member States of the European Union. Despite this, the existing right to privacy is not sufficient to protect brain data or information. The right to mental privacy, proposed by Ienca and Andorno is an extension of the right to privacy, and is supportive of the right to remain silent and the principle against self-incrimination. It aims to protect persons from the risk of unlawful access to or leak of brain data or information, by or to third parties. For instance, brain records, like DNA and fingerprints could be used as biometrical identification, because it can be traced back to particular individuals. This, if misused can be dangerous. According to Ienca and Andorno, the right to mental privacy is for the protection of ‘any bit of brain information about an individual recorded by a neurodevice and shared across the eco-system’. Surprisingly, the right to mental privacy is deemed not absolute by the authors, as certain derogations are allowed due to public interest. However, likewise physical data, as the European Court on Human Rights in the case of Funke v France ruled, the line must be drawn to prevent self-incrimination.
The right to mental integrity currently exists in international human right law. However, this mostly relates to mental health and so, the right to mental integrity is seen as an extension of physical health. Although, the European Union’s Charter on Fundamental Human Rights provides for the right to mental integrity in respect to medicine and bio-medicine, it does not consider mental integrity, in terms of neurotechnology. The new right to mental integrity, proposed by Ienca and Andorno aims to enhance the current right while catering to the demands of neurotechnological advancement. The term, ‘malicious brain-hacking’, coined by Ienca and Haselager refers to ‘neurocriminal activities that influence directly neural computation in the users of neurodevices in a manner that resembles how computers are hacked in computer crime’. The human brain stands the risk of manipulations, which can control and determine the actions of the brain-hacked person in procedures such as brain-washing and implantation of memories. Ienca and Andorno believe that, ‘for an action to qualify as a threat to mental integrity, it must involve the direct access to and manipulation of neural signaling, be unauthorized and result in physical and psychological harm. Like the right to mental privacy, the right to mental integrity is not absolute. Violations on the right to mental integrity may be excused in public interest.
The right to psychological continuity refers to the right to personal identity, and the right to personality. Neurotechnological procedures such as tDCS and Deep Brain Stimulation (DBS) have the ability to affect people’s perception of their personal identities, inducing personality changes such as aggressiveness and impulsiveness. During these procedures, memories which hold significance to personal identities are removed, altered, added and replaced in individuals. According to Ienca and Andorno, the major question is ‘whether such personality changes induced by neurostimulation or memory manipulating technology could constitute in some circumstances a violation of a basic human right?’ Unfortunately, this is the current situation. It has been reported that many of these procedures are carried out for experimental purposes, on ignorant and sometimes, unwilling victims who suffer acute personality changes. Also, there are concerns over the vulnerability of the human brain to brain-jacking. Defined by Pycroft, and cited by Ienca and Andorno, brain-jacking is the unauthorized use of neurodevices by third parties. Brain-jacking could result in brain information theft, tissue damage, induction of pain and modifications of emotions. The right to psychological continuity would protect individuals from brain-jacking, thereby protecting their personal identities and personalities.
Critics of Ienca and Andorno’s work have argued that existing human rights provide adequate protection, and so, these rights are unnecessary, contributing to human rights inflation and proliferation. In response, Ienca and Andorno have argued that these rights meet the list of criteria for human rights, as posited by Australian international law scholar, Phillip Alston. These requirements, according to the authors state that human rights must ‘reflect a fundamentally important social value’, ‘be consistent but, not merely repetitive of the existing body of human rights law’, ‘be capable of achieving a very high degree of international consensus’ and ‘be sufficiently precise as to give rise to identifiable rights and obligations’. In their paper, Ienca and Andorno have argued that neurotechnology is actively influencing the infosphere and digital infrastructures in our societies. Therefore, we must ensure that our current ethical and legal framework are capable of meeting the demands of these advancements. Other critics have suggested that these new rights are over-complicated and so, cannot be enforced or accurately interpreted. Another raised question is the absoluteness or otherwise of these human rights. Could these rights to mental privacy be violated for judicial reasons? Is the duty to prevent crime greater than the human rights of the individual? Can these rights be violated in order to protect the individual from physical or psychological harm?
As the authors have proposed, these new rights are absolutely necessary, because of the inadequacy of the existing human rights frame-work to protect our minds. The existing rights to privacy, dignity, expression, thought, and so on must not be assumed to adequately cover neurotechnological breaches. These rights, as emphasized earlier are rather simplistic, and are yet to be interpreted by any court of law to protect abuse caused by neurotechnology. Also, it has been argued that some of the concerns of the danger of neurotechnology are unfounded, and ahead of technology itself. However, in this case, prevention is better than cure. A world where anyone can gain access to your thoughts and inactions can be very dangerous. Our world is already troubled enough with political, religious and social difficulties. But, misuse of brain technology would render it worse. Dictatorships and authoritarian governments would use this technology to induce fear, curtail freedoms of citizens, and punish citizens for their opinions and beliefs. Currently, there are speculations about the ongoing abuse of these kinds of technologies by the Chinese and North-Korean authorities. Radicals and extremists, with neurotechnological abuse can inflict more harm on individuals and societies through the ability to control people’s brains and determine what their actions should be. Even mere differences would be revealed in seconds, or minutes. Political opposition, sexual orientation, religious beliefs and other personal opinions/ activities that people seek to hide would no longer be secrets, but could be used to the detriment of individuals. Bioethicist and lawyer, Nita Farahany has emphasized this in her Ted Talk, ‘When Technology Can Read Our Minds, How Will We Protect Our Privacy?’
Tesla, headed by Elon Musk has created a brain interface, which would merge artificial intelligence and the human brain. So, thoughts would be directly transferred to digital devices such as phones and computers, without physical contact. Facebook is already working on this kind of technology, as well. Imagine texting with your brain, without having to type a word. This innovation, while remarkable emphasizes the point which Ienca and Andorno have espoused in their paper. With companies in the infosphere like Facebook and Aadhaar unable to protect billions of consumer information from breaches, and the potential access to this information by malicious third parties, an innovation like Elon Musk’s Neuralink would pose worse vulnerabilities, if unregulated. The future is already here, so we must match up.
The right to privacy may currently exist in international human rights law. However, many of the aforementioned treaties guarding privacy are unfortunately outdated, as many of them were created in the 20th Century, when these neurotechnological advancements had not emerged. Therefore, these new human rights are a welcome proposal. However, there is a need for further research to determine the thin line between these rights, and the derogation clause. The authors have failed to determine within what confines the right to mental privacy and the right to mental integrity can be derogated. This uncertainty could lead to a further perpetration of violations on the human mind, and the entirety of the human being, violating other human rights including but not limited to the right to liberty, the right to fair hearing, freedom of though and the right to the dignity of the human person.
Furthermore, the requirements of the right to mental integrity are unclear, and must be expanded in meaning. It is unclear whether or not the three requirements must be met, before an action qualifies as a violation of the right to mental integrity. Two of the three requirements namely non-authorization and absence of physical/ psychological harm are too stringent, and should be expanded to cover cases where there is no evidence of harm, and in the case of abuse of consent. According to criminal law, the mental element (mens rea) of a crime is established, as long as there is an intention to cause harm to the person. This must be considered in respect of the requirement of physical/psychological harm, because the right to mental integrity seeks to protect the mind from neurocriminal activities. In a situation where an individual is maliciously brainwashed by his/her physician, who has been authorized to carry out neuro-procedures on his/her mind, although without any evidence of harm, these requirements would allow him/her no recourse for justice.
In addition, the authors, despite their laudable proposals have failed to identify the means of implementation for these new human rights. If these new rights are to make any impact, the issue of implementation must be addressed. We must determine whether they should be created into new laws and regulations, or whether existing laws and regulations should be amended to include them. As stated severally, the new human rights as proposed by Ienca and Andorno are remarkable proposals, which are incredibly capable of being adequate guidelines for the use of neuroscience and neurotechnology in the nearest future. But, the loopholes must be addressed and further work is to be done before these proposals can be truly effective. However, this does not undermine the necessity of these four new human rights, in any manner.
Titilope Adedokun is a 400 Level student of the Faculty of Law, University of Lagos. She is interested in international law, international development, human rights, women’s rights and literature.