While cybersecurity is a touchstone for many businesses and individuals, neural security may be the future of privacy when it comes to technology. A new study from NCC Group goes into detail on Brain Computer Interfaces (BCIs), which could pose a significant threat to personal freedom. BCIs present a number of ethical, legal and existential questions about how to make assurances about systems that cannot be controlled, such as the human brain.
Transhumanism is a key topic, outlining the way in which people will merge their minds with the use of technology to upgrade cognitive and physical capabilities. One example of this is seen from Neuralink, which has attempted to make breakthroughs by embedding BCIs directly into the brain. While these devices may have their benefits, as with all technology, it may be susceptible to attack, bringing a very real threat to those who employ these BCIs once they are made available.
SEE: Artificial Intelligence Ethics Policy (TechRepublic Premium)
How do BCIs work?
BCIs fall into three different categories:
- Non-invasive BCIs
- Partially invasive BCIs
- Invasive BCIs
According to NCC Group’s findings, non-invasive BCIs are typically sensors applied to a person’s head or via the use of a helmet or exoskeleton with sensors on them. These devices typically just read data from a person’s brain and are used in a monitoring function. Since this version of BCIs are limited in what they can do, they cannot effectively use higher-frequency signals due to the skull’s resistance. These devices would work similar to an electroencephalogram (EEG).
The partially invasive versions are implanted inside the human skull, but do not rest directly on a person’s brain. Due to this, these BCIs have a lower risk of creating scar tissue, since they only exist within a brain’s gray matter. Using these devices, people have shown the ability to make electronic devices work without the use of their hands via signals transmitted through the sensor.
An invasive BCI example outlined above is Neuralink, and these devices are placed directly under the scalp to transmit signals in and out of the brain. The one major downside to this is the utilization of invasive surgery, and the potential for scar tissue to form around the site, which may lead to potential side effects such as seizures. Neuralink’s chip allows for information to be transmitted via Bluetooth.
A key part of this BCI technology is the way it can be applied to AI and machine learning (ML). Using AI and ML, the tech has been associated with movement based on brain activity to the point where it can then predict future movement based on that same information. Different interfaces can be used by eye tracking, speech recognition, AR/VR, wireless technologies and muscle sensing pads.
The problems with BCIs
The biggest issue with relation to using BCIs are the security threats that may come with using them. Whereas a user of a PC may lose data or the use of their device, the costs of having an implanted BCI hacked are much greater. If a malicious actor were to gain access to a user’s BCI, it could lead to paralyzation, severe brain damage or even potentially a loss of life.
While these risks can be dire, in the short term it warrants an immediate investigation in how to mitigate these potential attacks through various measures. According to the report, “much of the impact of adversarial attacks can be potentially mitigated through thoughtful security by design in the creation of BCIs.”
The problems start from the pre-implantation process and stem all along the way, from ensuring the supply chain remains secure to protecting the wireless technologies such as Bluetooth themselves. Until these issues can be ironed out and regulation can be brought to the forefront, it is unlikely that BCIs will become a household item sooner than later, as many industries are just beginning to scratch the surface of what the tech is capable of and the potential dangers of utilizing it.
SEE: Metaverse cheat sheet: Everything you need to know (free PDF) (TechRepublic)
Can these devices really be the future?
As the research and development phase ramps up on the use of these devices for numerous reasons, it is important to note that the security and safety measures are not yet at an acceptable level. The little regulation on security and safety requirements makes the piece of technology extremely vulnerable and risky in the short term. The key to neuroprivacy will come as the developments come to the BCI devices.
Some of the potential solutions for these issues could be as complicated as maintaining firmware updates to ensure the device has the necessary security updates to simply having a shutdown switch for the BCI device should it be compromised. Either way, the need for resilience is essential, as the dependence on this tech will undoubtedly become a real factor in the decades to come.