The Future of Computing: How Brain-Computer Interfaces Will Change Our Relationship with Computers
Ever since Elon Muskâs Neuralink showcased the monkey Pager controlling a game with its mind, Brain-Computer Interfaces (BCIs) came to the spotlight.
A brain-computer interface (BCI), also known as a brain-machine interface, mind-machine interface, or neural-control interface, has been around for a while. In fact, the research on BCIs started already in the 1970s at the University of California.
This initial research focused on restoring damaged sight, hearing and movements, and BCI research has made significant progress since then. The focus of the research is on pattern recognition, and together with developments in AI and machine learning, this has enabled researchers to become very good at this. A BCI detects minuscule changes in the energy radiated by the brain when you think, recognising patterns in the brain.
Thanks to Magnetic Resonance Imaging (MRI), we can now understand in extreme detail which parts of the brain light up when thinking about something. This has enabled scientists to read dreams, read our thoughts and know our feelings.
However, an MRI machine is not very portable, is very expensive â an entry-level machine starts at $250.000, and it requires a team of researchers or doctors to operate. Therefore, the race is on to create brain-computer interfaces that are cheap, small and can be operated by anyone. Within this category, there are two types of BCI: internal BCI using delicate cybernetic implants such as developed by Neuralink or external BCI devices such as developed by NextMind.
Neuralink: from pigs to monkeys to humans
Letâs first look at Neuralink, the company that made headlines with the monkey controlling a game of Ping Pong. It is truly amazing to see a monkey playing games just with its thoughts and to be very good at it. I have covered Neuralink before, in one of my recent Tech Journals by The Digital Speaker.
The vision of Neuralink is to develop ultra-high bandwidth brain-machine interfaces to connect humans and computers to survive the upcoming age of AI. Elon Musk believes that our only way to stand a chance when super-intelligent AI arrives is to merge with AI, hence Neuralinkâs mission statement: âif you canât beat them, join themâ.
They have made remarkable progress. In the summer of 2020, they demonstrated three pigs with a surgically implanted Neuralink, recording their everyday brain activities like smelling and moving. Only a year later, they showed Pager playing games with its mind.
Pager had the coin-sized link disc installed in its brain via a surgical robot, connecting thousands of micro threads from the chip to specific neurons to record the brain patterns. After a lot of training, it could play the game simply by thinking about its actions. If this is what a monkey can achieve with an internal BCI, imagine what humans can do when our brains are connected to computers. It also immediately raises a plethora of ethical considerations, which I will cover below.
However, implanting a tiny computer in your brain is not everyoneâs idea of connecting with computers. Fortunately, there are also wearable BCIs on the market that do not require a robot operating on your most precious organ.
NextMind: Your Plug-and-Play BCI
These less invasive options for a Brain-Computer Interface are using an external device that records your brainwaves. These have been around for a long time as well, in the form of EEG caps. Electroencephalography, or EEG, is probably the second-best known technique for recording neural activity, next to the use of an MRI machine.
Using electrodes placed on the scalp, it can record the brainâs electrical activity. While they are cheap to buy â you can find them on Alibaba for just $20 â they are not the most comfortable, or fashionable, to wear. In addition, it requires a lot of software development to make them useful.
This is where NextMind comes in. They have developed a portable, easy to use, plug-and-play device that can measure your brainâs activity using EEG sensors that are comfortable and easy to wear. Disclaimer: I have been given one of these devices to test-drive it.
And it blew my mind. After several attempts to get the technology calibrated and working â testing it in hotel quarantine with a two-year-old around you is not the right environment to be hyper-focused â I was able to play a game with my mind, change TV channels with my mind and use some basic controls on my computer. I was experiencing the future.
At this moment, it requires you to be super focused, without any distraction, for the device to read your intentions accurately. But the fact that it comes as a small and easy to wear device definitely makes up for that.
Using the brain as a mouse is a new muscle that needs practice, like any new technology. The longer I played with it, the better I became at it. According to neuroscientist and founder of NextMind, Sid Kouider, however, this is just the beginning. Within a few decades, we will be able to transmit thoughts telepathically to each other with a device like this.
At the moment, however, the product is only offered as a developer kit, and there are not yet commercially available games or products that incorporate the technology. I am sure that is a matter of time as thousands of developers are working on various applications.
What is available, is a few test applications and games such as Tic Tac Toe, Platformer and an application to type with your thoughts (which currently works similar to how we used to type on old Nokia phones). NextMind is also working on integrating virtual reality, arguing that it would make a VR experience more natural, especially when combined with eye and hand tracking.
There are also some proof of concepts by corporates who are exploring mind control. One is a mind-enabled DJ set developed in collaboration with Absolut. The other concept is developed by Mercedes-Benz, who have incorporated the technology in their VISION AVTR concept car, allowing drives to control dashboard functions with their mind.
Image: VISION AVTR x BCI IAA21
Using the brain-sensing wearable, users can take control of the car simply with their thoughts. They can perform functions on the dashboard, such as selecting entertainment channels and navigating a car in a virtual world, so controlling a car with our mind in the real world remains science fiction.
Of course, NextMind is not the only company working on BCI applications. There are plenty of companies in a variety of industries exploring brain-computer interfaces.
Brain-Computer Interfaces and Healthcare, the Military and Gaming
According to Andrea KĂŒbler, the publisher of the Neurology of Consciousness experiment, a âBCI allows users to directly communicate their intention without any involvement of the motor periphery.â Using a BCI, people can make what they want to happen, happen, without needing to move a muscle and that, of course, has tremendous benefits for healthcare, the military and gaming.
BCI and Healthcare
In a world-first, scientists developed a brain-computer interface that can instantly turn mental handwriting into text on a screen.
The system, designed by BrainGate Consortium, works similarly to other invasive BCI devices. A sensor is implanted into the brain, which monitors specific brain signals, in this case, the ones associated with handwriting.
These signals are then recorded and sent, in real-time, to a computer that displays text on-screen, enabling a locked-in patient to write at a rate of ninety characters per minute, which is a lot faster than using the earlier mentioned project by NextMind (and a lot faster than using your old Nokia), which shows the difference between invasive and non-invasive BCIs.
While this system only works with people who learned to handwrite and were paralysed later in life, it is a dramatic step forward. It proves that brain-computer interfaces have the potential to help so many paralysed patients.
Nathan Copeland, a man who broke his spine in a car accident and is paralysed from the chest down, can personally testify to the potential this technology holds. Nathan is one of the first in history to have a working BCI and regularly tests BCI machinery and software. He also recently challenged Neuralinkâs monkey to a game of pong.
BCI and the Military
Another industry that expects to benefit from BCI is the military because new technologies bring new ways to wage war, and BCI is no different. It may be no surprise that DARPA, the Defence Advanced Research Projects Agency, has been funding various BCI projects since the early 1970s.
In addition, the US Department of Defence University Research Instrumentation Program lists brain-computer interfacing as a critical area of funded research. And that is only what they are telling us. Projects in other areas, China, Russia, the EU, and more, are undoubtedly going on behind closed doors. The main area of interest for the army is telepresence.
Telepresence, the technology of controlling mobile robot agents with your mind from an area of safety, holds the potential to revolutionise the front lines.
In 2013, this tech was already successfully tested in closed laboratory conditions when paralysed individuals used BCI to navigate a robot through a complex obstacle course successfully. The tech utilises virtual reality as a soldierâs eye and, through BCI, allows them to encroach on enemy-territory without risking their life. The technology right now has three applications, relaying orders, medics, and scouting.
âSilent Talkâ is another key DARPA project which lets soldiers communicate silently. Sometimes on the battlefield, especially when stealth is a must, open communication is inadvisable. Right now, they get around that by using hand signals, but with silent talk, hand signals will be a thing of the past, enabling soldiers to talk to each other without making a sound.
In combination with the current HoloLens 2 AR headset that is now being produced for the US military by Microsoft and Airbus, this technology spells a highly digitalised battlefield in the near future.
One where telepresence-controlled robot units controlled from a distant outpost by a team of soldiers hooked up to BCIs, silently relay orders to frontline soldiers wearing Microsoft made AR HoloLens helmets.
BCI and Gaming
If BCIs are suitable for war, they are undoubtedly good for gaming. Fans, programmers, and engineers alike constantly push the gaming industry to the cutting edge of digital performance, and none more than Valve and their CEO Gabe Newell.
Valve has always pushed the boundaries of what is possibleâliterally making leaps and bounds in the digital realm. In 2004, Valve created a revolutionary physics engine, Source, and followed it up with Source 2, an engine that has been lightly used since 2015, but more recently when it was evolved for use in VR headsets on the release of Half-Life Alyx.
But more recently, he shared his vision for BCI in gaming.
In an interview with 1 News, a New Zealand news channel, Newell confirms that Valve is working with OpenBCI headsets to develop open-source software to help game developers better understand what responses they are stimulating in a playerâs brain.
In November 2020, OpenBCI unveiled a headset called Galea, specifically designed to work in unison with Valveâs own VR headset, the Valve Index, and according to Newell, BCI is the future of gaming.
Newell stated that âif youâre a software developer in 2022 who doesnât have one of these in your test lab, youâre making a silly mistake.â
Putting it bluntly, this giant in the game industry does not doubt that, not only will game developers use BCIs to fine-tune their games but that players will be able to experience games in a whole new way. And as such, Valve is in the process of exploring BCI to change the way gamers control and view a game.
According to Mike Ambinder, the experimental psychologist at Valve, Valve aims to move away from games with seventeen buttons as a standard and more towards something more naturalistic. Visual wise, Newell speaks critically of our natural seeing abilities and instead envisions a future where BCIs would beam visuals directly into our heads.
This more direct path would make games much more natural, turning what was once a flat and colourless experience into something richer than we could ever imagine.
The Ethics of Brain-Computer Interfaces
Brain-Computer Interfaces promise us an entirely new way of interacting with computers, machines and eventually with each other. However, with this new way of communication also comes a new potential privacy disaster. After all, our thoughts are currently one of the last truly private domains, and BCI is likely to change this.
If we canât trust companies with our data today, why would that be any different tomorrow? Imagine what can happen when the companies of tomorrow obtain access to a plethora of extremely valuable user data that reveals our true selves, our thoughts? When this is a non-invasive wearable, you at least have complete control when you turn it off, but that control will disappear with any invasive BCI.
According to NextMind, the algorithms used are not learning from a personâs brain waves directly. Instead, they use anonymised data from users who opt-in, and they use that data to improve their algorithms over time. Moreover, any developer using the NextMind SDK and developing applications for the wearable BCI will not have access to raw brain data. Instead, they are provided with the data analysis, which they can use to improve their applications. For now, this seems to work, especially because NextMind does not require any user details or personally identifiable information, so it appears that this anonymised data can indeed remain anonymous.
However, with any other company, especially social networks, developing wearable BCIs, I am fearful of the privacy consequence of using your brain to control a computer. Fortunately, Facebook recently cancelled their mind reading research project. I would certainly not trust Facebook with my brain data; you will get advertising beamed straight to your brain before you know.
However, what is necessary is a discussion on how we want to deal with this precious personal data, whether the data is being created using invasive or non-invasive BCIs. How long can this data be stored, for what purposes, and can it be linked to a username and other data sources? We should be cautious with giving up the last genuinely private domain of our lives.
BCI and our Relationships with Computers
The more we integrate BCIs into our lives, the more it will change our relationship with computers. In the past decades, we have gone from typing to speaking, and soon we might move to thinking to computers. Similar to how Alexa, Siri or Echo have changed our relationship with computers and brands (they decide what is good for you, instead of you choosing), the same will likely happen when we move to telepathy using BCI.
Would we still need physical computers or mobile devices or emails when we can simply send our thoughts to the other side of the world? How would social media change if everyone would âshoutâ their thoughts on a future version of social media instead of Tweeting? How can we block certain messages from entering our brains, and how can we stop certain thoughts from accidentally being sent to your boss?
Exploring the first opportunities with this new technology is undoubtedly cool, but there are many unknowns when it comes to how it will change our relationship with computers and machines. It will require significant research to understand the best way of dealing with Brain-Computer Interfaces, whether invasive or non-invasive.