Back in 2015, DARPA was experimenting with Brain-Computer Interface (BCI) control of simulated advanced fighter aircraft. By 2018, it was possible for a volunteer with a BCI implant to simultaneously and successfully control up to three fighter aircraft in the simulated environment. And that’s just released information.
A Brain-Computer Interface, sometimes referred to Mind-Machine Interface is in simple words – a system to hook-up the brain (either through implanted or external electrodes) to a computer. The computer in turn analyses electrical signals from the brain and translates these to computer commands. The commands can then be used to trigger computer, machine or robotic actions. Using implants, it is also possible for the computer to provide feedback, where certain sensations (caused by electrical input into the brain) may be interpreted by a person, for required action.
Indecently, BCI research (on primates) was already being carried out with some success, way back in 1969. By 1989, researchers were able to capture and reconstruct movie scenes shown to cats, by using embedded electrodes in the cats’ brains. These electrodes captured and computers decoded the brain’s neuronal firings. Essentially, people could see what the cats were seeing, on their computers. By 2016, researchers were successfully achieving prosthetic finger control, using BCI interfaces. The research and documentation have come so far, that now, there are opensource DIY research-grade BCI EEGs available, along with the requisite computer boards and programs.
Research has obviously come far enough that cutting-edge technology companies have already got into the fray. Brainhacking is no longer limited to universities, government agencies, medicine and individual enthusiasts; it’s going to be where the money is at. With computer AI and machine learning capabilities improving exponentially, people will have to find methods of communicating, interacting and even coexisting with advanced machines. It’ll be BCI for the masses. A brainhacking revolution.
But before we’re all controlling computers, machines, robots and perhaps each other through advanced BCI gear, there’ll be leaps of advancements to areas which are already close to the science.
Medical science would of course be one of the first places where we’d likely see the greatest impact. Especially for the disabled. There have already been tremendous leaps in BCI enabled prosthetics. Going forward, most disabilities may have enabling solutions, as long as a person has a relatively healthy brain. But the science so far has been all about detecting and understanding how neurons function. The leap here is that detecting brain and mental states would become much easier.
Brain-to-brain Communication is a bit complicated, but once the associated machine learning has adequately caught up with everyone’s brain patterns, communication would be readily transmitted between people. Ahead on the communication curve, we could have opt-in and voting functions. Imagine being able to have a voice in important issues. Something close to absolute democracy.
Interactive media will be one of the areas where BCI will rapidly make gains. Computer gaming without the fidgety controllers or complicated consoles. Just imagine being able to prop your feet up, wear your headset and become a part of a game world. Similarly, the movie industry could see a bit of transformation, especially once advanced animation software becomes readily available. Once that’s on the creative table, most people with a story to tell could literally ‘dream up’ a movie.
We’d be likely to see education heavily impacted, simply by getting real-time feedback from individual students on the effectiveness of educational material. Study material could then be dynamically adjusted so that it matches or is closest to the highest attention measurement for the student.
There would of course be numerous other applications. Where do you think this technology would lead to?