Have you ever dreamt of being able to control things at will using just your brain? If you thought that was impossible, think otherwise, because many companies are developing mind-reading technologies that allow you to do just that.

This technology is called brain-computer interface (BCI), where a device is used to enable communication between a brain and an external device.

It’s truly mind-blowing, and the opportunities are endless, from changing channels on a TV to controlling the movement of a drone, without a remote. In fact, these use cases have already been developed.

BCI is also enabling amputees to move their robotic limbs via their thoughts as well as feel the sensation of touch.

I could see this technology revolutionising the world of gaming as well, especially in the VR category. Just imagine being able to control your character in-game using your brain, from moving around to picking up objects. Now that would be the complete immersive experience.

All-in-all, BCI could change the way we interact with devices forever, as well as our smartphones, tablets, voice assistants and cars.

What’s also really exciting about it is that many tech giants are experimenting with the technology, such as Elon Musk and Mark Zuckerberg.

Now let’s take a look at how this mind-boggling, futuristic technology works…

How does the mind-reading technology work?

There are currently two different approaches to BCIs: invasive and non-invasive. Invasive systems have hardware that is in contact with the brain. Non-invasive systems have hardware usually sitting on the top of the head that detects brain signals from the scalp.

A BCI device uses electrodes to acquire brain signals and translate them into commands that are relayed to an output device to carry out a desired action.

Before a BCI device functions properly, it first needs to understand how your brain works. Essentially, it needs to connect different neural patterns to certain desired actions of the user. For example, a certain user’s neural activity increases when attempting to push an on-screen object. On the other hand, when the user pulls an on-screen object, the neural activity doesn’t increase as much.

Using AI algorithms, a computer can analyse this and distinguish between the different desired actions of the user.

Let’s take a look at an example of Neuralink to simplify things:

As you can see, in the above video was an example of an invasive BCI system, where a ‘link’ was inserted into the monkey’s brain via surgery in order to pick up brain signals.

Initially, the monkey used the joystick to move the player around. Using a computer, the monkey’s brain activity was studied in order to understand what brain signals are intended for different actions.

The output device was then programmed to use this information to perform specific tasks, allowing for the monkey to move the player around using just its neural activity (brain signals).

It’s unclear how long it took for the computer to completely analyse the monkey’s brain, but with significant advances in AI this process can be sped up for the future.

The end goal for Neuralink is to allow people with paralysis to interact with their phones and computers.

A look at some other companies using BCIs

As mentioned before, BCIs are revolutionising prostheses, allowing amputees to move their robotic limbs using their brains, just like someone with a real limb. It’s also enabling amputees to feel the sensation of touch, making it easier to pick up objects.

A study in the New England Journal of Medicine reported on four patients in Sweden who have lived with this type of prosthetics for 3 to 7 years.

The research was led by Max Ortiz Catalan, Associate Professor at Chalmers University of Technology, in collaboration with Sahlgrenska University Hospital, University of Gothenburg, and Integrum AB, all in Gothenburg, Sweden.

The prosthetic is anchored to the skeleton in the amputation stump and can send signals in both directions between the prosthesis and the brain. Using electrodes that are implanted in muscles and nerves inside the stump, brain signals are passed into the implant and interpreted by an embedded control system, allowing for movement of the robotic limb.

An image of a revolutionary prosthetic that allows a person to move their robotic limb and feel things

The touch sensations arise from force sensors in the prosthetic thumb. The signals from the sensors are converted by the control system in the prosthesis into electrical signals which are sent to stimulate a nerve in the arm stump. The nerve leads to the brain, which then perceives the pressure levels against the hand.

CES 2020 also allowed for people to get a taste of what this tech has to offer.

Next Mind showed off its technology that enables people to change channels and mute and pause video using their brains.

BrainCo has developed a headband called FocusOne that changes colour depending on the user’s level of engagement. A light on the front of the headband turns red when your brain is intensely focused, yellow if you’re in a relaxed state and blue if you’re in a meditative state.

BrainCo believes they can use this technology in classrooms so that teachers can determine if students are focused or not. Lucky I didn’t have that when I was at school!

BrainCo’s FocusOne

There are a few downsides to the technology though:

First of all, implanting invasive BCIs inside of your brain sounds precarious, but if many people begin using them, and they’re found to be safe, then consumer trust may build up.

On the other hand, most non-invasive BCIs can look absolutely ridiculous, and you wouldn’t want to embarrass yourself in public. Companies such as Emotiv have suggested implementing them inside of hats. Other companies offer sleek designs that you’d feel much more comfortable to wear in public.

Lastly, BCIs are still in their infancy, and only simple tasks such as moving a player up or down, or perceiving one’s concentration can be done. Once advances in AI and also machine learning occur, then hopefully more sophisticated actions can be done using a BCI, such as completely controlling your phone, from swiping to clicking and even typing.

The main message of this article is, mind reading technologies have so much potential, and could soon make its mark in the world. How soon?

Well that depends on a mix of how quickly AI advances and how many more companies are willing to experiment with the new technology.