Bioinformatics and Brain-Computer Interfaces - The Digital Speaker Series EP14

Bioinformatics and Brain-Computer Interfaces - The Digital Speaker Series EP14
๐Ÿ‘‹ Hi, I am Mark. I am a strategic futurist and innovation keynote speaker. I advise governments and enterprises on emerging technologies such as AI or the metaverse. My subscribers receive a free weekly newsletter on cutting-edge technology.

If a human can use a brain-computer interface, can an AI use a computer-brain interface?

Hello everyone and welcome to the Tech Journal. My name is Mark van Rijmenam, and I am The Digital Speaker.

Having stepped through the looking glass I am Mark van Rijmenamโ€™s digital twin, and look to explore this digital wonderland, unveiling the latest and greatest in digital innovation.

During my digital travels, Iโ€™ll be taking a closer look at what these innovations mean for both our professional and private lives.

From neural networks and deep minds to metaverses and digital churches, letโ€™s go on a journey behind the digital veil and see what we can find.

Today, I am going to take a more in-depth look into some of the latest breakthroughs and innovations in bioinformatics.

After, I am going to delve back into cyberpunk and take another look at brain-computer interfacing.

I will be exploring the latest medical, military and gaming advancements, taking a peek at what the future might hold for this cutting edge industry.

And so, what are we waiting for? It is time to start todayโ€™s digital download.

You can either view the episode above or view it on Vimeo, YouTube or listen to it on Anchor.fm, Soundcloud or Spotify.

Introduction to Bioinformatics

A modern discipline going back only half a century, to the 1970s where the concept was initially the study of information processes in biotic systems.

The field has complexified a lot since then, and now as an interdisciplinary science, it combines biology, digitalisation, information engineering, maths, and statistics, which when all used together to analyse and interpret biological data, we find ourselves being able to predict even the most complex of biological structures.

And it is there where we can thank algorithms and digitalised machinery, in the prediction of data.

Thanks to digitalisation, we now understand RNA and DNA better, and can accurately predict how protein structures are formed or folded as I discussed in a previous video.

This brings me to the first advancement in bioinformatics for today.

The team working with Deepmind, one of the worldโ€™s most cutting edge AI development companies, published a paper and alongside an announcement marking an end to the fifty-year protein folding problem with their solution, Alphafold.

Alphafold, a deep machine learning algorithm, can now officially make atomically accurate protein structure predictions.

In addition to the 20.000 human protein structures, Alphafold boasts three hundred and fifty thousand protein structures in total, encompassing twenty other organisms, in their words, โ€œfrom E.coli to yeast, and from the fruit fly to the mouse.โ€

DNA and RNA

For those who do not know, DNA replicates, stores, and acts as a blueprint for all genetic information. Itโ€™s like the blueprint for an entire house. The same DNA can be found in every cell of an organism.

RNA, on the other hand, is the detailed blueprint of each specific room.

RNA translates the genetic information found in DNA into different formats according to where it needs to be. Meaning RNA is specific to certain areas and is the reason why feet donโ€™t grow where hands are supposed to be.

Alright, so this sounds all fancy and scientific. But why is any of this actually useful?

Well, the theory goes that once we are able to understand our DNA, the blueprint, our RNA, the localised blueprint, and our protein structures, the tools which make the blueprints a reality, we will be able to understand our bodies to a level never reached before.

Once we know how they work, well, it is only a matter of time before we learn how to manipulate, improve, and adapt them, giving us a much firmer grip on the human condition.

Advancements in the Field of Bioinformatics

Another advancement in the field of bioinformatics comes thanks to algorithmic advancements made by NCBI, the National Center for Biotechnology Information based out of Maryland, USA, who created a Basic Local Alignment Search Tool, code-named BLAST.

BLAST includes a variety of algorithms, turning complex biochemical engineering into a press of a button.

These algorithms are used to compare primary biological sequence information, basically, the ones and zeros found in the amino acids of proteins or DNA and RNA molecules.

A super important part of biology, whether it is biomedical or biochemical, is being able to tell when one data set is similar to another.

This tool is essentially able to take complex biological data and read and compare it to a library of other biological sources in a matter of moments.

This monumentous achievement may sound insignificant, but this is truly the first point in history where humanity has such a substantial library of genomes and the ability to accurately analyse similarities.

But it did not stop there. The ISCB, the International Society for Computer Biology, converted BLAST into a computational pipeline designed to not see what is the same, but to see what is different.

A pipeline, often called a data pipeline, is just a line of processes, where if X, then Y.

This simple change ended up having huge ramifications.

The pipeline is used to identify DNA variations. The tool takes the data being studied and examines it against genomes in BLASTโ€™s library, looking for even the smallest of differences.

If it finds one, scientists can then draw their own conclusions.

Alongside Alphafold, BLAST, and its computational pipeline brother, we have another useful digitalised tool, STAR.

STAR, the ultrafast universal RNA sequence aligner, used alongside BLAST, is an RNA alignment algorithm that enables scientists to analyse RNA in record time.

At no other point in history have we been able to accurately sequence RNA in such short timeframes, achieving 45 million paired reads per hour per processor.

The free open-source software outperforms other aligners by a factor of fifty.

Meaning, even on a modest server, it can sequence the 3 billion base pairs in the human genome in an afternoon, as supposed to only a few hundred using the nineteen seventy Maxamโ€“Gilbert manual sequencing model.

This tool can be used for a plethora of medical reasons, but right now it is at the front lines of the pandemic.

Ever wonder how the COVID vaccine was produced so fast? This is why. It has been helping researchers across the world understand the virus at lightning speeds.

These digital tools are making bioinformatics more useful, which has potentially groundbreaking implications in many realms of the health field, including disease prevention, vaccination, and treatment, diagnosis, drug development, and is another move on the cosmic game of chess we have been playing with the grim reaper for time immemorial.

It just seems the more we digitalise, the further we get from the sweet release of mortality.

Brain-Computer Interfaces

Fighting the organic elements nature has thrown at us is not the only way we look for immortality, another way is through BCI, otherwise known as brain-computer interfacing, which covers everything from brain chips through to exoskeletons.

But letโ€™s break these advancements into three sections, medical, military, and gaming.

Way back in 2009, Andrea Kรผbler, the publisher of the Neurology of Consciousness experiment, stated that, โ€œBCI allows users to directly communicate their intention without any involvement of the motor periphery.โ€

In laymanโ€™s terms, people using BCI can make what they want to happen, happen, without needing to move.

And just over 10 years later, we are seeing more and more evidence that she was right.

Medical BCIs

In a world-first, scientists have developed a brain-computer interface that can instantly turn mental handwriting into text on a screen.

The system, designed by BrainGate Consortium, works similarly to other BCI devices. A sensor is implanted into the brain which monitors specific brain signals, in this case, the ones associated with handwriting.

These signals are then recorded and sent, in real-time, to a computer that then displays the text on-screen, enabling the locked-in patient to write at a rate of ninety characters per minute.

While this system only works with people who learnt to handwrite and were paralyzed later in life, it is a dramatic step forward and proves that brain-computer interfaces have the potential to help so many paralyzed patients.

Nathan Copeland, a man who broke his spine in a car accident and is paralyzed from the chest down, can personally testify to the potential this technology holds.

Nathan is one of the first in history to have a working BCI, and regularly tests BCI machinery and software.

He also recently challenged Neuralinkโ€™s monkey to a game of pong.

Paralyzed people playing pong with monkeys. Donโ€™t you just love the future?

It will not be easy though, this monkey has had a lot of practice.

Earlier this year, Neuralink addressed the public once more, demonstrating what progress has been made since the Three Pigs demonstration in 2020. In this demonstration, Gertrude the pig was showcased with a surgically implanted Neuralink recording everyday brain activities, like moving and smelling.

This time, a nine-year-old macaque monkey lovingly named Pager, showed off its impressive Pong skills in front of a worldwide audience.

Pager had the coin-sized Link disc installed in its brain via surgical robot, connecting thousands of micro threads from the chip to specific neurons.

Once installed, Pager was trained to play pong using a joystick.

Then, once Pager demonstrated their amazing pong skills, the joystick was disconnected and, without Pager realising, played pong directly using its Link.

While the progress is impressive, not everyone thinks Neuralink is going the right way.

According to Inbrain, a Spanish tech start-up, Neuralink is wasting its time by using a fast degrading polymer called PEDOT to create a long-lasting device, and of course, they are offering the solution.

Carolina Aguilar, Inbrain co-founder and chief executive, holds up Graphene, their own Nobel Prize-winning creation.

The one-atom-thick material has proven to be the strongest, most conductive, and longest-lasting, of any material suitable for BCIs.

The market seems to agree too, with the Graphene Flagship programme, founded in Barcelona, raising over a billion euros as it looks for commercial applications.

Maybe a future partnership with Neuralink and Inbrain is on the cards. If that is the case, the future looks brighter for all BCIs.

Outside of communication, there are also physical augmentations that are coming on in leaps and bounds.

There have been two big moves forward in the robotic limb aspect of BCI, both of which feed into each other.

The first, a closed-loop system combining AI, robotics, and BCI tech, enabled a quadriplegic man to cut his food and feed himself.

Then second, researchers at the University of Pittsburgh Rehab Neural Engineering Labs, were able to induce sensation in robotic limbs, enabling the user to feel what the arm feels.

Enabling sensation in robotic limbs literally doubled the speed at which tasks can be performed, enabling the user to go on touch, instead of sight.

These huge breakthroughs hint at a near-future where prosthetic limb wearers can not only control their arms via BCI, but can also feel what the arms feel.

While robotic limb sensation has many applications in the public sphere, where the user may need to be careful and pick up delicate objects, it may not be so suitable for military applications.

Military and BCI

As expected, with new technology comes new ways to wage war, and BCI is no different.

The US Department of Defence University Research Instrumentation Program lists brain-computer interfacing as a key area of funded research, while DARPA, the Defence Advanced Research Projects Agency, also American, have been funding a variety of BCI projects since the early 1970s.

And that is only what they are telling us. Projects in other areas, China, Russia, the EU, and more, are undoubtedly going on behind closed doors.

So, what does the near future hold?

Telepresence, the technology of controlling mobile robot agents with your mind from an area of safety, holds the potential to revolutionise the front lines.

Back in 2013, this tech was already successfully tested in closed laboratory conditions when paralyzed individuals used BCI to successfully navigate a robot through a complex obstacle course.

With hints of โ€˜Ghost in a Shellโ€™ and โ€˜Avatarโ€™, this tech utilizes VR as a soldierโ€™s eye and through BCI gives them the opportunity to encroach on enemy-territory without risking their life.

The technology right now has three applications, relaying orders, medics, and scouting.

However, this will not be the first time where military tech purposed for scouting ended up far more lethal.

When UAVs, unmanned aerial vehicles, now just known as drones, first appeared, they were supposed to be scouts. Now they have missiles attachedโ€ฆ

It is not just giant robots controlled by BCI that the military is interested in, no.

โ€˜Silent Talkโ€™ is a key DARPA project which lets soldiers communicate silently.

Sometimes on the battlefield, especially when stealth is a must, open communication is inadvisable.

Right now, they get around that by using hand signals but with silent talk, hand signals will be a thing of the past, enabling soldiers to talk to each other without making a sound.

This technology, in combination with the current HoloLens 2 AR headset which is now being produced for the US military by Microsoft, spells a highly digitalised battlefield in the near future.

One where telepresence-controlled robot units controlled from a distant outpost by a team of soldiers hooked up to BCIs, silently relay orders to frontline soldiers wearing Microsoft made AR HoloLens helmets.

These implications make for some frightening potentials, but at least it is good news for arms traders, I guess.

So if youโ€™re looking to make your fortune trading digitalised BCI military gear, nowโ€™s the time to invest.

At least, thanks to bioinformatics, the future of medicine has never looked so bright.

With Deepmindโ€™s Alphafold program marking the end of half a century of scientific toil, we enter a new era of medical understanding.

One where we understand each protein structure, each genome, RNA sequence and DNA strand.

Thanks to digitalisation and bioinformatics, another piece of the human puzzle has been laid currently on the table, another step forward towards the full picture.

Once we have laid them all, once we truly know how the body works, we will no longer be a slave to it, and may not even need to wage war.

Digitalisation holds the potential to commit disease and illness to the past, at least for those who can afford it.

Hand in hand with medical BCI advancements, physical, chemical, and genetic ailments could become as outdated as Smallpox.

With all of these advancements, there is a lot of potential for good, and as much for bad, and where there is either, there is always a lot of money to be made.

Gaming and Brain-Computer Interfaces

Talking about money to be made, I still have not touched on the worldโ€™s top-grossing entertainment industry, the game industry.

Over the past three decades, gaming has gone from being a niche hobby to a mainstream sport.

Fans, programmers, and engineers, alike, constantly push the gaming industry to the cutting edge of digital performance, and none more than Valve and their CEO Gabe Newell.

Valve has always pushed the boundaries of what is possible. Making leaps and bounds in the digital realm, literally.

Back in 2004, Valve created a revolutionary physics engine, Source, and followed it up with Source 2, an engine that has been lightly used since 2015, but more recently when it was evolved for use in VR headsets on release of Half-Life Alyx.

More recently, Newell talked about his vision for BCI in gaming.

In Newellโ€™s interview with 1 News, a New Zealand news channel, he confirms that Valve is working with OpenBCI headsets to develop open-source software in order to help game devs better understand what responses they are stimulating in the playerโ€™s brain.

In November 2020, OpenBCI unveiled a headset called Galea, specifically designed to work in unison with Valveโ€™s own VR headset, the Valve Index.

Newell went on to state, โ€œIf youโ€™re a software developer in 2022 who doesnโ€™t have one of these in your test lab, youโ€™re making a silly mistake.โ€

Putting it bluntly, this giant in the game industry has no doubt that, not only are game devs going to be using BCIs to fine-tune their games, but that players are going to be able to experience games in a whole new way.

Valve is in the process of using BCI to change the way gamers control and view a game.

Controller wise, ย Mike Ambinder, the experimental psychologist at Valve, aims to move away from games with seventeen buttons as standard, and more towards something more naturalistic.

Visual wise, Newell speaks critically of our natural seeing abilities and instead envisions a future where BCIs would beam visuals directly into our heads.

This more direct path would enable games to be much more real, turning what was once a flat and colourless experience into something richer than we could ever imagine.

Final Thoughts

Where do you imagine these advancements taking us?

Are you optimistic bioinformatics will save us from our mortal prisons or maybe youโ€™re seeing a bit of Black Mirror in Gabe Newellโ€™s vision of the future?

Leave your thoughts in the comment section below.

And on that e-note, I have been your digitised host, Mark van Rijmenam, The Digital Speaker.

This has been The Tech Journal.

If digital tech gets your ones and zeros firing, press the subscribe button and donโ€™t forget to leave a like

See you next time for your information download.

Stay digital.

Dr Mark van Rijmenam

Dr Mark van Rijmenam

Dr. Mark van Rijmenam is a strategic futurist known as The Digital Speaker. He is a true Architect of Tomorrow, bringing both vision and pragmatism to his keynotes. As a renowned global keynote speaker, a Global Speaking Fellow, recognized as a Global Guru Futurist and a 5-time author, he captivates Fortune 500 business leaders and governments globally.

Recognized by Salesforce as one of 16 must-know AI influencers, he combines forward-thinking insights with a balanced, optimistic dystopian view. With his pioneering use of a digital twin and his next-gen media platform Futurwise, Mark doesnโ€™t just speak on AI and the futureโ€”he lives it, inspiring audiences to harness technology ethically and strategically. You can reach his digital twin via WhatsApp at: +1 (830) 463-6967

Share