Brain Implants Help Paralyzed People Move and Communicate Again

Image credits: Mike Kai Chen/The New York Times

Welcome to the 17th issue of the AstroFeather AI newsletter!

With a handful of stories detailing exciting advances in brain implants making national and global headlines, I thought now would be a good time to take a deep dive into the technology, specifically invasive brain-computer interfaces (BCIs)!

In this week's issue, you'll also find some exciting updates on medical AI product launches and FDA approvals, as well as recent implementations of generative AI in hospital systems, including Providence Health and NYU Langone Health.

I hope you enjoy reading and if you have any helpful feedback, feel free to respond to this email, contact me directly on LinkedIn (@adideswilliams), or follow the AstroFeather LinkedIn page!

Thanks - Adides Williams, Founder @ AstroFeather

In this week’s recap (15 min read):

  • Update #1: Paralyzed Patients Can Move and Communicate Again Thanks to Brain Implants and AI

  • Update #2: Medical AI Product Launches and FDA Approvals

  • Update #3: Recent Generative AI Implementations in the Hospital Setting

Must-Read News Articles and Updates

Update #1. Paralyzed Patients Can Move and Communicate Again Thanks to Brain Implants and AI.

The latest: A patient with a rare form of ALS that makes it nearly impossible to use his hands to type or write is pioneering a brain-computer interface (BCI) called the Stentrode that allows him to control digital devices with his brain, according to a recent AFP report [news report | video].

Image credits: AFP

ALS (amyotrophic lateral sclerosis), more commonly known as Lou Gehrig's disease, is a neurological disorder characterized by the progressive degeneration and death of motor neurons (a class of nerve cells responsible for voluntary muscle movement), ultimately leading to muscle wasting and paralysis throughout the body (including the arms and legs). Unfortunately, as this incurable disease progresses, patients lose the ability to perform everyday activities such as chewing, walking, and communicating with others.

To help patients with neurodegenerative diseases such as ALS or severe paralysis regain their ability to communicate, Synchron developed the Stentrode invasive brain-computer interface (BCI), which allows them to control electronic devices such as computers and cell phones to communicate with the outside world by texting, emailing, or accessing social media.

How it works (Part 1): BCIs (like the Stentrode) are fascinating devices that often seem like something out of a science fiction novel! Simply put, they are advanced systems that can connect our brain to an electronic device (such as a computer) or robotic system (such as an exoskeleton), allowing us to control these devices using our brain activity. In general, BCIs consist of four components:

  1. A hardware component that "listens" to brain signals. For invasive BCIs, the hardware component is surgically implanted on the surface of the brain or within the brain, while non-invasive BCIs are placed on the surface of the scalp. Once in place, the BCI can "listen" to brain activity.

  2. A software component that decodes the brain signals. In general, machine learning (ML) algorithms can be used to recognize which specific patterns of brain activity correspond to specific physical movements. For example, if a person wants to raise his or her hand, the mere thought of doing so generates a brain signal pattern that's different from the brain signal pattern that corresponds to lifting the legs.

  3. A software component that translates the decoded signals into something a computer can understand. ML algorithms can be used to map decoded signals to computer actions. For example, the brain signal pattern that corresponds to lifting your leg can be mapped to a computer action on the screen, such as a "click" of the cursor (mouse).

  4. A hardware component that sends the decoded brain signals to a target device. After a user's intended action is mapped to a computer command, it is sent to an external device to generate an action.

How it works (Part 2) - The Stentrode: Now that we have an (admittedly simplified) understanding of how BCIs work, we can look at the Stentrode, which is a small (8-millimeter) cylindrical wire mesh stent-mounted electrode array that can be permanently implanted into a blood vessel in the brain without the need for invasive (open) brain surgery. Once implanted, it captures brain signals that are ultimately translated into digital instructions that enable patients to perform everyday tasks (hands-free) on external devices, including emailing and texting:

  1. Surgical implantation through the jugular vein: A surgeon makes an incision in the patient's neck and maneuvers the Stentrode via a guide catheter through the jugular vein to a region of the superior sagittal sinus (a large vein at the top of the brain) adjacent to the motor cortex (a region of the brain responsible for movement). Once the superior sagittal sinus is reached, the guide catheter is retracted to release the Stentrode, which expands and places its electrodes along the blood vessel wall.

  2. "Listening” to brain signals: Over time, endothelial cells grow over the electrode array, chronically embedding it into the endothelium of the blood vessel like a tattoo under the skin. At this point, the Stentrode captures brain signals generated by the user's intention to move a limb and transmits them to a receiver.

  3. Signal transmission: A second set of procedures connects the Stentrode via a wire to a receiver-transmitter unit implanted under the skin in the patient's chest. Signals from the Stentrode are sent to this receiver, which in turn transmits them to an external system.

  4. Decoding and translation of brain signals: Once received by the external system, the brain signals are then decoded (to determine the action from which they originated) and translated into digital instructions that correspond to actions such as "clicking" the cursor or "zooming in" (magnifying) portions of the screen.

  5. Sending commands to the computer: Finally, digital instructions to perform the desired action are sent to the computer from the connected external system.

As you can imagine, patients must work with a field technician to set up the device. In an interview with the University of Melbourne, a Stentrode user briefly describes this process: First, he is asked to think of a physical action, such as tapping his left ankle 10 times. An AI/ML algorithm is then trained to recognize the user's specific brain signal pattern resulting from the intention to tap his ankle, and then maps it to a computer action. So, when he thinks "tap my left ankle," the brain signal pattern associated with that thought is translated into an on-screen action like "left mouse click".

Image credits: Mt. Sinai Hospital

Results: Early patient feedback and results for the Stentrode have been positive. In general, patients have demonstrated restored ability to use digital devices to communicate (via email and text) and engage in online activities such as shopping and gaming. Early safety studies of the device indicate that its long-term use is safe, results that undoubtedly contributed to Synchron's FDA approvals in 2020 and 2021:

Yes, but: While the Stentrode has been an early success in the use of permanently implanted (endovascular) BCIs, there are still some limitations associated with the use of the device. These include the quality of the brain signal (compared to BCIs whose electrodes penetrate the brain tissue) and the lack of a universal brain-to-computer language:

  • Signal quality: During a discussion with CNBC, Synchron's Senior Director of Neuroscience, Peter Yoo, mentioned that the quality of the brain signal "isn't perfect" because the device is not inserted directly into the brain tissue. This could result in users having to make multiple attempts to complete a task if the brain signals are incompletely (or incorrectly) captured due to low signal quality.

  • Lack of a universal brain-to-computer language: For the Stentrode BCI system to be successful, its brain signal decoding AI must be trained to understand each user's unique brain signal patterns that result from intent. Because people's minds "speak" to their bodies in different ways, the challenge is to train the AI to understand intent regardless of who's thinking it.

Behind the news: BCIs are not new and have been around for several decades. In recent years, however, there has been a growing interplay between BCIs and AI (machine learning) that has sparked intense interest in BCI development, leading to invasive and non-invasive devices in multiple industries, including medicine, virtual reality (VR) and gaming, and mental health and wellness.

By now, many have heard of arguably the "gold standard" in the industry for invasive BCI, the Neuroport Array from Blackrock Neurotech (no relation to the investment firm) and/or the highly publicized N1 from Neuralink. However, I wanted to highlight two additional invasive BCI competitors that are expected to make an impact as they are gaining funding and momentum in this highly competitive space:

[#1] Connexus Direct Data Interface (DDI) from Paradromics: The Connexus DDI is a fully implantable system that collects brain signal data, transmits it wirelessly through the skin to an external device, and uses biocompatible microelectrode arrays to target neurons in the cortical brain.

  • Surgical implantation procedures: During surgery, the cranial hub of the Connexus DDI system is placed to sit flush with the surface of the skull. It contains an array of microelectrodes that can reach 1.5 mm below the surface of the cortical brain to pick up signals from more than 1,600 neurons.

  • Clinical trials and FDA approval status: Although the Paradromics has not yet undergone clinical trials, the FDA has granted the Connexus DDI Breakthrough Medical Device Designation.

  • Recent fund raising: In May, Paradromics raised $33 million in a Series A round to support the launch of its first-in-human clinical trial of the Connexus DDI. This brings Paradromics' total funding to $91.3 million, of which approximately $66 million has been raised in the last 2 years (according to Crunchbase records).

[#2] Layer 7 Cortical Interface from Precision Neuroscience: The Layer 7 Cortical Interface is a thin, flexible-film microelectrode array (1,024 electrodes covering an area of one square centimeter) that conforms to the brain's cortex without causing tissue damage. It can deliver thousands of channels to any location on the brain surface.

  • Surgical implantation procedures: The patented surgical procedure for this device is designed to be minimally invasive and reversible.

  • Clinical trials and FDA approval status: Precision Neuroscience has conducted a first-in-human study using the Layer 7 Cortical Interface to map human brain activity (including reading, recording, and mapping electrical activity from the surface of the brain). However, the device has not yet received FDA approval.

  • Recent fund raising: In January, Precision Neuroscience raised $41 million in a Series B round to continue development of its minimally invasive (and reversible) brain implants. This brings Precision's total funding to $53 million in the two years since its founding (according to Crunchbase records).

Driving the news: While BCI companies continue to make great strides in bringing BCIs to those in need, research labs around the world are testing new and inventive ways to use the technology to restore mobility. Some of these stories have even made national and/or global headlines, including the development of a novel brain-spine interface (BSI) and a first-of-its-kind BCI approach:

[#1] A "wireless digital bridge" of brain and spine implants restores the ability to walk: An international group of researchers reported (in May) that a new BSI successfully restored the connection between a paralyzed patient's brain and spinal cord, allowing him to walk, navigate obstacles, and traverse steep ramps.

  • The BSI consists of two disk-shaped implants surgically implanted in the skull with electrode arrays that sit on top of the motor cortex and capture brain signals generated by the intention to walk. These brain signals are then transmitted wirelessly to sensors attached to a helmet on the patient's head, and then to a laptop in a backpack (also worn by the patient).

  • On the laptop, ML algorithms decode the brain signals and translate them into instructions to move the leg and foot muscles, which are sent to a spinal stimulator that delivers different patterns of electrical pulses to the spinal cord to enable movement.

  • After several sessions of neurorehabilitation, including walking, balance, and single-joint movements with the BSI, the patient was able to walk with crutches even when the BSI implant was turned off.

Image credits: Gilles Weber/CHUV

[#2] "Double neural bypass" restores use of arms and hands: Researchers and surgeons at Northwell Health's Feinstein Institutes for Medical Research issued a press release (in July) announcing early results from a clinical trial demonstrating that a first-of-its-kind BCI approach successfully restored permanent movement and sensation (sense of touch) to a patient's paralyzed arm.

  • The BCI consists of five microchips implanted in the brain (two chips in the motor cortex and three chips in areas responsible for touch and feeling in the fingers) and two ports surgically implanted in the skull that can be connected to a computer via an HDMI cable.

  • When the patient thinks about moving the resulting electrical signals from the brain are captured by the microchips and transmitted to the ports embedded in the skull, which then transmit the signals to a computer where they are decoded by ML algorithms. The decoded movement signals are then sent to non-invasive electrode patches placed on the spine and forearm muscles that cause the hands to move.

  • Importantly, when the patient touches an object or person, sensors on the skin can send signals to the computer, which then communicates with microchips in the brain to (partially) restore sensation and touch. This allows touch information to flow from his hand to his brain, and movement instructions to flow from his brain to his arm and hand, creating a double neural bypass.

Image credits: Matthew Libassi/Feinstein Institutes for Medical Research

Why it matters:

  • According to the results of the US Paralysis Prevalence and Health Disparities Survey, 1.7% of the US population is living with paralysis, which translates to approximately 5.6 million US citizens in 2023. While it is difficult to quantify global cases of paralysis, the World Health Organization (WHO) estimates that between 250,000 and 500,000 people worldwide suffer from spinal cord injury, a leading cause of paralysis, each year. While rarer in occurrence, ALS (which often results in paralysis) affects approximately 360,000 people worldwide (assuming a prevalence of 4.42/100,000 population) and between 18,000 and 30,000 people in the US, with an estimated economic burden of $1.02 billion each year.

  • It's clear from the stories discussed above that the BCI community has made great strides in developing new and exciting systems and methods that can help this growing number of people with paralysis and ALS regain their ability to communicate and move around their environment by translating brain signals into computer (and/or machine) commands.

  • While there are several concerns associated with BCIs (especially surgical risks such as infection that invasive BCIs pose), the technology has great potential to restore a paralyzed patient's ability to perform some daily tasks and interact meaningfully with friends and family, resulting in a significant improvement in their quality of life.

Update #2. Medical AI Product Launches and FDA Approvals.

Vital launches AI tool to translate medical jargon: Health tech company Vital has launched an AI doctor-to-patient translator that turns complex medical terminology into plain language. The free tool uses third-party large language models (LLMs) and natural language processing (NLP) to translate medical notes, radiology reports, discharge summaries, and test results to a 5th grade reading level. The translator aims to improve patient health literacy, reduce clinician workload, and minimize miscommunication and adverse clinical events.

Image credits: Vital.io

Vanderbilt University Medical Center (VUMC) has launched its first virtual nurse program: A team of dedicated VUMC nurses can now perform tasks (such as documentation) remotely via teleconferencing in the patient's room, freeing bedside nurses to spend more time with patients. Initially piloted on a 36-bed ventricular assist device and transplant unit, up to three virtual nurses at a time work on the unit 24 hours a day, focusing on tasks such as admission and discharge documentation. The program is designed to improve patient and staff satisfaction and is supported by an integrated AI-based technology system from vendor care.ai.

Image credits: Vanderbilt University Medical Center

Viz.ai has received de novo FDA clearance for its AI detection algorithm for hypertrophic cardiomyopathy (HCM): The Viz HCM module analyzes electrocardiograms to detect signs of HCM, a condition characterized by thickened or enlarged heart muscle. Developed with support from Bristol Myers Squibb, the module can detect subtle signs of thickened heart tissue in adult patients, does not provide a diagnosis, and is intended to complement the use of ultrasound to detect HCM.

Image credits: Viz.ai

IMIDEX has received 510(k) clearance from the FDA for its AI-powered medical device: VisiRad XR uses machine learning to analyze chest X-rays to identify lung abnormalities, potentially helping to detect lung cancer at an early stage. VisiRad XR is designed to improve the detection of lung nodules and masses during routine care without requiring additional test orders. In a retrospective study, the software demonstrated 83% sensitivity in identifying lung nodules and masses.

Image credits: Imidex

GE Healthcare has received FDA 510(k) clearance for its Portrait Mobile wearable patient monitoring system for use in US hospitals: The system consists of wearable sensors that collect vital signs from patients and transmit them wirelessly to a smartphone-like monitor. The data is then displayed on a dashboard for clinicians to monitor in real time, helping caregivers identify early signs of deterioration in patients recovering from critical illness and enabling timely intervention. A real-world study conducted at a London hospital showed that nurses found the system helpful in identifying patient deterioration earlier than typical observation methods.

Image credits: GE Health

Update #3. Recent Generative AI Implementations in the Hospital Setting.

Epic relaunches app marketplace with generative AI services: EHR vendor Epic has introduced the Partners and Pals programs as part of its initiative to collaborate with third-party developers. Abridge has joined the program (in partnership with Emory Healthcare) to integrate its AI clinical documentation tools into Epic's clinical workflows, saving providers time. Talkdesk also joined the program, offering its AI cloud contact center services, including the Talkdesk Healthcare Experience Cloud, to improve patient access, revenue cycle management and patient care within Epic's Cheers CRM suite.

Providence Health and Services is using generative AI in its MedPearl clinical decision support tool: MedPearl helps primary care providers make specialty care referrals by providing advice and guidance. Since its launch in January, MedPearl has been used by approximately 2,700 out of 4,000 primary and urgent care providers, who have searched the application more than 90,000 times. The integration of generative AI has enhanced MedPearl's search capabilities and interpretation of medical acronyms. Providence plans to further enhance the tool by incorporating semantic search and promptly updating it with new medical guidelines.

Real Chemistry and WhizAI have partnered to develop patient journey software using generative AI: Real Chemistry's analytics engine, combined with WhizAI's generative AI visualization platform, will provide a unique solution for patient journey mapping. By understanding the patient experience throughout the diagnosis and treatment process, the software helps hospitals improve their marketing strategies. Users can also leverage the platform to ask questions about specific disease states and gain data-driven insights that (hopefully) lead to improved patient outcomes.

NYU Langone Health goes all-in on generative AI: Earlier this year, NYU Langone partnered with Microsoft to create one of the first HIPAA-compliant GPT-4 ecosystems in healthcare. Since then, the medical center has continued its adoption of generative AI by hosting "the first Generative AI Prompt-A-Thon in healthcare." During the event, teams of clinicians, educators, and researchers collaborated to find GPT-4-based solutions to healthcare challenges using de-identified patient data in a secure and HIPAA-compliant manner. The goal of the Prompt-A-Thon was to empower and educate the workforce on the benefits of generative AI.

Thanks for reading this issue of the AstroFeather newsletter and I’ll see you in the next issue!

If you enjoy AstroFeather content, be sure to share this newsletter!

Adides Williams, Founder @ AstroFeather (astrofeather.com)

Join the conversation

or to participate.