Bionic limbs that merge with nerves and muscles can be used in daily life

PLUS: Forward Health launches AI self-contained medical stations

Background image credits: Anna-Lena Lundqvist / Science Robotics

Welcome to the latest issue of the AstroFeather Tech Review!

A patient (in Sweden) with a below-elbow amputation has been pioneering the use of a novel neuromusculoskeletal artificial hand that integrates with a user’s body – nerves, muscles, and skeleton – and can be intuitively controlled and comfortably used in daily activities (for years)!

Public interest in this topic has grown since the international research team that created the artificial limb system shared promising results, and in this issue, we’ll take a deep dive into the neuromusculoskeletal device and some other models that merge artificial limbs with the nervous and muscular systems.

In this issue, you'll also find some helpful updates on medical AI technology (FDA approvals and product launches) and recent implementations of AI in the healthcare space, including the rollout of AI self-service clinics.

Thanks - Adides Williams, Founder @ AstroFeather AI

In this issue's recap (25 – 30 min read time):

Update #1: Bionic limbs that merge with nerves and muscles can be used in daily life.

Update #2: Medical Tech: Product Launches and FDA Approvals.

Update #3: Venture capital funding, partnerships, and acquisitions in AI healthcare.

Update #1. Bionic limbs that merge with nerves and muscles can be used in daily life.

The latest: A groundbreaking bionic hand that connects directly to a person's nerves, muscles, and skeleton has shown promising results in restoring a basic sense of touch, improving quality of life, and remaining functional after years of daily use.

This fusion of human and machine (called a neuromusculoskeletal interface) translates the user’s brain signals into precise movements of the artificial limb (for intuitive control), enables bidirectional communication between the artificial limb and the body (to restore a basic sense of touch), and comfortably anchors the artificial limb to the body (removing the need for sockets), according to some exciting results shared in Science Robotics!

Image credits: M. Ortiz-Catalan et al. Science Robotics (doi: 10.1126/scirobotics.adf7360)

How it works (Part 1) – Neuromusculoskeletal Prostheses Basics

Human movement is a complex process that requires seamless coordination between the nervous, muscular, and skeletal systems. When we decide to move, electrical signals originating in a region of the brain responsible for voluntary movement (the primary motor cortex) travel down the spinal cord through specialized neurons (called motor neurons) and along the peripheral nerves to reach the muscles of the limbs (such as the arms and legs). Once these electrical signals reach the muscles, they cause the muscle fibers to contract, which in turn pull on the bones in the limb via attached tendons to create movement.  

When a person's limb is amputated, the bones, muscles, and nerves responsible for limb movement are often left intact in the residual limb. This provides an opportunity to design an advanced artificial limb that can interface with the remaining nervous and skeletal systems to restore both movement and sensation. For example, when a person with an amputation attempts to move their missing (phantom) limb, the brain is still sending electrical signals through the residual peripheral nerves that were once connected to the residual limb, and these signals can be detected and used to control an artificial limb. Moreover, the bones in the residual limb can provide a suitable foundation for direct skeletal attachment of the artificial limb (via a process called osseointegration), allowing it to function as a (near) natural extension of the residual limb.

Thus, a neuromusculoskeletal prosthesis that integrates with the nerves, muscles, and skeleton is possible thanks to three main interfaces between it and the human body: a skeletal interface that anchors the artificial limb to the body, a neuromuscular interface that captures signals from the muscles and nerves generated by the intention to move, and an artificial limb controller where machine learning (ML) algorithms convert these signals into instructions that tell the artificial limb how to move:

  1. Skeletal interface: To create the skeletal interface, surgeons use a procedure based on the principle of osseointegration, which involves inserting a metal implant (usually made of titanium) into the bone of the residual limb. As the bone heals, it attaches directly to the implant, which extends through the skin and serves as a secure foundation for attaching an artificial limb. By anchoring the titanium implant to the patient's skeleton, osseointegration creates a mechanical connection between the artificial limb and the body that allows the artificial limb to function as a direct extension of the residual limb and can be worn all day (every day) without the complications associated with traditional socket attachment (such as chafing and skin sores).

  2. Neuromuscular interface: To create the neuromuscular interface, surgeons first identify the nerves and muscles in the residual limb that controlled movement in the fully intact limb before amputation. Next, they carefully attach electrodes to either the surface of or inside the identified nerves and muscles. Whenever a person with an amputation tries to move their missing limb, the identified nerves and muscles produce electrical signals, therefore the goal is to position the electrodes so that they can accurately detect these electrical signals and transmit them through wires that pass through the titanium bolt (of the skeletal interface) and into the artificial limb.

  3. Artificial limb controller with ML algorithms: Electrical signals (from the muscles and nerves) ultimately arrive at the artificial limb as distinct patterns that correspond to an intended movement. For example, the intention to close the hand would result in a pattern of electrical signals (a "close hand" pattern) that would be different from the pattern of electrical signals associated with the intention to open the hand (an "open hand" pattern).  ML algorithms stored on the artificial limb controller are trained to recognize a variety of these patterns of electrical signal and then translate them into instructions for the artificial limb.

  4. Artificial (prosthetic) limb: Finally, when instructions are received from the controller, the artificial limb performs the desired action. For example, instructions for a robotic hand would include opening and closing the hand, extending and curling the fingers, and forming different grips to grasp and pinch objects.

Image credit: M. Ortiz-Catalan et al. Science Robotics (doi: 10.1126/scirobotics.adf7360)

How it works (Part 2) – Self-Contained Neuromusculoskeletal Prosthetic Hand

The first-ever clinical implementation of a neuromusculoskeletal bionic hand for below-elbow amputation has recently been achieved through a combination of surgical and engineering techniques that integrate an advanced robotic prosthesis with the user's nervous and skeletal systems. The self-contained system consists of a robotic hand, a wrist shaped battery, an embedded artificial limb controller that translates muscle and nerve signals into movement commands for the robotic hand, and neuromuscular and skeletal interfaces that allow bidirectional communication between the user and the prosthesis:

  1. Skeletal interface – creating a bone-anchored (osseointegrated) implant system: To create the skeletal interface, a flap of skin is lifted at the end of the residual limb, and the bones in the forearm (radius and ulna) are located and adjusted to equal length. For each bone, the central cavity (medullary canal) is opened, a threaded titanium implant (called a fixture) is inserted, and the wound is closed. After approximately six months, the implanted fixture is re-exposed, and a skin-penetrating cylindrical titanium implant (the abutment) is attached to the fixture with a titanium screw (the abutment screw) that holds the implant system together. The wound is closed with the titanium abutment extending through the skin to attach to the robotic hand.

  2. Neuromuscular interface – implanting the electrodes: To create the neuromuscular interface, residual nerve endings (neuromas) are first removed from the three main nerves (median, radial, and ulnar) in the residual forearm, which are then divided into individual nerve bundles (fascicles). Healthy pieces of thigh muscle (free muscle grafts that act as biological amplifiers of nerve signals) are then wrapped around the ends of the nerves, sutured in place, and an electrode is inserted directly into the muscle tissue, while a remaining part of the ulnar nerve is wrapped with a cuff electrode for sensory feedback. Electrodes are also attached to the native muscles of the residual forearm on the surface of the muscle (in the epimysium - a layer of dense connective tissue that surrounds the muscle) and directly into the muscle tissue.

  3. Connecting the neuromuscular and skeletal interfaces: Multiwire connectors are used to combine the electrode lead wires (from the neuromuscular interface) to form two separate wires that pass through the ulna and radius bones, respectively. These wires are then attached to feedthrough connectors located in the bone-anchored titanium implant system, allowing wired electrical communication from the electrodes inside the body to the titanium rods (abutments) that connect to the robotic hand. In this way, muscle signals generated by muscle contraction (because of an intention to move) can travel through the electrodes and their lead wires to the artificial limb controller (in the attached robotic hand), where they are converted into movement commands for the robotic hand.

  4. Artificial limb controller with pattern recognition algorithms: Once the robotic hand receives electrical signals from contracting muscles, machine learning (ML) algorithms (on a custom embedded controller) process them into movement instructions for the robotic hand. In a previous study, the research team described their custom control system, which uses two classifier algorithms, linear discriminant analysis (LDA) and support vector machine (SVM), trained to recognize patterns of muscle signals that correspond to different actions and then translate them into instructions for the robotic limb. Shortly after implantation, patients use a "virtual training prosthesis" to train the ML algorithms by "moving" their phantom hand to mirror a virtual avatar hand on the screen as it cycles through different hand poses.

  5. Robotic hand: After receiving instructions from the embedded controller based on the pattern of muscle signals, the prosthesis performs the desired action. Notably, a basic sense of touch is also restored thanks to sensors in the fingers of the robotic hand that can measure the force applied (by pinching and grasping objects). Once an object is grasped, electrical signals from the force sensors in the fingers are sent to the artificial limb controller, where they are translated into electrical pulses that are sent by a neurostimulator module to stimulate the cuff electrode around the ulnar nerve, allowing the patient to “perceive” changing pressure levels against the robotic hand that are interpreted as "touch".

Image credit: M. Ortiz-Catalan et al. Science Robotics (doi: 10.1126/scirobotics.adf7360)

Results

Patient feedback for the neuromusculoskeletal prosthetic hand (consisting of an osseointegrated implant that supports bidirectional communication between a robotic hand and implanted electrodes) has been extremely positive! Not only has the patient successfully used the prosthesis for more than three years to perform a range of daily activities, including packing a suitcase and preparing food, but she also mentioned in an interview that her excruciating phantom pain, which constantly felt like her hand was "in a meat grinder," has been reduced to more tolerable levels.

Image credit: M. Ortiz-Catalan et al. Science Robotics (doi: 10.1126/scirobotics.adf7360)

Research studies: Prior to the surgical procedures (pre-intervention) described above (in the "How it works (Part 2)" section), the patient used a myoelectric prosthetic hand that was attached to her body using a cup-like device called a socket, and muscle signals were sent to the prosthesis through electrodes attached to the surface of the skin. However, during a one-year follow-up after surgery (post-intervention), the patient demonstrated improved control of the prosthesis, reported improvements in quality of life and perception of disability, and reduced pain levels (thanks to the neuromusculoskeletal interface):

  • Reduced pain related to amputation: Changes in phantom limb pain (PLP), stump pain, and how much the phantom limb interfered with daily activities were assessed using the Questionnaire for Phantom Limb Pain Tracking (Q-PLPT), with a scale ranging from 0 (no pain or interference with daily life) to 10 (extreme pain or full interference with daily life). At the one-year follow-up, stump pain was completely eliminated (from 6 before to 0 after the intervention) and PLP intensity decreased by 2 points (from 5 to 3). Perceived interference of PLP during daily activities, work, and sleep was also reduced. A likely explanation for the reduced pain is the surgical removal of neuromas (which are associated with phantom limb pain) and their subsequent replacement with healthy muscle grafts, which inhibit neuroma regrowth.

  • Quality of life (QOL) questionnaire results: Several self-assessed questionnaires were used to determine improvements in patient QOL, including the EQ-5D-5L (which asks questions about topics such as mobility, self-care, and anxiety/depression), the Disabilities of the Arm, Shoulder, and Hand (DASH - which asks users to rate difficulties in performing physical activities with the arm, shoulder, and hand), and the Questionnaire for Upper Limb Amputation (Q-ULA - which assesses problems with prosthesis use). During the one-year follow-up, the patient's EQ-5D-5L increased (by 0.4 points relative to pre-intervention), indicating (likely) clinically relevant improvement; the DASH score decreased (by 9 points), indicating greater ability to perform upper limb tasks and suggesting less perceived disability; and the Q-ULA decreased (by 27 points), indicating that the patient experienced fewer problems related to the prosthesis and amputation.

  • ACMC test for prosthesis functionality: The Assessment of Capacity for Myoelectric Control (ACMC) is an observational assessment that evaluates a person's ability to use a myoelectric prosthesis to perform pre-defined daily tasks, including setting a table and packing luggage. According to the ACMC, the patient's luggage packing and table setting scores improved by 13% and 23%, respectively, and the overall ACMC score was in the "extremely capable" range.

  • SHAP test for prosthesis functionality: The Southampton Hand Assessment Procedure (SHAP) is a timed assessment that measures the effectiveness of an upper limb prosthesis and consists of tasks that require participants to grasp and move objects such as cylinders and spheres, as well as perform everyday tasks such as turning a door handle and picking up coins. During the one-year follow-up, the patient showed an overall 23% improvement in completing the SHAP tasks (compared to pre-intervention assessments).

FDA approval status – Neuromusculoskeletal interface: While the neuromusculoskeletal interface has not been FDA approved, the collection of required surgical procedures has been used in proof-of-concept studies in Sweden for patients with above-elbow (transhumeral) and below-elbow (transradial) amputations.

FDA approval status – Osseointegration: It is worth noting that the osseointegration procedures for the neuromusculoskeletal interface are performed using the OPRA Implant System which has the following FDA clearance status:

Yes, the results are exciting, but...

From the study results, it's clear that the use of the self-contained neuromusculoskeletal bionic hand resulted in many improvements in quality of life and could be comfortably used in daily activities (outside of the laboratory) for years! However, the small number of participants in the study makes it difficult to understand how a larger group of individuals with below-elbow amputations would perform with the neuromusculoskeletal bionic hand and respond to the required invasive procedures:

  • Single patient case study only (additional studies needed): At the time of this writing, only one patient with a below-elbow amputation has been treated with the proposed combination of surgical and engineering techniques for integrating a prosthetic hand with the skeletal and nervous systems. Additional studies are necessary to determine the overall safety of the required surgeries and whether the techniques described will generally result in improved user control of a prosthetic hand (considering that each below-elbow amputation will result in varying amounts of soft and scar tissue).

  • Required osseointegration surgeries have long recovery times and associated risks: A hallmark of the surgical procedures described by the researchers is the two-stage osseointegration process, which requires roughly nine months of healing and rehabilitation time before the prosthesis can be fully used. Placement of the titanium implant into the bone (stage 1) typically requires a six-month healing period, while placement of the abutment (the titanium rod that connects to the prosthesis) (stage 2) requires a combined (post-surgery) rehabilitation and training period of roughly 3 months. Bone implant surgeries are also known to carry significant risks including, implant-associated bone infection (osteomyelitis) and fractures that occur in the bone surrounding the implant (periprosthetic fracture).

  • Complications associated with bone implants: Aseptic loosening and mechanical failure of implant components are two notable complications associated with osseointegration procedures. Aseptic loosening occurs when the bond between the bone and the implant fails (in the absence of infection). During the study, researchers found that the titanium fixture in one of the forearm bones (the radial bone) failed to osseointegrate and had to be replaced with a larger fixture. Thankfully, no infection was detected (suggesting aseptic loosening), but it took 10 months to rectify the incident (4 months to heal after removal of the failed implant, plus 6 months to heal after insertion of the new implant). To add to this, the researchers noted that the abutment screw of the ulna implant had to be replaced due to mechanical failure.

Behind the news

Research on protheses controlled by electrical signals generated by the muscles (and associated nerves) in the residual limb has been ongoing since the 1940s, with clinically relevant myoelectric prostheses emerging between the 1960s and 1980s. More recently, international research teams have been pursuing clinical implementations of neuromuscular interfaces that provide intuitive prosthesis control suitable for long-term use in daily life.

While the development of self-contained neuromusculoskeletal prostheses for above-elbow (transhumeral) and below-elbow (transradial) amputations (led by the Center for Bionics and Pain Research (Mölndal, Sweden)) is well known, I wanted to highlight two additional noteworthy efforts to integrate advanced prosthetics with the muscular and nervous systems in patients with upper- and lower-limb amputations:  

[#1] Regenerative peripheral nerve interface (RPNI): A multidisciplinary team of computer scientists and surgeons at the University of Michigan developed the RPNI procedure to capture and amplify electrical signals from peripheral nerves of the arm (radial, median, and ulnar) that were severed during amputation. RPNIs were successfully created in patients, as determined by ultrasound assessment, and the amplified RPNI electrical signals were used by AI/ML algorithms to control the movements of a prosthetic arm for almost a year (300 days) without the need for adjustments! [ interactive website | live-demo ]

Using the RPNI platform, patients experienced reduced phantom limb pain and gained fine control of an advanced prosthetic hand, allowing them to pick up miniature play blocks, grasp soda cans, and play a modified version of Rock, Paper, Scissors called Rock, Paper, Pliers. The design of the RPNI platform can be roughly described as follows: 

  • RPNIs as a neuromuscular interface: To create RPNIs, surgeons first harvest grafts of healthy thigh muscle (about the size of a small paper clip) from the patient. Next, the peripheral nerves of the arm are identified, neuromas (which are associated with phantom limb pain) are removed, the nerves are divided into nerve bundles (fascicles), and the muscle grafts are wrapped around the ends of the nerves and sutured in place. During a three-month recovery period, the peripheral nerves of the arm reinnervate the muscle grafts, and ultrasound is used to visually confirm and measure the physical contractions of the RPNIs as the patient thinks about moving their phantom hand.

  • Capturing electrical signals from RPNIs: Electrodes are first implanted in the RPNIs, where they pick up peripheral nerve signals that are generated by the intention to move and amplified by the electric field generated by the reinnervated muscle grafts during each contraction. The electrode lead wires are then passed through the patient's limb and combined into a multi-pin data port connector that's attached to the arm and serves as a data interface for an external computer.

  • Decoding and translating muscle signals: After nerve and muscle signals are recorded from the RPNIs, patients are connected to a computer (via the RPNI data ports on their arms) and begin training machine learning (ML) algorithms by attempting to "move" their phantom hand to mirror an on-screen virtual avatar hand as it cycles through various hand postures (such as fist and pinch). After several mirroring and hand posturing exercises, the following ML algorithms are trained in a matter of minutes: a Naive Bayes (NB) classifier (to detect muscle signal patterns and associate them with different hand gestures) and a Kalman filter (to estimate the intended position and velocity of specific fingers).

  • Bionic arm with hand sensors: After algorithm training, the patient puts on a Mobius Bionics LUKE Arm (formerly DEKA Arm System) in the “radial configuration”, and both the patient and the LUKE Arm are connected to a computer housing the ML algorithms. Now, when the patient thinks to move their phantom hand, the pattern of resulting nerve and muscle signals are decoded by the ML algorithms and used as movement instructions for the LUKE Arm, which provides dexterous hand and finger movements and has force sensors in the hand that can communicate when an object is being touched or grasped. [As an interesting aside, the LUKE Arm (whose development was funded by the US Defense Advanced Research Project Agency (DARPA)) was first prescribed to two US military veterans, and is a passing reference to the futuristic bionic arm that Luke Skywalker received in Star Wars: The Empire Strikes Back]. 

Image credits: Evan Dougherty / University of Michigan Engineering

[#2] Neuroprosthetic leg system: An international group of researchers led by ETH Zurich and SensArs Neuroprosthetics (a startup based in Lausanne, Switzerland) has developed a neuroprosthetic leg system that connects a leg prosthesis to the residual nerves in the user’s thigh, allowing the user to feel (in real time) when the foot of the prosthetic leg touches the ground and when the knee bends and extends! [ live + animated demo ]

The neuroprosthetic leg was initially tested during a three-month clinical trial on patients with an above-knee (transfemoral) leg amputation and follow-up reports have demonstrated improvements in mobility, reduced falls, and increased perception of the neuroprosthetic leg as part of the body. Active complex tasks, including walking on a sandy terrain, navigating platforms with obstacles, and climbing and descending stairs, were accomplished with less effort when the neuroprosthetic leg was activated (“turned on”). The design of the neuroprosthetic leg system can be roughly described as follows:        

  • Using TIMEs to create a neuromuscular interface: The neuroprosthetic leg uses transversal intraneural multichannel electrodes (TIMEs) to stimulate the tibial nerve in a way that the user interprets as physical sensations of touch (tactile sensory feedback) in the phantom leg. To implant TIMEs, surgeons locate the tibial branch of the sciatic nerve, make a small opening in its outermost layer (epineurium), and then use a guide needle to pull the TIME transversely through (horizontally across) the inside of the nerve until it reaches the desired depth. Once the electrode is positioned, its special fixation tabs are folded over the surface of the nerve and sutured in place, its active sites are placed in direct contact with various nerve bundles (fascicles), and its lead wires are then tunneled through the thigh and pulled out of the leg through small incisions to connect to an external neurostimulator.

  • Characterization of sensations derived from stimulating TIMEs: After implantation, patients are connected to an external stimulator through their TIME implants. Short pulses of (painless) electrical currents (of varying intensity, duration, and frequency) are delivered through the wires of the electrode and into each of its active sites. These electrical currents then stimulate different areas of the tibial nerve, and the patient interprets these stimulations as different sensations (such as touch, pressure, and vibration) that vary in perceived intensity and location on the sole of the phantom foot and lower leg. A mapping exercise is then performed to determine which specific electrode active site will produce sensation in the calf of the lower phantom limb (interpreted as knee movement) and in the forefoot region (central metatarsal), midfoot region (lateral metatarsal), and heel of the phantom foot.

  • Bionic leg with foot sensors: Patients are then fitted with a custom-made (transfemoral) leg prosthesis assembled from commercially available components, including a prosthetic knee that can wirelessly communicate changes in knee joint angle, indicating knee flexion (a movement that brings the calf muscle closer to the hamstring, reducing the angle between the two) or knee extension (a movement that straightens the leg, increasing the angle between the calf muscle and the hamstring). A prosthetic foot equipped with an insole of pressure sensors (sensorized insole) is also attached and can wirelessly communicate the pressure changes experienced in the forefoot, midfoot, and heel regions with each step the user takes.

  • Body attachment method: After assembly, the neuroprosthetic leg is then fitted to the user using a traditional socket-based approach, including the use of a silicone suction liner that provides an airtight seal between the residual limb and the socket (which connects to the rest of the prosthetic leg).

  • Decoding and translating signals from prostheses: When the patient walks with the neuroprosthetic leg, pressure data from the sensorized insole and motion data from the prosthetic knee are transmitted (via Bluetooth) to an external controller in a backpack (also worn by the patient). The controller, implemented on a Raspberry Pi, runs a "sensory encoding algorithm" that uses linear amplitude modulation (LAM) to convert the sensory data from the prosthetic knee and sensorized insole into stimulation parameters (instructions) for an external stimulator that is wired to the controller and connected to the patient's TIME implants.

  • Restoring touch in the lower leg and foot: Upon receiving instructions from the controller, the external stimulator delivers electrical pulses to the TIME implants, which then stimulate selected areas of the tibial nerve to produce sensation in the patient's phantom foot and lower leg.

Image credits: F. M. Petrini et al. (doi: 10.1038/s41591-019-0567-3)

Why it matters and a brief history of prosthetic limbs

A WWI-era prosthetic arm via Wikimedia Commons

The development of artificial limbs is a profoundly human endeavor that spans several millennia, and it all began with a prosthetic toe. While ancient literature contains several references to artificial limbs in mythology and poetry, the approximately 3,000-year-old "Cairo Toe" remains the oldest known prosthesis to be made and worn for practical, everyday use. The Cairo Toe was the result of extraordinary craftsmanship (for its time - circa 950 BCE), as it was shaped to look like a human toe (for aesthetics), constructed with a combination of wood and leather (for durability and comfort), meticulously fitted to the wearer's foot (for stable attachment to the body), and could even bend (for practical use).

During the Roman Republic, other examples of practical prostheses were developed, including simple leg prostheses made of wood and bronze plates, such as the "Capua Leg" (300 BCE), and an iron prosthetic hand designed for Marcus Sergius during the Second Punic War (218 BCE), which allowed him to grasp his shield and return to battle after the amputation of his right hand. In the period from the end of the Roman Republic (27 BCE) to the end of the Middle Ages (~1490 CE), there were few advances in the practical development of artificial limbs, with most prosthetics being heavy, crude devices for combat that focused on concealing deformity and/or had limited functionality (such as peg legs and hand hooks).

In the mid-1500s, Ambroise Paré, a French military surgeon, advanced the development of prosthetics with the introduction of jointed artificial limbs. His standout design was "Le Petit Lorrain," a mechanical hand operated by hinges, catches, and springs that simulated the joints of a biological hand. He also designed an above-knee (transfemoral) leg prosthesis with a locking knee joint and a prosthetic foot. The development of lower-limb prostheses continued with contributions from Pieter Verduyn, who introduced the first non-locking below-knee (transtibial) prosthesis in 1696. Just over a century later, in 1800, James Potts advanced prosthetic design with the introduction of the wooden "Anglesey Leg," the world's first artificial leg with articulating joints at the toe, ankle, and knee, allowing its users to walk and even ride horses.

From the 1860s to the mid-1940s, large-scale conflicts such as the American Civil War and World Wars I and II led to an exponential increase in the need for prosthetic limbs that were both functional and allowed those with an amputation to continue working. In 1871, James Hanger received a US patent for the "Hanger Limb," a double-jointed prosthetic leg that flexed at the ankle and knee and was popular with American Civil War veterans because of its affordability, functionality, and sturdiness. In the early years of World War I, Germany's Siemens-Schuckertwerke began manufacturing the "Siemens Universal Work Arm" (~1915), which was essentially a tool holder with interchangeable inserts such as a simple hammer or those whose ends could be attached to welding tools.

Shortly after World War II, the US National Academy of Sciences established the "Artificial Limb Program" (1945) to promote scientific progress in prosthetic development, and UC Berkeley introduced a new fitting technique for above-knee prostheses called the "suction socket" (1946). From the end of World War II to the present day, prosthetic development has advanced by leaps and bounds. The advent of clinically relevant myoelectric prostheses (between the 1960s and 1980s) and targeted muscle reinnervation (TMR) paved the way for several groundbreaking advances in so-called "mind-controlled" prosthetics that arrived between 2010 and 2020. These include the self-contained neuromusculoskeletal interface (from the Center for Bionics and Pain Research in Sweden) used to power the “Mia Hand” and the regenerative peripheral nerve interface (from the University of Michigan) used to power the “LUKE Hand”.

Progress from wooden toes to mind-controlled bionic limbs has been driven by an understanding of the (usually) devastating impact of limb loss and a desire to restore quality of life to those with an amputation. Worldwide, an estimated 58 million people live with a limb amputation (due to traumatic causes), and approximately 35 to 40 million will require prosthetics (and orthotics), according to the World Health Organization. In the US, an estimated 150,000 to 185,000 amputations are performed each year, and the number of people with limb loss is expected to more than double from 1.6 million (in 2005) to 3.6 million (in 2050).

It's clear that there will be a continuing need for prosthetic limbs (for the foreseeable future). Thanks to the efforts of artificial limb pioneers and nearly 3,000 years of advances, researchers are closer than ever to replicating the full function of a biological limb with prostheses that can be intuitively controlled, worn comfortably (all day, every day), and restore the sense of touch.

Update #2. Medical Tech: Product Launches, Trials, and FDA Approvals (for Oct 26 – Nov 13, 2023)

RapidAI recently received FDA clearance for Rapid SDH, which uses AI to identify signs of subdural hematoma in brain CT scans: By analyzing hundreds of patient scans, Rapid SDH's AI has been trained to detect potential indicators of acute and chronic subdural hematoma (brain bleeds) with 93% sensitivity and 99% specificity. The module can also deliver results within one minute via the RapidAI mobile app, email, and existing hospital systems, allowing the entire trauma team to assess the findings and initiate remote triage if necessary.

Image credits: RapidAI

Smileyscope has become the first virtual reality (VR) device to receive FDA Class II clearance for acute pain: This clearance recognizes the company's Procedural Choreography technology, which uses positive virtual stimuli to reduce pain and anxiety during medical procedures. The FDA's decision follows positive results from studies conducted by Smileyscope that demonstrated reductions in pain, anxiety, caregiver distress and the need for physical restraints, making the company's VR technology a promising solution for a wide range of patients.

Image credits: Smileyscope

Neurovalens' Modius Sleep device has received FDA clearance as a drug-free option for the treatment of chronic insomnia: The headset uses neuromodulation technology to target the underlying neurological processes responsible for insomnia. By delivering small electrical pulses through an electrode placed behind the ear, it stimulates brain areas associated with the sleep-wake cycle and circadian rhythm. Users can wear the device for a 30-minute session before bed while engaging in activities such as watching television or reading. In a pilot study, participants experienced significant improvements in their Insomnia Severity Index scores and reported feeling better rested throughout the day.

Image credits: Neurovalens

Paige Lymph Node has received FDA Breakthrough Device Designation for its ability to detect breast cancer metastasis: Developed as in vitro diagnostic software, Paige Lymph Node uses AI to identify suspicious areas of potential breast cancer metastasis, including micrometastases and isolated tumor cells in lymph nodes. The software has demonstrated over 98% sensitivity thanks to its deep learning models trained on over 32,000 digitized lymph node slides and is part of the broader Paige Breast Suite, which aims to streamline breast cancer diagnosis and improve pathologist efficiency.

Image credits: Paige

Eko Health is partnering with Imperial College London to bring AI stethoscopes to primary care practices in the UK: The program, called TRICORDER, aims to help healthcare workers identify heart failure problems using Eko Health's stethoscopes, which are equipped with machine learning algorithms trained to detect structural heart murmurs associated with valvular heart disease (VHD). News of Eko Health's TRICORDER program follows the company's US launch as SENSORA and recent study results suggesting that the AI-equipped stethoscope detected signs of VHD with a sensitivity of approximately 94%, compared to 41% for the standard analog stethoscope.

Image credits: Eko Health

Update #3. Venture Capital Funding, Partnerships, and Acquisitions in AI Healthcare (for Oct 27 – Nov 15, 2023).

Virgin Pulse completed a $3 billion merger with HealthComp to form a new unified company focused on creating a comprehensive employer health benefits platform-as-a-service (PaaS). According to the company press release, the integrated platform will use a combination of “leading technologies and AI-driven data models” to deliver health plan designs aimed at lowering health care costs for both members and employers.

Thoma Bravo acquired NextGen Healthcare for $1.8 billion, in a deal that gives Thoma Bravo a foothold in the electronic health record (EHR) market. NextGen Healthcare is an EHR vendor with over 100,000 provider clients in the US, including accountable care organizations and independent physician associates, according to a recent 10-K filing with the SEC.

IKS Health acquired AQuity Solutions for $200 million to expand its Care Enablement Platform for revenue optimization, clinical support, and value-based care. AQuity Solutions uses AI (and virtual scribes) for clinical documentation, medical coding, and revenue integrity, as such AQuity’s datasets (and AI/ML expertise) should help IKS Health scale its current proprietary AI products and services.

The UK Government invested £100 million (~$120 million) to establish the “AI Life Sciences Accelerator Mission,” a new life sciences and healthcare initiative focused on using AI to develop new, transformational therapies for diseases such as cancer and dementia.  

Forward Health raised $100 million in a Series E funding round to support the launch of its CarePods, which are self-contained medical stations that use AI to provide various medical tests and diagnoses. Users can enter the CarePods (which will be deployed in shopping malls, gyms, and office buildings) and receive tests such as blood work, throat swabs and blood pressure readings. AI is used to screen and diagnose health conditions, and doctors provide prescriptions behind the scenes.

Elucid raised $80 million in a Series C funding round to expand development of its AI diagnostic software that uses cardiac CT scans to create 3D models of plaque buildup in the blood vessels. Elucid will also develop its PlaqueIQ software toward a new indication for measuring fractional flow reserve from CT scans.  

MD Anderson Cancer Center raised $61 million (toward a $100 million fundraising goal) to launch the Institute for Data Science in Oncology (IDSO). Institutional and philanthropic support for the IDSO will bring together data scientists, clinicians, and cancer researchers to tackle a variety of challenges, including using AI (and other automated tools) to rapidly diagnose and characterize cancer and to develop novel therapeutics

Covera Health raised $50 million in a Series C funding round to expand its AI and data analytics services, which help improve radiology diagnostic accuracy (and reduce misdiagnosis). Covera Health also has a “Radiology Centers of Excellence” data program that’s used by health plans to send patients to high-quality radiology providers (based on specific needs).

Royal Philips received $44.6 million from the Bill & Melinda Gates Foundation to support the global development of its handheld AI ultrasound system called Lumify, which helps healthcare workers such as midwives identify complications during pregnancy. The latest grant brings total funding from the Bill & Melinda Gates Foundation to $60 million (including an initial $15.4 million used to support a pilot of Lumify in Kenya).

Thanks for reading this issue of the AstroFeather newsletter and I’ll see you in the next issue!

If you enjoy AstroFeather content, be sure to share this newsletter!

Adides Williams, Founder @ AstroFeather (astrofeather.com)

Join the conversation

or to participate.