There are many powerful examples of major technology advancements in healthcare, some of which we’ve highlighted in previous blogs. Throughout history, certain developments have dramatically shifted the level of care patients receive—sometimes in direct ways, and other times through more indirect effects.
Louis Pasteur, the dapper French chemist, demonstrated that the fermentation of wine and the souring of milk was caused by living microbes. Joseph Lister, an English surgeon, was the first to apply germ theory in the care of patients in the operating room. The evolution of anesthesiology has enabled the impossible (simple and increasingly complex painful procedures) to become possible. Insulin, after years of intense work, was first used to treat patients in 1922. Since then, it has saved millions of lives, and today a diagnosis of Type 1 diabetes often allows for a full, normal life. CRISPR-Cas9, often called “molecular scissors,” can repair, edit, and modify faulty DNA. It has already shown that it can successfully treat, and even cure, some life-threatening genetic conditions that arise when nature doesn’t get it quite right. We could go on and on!
Table of Contents
What advancements recently intrigued us—and that’s an understatement!
While not pretending to be prescient in our prognostications and certainly not intended to elevate what follows to the lofty impact of that mentioned above, a few advancements have recently caught our attention, and we want to give each —admittedly early and possibly premature—a shout-out. We tend to the optimistic side of the see-saw of life, but as always, it is not our intent to influence, but rather to illuminate, and in that process let you be the judge.
1. An artificial limb and its conscious control
Using the usual dry vernacular of the scientist-clinician, we delved into a report, “Continuous neural control of a bionic limb restores biomimetic gait after amputation.” This report in Nature Medicine, of seven patients with missing limbs, turns science fiction’s portrayal of the neural control of bionic legs and arms into a pathway to reality. Until recently, bionic limbs depended entirely on preset robotic control mechanisms to generate movement or locomotion.
Something we generally take for granted is that walking or lifting requires coordinated, thoughtful, and reflexive motor control, utilizing a combination of afferent and reflexive motor control. An artificial limb, divorced from such control, lacks these characteristics, resulting in robotic herky-jerky movements. But wait! Investigators and clinicians at Harvard and MIT have demonstrated that continuous neural control of a bionic limb can create a proper biomimetic gait, one almost indistinguishable from a normal one, following amputation below the knee.
With obvious refinements, the surgical technique of amputation has remained largely unchanged since before World War I. What’s different with what the Harvard/MIT team devised is that the residual limb is managed in an entirely different way. Thigh muscles that are traditionally severed are instead connected, creating an “agonist-antagonist myoneural interface construction” (AMIC). In doing so, and marrying it with some amazing technology, the reconnection permits the individual’s brain to send signals to the muscles that control the prosthetic’s movement in a not-so-robotic manner.
In the seven patients who underwent the procedure, fully 18% of biologically intact muscle afferents were achieved, resulting in a 41% increase in walking speed compared to matched amputee controls who did not undergo the procedure. The extent of afferent success was sufficient to permit biomimetic adaptation to various walking speeds, different terrain (including stairs, inclines, and obstacles), and nuanced adjustments that typify the real world.
These investigators demonstrate that the patient—not preprogrammed robotic mechanics—utilized their intrinsic, volitional nervous system to continuously, in real-time, control their gait, walking speed, and navigate complex terrain. We think this opens a very big door to what might lie ahead for those with missing limbs.

2. A new class of nonopioid, oral pain reliever: a game changer?
The search for the holy grail of analgesia—an oral, nonopioid, non-addictive potent drug—has proven to be elusive. Long exemplified by the British idiom of ‘nailing jelly to a wall,’ implying a particularly daunting task, a recent development suggests that the grail may be within reach. Pain, acute and chronic in its ubiquitous presentations, carries significant personal and societal costs, the latter exemplified by the pandemic of the ongoing opioid crisis.
CRNAs are at the interface of caring for patients who are taking analgesics, often opioids, or who will most certainly require them as part of their perioperative management. Among our options, besides opioids, are aspirin, NSAIDs, local anesthetics, antidepressants, NMDA antagonists, acetaminophen, central alpha-2 agonists, nitrous oxide, and several potent inhaled agents. Each has its place, but all suffer from significant issues that we need not remind you, savvy practitioner, of what those are!
The ‘new kid,’ a much different one, on the block
An oral sodium channel blocker, suzetrigine, blocks the NaV1.8 sodium channel in peripheral, not central, sensory neurons, thereby quieting, and in some cases, completely blocking the transmission of pain signals. FDA-approved on January 30, 2025, for moderate to severe pain, this drug represents an entirely new class of analgesics. High-quality studies demonstrate its effectiveness as a potent and highly selective inhibitor of pain signaling. Regarding NaV1.8, the ‘NaV’ refers to the voltage-gated sodium channel, and the ‘1.8’ indicates the channel’s type and location in peripheral sensory neurons. You may recall that there are at least nine (NaV1.1 to NaV1.9) types of sodium channels in our body.
Unless you’ve been practicing for more than 20 years, you can’t recall a brand-new class of analgesic receiving FDA approval. Acting entirely, that is, selectively at peripheral neurons, suzetrigine obviates concerns about addictive properties. Marketed as Journavx™, it has gained merit by navigating robust randomized controlled trials and safety dosing studies, involving a wide range of operative and nonoperative pain scenarios. Post-marketing surveillance (Phase 4) validates the effectiveness of the drug.
Adverse events observed in the trials included pruritus, the most common event, affecting 2% of the patients, as well as a lower incidence of muscle spasms, rash, elevated creatine phosphokinase, and modest decreases in glomerular filtration rate. Some of the trials involved patients taking suzetrigine for up to two weeks. Typical administration involved a 100mg oral dose followed by 50mg every 12 hours.
Considering the potential of this new class of analgesics, we recalled the evolution of calcium channel blockers (CCBs). Verapamil, the first in its class approved by the FDA in 1981, has undergone significant subsequent modifications. Under the broad umbrella of CCBs, we’ve come a long way, with some targeting the heart and its rhythm specifically, while others focus on the vasculature. We could not be more excited about the potential for Journavx™ and other oral analgesic sodium channel blockers that are likely to emerge in its wake. Perhaps not the holy grail achieved, but a game changer, for sure. Unbridled scientific progress at work.
3. An astonishing, magnificent “glove”
Every CRNA likely has experience, either from their training program or postgraduate professional work, with caring for patients in the burn unit. The sheer physicality and emotion invested in caring for such patients can be overwhelming. The devotion of the burn unit staff to their patients, in our experience, is unrivaled, given the intense pain, risk of infection, body-image threats, and often long-term hospitalization required.
When we are called to the burn unit, it is for one or some combination of dressing change, debridement, and grafting. The extraordinary intensity of pain involved in each necessitates our expertise. A common procedure involves the burn surgeon sewing on harvested or artificial skin to cover the burn wound and promote healing.
Let’s take a case in point: a severely burned hand where perhaps the entirety of its surface area has been lost, and small segments of replacement skin require intricate placement and challenging sewing that must conform to the complex architecture of the hand. In our clinical experience, and based on discussions with full-time burn unit surgeons, this often involves the meticulous placement of up to 15-20 skin segments, which are technically challenging and time-consuming, especially with respect to the fingers. And even when optimally accomplished by the artisan-surgeon, there is the issue of healing that involves intense pain, scarring, contracture risks, and the ever-present threat of infection and tissue sloughing.
Enter a team of dedicated Columbia University bioengineers, dermatologists, surgeons, and hand specialists who have developed an engineered skin “glove” that is customized to fit the patient precisely. Although successful in animal models, this approach has yet to be applied to humans, as additional regulatory hurdles must be overcome. But imagine, after generating a precise laser scan of the patient’s hand, an AI-enabled program creates a 3D-printed biological scaffolding, seeded with progenitor cells tailored for the intended patient. This generates real skin that does not trigger immune rejection, all while being nourished with a blood-like medium to promote growth over approximately three weeks. The bioengineered glove, once mature, can be slipped over the awaiting hand, anchored at the wrist, and integrate itself into the underlying tissue as it heals.
When adventurous, prescient, educated minds encounter a problem
While some may accuse us of being overly enthusiastic about the possibilities of scientific curiosity and effort, we want to point to a story unrelated to the three healthcare advancements depicted above. We applaud innovators who take advantage of the freedom to pursue new ideas and approaches that may have both short-term and long-term implications, often underappreciated.
Events occurring in the early 1700s and mid-1800s conspired to create a technology that predated any consideration of its potential health applications. Bouguer, in 1729, introduced what was described as “photometry,” where he described how the intensity of light decreased as it passed through various media. Lambert, three decades later, described a mathematical relationship amplifying Bouguer’s work. In 1852, Beer extended on the work of Bouguer and Lambert, describing light’s absorbance as proportional to the concentration of a solute in a sample. These observations were to foster, a century later, the development of pulse oximetry, grounded in the Lambert-Beer Law, with consequences that those early technological explorers could not have envisioned.
While our focus here is on improving quality of life through the three breakthroughs described above, we remain optimistic—and excited—about both their immediate impact and their long-term potential, the latter akin to the fate of Bouger’s “photometry.” There are many other advances we could have discussed, and we may explore some of those with you in future posts.
We’re CRNAs ourselves, and we understand the challenge of fitting CRNA continuing education credits into your busy schedule. Whenever you’re ready, we’re here to help.
