Trends in tobacco use in the US have shifted dramatically over the past half century, thanks, in part, to improved scientific and medical knowledge, policies aimed at reducing cigarette use, and the emergence of tobacco-free products like electronic cigarettes (e-cigarettes). Although these trends demonstrate how major strides can be made in public health in a matter of decades, there still remains a need to strengthen public health protections around tobacco use.

Epidemiological cohort studies (in which groups of people are followed for a long time to measure the impact of exposure to a specific factor) providing clear evidence of the link between cigarette smoking and lung cancer were first conducted in the 1950s.1 Around that time, the majority of US adults were smokers: smoking prevalence was 56.9% in 1955.2 Public opinion and behavior, however, began to shift after the release of the 1964 US Surgeon General Report, which concluded that the average smoker had a 9- to 10-fold increased risk of developing lung cancer compared to non-smokers, while smoking was responsible for a 70% increased mortality rate among smokers versus non-smokers.3 Smoking prevalence dropped shortly thereafter. Among US adults, it was 42.4% in 1965 but fell to 33.2% by 1980; by 2022, it was only 11.6%, according to data from the American Lung Association.4 According to one study, over 8 million US deaths attributable to smoking have been averted since 1964.5

Increased taxes on tobacco products, the introduction of smoking bans to places like workplaces and restaurants, restrictions on advertising, and increased support for smoking cessation have all contributed to the dramatic decline in tobacco use in the US. The trends also hold true for adolescents. In 1997, 36.4% of US high school students reported smoking cigarettes, but only 3.8% did in 2021.4 Between 1991 and 2021, daily cigarette use decreased from 9.8% to 0.6%, while frequent use—defined as smoking on 20 or more days during the 30 days before the survey—dropped from 12.7% to 0.7%.6

In recent years, however, the trend of decreased tobacco use in the US has been matched—and in some ways even exceeded—by a trend towards increased use of e-cigarettes. Though they do not contain tobacco, most e-cigarette products contain nicotine, which is often extracted from the tobacco plant, and carry a range of health risks, including lung injury, nicotine addiction, and harms to brain development in adolescents.7 According to data from the Centers for Disease Control and Prevention, whereas only 4.5% of US adults used e-cigarettes in 2019, 6.5% did so in 2023.8

Additionally, between 2017 and 2023, the number of US adults who exclusively smoke cigarettes decreased by 6.8 million, but the number of US adults who exclusively use e-cigarettes increased by 7.2 million.9 E-cigarette usage among adolescents surged in the 2010s, reaching a peak of 27.5% among high school students in 201910; in 2024, only 5.9% of US youths reported e-cigarette usage,11 but it still remains a serious public health concern. Though the trend of reduced tobacco smoking among the US population remains a public health achievement, the newer trend of increased e-cigarette usage demands attention and intervention.

References

1. Di Cicco, M. E., Ragazzo, V. & Jacinto, T. Mortality in relation to smoking: the British Doctors Study. Breathe (Sheff) 12, 275–276 (2016), PMID: 28210302

2. Stone, M. D. et al. State and Sociodemographic Trends in US Cigarette Smoking With Future Projections. JAMA Netw Open 8, e256834 (2025), DOI:10.1001/jamanetworkopen.2025.6834

3. The 1964 Report on Smoking and Health. Reports of the Surgeon General – Profiles in Science https://profiles.nlm.nih.gov/spotlight/nn/feature/smoking (2019).

4. American Lung Association. Tobacco Trends Brief: Overall Tobacco Trends Brief. https://www.lung.org/research/trends-in-lung-disease/tobacco-trends-brief/overall-smoking-trends.

5. Holford, T. R. et al. Tobacco Control and the Reduction in Smoking-Related Premature Deaths in the United States, 1964-2012. JAMA 311, 164–171 (2014), DOI: 10.1001/jama.2013.285112

6. Mejia, M. C., Adele, A., Levine, R. S., Hennekens, C. H. & Kitsantas, P. Trends in Cigarette Smoking Among United States Adolescents. Ochsner J 23, 289–295 (2023), DOI: 10.31486/toj.23.0113

7. CDC. Health Effects of Vaping. Smoking and Tobacco Use https://www.cdc.gov/tobacco/e-cigarettes/health-effects.html (2025).

8. Products – Data Briefs – Number 524 – January 2025. https://www.cdc.gov/nchs/products/databriefs/db524.htm (2025) doi:10.15620/cdc/174583.

9. Arrazola, R. A. Notes from the Field: Tobacco Product Use Among Adults — United States, 2017–2023. MMWR Morb Mortal Wkly Rep 74, (2025), DOI: 10.15585/mmwr.mm7407a3

10. Cullen, K. A. et al. e-Cigarette Use Among Youth in the United States, 2019. JAMA 322, 2095–2103 (2019), DOI: 10.1001/jama.2019.18387

11. Commissioner, O. of the. Youth E-Cigarette Use Drops to Lowest Level in a Decade. FDA https://www.fda.gov/news-events/press-announcements/youth-e-cigarette-use-drops-lowest-level-decade (2024).

Endotracheal intubation is a common but often challenging part of anesthetic and critical care, marked by a high risk of hypoxemia. Traditionally, oxygenation during intubation has relied on bag-mask ventilation (BMV) or other forms of positive-pressure ventilation between induction and laryngoscopy. However, there is growing interest in high-flow nasal oxygenation (HFNO) as an alternative to ventilation, particularly for apneic oxygenation during intubation.

High-flow nasal oxygenation delivers warmed, humidified oxygen at flow rates typically ranging from 30 to 70 L/min via wide-bore nasal cannula. This approach provides several physiological advantages, including washout of nasopharyngeal dead space, generation of low-level positive airway pressure, and continuous delivery of oxygen during apnea. High-flow nasal oxygenation may extend safe apnea time and reduce the risk of desaturation, thus facilitating intubation attempts.

Multiple randomized and observational studies in operating room, emergency department, and intensive care unit settings have demonstrated that HFNO improves oxygen saturation during intubation compared with standard low-flow oxygen delivery. In patients with anticipated difficult airways or mild-to-moderate hypoxemia, HFNO has been shown to delay desaturation and increase first-pass intubation success by maintaining higher oxygen reserves. Its hands-free nature also facilitates airway manipulation and may reduce gastric insufflation compared with positive-pressure mask ventilation.

However, high-flow nasal oxygen does not provide ventilation or carbon dioxide clearance during apnea, which is sometimes necessary during intubation. In patients with severe hypoxemia, obesity, or high oxygen consumption, oxygenation alone may be insufficient, and rapid desaturation can still occur. Additionally, HFNO may be less effective in cases of upper airway obstruction or poor nasal patency. Other studies on this approach have shown mixed results, with a 2015 randomized trial demonstrating that HFNO did not make a significant difference in lowest oxygen saturation during intubation in critically ill patients with acute respiratory failure.

Bag-mask ventilation remains a reliable method for both oxygenation and ventilation during intubation, particularly in patients with severe hypoxemia or hypercapnia. When performed with appropriate technique and airway adjuncts, BMV can effectively maintain oxygenation and carbon dioxide elimination. Concerns regarding aspiration risk with positive-pressure ventilation have been challenged by evidence suggesting that gentle ventilation with low inspiratory pressures does not significantly increase aspiration events and may reduce hypoxemia-related complications.

In elective surgical patients with adequate preoxygenation and low risk of complications, HFNO may be used as a primary strategy for apneic oxygenation during rapid sequence induction. However, many patients—particularly those who are hypoxemic or have comorbidities that increase the risk of desaturation—may benefit more from active ventilation using bag-mask ventilation, noninvasive ventilation, or a combination of high-flow nasal oxygenation with assisted ventilation prior to intubation. Some studies suggest that combining HFNO with noninvasive ventilation during preoxygenation offers superior oxygenation compared with either technique alone.

High-flow nasal oxygenation and conventional ventilation during intubation have distinct advantages and limitations. HFNO is particularly useful for extending safe apnea time and maintaining oxygenation during laryngoscopy, while bag-mask ventilation provides more reliable gas exchange in high-risk patients. Optimal airway management should be tailored to patient physiology, clinical setting, and operator expertise, rather than relying on a single universal strategy.

References

1. Patel A, Nouraei SA. Transnasal humidified rapid-insufflation ventilatory exchange (THRIVE): a physiological method of increasing apnoea time in patients with difficult airways. Anaesthesia. 2015;70(3):323-329. DOI: 10.1111/anae.12923

2. Miguel-Montanes R, Hajage D, Messika J, et al. Use of high-flow nasal cannula oxygen therapy to prevent desaturation during tracheal intubation of ICU patients. Am J Respir Crit Care Med. 2015;193(10):1140-1147. DOI: 10.1097/CCM.0000000000000743

3. Casey JD, Janz DR, Russell DW, et al. Bag-mask ventilation during tracheal intubation of critically ill adults. N Engl J Med. 2019;380(9):811-821. DOI: 10.1056/NEJMoa1812405

4. Semler MW, Janz DR, Lentz RJ, et al. Randomized trial of apneic oxygenation during endotracheal intubation of critically ill patients. Am J Respir Crit Care Med. 2016;193(3):273-280. DOI: 10.1164/rccm.201507-1294OC

5. Frat JP, Thille AW, Mercat A, et al. High-flow oxygen through nasal cannula in acute hypoxemic respiratory failure. N Engl J Med. 2015;372(23):2185-2196. DOI: 10.1056/NEJMoa1503326

Neuromuscular blockade is an anesthetic technique that allows clinicians to immobilize patients in a controlled way and safely perform precise, complex surgery. Nondepolarizing neuromuscular blocking agents (NMBAs) such as rocuronium and vecuronium produce paralysis by competitively inhibiting nicotinic acetylcholine receptors (nAChRs) at the motor endplate, preventing acetylcholine (ACh) from activating the receptor to generate muscle contraction. Agents used for the reversal of neuromuscular blockade operate through two fundamentally different molecular strategies: boosting synaptic ACh by inhibiting acetylcholinesterase (AChE) or directly removing aminosteroid NMBAs from circulation through host-guest encapsulation. Each method has distinct molecular targets, kinetics, and side-effect profiles that shape clinical use and guide ongoing drug development.1

AChE contains two key functional regions: an anionic site that attracts the positively charged ammonium of ACh and an esteratic site that catalyzes hydrolysis. Neostigmine is a neuromuscular blockade reversal agent that works on a molecular level by inhibiting AChE, allowing ACh to build up and disrupt the competitive inhibition of the blocking agent. It is a quaternary ammonium carbamate that binds deep within the enzyme’s gorge, occupying both the anionic and esteratic domains. This interaction forms a carbamylated enzyme intermediate that is hydrolyzed far more slowly than the natural substrate.1 As long as the esteratic site remains carbamylated, ACh cannot access the catalytic machinery, effectively inhibiting AChE until spontaneous decarbamylation restores activity.1

By inhibiting AChE, neostigmine prolongs the synaptic lifetime of ACh and raises its steady-state concentration at the neuromuscular junction. Elevated ACh then competes more effectively with nondepolarizing NMBAs for nAChR binding. As circulating NMBA levels fall through redistribution and elimination, ACh occupancy of the receptor increases, allowing endplate potentials to return above the threshold needed for muscle contraction.1 However, this approach is limited by a ceiling effect: once AChE is maximally inhibited, additional drug provides no further benefit, especially in deep blockade. Muscarinic side effects—bradycardia, secretions, bronchospasm—require co-administration of anticholinergics.

Host-guest encapsulation offers a fundamentally different molecular mechanism for neuromuscular blockade reversal. Sugammadex, a modified γ-cyclodextrin, features a hydrophobic central cavity and negatively charged exterior groups. The cavity fits the hydrophobic steroid backbone of rocuronium and vecuronium, while the outer charges attract their cationic regions, enabling tight but reversible 1:1 binding.2, 3 Molecular simulations show that rocuronium first docks near the rim of sugammadex, then rotates and slides into the cavity to reach a low-energy, stable configuration.3

Clinically, sugammadex rapidly decreases the free plasma concentration of rocuronium or vecuronium by encapsulating them. This creates a concentration gradient that draws more NMBA molecules away from the neuromuscular junction and into the bloodstream. Once removed from the receptor environment, paralysis resolves without needing to increase ACh levels or inhibit AChE.3

Calabadions represent the next generation of encapsulating agents. These pumpkin-shaped molecular “containers,” derived from the cucurbituril family, bind multiple classes of cationic NMBAs, including steroidal and benzylisoquinolinium drugs.2 Calabadion 1 has affinity comparable to the sugammadex–rocuronium complex, while calabadion 2 exhibits dramatically higher binding strength—approximately two orders of magnitude greater than sugammadex in

experimental systems. In vitro and ex vivo studies show calabadion 2 binds rocuronium with roughly 89-fold higher affinity than sugammadex.2,4 Its adjustable cavity size allows encapsulation of benzylisoquinoliniums like cisatracurium, expanding its therapeutic reach beyond what sugammadex can achieve.2,4

These complexes are cleared renally, and animal studies demonstrate that calabadion 2 reverses neuromuscular block more rapidly and at lower doses than sugammadex, without major cardiovascular effects.2,4 Because calabadions do not rely on increasing ACh, they avoid muscarinic adverse effects and the ceiling limitation of AChE inhibitors, and they may work even in profound blockade. Their main barriers to clinical adoption include defining dosing in relation to NMBA burden, assessing off-target binding to other cationic drugs, and establishing human safety. Current evidence remains preclinical, but if validated, calabadions could function as broad—potentially universal—NMBA reversal agents.

The molecular mechanisms behind neuromuscular blockade reversal define both the strengths and limits of current therapy. Traditional AChE inhibitors like neostigmine boost synaptic acetylcholine. However, they only work when NMBA levels are low, and they bring muscarinic side effects and a ceiling on efficacy. Encapsulation agents such as sugammadex mark a real shift: by directly sequestering NMBAs, they allow rapid, reliable reversal even from deep block with fewer autonomic complications. Emerging calabadions may extend this model further, potentially offering universal reversal across NMBA classes.

References

1. Ji W, Zhang X, Liu J, et al. Efficacy and safety of neostigmine for neuromuscular blockade reversal in patients under general anesthesia: a systematic review and meta-analysis. Ann Transl Med. 2021;9(22):1691. doi:10.21037/atm-21-5667

2. Haerter F, Simons JCP, Foerster U, et al. Comparative effectiveness of calabadion and sugammadex to reverse non-depolarizing neuromuscular blocking agents. Anesthesiology. 2015;123(6):1337-1349. doi:10.1097/ALN.0000000000000868

3. Irani AH, Voss L, Whittle N, Sleigh JW. Encapsulation Dynamics of Neuromuscular Blocking Drugs by Sugammadex. Anesthesiology. 2023;138(2):152-163. doi:10.1097/ALN.0000000000004442

4. de Boer HD, Carlos RV. New drug developments for neuromuscular blockade and reversal: Gantacurium, CW002, CW011, and calabadion. Curr Anesthesiol Rep. 2018;8(1):119-124. doi:10.1007/s40140-018-0262-9

Blood pressure measurement is essential to clinical care, especially in surgical and intensive care settings. However, measured values can vary markedly depending on the method used. Differences in device technology, measurement environment, user technique, and physiologic factors all contribute to this variability. Understanding potential variation in blood pressure measurement across measurement methods is important for interpreting readings accurately and selecting the most appropriate monitoring strategy for each patient.

Office blood pressure measurement, traditionally performed by auscultation or automated oscillometric devices, has been the foundation of hypertension diagnosis for decades. Despite its widespread use, office blood pressure often overestimates a patient’s usual blood pressure because of the white-coat effect, improper technique, or suboptimal conditions such as talking or inadequate rest before measurement. Although office readings are standardized and performed by trained personnel, they capture only a brief moment in time and may not reflect typical blood pressure patterns during daily life.

Home blood pressure monitoring (HBPM) provides a more realistic assessment of blood pressure under everyday conditions. Measurements taken at home are typically 5 to 15 mmHg lower than office values due to reduced sympathetic activation and greater patient comfort. HBPM improves diagnostic accuracy for hypertension and enhances long-term management by allowing patients and clinicians to track trends. Accuracy, however, depends on using a validated device, correct cuff size, proper arm positioning, and adherence to measurement protocols.

Ambulatory blood pressure monitoring (ABPM) is considered the gold standard for evaluating true blood pressure patterns. ABPM measures blood pressure at regular intervals over 24 hours, capturing daytime and nighttime values and revealing circadian patterns. Normal individuals experience a 10% to 20% dip in nighttime blood pressure; deviations such as non-dipping or reverse-dipping patterns have been associated with increased cardiovascular morbidity. ABPM is particularly valuable for diagnosing masked hypertension, assessing resistant hypertension, and evaluating treatment effectiveness. Limitations include patient discomfort from repeated cuff inflations and limited accessibility in some clinical settings.

Wrist and finger monitoring methods, sometimes used in nonclinical environments, should generally not be considered accurate to true blood pressure as they produce significant variation in measurement results. Wrist monitors are sensitive to positioning, and inaccurate readings often result when the wrist is not held at heart level. Finger devices rely on photoplethysmographic

measurements and are affected by peripheral vasoconstriction or poor perfusion, making them unreliable for clinical decision-making. For accurate assessment, upper-arm devices remain the preferred noninvasive option.

Invasive arterial blood pressure monitoring provides continuous, beat-to-beat measurements through an arterial catheter and is used primarily in intensive care and surgical settings. Although invasive measurements tend to show slightly higher systolic and lower diastolic pressures than noninvasive cuff methods due to waveform amplification, they offer unmatched accuracy and responsiveness to hemodynamic changes. Risks such as thrombosis, bleeding, and infection restrict invasive monitoring to high-acuity settings.

Variations in blood pressure readings across measurement methods highlight the need for careful interpretation and selection of the most reliable approach for each clinical situation. While office measurements remain important for diagnostic pathways, HBPM and ABPM provide superior insight into true blood pressure patterns and cardiovascular risk, while intensive care and surgery may require the greater precision and accuracy offered by invasive monitoring. Ensuring proper technique and awareness of method-specific limitations is essential to achieving accurate and clinically meaningful blood pressure assessment.

References

1. Pickering TG, Miller NH, Ogedegbe G, et al. Call to action on use and reimbursement for home blood pressure monitoring: a joint scientific statement. Hypertension. 2008;52(1):10-29. DOI: 10.1161/HYPERTENSIONAHA.107.189011

2. O’Brien E, Parati G, Stergiou G, et al. European Society of Hypertension position paper on ambulatory blood pressure monitoring. J Hypertens. 2013;31(9):1731-1768. DOI: 10.1097/HJH.0b013e328363e964

3. Stergiou GS, Palatini P, Asmar R, et al. Recommendations for home blood pressure monitoring. Eur Heart J. 2021;42(39):4007-4020. DOI: 10.1111/jch.13815

4. Sharman JE, Howes F, Head GA, McGrath BP, Stowasser M, Schlaich M. Home and ambulatory blood pressure monitoring: recent evidence and clinical relevance. Med J Aust. 2015;203(3):106-109. DOI: 10.1111/j.1751-7176.2008.00043.x

5. Wax DB, Lin H-M, Leibowitz AB. Invasive and noninvasive blood pressure monitoring: when are both needed? Anesthesiology. 2010;112(3):707-722. DOI: 10.3389/fmed.2017.00202

Normal saline (0.9% sodium chloride) has traditionally been considered the standard isotonic solution for fluid resuscitation and maintenance. Yet, substantial evidence indicates that large-volume administration of normal saline can produce a non-anion gap, hyperchloremic metabolic acidosis that disrupts both acid-base equilibrium and organ perfusion (1). Once regarded as a benign biochemical effect, this iatrogenic acidosis is now recognized as clinically significant, especially in perioperative and critical care settings.

The mechanism of saline-induced acidosis can be understood using the physicochemical, or Stewart, approach to acid-base balance. In this model, plasma pH depends on three independent variables: the partial pressure of carbon dioxide, the total concentration of weak acids, and the strong ion difference (SID), which is the numerical difference between fully dissociated cations and anions. Under normal physiological conditions, the SID of human plasma averages 38–42 mEq/L. However, the SID of normal saline is essentially zero because it contains equal concentrations of sodium and chloride. Therefore, infusion of saline reduces the plasma SID, leading to an increase in hydrogen ion concentration and a decline in pH, a.k.a. acidosis (2). In contrast, balanced crystalloids, such as Ringer’s lactate or Plasma-Lyte, contain metabolizable anions (lactate, acetate, or gluconate) that serve as buffers and prevent the development of acidosis.

In addition to its effects on acid-base balance, chloride influences kidney and circulatory function. Elevated plasma chloride levels (compared to baseline) cause the afferent arterioles in the kidneys to constrict, decreasing renal blood flow and glomerular filtration. Wilcox first demonstrated this mechanism in human and animal studies, showing that chloride itself (rather than sodium or osmotic factors) drives these changes (3). This reduction in renal perfusion provides a physiological explanation for the risk of kidney injury associated with chloride-rich fluids. Yunos and colleagues later confirmed the clinical importance of this mechanism by finding that adopting a chloride-restrictive fluid strategy in critically ill adults lowered the incidence of acute kidney injury and reduced the need for renal replacement therapy (4).

Clinically, saline-induced acidosis manifests as a gradual decrease in serum bicarbonate concentration, accompanied by a proportional increase in chloride. This condition can develop rapidly following the administration of several liters of saline, especially in patients with limited renal compensatory capacity. Although respiratory alkalosis can offset some of the decrease in pH through hyperventilation, sustained metabolic acidosis can impair myocardial contractility, blunt catecholamine responsiveness, and worsen systemic inflammation. McFarlane and Lee observed that patients who received balanced crystalloids during surgery had more stable hemodynamic and acid-base profiles than patients who received normal saline, which highlights the clinical consequences of fluid selection (5).

Furthermore, the traditional characterization of normal saline as “physiologic” is misleading. Its chloride content (154 mmol/L) substantially exceeds normal plasma levels, creating an ionic environment that is not representative of human plasma. For this reason, many experts now recommend balanced crystalloids as the preferred resuscitation fluid for most clinical situations and reserve normal saline for conditions such as hyponatremia or traumatic brain injury, where hypotonic fluids may be contraindicated.

Acidosis resulting from normal saline administration is a well-characterized and preventable disturbance in acid-base homeostasis. The Stewart framework provides a clear mechanistic basis for understanding this effect, and clinical studies continue to demonstrate its relevance to renal and systemic outcomes. Appropriate fluid choice, guided by physiological principles rather than convention, remains essential to optimizing patient safety and therapeutic efficacy, especially when large volumes of fluid are needed.

References

  1. Kellum JA. Saline-induced hyperchloremic metabolic acidosis. Crit Care Med. 2002;30(1):259-261. doi:10.1097/00003246-200201000-00046
  2. Morgan TJ. The meaning of acid-base abnormalities in the intensive care unit: part III — effects of fluid administration. Crit Care. 2005;9(2):204-211. doi:10.1186/cc2946
  3. Wilcox CS. Regulation of renal blood flow by plasma chloride. J Clin Invest. 1983;71(3):726-735. doi:10.1172/jci110820
  4. Yunos NM, Bellomo R, Hegarty C, Story D, Ho L, Bailey M. Association between a chloride-liberal vs chloride-restrictive intravenous fluid strategy and kidney injury in critically ill adults. JAMA. 2012;308(15):1566–1572. doi:10.1001/jama.2012.13356
  5. McFarlane C, Lee A. A comparison of Plasmalyte 148 and 0.9% saline for intra-operative fluid replacement. Anaesthesia. 1994;49(9):779-781. doi:10.1111/j.1365-2044.1994.tb04450.x

The reversal of neuromuscular blockade during anesthesia ensures that patients regain safe respiratory function and airway protection following surgery. In the United States, anesthesiologists primarily rely on two drugs for this purpose: neostigmine, a long-established cholinesterase inhibitor, and sugammadex, a newer selective relaxant binding agent. Their usage reflects a balance of tradition, cost considerations, and emerging evidence on patient outcomes.

Neostigmine acts by inhibiting acetylcholinesterase, allowing acetylcholine to accumulate and compete with neuromuscular blocking agents at the receptor level. This indirect mechanism means that its effectiveness depends on partial spontaneous recovery having already occurred. As a result, reversal from deep blockade may be incomplete or delayed, increasing the risk of residual paralysis.

Sugammadex, in contrast, provides a more direct and predictable effect. By encapsulating rocuronium or vecuronium molecules, it rapidly clears them from the neuromuscular junction directly, leading to swift restoration of muscle function. Some clinical trials suggest a faster return to normal function with sugammadex compared with neostigmine, along with better recovery of respiratory muscle strength 1–7.

The side effect profiles of these drugs differ significantly. Neostigmine is almost always administered with an anticholinergic agent such as glycopyrrolate to counteract muscarinic effects like bradycardia, bronchospasm, and increased salivation. Despite this precaution, hemodynamic fluctuations remain a concern.

Sugammadex does not share this cholinergic mechanism, and patients treated with it often experience more stable cardiovascular responses. Although rare reports of hypersensitivity or anaphylaxis exist, the overall safety record of sugammadex is favorable, particularly in higher-risk populations such as those with cardiovascular disease or respiratory compromise 3,4,8–11.

Cost has been the most significant barrier to broader adoption of sugammadex in the United States. A single dose of sugammadex costs several times more than the combination of neostigmine and glycopyrrolate. As a result, many hospitals restrict its use to select scenarios, preserving neostigmine as the default agent for routine cases.However, when broader economic factors are considered, the value proposition shifts: Sugammadex frequently reduces time to extubation, accelerates operating room turnover, and decreases postoperative pulmonary complications. Replacing neostigmine with sugammadex in a portion of procedures could perhaps contribute to reducing pulmonary complications and result in overall savings after factoring in downstream costs 12–17.

In everyday practice in the United States, institutions often adopt a hybrid strategy, with neostigmine remaining a familiar and cost-effective option for patients at lower risk or when only shallow blockade is anticipated. Sugammadex, however, is increasingly favored in situations involving deeper blockade, higher-risk patients, or settings where efficiency and safety are prioritized 3,4,18,19.

As more outcome-driven evidence becomes available and hospitals refine cost–benefit considerations, the role of sugammadex is expected to expand, with its broader adoption ultimately depending on how institutions balance immediate drug costs against potential long-term gains in reduced complications, improved workflow, and enhanced patient safety.

References

1.         Moss, A. P., Powell, M. F., Morgan, C. J. & Tubinis, M. D. Sugammadex versus neostigmine for routine reversal of neuromuscular blockade and the effect on perioperative efficiency. Proc (Bayl Univ Med Cent) 35, 599–603. DOI: 10.1080/08998280.2022.2079921

2.         Maqusood, S., Bele, A., Verma, N., Dash, S. & Bawiskar, D. Sugammadex vs Neostigmine, a Comparison in Reversing Neuromuscular Blockade: A Narrative Review. Cureus 16, e65656. DOI: 10.7759/cureus.65656

3.         Chandrasekhar, K., Togioka, B. M. & Jeffers, J. L. Sugammadex. in StatPearls (StatPearls Publishing, Treasure Island (FL), 2025).

4.         Neely, G. A., Sabir, S. & Kohli, A. Neostigmine. in StatPearls (StatPearls Publishing, Treasure Island (FL), 2025).

5.         Calvey, T. N., Wareing, M., Williams, N. E. & Chan, K. Pharmacokinetics and pharmacological effects of neostigmine in man. Br J Clin Pharmacol 7, 149–155 (1979). DOI: 10.1111/j.1365-2125.1979.tb00915.x

6.         Nguyen-Lee, J. et al. Sugammadex: Clinical Pharmacokinetics and Pharmacodynamics. Curr Anesthesiol Rep 8, 168–177 (2018).     DOI: 10.1007/s40140-018-0266-5

7.         Panhuizen, I. F. et al. Efficacy, safety and pharmacokinetics of sugammadex 4 mg kg−1 for reversal of deep neuromuscular blockade in patients with severe renal impairment. British Journal of Anaesthesia 114, 777–784 (2015). DOI: 10.1093/bja/aet586

8.         Herring, W. J. et al. A randomized trial evaluating the safety profile of sugammadex in high surgical risk ASA physical class 3 or 4 participants. BMC Anesthesiol 21, 259 (2021). DOI: 10.1186/s12871-021-01477-5

9.         Mao, X. et al. A pharmacovigilance study of FDA adverse events for sugammadex. Journal of Clinical Anesthesia 97, 111509 (2024). DOI: 10.1016/j.jclinane.2024.111509

10.       Luo, J., Chen, S., Min, S. & Peng, L. Reevaluation and update on efficacy and safety of neostigmine for reversal of neuromuscular blockade. Ther Clin Risk Manag 14, 2397–2406 (2018). DOI: 10.2147/TCRM.S179420

11.       Hristovska, A.-M., Duch, P., Allingstrup, M. & Afshari, A. The comparative efficacy and safety of sugammadex and neostigmine in reversing neuromuscular blockade in adults. A Cochrane systematic review with meta-analysis and trial sequential analysis. Anaesthesia 73, 631–641 (2018). DOI: 10.1111/anae.14160

12.       Neostigmine Prices, Coupons, Copay Cards & Patient Assistance. Drugs.com https://www.drugs.com/price-guide/neostigmine.

13.       Benscheidt, A. & Daratha, K. B. Sugammadex versus Neostigmine: Operating Room Time and Cost. Books, Presentations, Posters, Etc. (2019).

14.       Bartels, K., Fernandez-Bustamante, A. & Melo, M. F. V. Reversal of neuromuscular block: what are the costs? British Journal of Anaesthesia 131, 202–204 (2023). DOI: 10.1016/j.bja.2023.04.037

15.       Lan, W. et al. Cost-Effectiveness of Sugammadex Versus Neostigmine to Reverse Neuromuscular Blockade in a University Hospital in Taiwan: A Propensity Score-Matched Analysis. Healthcare (Basel) 11, 240 (2023). DOI: 10.3390/healthcare11020240

16.       Maddock, A. The ‘cost’ of sugammadex. Anaesthesia 72, 1558–1559 (2017). DOI: 10.1111/anae.14150

17.       Bruceta, M., Singh, P. M., Bonavia, A., Carr, Z. J. & Karamchandani, K. Emergency use of sugammadex after failure of standard reversal drugs and postoperative pulmonary complications: A retrospective cohort study. J Anaesthesiol Clin Pharmacol 39, 232–238 (2023). DOI: 10.4103/joacp.joacp_289_21

18.       Adeyinka, A. & Layer, D. A. Neuromuscular Blocking Agents. in StatPearls (StatPearls Publishing, Treasure Island (FL), 2025).

19.       Nemes, R. & Renew, J. R. Clinical Practice Guideline for the Management of Neuromuscular Blockade: What Are the Recommendations in the USA and Other Countries? Curr Anesthesiol Rep 10, 90–98 (2020). DOI: 10.1007/s40140-020-00389-3

Anesthesia management for patients with a history of stem cell transplant poses unique clinical challenges due to the complex interplay of immunosuppression, systemic toxicity, and organ dysfunction that are commonly observed in this patient population. These patients often have significant comorbidities as a result of their underlying hematologic malignancies or the intensive chemoradiotherapy regimens they undergo before the transplant. A comprehensive preoperative evaluation, meticulous intraoperative planning, and vigilant postoperative care are essential to optimizing outcomes and reducing complications.

The preoperative assessment should include a detailed history of the transplant process, including the conditioning regimen, the source of the graft (autologous or allogeneic), and post-transplant complications, such as graft-versus-host disease (GVHD). Pulmonary complications are common in this group and include idiopathic pneumonia syndrome, bronchiolitis obliterans, and infectious pneumonias. These complications can severely impair gas exchange and increase the risk of perioperative hypoxia (Ji et al., 2022). Thus, pulmonary function tests and imaging studies are often warranted preoperatively. Cardiovascular toxicity from chemotherapy agents such as anthracyclines may manifest as cardiomyopathy. Echocardiographic evaluation is necessary to assess ejection fraction and valvular function before anesthesia induction (Mahadeo et al., 2021).

The hematologic status of patients with stem cell transplant needing surgery is crucial for anesthesia and surgery planning. Profound cytopenias, including thrombocytopenia, may be present and increase the risk of perioperative bleeding. Platelet transfusions are frequently required and must be timed appropriately to ensure hemostatic efficacy during invasive procedures (Russell et al., 2024). Additionally, neutropenia increases the risk of perioperative infections, necessitating strict aseptic techniques and potentially prophylactic antibiotics.

Anesthesia medications must be carefully selected in patients with stem cell transplant. Hepatic and renal dysfunction, which are not uncommon due to previous chemotherapy or GVHD-related organ damage, can alter drug metabolism and excretion. Agents such as propofol, which have a short, context-sensitive half-life and have relatively predictable kinetics, are often preferred. However, volatile anesthetics may exacerbate hepatic injury and require cautious use. Neuromuscular blockers that do not rely on hepatic or renal clearance, such as cisatracurium, are preferred in patients with end-organ impairment (Alodhaib et al., 2025).

Intraoperative monitoring should be comprehensive. Arterial lines should be considered for hemodynamic monitoring and blood gas analysis in any patients with cardiovascular instability or respiratory compromise. Temperature regulation is critical due to the patients’ immunocompromised state; even mild hypothermia can predispose them to infections and coagulopathy. Furthermore, fluid management must be judicious, especially in cases of capillary leak syndrome or hypoalbuminemia, both of which may occur post-transplant (Mahadeo et al., 2021).

Postoperative care includes aggressive pain management while minimizing the immunosuppressive and respiratory depressive effects of opioids. Regional anesthesia techniques can provide superior analgesia and reduce systemic opioid use when not contraindicated by thrombocytopenia or coagulopathy. However, the risks of bleeding must be weighed against the benefits of invasive blocks, especially in patients with platelet counts below 50,000/µL (Russell et al., 2024). Additionally, anesthesiologists must be alert to potential complications, such as septic shock or acute respiratory distress syndrome (ARDS), which can occur even during the immediate postoperative period. Due to the complex physiology of SCT recipients, even minor procedures require careful planning and execution.

Patients with stem cell transplants are at high risk for anesthesia complications due to multisystem involvement, immunosuppression, and the potential for rapid decompensation. Tailored anesthetic strategies that consider the transplant timeline and associated complications are critical for safe and effective perioperative care.

References

  1. Liang J, Chen Y, Zhou J, et al. Lung transplantation for bronchiolitis obliterans after hematopoietic stem cell transplantation: a retrospective single-center study. Ann Transl Med. 2022;10(12):659. doi:10.21037/atm-22-2517
  2. Ragoonanan D, Khazal SJ, Abdel-Azim H, et al. Diagnosis, grading and management of toxicities from immunotherapies in children, adolescents and young adults with cancer. Nat Rev Clin Oncol. 2021;18(7):435-453. doi:10.1038/s41571-021-00474-4
  3. Anthon CT, Pène F, Perner A, et al. Platelet transfusions in adult ICU patients with thrombocytopenia: A sub-study of the PLOT-ICU inception cohort study. Acta Anaesthesiol Scand. 2024;68(8):1018-1030. doi:10.1111/aas.14467
  4. Khan MA, Abu Esba LC, Yousef CC, et al. Practical challenges and considerations in adopting biosimilars in oncology clinical practice within a large healthcare system. Expert Rev Clin Pharmacol. 2025;18(6):323-331. doi:10.1080/17512433.2025.2492063
  5. Anthon CT, Pène F, Perner A, et al. Platelet transfusions and thrombocytopenia in intensive care units: Protocol for an international inception cohort study (PLOT-ICU). Acta Anaesthesiol Scand. 2022;66(9):1146-1155. doi:10.1111/aas.14124

As healthcare policy has evolved, healthcare delivery has become increasingly influenced by administrative procedures such as prior authorization. Although prior authorization was initially implemented to control healthcare costs and guarantee appropriate care, its effect on perioperative services, including anesthesia, is becoming a major concern. Delays caused by prior authorization requirements can affect operating room scheduling, anesthesia planning, and, ultimately, patient outcomes. Mounting evidence suggests that these delays introduce clinical inefficiencies and potential risks for patients requiring time-sensitive surgical interventions.

A significant barrier to timely anesthesia services is the delay in scheduling procedures due to pending insurance approval. This is particularly problematic for outpatient and elective procedures, where administrative hurdles may delay surgery by days or weeks. These delays can have more than just logistical consequences—they may lead to worsening patient conditions, the need to reassess preoperative plans, or even the progression of disease, which requires more complex anesthesia care. Although anesthesia itself is not typically the direct target of prior authorization, the services it supports, such as imaging, surgery, and certain pain management interventions, frequently are. As such, anesthesiology teams may experience indirect but significant disruptions in workflow and resource allocation (1).

Socioeconomic factors have been shown to correlate with longer wait times for procedures involving anesthesia, especially in pediatric populations. For instance, a study analyzing pediatric outpatient MRIs revealed that patients with public insurance or belonging to minority racial or non-English speaking groups had significantly longer wait times for imaging completion, a process that requires anesthesia in many cases. These findings suggest that systemic inequities linked to prior authorization disproportionately affect vulnerable populations, exacerbating healthcare disparities and potentially influencing anesthetic outcomes through delayed diagnosis or treatment (1).

The potential downstream effects of these delays on trauma and acute care cases have also been observed. Although trauma care is often exempt from prior authorization requirements due to its emergent nature, subsequent surgical planning may be delayed when post-acute interventions, such as imaging or certain medications, require authorization. In such cases, the anesthesia team may face uncertainty in scheduling and resource mobilization, which can affect the safety and effectiveness of perioperative care. Furthermore, since trauma cases can evolve quickly, delays in post-trauma surgical intervention can necessitate changes in anesthetic plans, including different pharmacological approaches or monitoring standards, due to progression in clinical status (2).

Another area in which prior authorization impacts anesthesia care is chronic pain management. Interventional pain procedures, particularly those requiring guided injections or radiofrequency ablations, often fall under the purview of the anesthesiology department. These services are commonly flagged for prior authorization, which can lead to frustration among providers and patients. Delays in these procedures can lead to prolonged opioid use, increased emergency department visits, or deterioration in patient function. All of these outcomes increase the burden on the anesthesia teams involved in the longitudinal care of these patients (3).

While definitive studies specifically quantifying anesthesia-related complications from prior authorization delays are still limited, the indirect evidence is compelling. Prior authorization often presents a significant operational and clinical hurdle. Reforming the system to prioritize efficiency and transparency could mitigate its negative impact on anesthesiology and surgical services.

References

  1. Noda SM, Alp Oztek M, Sullivan E, Otto RK, Stanford S, Iyer RS. The Effects of Race, Primary Language, Insurance and Other Factors on Time to Pediatric Outpatient MRI Completion: A Retrospective Cohort Study. Acad Radiol. 2024;31(11):4643-4649. doi:10.1016/j.acra.2024.08.042
  2. Scott JW, Groner JI, Jensen AR; ACS TQIP Mortality Reporting System Writing Group. Trauma Quality Improvement Program mortality reporting system case reports: Unanticipated mortality because of imaging-related delays in care. J Trauma Acute Care Surg. Published online July 3, 2025. doi:10.1097/TA.0000000000004691
  3. Slat S, Yaganti A, Thomas J, et al. Opioid Policy and Chronic Pain Treatment Access Experiences: A Multi-Stakeholder Qualitative Analysis and Conceptual Model. J Pain Res. 2021;14:1161-1169. Published 2021 Apr 27. doi:10.2147/JPR.S282228

Glucagon-like peptide-1 (GLP-1) receptor agonists have transformed the therapeutic landscape for type 2 diabetes mellitus (T2DM) and obesity. These agents mimic endogenous GLP-1, which enhances glucose-dependent insulin secretion, delays gastric emptying, and promotes satiety. Initially evaluated in randomized controlled trials (RCTs), GLP-1 RAs have demonstrated significant efficacy in improving glycemic control and reducing weight. However, RCTs often fail to capture the heterogeneity of real-world clinical populations. Therefore, real-world data is essential for evaluating the broader clinical utility, comparative effectiveness, and safety of GLP-1 agonists across different patient demographics and comorbidity profiles.

A comprehensive study by Thomsen et al. used nationwide registries to evaluate real-world outcomes associated with GLP-1 agonists, particularly with newer agents such as semaglutide and liraglutide. Their analysis revealed consistent weight loss and glycemic improvement across various subgroups, including individuals with significant comorbid conditions, such as cardiovascular disease or renal impairment (1).

Real-world data suggest that GLP-1 agonists may offer advantages over traditional therapies, such as basal insulin. In another study, Yang et al. analyzed the cost-effectiveness and clinical outcomes of GLP-1 agonists versus insulin using population-level datasets. Their findings revealed that GLP-1 agonists not only resulted in similar or superior glycemic control but also significantly reduced body weight and the risk of hypoglycemia (2).

Safety is an important concern in long-term pharmacotherapy. Gastrointestinal (GI) adverse events are the most commonly reported side effect associated with GLP-1 receptor agonist use. Liu et al. analyzed real-world data from the FDA Adverse Event Reporting System (FAERS) to compare the incidence of GI events among different GLP-1 receptor agonists. They found that semaglutide was associated with a higher incidence of nausea and vomiting, while exenatide was associated with a higher incidence of diarrhea. Despite these associations, the events were largely self-limiting and rarely led to treatment discontinuation, particularly when slow dose escalation protocols were employed (3).

Tumor-related adverse effects are a more serious but less frequent safety concern. Yang et al. analyzed FAERS data spanning nearly two decades to investigate potential oncogenic risks associated with GLP-1 agonist therapy. The study identified a small number of reports indicating possible associations with medullary thyroid carcinoma and pancreatic neoplasms. However, the authors cautioned that these findings were based on spontaneous reporting systems and did not establish causality. They emphasized the need for continued surveillance and mechanistic studies to clarify any potential biological links (4).

Renal and cardiovascular outcomes also merit consideration when evaluating GLP-1 agonists in the real world. In a meta-analysis of observational studies, Caruso et al. reported favorable renal outcomes among GLP-1 agonist users, including slower progression of albuminuria and a reduced incidence of acute kidney injury, compared to other glucose-lowering drugs. Cardiovascular outcomes also trended positively, consistent with randomized controlled trial (RCT) findings, such as those from the LEADER and SUSTAIN-6 trials. However, the authors noted the need for longer follow-up periods and stratification by baseline renal function and cardiovascular risk to validate these findings in the general population.

In conclusion, the growing body of real-world evidence confirms that GLP-1 agonists effectively improve glycemic control and promote weight loss and are generally safe when used in appropriate populations. Their comparative advantage over insulin and other antidiabetic agents makes them an appealing choice in many clinical scenarios. Nevertheless, careful attention to adverse event profiles, particularly GI symptoms and potential tumor risks, is warranted. As their use continues to expand, real-world studies will remain indispensable in refining clinical guidelines and optimizing patient outcomes.

References

  1. Thomsen RW, Mailhac A, Løhde JB, Pottegård A. Real-world evidence on the utilization, clinical and comparative effectiveness, and adverse effects of newer GLP-1RA-based weight-loss therapies. Diabetes Obes Metab. 2025;27 Suppl 2(Suppl 2):66-88. doi:10.1111/dom.16364
  2. Yang CY, Chen YR, Ou HT, Kuo S. Cost-effectiveness of GLP-1 receptor agonists versus insulin for the treatment of type 2 diabetes: a real-world study and systematic review. Cardiovasc Diabetol. 2021;20(1):21. Published 2021 Jan 19. doi:10.1186/s12933-020-01211-4
  3. Liu L, Chen J, Wang L, Chen C, Chen L. Association between different GLP-1 receptor agonists and gastrointestinal adverse reactions: A real-world disproportionality study based on FDA adverse event reporting system database. Front Endocrinol (Lausanne). 2022;13:1043789. Published 2022 Dec 7. doi:10.3389/fendo.2022.1043789
  4. Yang Z, Lv Y, Yu M, et al. GLP-1 receptor agonist-associated tumor adverse events: A real-world study from 2004 to 2021 based on FAERS. Front Pharmacol. 2022;13:925377. Published 2022 Oct 25. doi:10.3389/fphar.2022.925377
  5. Caruso I, Cignarelli A, Sorice GP, et al. Cardiovascular and Renal Effectiveness of GLP-1 Receptor Agonists vs. Other Glucose-Lowering Drugs in Type 2 Diabetes: A Systematic Review and Meta-Analysis of Real-World Studies. Metabolites. 2022;12(2):183. Published 2022 Feb 15. doi:10.3390/metabo12020183

Intravenous (IV) access is essential in anesthesia and surgery for rapid and reliable administration of fluids and medications. While upper extremity veins are preferred due to lower complication rates and easier access, there are situations (such as trauma, burns, or inaccessible upper extremity veins) where IV access in the lower extremities is required to proceed with anesthesia and surgery.

The most common lower extremity sites for peripheral IV access are the great saphenous vein at the ankle and the lesser saphenous vein, chosen for their superficial locations and consistent anatomy. In emergency or critical care situations, the femoral vein is preferred for central venous access because it is large, easy to identify, and suitable for rapid infusion. However, IV access in the lower extremities carries unique risks, particularly a higher incidence of complications such as thrombophlebitis, deep vein thrombosis (DVT), and catheter-related infections compared to upper extremity sites. These risks are amplified in adults due to factors such as slower blood flow, increased limb dependence, and a higher prevalence of peripheral vascular disease. Because of these concerns, guidelines consistently recommend limiting the duration of lower extremity IV access and transitioning to upper extremity access as soon as possible (1).

Recent advances in ultrasound technology have significantly improved the safety and success rates of acquiring IV access in the lower extremities, making ultrasound an increasingly useful tool for anesthesia and surgery. Ultrasound guidance provides direct visualization of veins, which not only facilitates cannulation in patients with difficult vascular access but also reduces the risk of complications such as arterial puncture or multiple failed attempts. Several studies support the use of ultrasound for both peripheral and femoral vein cannulation, showing improvements in first-pass success rates and reductions in procedure-related complications (1). This is particularly relevant for central access via the femoral vein, which, while easy to access, is associated with higher rates of infection and thrombotic events compared to central access via the subclavian or internal jugular vein. In a landmark study, Merrer et al. found that femoral catheterization had higher rates of both infection and thrombosis than subclavian access, reinforcing the recommendation that femoral sites should be reserved for specific clinical situations or emergencies (2).

IV access in the lower extremities is generally considered safer and more feasible in pediatric patients than in adults because children have fewer vascular comorbidities and their veins are less prone to thrombosis. However, complications such as infiltration, infection, and phlebitis still occur and require regular evaluation and prompt intervention if problems arise (3). In adult patients, early removal and routine monitoring are essential strategies to reduce complications, especially in those with risk factors for thrombosis or infection.

To minimize risk, several key practices are recommended: use of ultrasound guidance, strict aseptic technique during insertion, regular inspection of the IV site, and prompt removal of lower extremity catheters when no longer needed. Providers should monitor for signs of DVT, such as swelling, pain, and erythema, as well as for local or systemic infection. In addition, when femoral access is required for central venous catheterization, careful attention to catheter care protocols and early transition to safer sites are critical to reducing adverse outcomes (4).

Although lower extremity IV access is sometimes unavoidable in anesthesia and surgery, its use is associated with increased risks. Modern techniques such as ultrasound have improved the safety profile of these procedures, but clinicians must remain vigilant for potential complications and limit the use of lower extremity access when possible. With careful management, lower extremity IV access can be a safe and effective option for patients with limited alternatives.

References

  1. Witting MD. IV access difficulty: incidence and delays in an urban emergency department. J Emerg Med. 2012;42(4):483-487. doi:10.1016/j.jemermed.2011.07.030
  2. Merrer J, De Jonghe B, Golliot F, et al. Complications of femoral and subclavian venous catheterization in critically ill patients: a randomized controlled trial. JAMA. 2001;286(6):700-707. doi:10.1001/jama.286.6.700
  3. Geerts WH, Code KI, Jay RM, Chen E, Szalai JP. A prospective study of venous thromboembolism after major trauma. N Engl J Med. 1994;331(24):1601-1606. doi:10.1056/NEJM199412153312401
  4. Dargin JM, Rebholz CM, Lowenstein RA, Mitchell PM, Feldman JA. Ultrasonography-guided peripheral intravenous catheter survival in ED patients with difficult access. Am J Emerg Med. 2010;28(1):1-7. doi:10.1016/j.ajem.2008.09.001