Carta Acesso aberto Revisado por pares

Temperature Monitoring and Management During Neuraxial Anesthesia

1999; Lippincott Williams & Wilkins; Volume: 88; Issue: 2 Linguagem: Inglês

10.1097/00000539-199902000-00003

ISSN

1526-7598

Autores

Daniel I. Sessler,

Tópico(s)

Respiratory Support and Mechanisms

Resumo

Frank et al. [1] surveyed 102 members of the ASA to determine temperature-monitoring practices during neuraxial anesthesia. Their results indicate that only a third of the anesthesiologists in the United States routinely monitor temperature during neuraxial anesthesia. This conclusion is probably correct; nonetheless, it is worth considering some limitations of this study. For example, only 60 questionnaires were returned, which is a relatively low response rate. The authors did not attempt secondary questionnaires or telephone contact. Consequently, they were unable to use well established statistical methods to evaluate response bias. It is thus possible that a disproportionate fraction of the responses were from a self-selected group of anesthesiologists who were especially aware of and interested in the study topic. This possibility is supported by the fact that 60% of the returned questionnaires were from academic practitioners, which is hardly the typical ratio among ASA members. An additional concern is that respondents may have overestimated the proportion of neuraxial anesthetics during which they monitored temperature. Local anesthetics do not trigger malignant hyperthermia [2], nor do the sedatives typically used during neuraxial anesthesia. Unless fever is likely, detection of inadvertent hypothermia is the major reason for monitoring temperature during neuraxial anesthesia. The three major defenses against hypothermia in humans are vasoconstriction, shivering, and behavior (i.e., putting on a sweater, moving to a warmer environment). Neuraxial anesthesia impairs central autonomic thermoregulatory control [3], possibly by increasing apparent (as opposed to actual) leg skin temperature [4]. This inhibition is proportional to block height [5] but is small compared with the central inhibition produced by general anesthetics [6]. Neuraxial anesthesia also impairs behavioral thermoregulation with the result that patients often do not consciously perceive that they are hypothermic [7]. Few surgical patients are in a position to much alter their environments, but lack of thermal complaints lulls anesthesiologists into believing that their patients are near-normothermic. Most importantly, however, major conduction anesthesia blocks autonomic control to the affected region, thus preventing vasoconstriction and shivering in the legs [8]. Hypothermia during neuraxial anesthesia develops initially from a core-to-peripheral redistribution of body heat [9,10], with the amount depending on numerous factors, including the patient's previous thermal environment [11] and medication use [12]. Subsequent hypothermia, as during general anesthesia, results from heat loss exceeding heat production. The extent to which core temperature decreases during this phase depends largely on ambient temperature [13], the magnitude and duration of the surgical procedure [14], and the amount of unwarmed IV fluids that are given [15]. At some point, reemergence of thermoregulatory defenses will moderate further cooling; however, defenses restricted to the upper body are often insufficient to prevent further hypothermia. Most studies evaluating the adverse consequences of mild hypothermia were performed in patients given general anesthesia; however, there is no reason to believe that neuraxial anesthesia in any way protects patients from hypothermia-induced complications. The major consequences of mild perioperative hypothermia (i.e., 1-2[degree sign]C) include morbid myocardial outcomes [16], augmented blood loss and allogeneic transfusion requirement [17], reduced resistance to surgical wound infections, and a 20% prolongation of hospitalization [18]. Other important consequences that have been verified by prospective, controlled trials include delayed postanesthetic recovery [19], protein wasting [20], reduced drug metabolism [21], and shivering [22]. Having established the mechanisms by which neuraxial anesthesia cause hypothermia-and that the consequences are likely to be severe-we must then consider the frequency of hypothermia in this population. Hypothermia during neuraxial anesthesia is certainly far more common than generally appreciated. Several studies indicate that hypothermia during neuraxial anesthesia for extensive operations is nearly as common and severe as that during general anesthesia [23,24]. The difficulty, however, is that neuraxial anesthesia is often used for relatively small operations. The frequency of hypothermia in patients undergoing short, minor procedures under neuraxial anesthesia is unknown. The study by Frank et al. [1] is thus limited in having failed to evaluate the types and duration of procedures for which the respondents used neuraxial anesthesia. Furthermore, they did not actually evaluate core temperature in these patients. It is therefore impossible to determine from the presented data what we really need to know, which is whether inadequate temperature monitoring led to serious underestimation of hypothermia. However, given that one third of anesthesiologists rarely monitor patient temperature during neuraxial anesthesia and that another third do so only occasionally, it seems likely that considerable hypothermia is being missed in these patients. The final issue addressed by Frank et al. [1] is the site at which body temperature is monitored. Their survey indicates that skin-surface liquid-crystal thermometers were used most commonly, followed by axillary probes. Bladder temperature was occasionally monitored, but other sites were rarely used. The four reliable core temperature-monitoring sites are pulmonary artery, distal esophagus, nasopharynx, and tympanic membrane. Values recorded from these sites are good indicators of brain temperature and are comparable during all but the most extreme thermal perturbations. (1) It is, however, important to distinguish tympanic-membrane thermocouples from infrared aural-canal temperatures: carefully positioned tympanic-membrane thermocouples are among the best core temperature measurement sites, whereas the accuracy and precision of many infrared aural-canal monitors are poor [26-28]. (1) Stone JG, Yound WL, Smith CR, et al. Do temperatures recorded at standard monitoring sites reflect actual brain temperature during deep hypothermia? [abstract]. Anesthesiology 1991;75:A483. Other sites that are generally useful except during cardiopulmonary bypass include the mouth, axilla, bladder, and rectum [29,30]. "Deep-tissue" temperatures are also useful, but they are not currently available in the United States or Europe [31]. As long as the measurements are made properly and reasonable precautions observed, any of these sites can be substituted for the four core sites during most surgical procedures. Skin-surface temperatures deserve special consideration because they are the measurement sites most commonly used during neuraxial anesthesia. Liquid-crystal strips are themselves reasonably accurate thermometers. The difficulty is that skin-surface temperatures are typically 2-4[degree sign]C less than the core temperature. Manufacturers compensate for this difference by adding an offset to the displayed reading. Although one might expect that vasomotor activity and ambient temperature variation would influence the core-to-skin gradient, the effects are actually relatively small [32]. Forehead skin-temperature monitors are insufficiently accurate for fever screening [33], and they completely fail to detect malignant hyperthermia [34]. Their ability to reliably identify perioperative thermal disturbances is also unclear; some studies support their use [35], but others find them insufficiently accurate for clinical use [36,37]. In one other case, the authors concluded that skin-temperature monitoring is reliable-although their data indicate just the opposite [38]. Available data thus offer considerable support for the assertion of Frank et al. [1] that skin-surface temperatures are suboptimal. In contrast, it seems likely that axillary temperatures obtained with a carefully positioned probe or intermittent oral temperatures will be suitable during most neuraxial anesthetics. Of course, simply detecting thermal perturbations is insufficient: avoiding hypothermia-induced complications requires that hypothermia be prevented. Fortunately, good methods are now readily available. The most important methods are IV fluid warming and forced-air heating [39]. Each liter of fluid infused at ambient temperature decreases mean body temperature 0.25[degree sign]C in an average sized patient [40]. Fluid warming obliterates this source of cooling. Forced-air heating prevents the typical 50-75 W intraoperative cutaneous loss [41], and transfers an additional 25-50 W across the skin surface [42]. The combination of fluid warming and forced-air heating keeps nearly all surgical patients normothermic. In summary, local anesthetics and sedative do not trigger malignant hyperthermia. There are, however, ample data indicating that perioperative hypothermia is common during neuraxial anesthesia. Furthermore, these patients are almost surely susceptible to the well documented adverse effects of mild perioperative hypothermia. It is thus appropriate to monitor patient temperature during major conduction blocks when hypothermia is likely. This certainly includes patients undergoing body cavity surgery and those undergoing other involved or long procedures. Axillary or oral temperatures are likely to be suitable in most patients, although numerous other sites are also adequate. Monitoring is only the first step: anesthesiologists must also intervene as necessary to maintain normothermia-a core temperature of at least 36[degree sign]C.

Referência(s)
Altmetric
PlumX