Special Report: From the Battlefield to the ED
2010; Lippincott Williams & Wilkins; Volume: 32; Issue: 10 Linguagem: Inglês
10.1097/01.eem.0000389818.25738.60
ISSN1552-3624
Autores Tópico(s)Healthcare Systems and Technology
ResumoAn Army nurse administers CPR on an American soldier just after he arrived at a military hospital in Iraq. Techniques employed during wartime often find their way back to emergency departments in the United States.When George Washington took command of the Continental Army, he gained almost immediate attention for a maneuver that had more to do with medicine than military tactics, but it proved a bold stroke toward victory. He instituted mandatory vaccination against smallpox. His reasoning at the time, according to archives still ensconced near his home at Mount Vernon, was to exercise “utmost vigilance against this most dangerous enemy,” even though inoculation was so primitive that one common method was simply to inhale material from the pustules. But his logic has lasted more than two centuries: That a monstrous threat lies in potentially lethal and transmissible infection, and containment is critical. “Although decisions to implement these measures may infringe upon individual or community rights, they are ethically acceptable and justified under these circumstances, if there is consensus that they are effective and for the common good.” Washington didn't write those words; emergency physicians at Wayne State University in Detroit did, just two years ago, but they seemed to take a page right out of the general's own wartime experience. (Prehosp Disaster Med 2009; 24[2]:115.) In fact, advances from the field of battle to the field of emergency medicine have a very long history. War has led to the kind of first-class care that since that first intercontinental American battle has meant advances in hemostatic resuscitation, point-of-care imaging, skin preservation for burns, and even refinements in needles that render insertion into veins possible during bumpy transport. (BMJ 2008; 336[7653]:1098.) How did it happen? “Ironically, though not uncommonly, subsequent advancements in care for the critically ill occurred during the wars of the twentieth century,” observed Matthew Rosengart, MD, MPH, a surgeon who summed up historic landmarks in critical care as part of a grant from the National Institutes of Health. “Identifying shock and instituting appropriate intravascular fluid and resuscitation was well-established at the conclusion of World War I, and the techniques of blood transfusion became operant during World War II.” (Surg Clin North Am 2006;86[6]:1305.) “I think we can learn a lot from what has already been worked on by the military,” observed Christopher Colwell, MD, the director of emergency medicine at Denver Health Medical Center. “It was the military publishing their experience with tourniquets that really got that idea going again,” noted Dr. Colwell, also an associate professor of emergency medicine at the University of Colorado School of Medicine. Less than 100 years after the British were overthrown, another American war, the one between the states, would inaugurate this system of emergency care, which was originally developed in France during the Napoleonic Wars by military surgeon Dominique Jean Larrey. Larrey's organization and approach, which used horse-drawn “flying ambulances” like those of fire wagons and instituted public-health reforms aimed at aseptic technique, were largely discontinued with his death in 1842. (J Emerg Primary Health Care 2003;1[3-4]:990004.) It was Jonathan Letterman who adopted the methods and widely implemented them for the bloodiest war in U.S. history by establishing field hospitals, protocols for triage, and ambulances as vehicles for carrying the seriously injured. At the rear, “the surgeon and his assistants receive the poor wounded soldiers, and swiftly minister to their needs. Arteries are tied, ligatures and tourniquets applied, flesh wounds hastily dressed, broken limbs set, and sometimes, where haste is essential, amputations performed within sight and sound of the cannon,” according to an account of the day in archival records. (http://sonofthesouth.com.) But the starting point for the practice of urgent medical care appears to be the war for independence, in which medicine was transformed from a profession characterized by certain methods into a profession characterized by certain methods under the pressure of immediacy. Neither side expected the war to drag on, and facilities for the wounded in both the North and the South were often makeshift, unsanitary, and crowded (A History of Medicine. London: Taylor & Francis; 2005.) Some doctors provided their own surgical cases and instruments, precious tools that would ignite a debate years later about how and why some troop units seemed to have such equipment, including state-of-the-art stethoscopes, when there was no proof in requisition records that they were ever provided by the quartermasters. This would pave the way for wider use of these devices in later wars, as casualty care moved closer to the battle lines, to save time. During the Civil War, 14 percent of soldiers died from their injuries, and that figure plummeted to four percent in Vietnam. (Crit Care Clin 2009;25[1]:31.) Triage was first implemented on the battlefields of New England to determine where tourniquets were most needed following musket fire. By the Civil War, this rush to get patients in for treatment that became known for a time limit, the need to fall within only an “hour's neglect,” a term that presumably gave rise to the “golden hour.” (Harper's Weekly. June 7, 1862.) Preventive measures also were first used in the American Revolution, including the “variolation” ordered by Washington against smallpox; he had a realistic fear of biological weaponry. The pox had been intentionally used by the British before, on Native Americans during the French and Indian Wars. (J Military History 2004;68[2]:381.) In fact, an account from the 1760s indicates that a small tin box of the infectious agent was given to a tribe thought to be sympathetic to France, bestowed as a gift by the English, with the promise that the little container would bring the population good luck when they opened it among their own people. Soon a village 15 miles long was devoid of any living inhabitants, its lodges filled with the dead. (Major Problems in the History of American Medicine and Public Health New York: Houghton Mifflin Co.; 2001.) The inoculation program reached what would now be considered astounding statistical significance. By 1778, the death rate from smallpox had fallen to three individuals per thousand, down from 160 per thousand only the year before. (Foreign Policy Research Institute, Wachman Center, 2010;15[4]; http://bit.ly/Wachman.) The man who would become the first American president couldn't have known that his actions would not only help popularize the procedure, but aruably set the stage for medicolegal challenges to it. Years later, doctors would be expected to root out the unvaccinated, making home visits to families to ensure such prevention. In 1895, in one of the first such malpractice lawsuits, a father prevailed as a litigant against a doctor who seized him by the arm and declared: “You shall be vaccinated or I shall die for it!” (The Centers for Law and the Public's Health at Johns Hopkins and Georgetown Universities; www.publichealthlaw.net.) And tourniquets, not just vaccinations, became a source of controversy over the years, too, despite the fact that they saved lives. Two hundred years after Washington's medical corps relied on using fabric and brass to staunch bleeding, researchers were claiming overuse. By the 1980s, the method had become a matter of argument, only to spring up again during the past decade as a point of contention all over again. Tourniquet application is an easily-applied approach for prevention of prehospital exsanguination, stated Israeli researchers, who reviewed cases of 550 soldiers. (J Trauma 2003;54[5 Suppl]: S221.) British and American investigators countered that there are few, if any, medical reasons for a tourniquet to be used to stop extremity hemorrhage. (J Trauma 2003;54[5 Suppl]:S219.) The use of tourniquets in emergency medicine is complicated by worries over the potential for ischemia and nerve damage, explained Col. David Della-Giustina, MD, an assistant professor of military and emergency medicine at the Uniformed Services University of Health Sciences and the chair of emergency medicine at Madigan Army Medical Center in Tacoma, WA. But they have proven valuable all the way from civilian emergency departments and in the clinics of Middle East deserts. In fact, in the award-winning documentary “Restrepo,” a group of soldiers occupying the Korengal Valley in Afghanistan is heard to credit tourniquet application with saving the life of a comrade who took fire. The observation doesn't seem dramatically different from that of a Civil War general who suffered a bone-shattering “ball,” and received similar interim tourniquet treatment. “They then bore me to the operating room,” wrote General Oliver Hammond, adding that it was “a place a little gruesome with arms, legs and hands not yet all carried off.” “We must be aware of categorical imperatives, such as statements like ‘Emergency physicians should or should not do X,’” warned Jerris Hedges, MD, the dean of the John A. Burns School of Medicine at the University of Hawaii, in an essay he wrote on historical perspectives in practice. “For example, 30 years ago we were told, ‘Emergency physicians should not use neuromuscular blocking agents.’ Similarly, 20 years ago we were told, ‘Emergency physicians should not use cardiac markers in their decision-making.’” (Acad Emerg Med 2007;14[11]:924.) But sometimes the answer is equivocal. The standard strap-and-buckle tourniquet does not work; the cravat-and stick tourniquet can, and some of the newer tourniquets seem to be 100 percent effective, according Col. Ian Wedmore, MD, the U.S. Army Emergency Medicine Consultant to the Surgeon General and a clinical adjunct professor at the Medical College of Georgia. “I see the recent advances as (those) in hemorrhage control — hemostatic dressings and tourniquet use — and in prehospital treatment protocols and training,” he said. Other innovations include pain control and antibiotic use as well as hypothermia prevention, he added. For the most recent wars in Afghanistan and Iraq, the wounds appear to be far less visible and more difficult to treat. More troops have been sidelined from duty than in any other conflict, and the major reason is psychological duress, particularly post-traumatic stress disorder. And for a time, suicide had overtaken enemy fire as a reason for mortality, according to statistics compiled by Foreign Policy magazine in August 2010. Will these wars provide a new realm of research, with answers for stress-related battle injuries that lead to incapacitation? At the University of Connecticut, a pair of emergency physicians took a careful look at the role of emergency medicine in the military, specifically by focusing on the Israeli army. They suggest that, in wars to come, emergency medicine will provide knowledge and expertise, although in the past the converse seems to have largely occurred. “Emergency medicine would seem to offer much to military medicine in both peace time and war,” they concluded. “The routine chief complaints of the regular army soldier are well within the scope of those in any emergency department, and the same judgment in terms of referral can be brought to bear.” (Israeli J Emerg Med 2006;6[4]:32.) The Missing Ear That Launched an Early American War — and a Libation Was a little war off the coast of Florida the battle that launched a thousand bar tabs? If so, it was the result of sound medical practice in an earlier era. The concoction lives on in the lore as a popular combo of two historic health preventives, one for malaria, the other for malnutrition. It's a gin and tonic, straight up and long before the possibility “on the rocks” even existed. Prior to the advent of chloroform, liquor generally was relied upon as anesthetic. Gin, perfected as a potent spirit by the British in the 1600s, served not just to relax the reflexes but to make more palatable the bitter quinine taken as a “tonic” for malaria treatment. One of the earliest wars of the New World may have required such a remedy. It took place around 1740, and was known as the War of Jenkins' Ear. It got its name when Spaniards sliced off the ear of an English naval officer, who carried the severed body part back to British superiors as evidence retaliation was warranted, which they duly undertook. Neither side gained much in the four-year skirmish, but a drink lived on in the lore to help recall the tale of that conflict. At the time, limes were the most readily available citrus for scurvy prevention in sailors, and it is theorized the tangy juice of the fruit was added to the gin-flavored antimalarial tonic. “A drink that prevents malaria and scurvy, remembers The War of Jenkins' Ear, and perfectly compliments a warm summer evening,” proffers http://historydrinks.com/. “The gin and tonic deserves its canonical place in the ranks of history drinks.” Photos of medical equipment from the Revolutionary and Civil War may be viewed at:www.surgicaltechnologists.netandwww.braceface.com. Comments about this article? Write to EMN at[email protected]. Click and Connect!Access the links in this article by reading it onwww.EM-News.com.
Referência(s)