The history of blood transfusion
2000; Wiley; Volume: 110; Issue: 4 Linguagem: Inglês
10.1046/j.1365-2141.2000.02139.x
ISSN1365-2141
Autores Tópico(s)Medical History and Innovations
Resumo'We few, we happy few, we band of brothers; For he today that sheds his blood with me Shall be my brother.' Shakespeare: Henry V (act 4, scene 3) It is not possible in a review of this length to cover the history of all aspects of transfusion medicine comprehensively. This article focuses principally on the key developments in the early history of transfusion medicine, rather than on the more recent developments. The major landmarks are summarized in Table I. The basic techniques involved in this life-saving procedure are relatively simple and it is thus perhaps surprising that blood transfusion only became a part of routine clinical practice relatively recently. Blood-letting (venesection) was widely practised for a variety of medical conditions from the time of Hippocrates (≈ 430 bc) through to the nineteenth century in Europe and yet transfusion only became a commonplace therapeutic intervention less than 100 years ago. This is because both an understanding of the nature of blood as well the physiology of the circulation were required as a foundation for the development of blood transfusion and these were not forthcoming until the middle of the seventeenth century. The views of the Romans and ancient Greeks exerted a profound influence on both the traditions and practise of Western medicine for nearly 2000 years. The principal beliefs of the ancient Greeks and Romans were based on the writings of Hippocrates. The central doctrine of the humoral theory is set out in the treatize entitled 'On the nature of man', in which it was proposed that all living matter is composed of four basic ingredients, namely blood, phlegm, yellow bile and black bile ( Lloyd, 1978). Although recognized as a vital element in the constitution of man, blood was certainly not viewed as being more or less important than the other humours. Differences in personality were viewed as reflecting different mixes of humours in people and this belief extended well into the Renaissance period ( Fig 1). Indeed, words still used today such as 'sanguine', 'phlegmatic', 'melancholic' and 'choleric' can be considered to be linguistic fossils from that era. An important consequence of the acceptance of the humoral theory was that it encouraged a holistic approach to medicine, in which illness came to be regarded as being due to an imbalance of the four humours. Correction of the imbalance was thus required for restoration of health and this could be achieved by attention to diet and environment, although medical procedures such as dieting, purging and blood letting could also be used. Anatomical knowledge was also required for an understanding of the circulation of the blood, the cornerstone for the practice of transfusion. The anatomical knowledge of the Greeks was very limited and they believed that blood simply ebbed and flowed through the peripheral veins with some blood passing through the pores of the interventricular septum to mix with the 'pneuma' (or vital spirit) that was inspired with the air and which fed the brain. William Harvey (1578–1657), who studied medicine in Padua after graduating from Cambridge, was the first to understand the circulation of the blood and his treatize entitled 'Exercitatio anatomica de motu cordis et sanguinis in animalibus' was first published in 1628. Plates from a German calendar of around 1480 portraying the influence of the four humours (top left, phlegm; top right, blood; bottom left, black bile; bottom right, yellow bile) on personality. A preponderance of blood is associated with lust and arrogance according to the text. Experiments with blood transfusion proceeded in steps, initially involving transfusions from one animal to another and then transfusions from animals to man. The first written evidence of experiments with blood transfusion comes from Oxford in 1666, where the intellectual climate was particularly favourable for such physiological experiments. William Harvey spent a brief period in the city as Warden of Merton College. Distinguished scientists such as Robert Boyle, Thomas Willis, Christopher Wren and Robert Hooke formed the Oxford Experimental Philosophy Club during the 1650s. Wren conducted experiments that showed that intravenous injection of substances into animals could exert a systemic effect and Richard Lower (1631–1703) demonstrated that blood turned red after passage through the lungs. In 1666, Lower went on to conduct experiments in which blood was transfused from one dog to another which had been venesected. Samuel Pepys, who was subsequently elected to the office of President of the Royal Society in 1684, describes in his diary the events of an evening spent at Gresham College on 14 November 1666, where he witnessed such an experiment: 'At the meeting of Gresham College tonight, there was a pretty experiment of the blood of one dog let out, till he died, into the body of another on one side while all his own ran out the other side. The first died upon the place, and the other very well and likely to do well. This did give occasion to many pretty wishes, as of the blood of a Quaker to be let into an Archbishop and such like; but may if it takes be of might us to man's health for the mending of bad blood by borrowing from a better body.' Jean Denis, Professor of Philosophy and Mathematics at Montpellier in France, published an account of his work in the Philosophical Transactions of the Royal Society in July 1667 ( Keynes, 1967). He transfused the blood of calves and lambs into humans, and it is interesting to note that the indication was not blood loss but usually symptoms of mental illness. In line with the humoral theory, he believed that the transfusion of a docile animal might exert a calming influence on a troubled and deranged mind. Lower himself went on to transfuse Arthur Coga, a Cambridge university student described by Pepys as 'cracked a little in the head', with the blood of a sheep on 23 November 1667, and he also survived a second transfusion on 12 December. However, others were not so lucky and transfusion soon fell into disrepute and thus no further advances were made for some time. The first person credited with transfusing blood from one human to another was James Blundell, an obstetrician at Guy's and St. Thomas' Hospitals in London. He had seen many cases of postpartum haemorrhage and this stimulated research into blood transfusion using dogs. He showed that death from haemorrhage could be prevented in dogs by transfusion and venous blood was just as effective as arterial blood for resuscitation. He concluded that 'only human blood should be employed' after observing that dogs given human blood invariably died. He developed a syringe with a two-way stopcock and this was used with a considerable degree of success to treat women with postpartum haemorrhage ( Blundell, 1828; Jones & Mackmul, 1928). He gave his first report of a blood transfusion from man to man in a paper to the Medico-Chirurgical Society of London presented on 22 December 1818. This represented the beginning of the modern era of transfusion medicine. It is remarkable that blood transfusion was initially carried out with considerable success even without any knowledge of blood groups. Differences in compatibility of blood between species were recognized before differences within a species. Landois had published a treatize entitled 'Die Transfusion des Blutes' in 1875 in which he reported the observation that mixing of blood cells from one animal with serum from another species often resulted in lysis within 2 min. Karl Landsteiner (1868–1943), an assistant at the Pathological–Anatomical Institute in Vienna, was aware of this work and carried out experiments to see whether there were demonstrable differences between individuals in man. He published his results in 1901, in which he described the reactions between the red cells and serum of 22 subjects ( Landsteiner, 1961, text in translation). He observed that the addition of serum from some individuals would cause clumping of the red cells of others and realized that this was a phenomenon with an immunological basis. He initially identified only three blood groups, which he termed A, B and C. Serum from group C subjects clumped the cells of those from groups A and B. The following year, Decastello and Stürli, two of Landsteiner's pupils in Vienna, confirmed his findings in a larger study of 155 individuals and also identified four subjects (2·5%) with no agglutinins in their own serum but whose red cells were agglutinated by serum from subjects with all of the three previously identified blood groups (group AB). They also reported that isoagglutinins were also found in healthy individuals and were certainly not merely associated with disease ( Decastello & Stürli, 1902). The importance of Landsteiner's work, written in German and published in an Austrian journal, was not recognized immediately and blood grouping did not become part of universal practice until the 1920s. In part, this was because Landsteiner himself passed on to other areas of research and did not pursue his early observations. He moved to Holland in 1919 and then emigrated to New York in 1922 to take up a new position at the Rockefeller Institute, where he embarked on work in other areas of immunology. He was awarded the Nobel Prize for Medicine in 1930 for his work on blood groups ( Fig 2). He was described by his colleagues as a man with 'military bearing' but 'pervading modesty and simplicity' ( Levine, 1961). He declined to address the audience at the presentation ceremony and asked Sinclair Lewis, the Nobel Laureate in Literature, to speak on his behalf. 'You may call me the master of words, but what is he?' said Lewis, 'he has been in a thousand cases the master of death.' Karl Landsteiner (1868–1943), who was awarded the Nobel Prize for Medicine and Physiology in 1930 for his discovery of the ABO antigen system. Several other groups with no knowledge of Landsteiner's work duplicated his findings. In 1907, Jansky in Poland named the four blood groups I, II, III and IV in order of their frequency and Moss in Baltimore (USA) described the four blood groups in 1910 in the reverse order of IV III, II and I ( Moss, 1910). The Moss nomenclature was widely used in England, but there was the potential for dangerous confusion. The confusion was eventually resolved at a meeting of the 1937 Congress of the International Society of Blood Transfusion held in Paris when the current ABO terminology was adopted. The Mendelian basis of blood group inheritance was not appreciated at first and was only conclusively established by Bernstein in 1924. Differences in the relative distribution of blood groups between races were first documented during the First World War by two German workers ( Hirszfeld & Hirszfeld, 1919). The data were manipulated during the subsequent period of national socialism in Germany and blood group B came to be identified as a marker for Slavic and Jewish races, whereas group A came to be associated with positive traits such as intelligence and industry. Race was to become an issue in the Second World War, too, when the German army only accepted blood from certified 'Aryan' donors. In the USA, the American Red Cross segregated blood according to race and declined to include any donations from black donors into the pooled plasma used for the manufacture of albumin. Segregation of blood according to race remained in place in some states in America until the late 1960s. In the late 1950s, a law was even passed in Louisiana which made it a misdemeanour for physicians to give blood from a black donor to a white person without consent. A quarter of a century passed before other blood group systems were recognized. One of Landsteiner's first pupils at the Rockefeller Institute was Philip Levine (1900–1987), who began work there in 1925. In 1939, he published a case history of post-transfusion haemolysis in a woman with blood group O who received blood from her husband who had the same blood group ( Levine & Stetson, 1939). There was a past history of stillbirth owing to erythroblastosis fetalis. Incubation of the woman's serum with cells from her husband resulted in agglutination and, when her serum was incubated with 104 other ABO-compatible samples, agglutination was seen in 80 cases. This, of course, was the first report of Rhesus antibodies. Levine did not suggest a name for the new system he had identified and the name was derived from parallel experimental work carried out by Landsteiner and Wiener involving the immunization of rabbits and guinea pigs with blood from Rhesus monkeys ( Landsteiner & Wiener, 1940). The antibodies obtained in these animals were also found to agglutinate the erythrocytes of 85% of humans tested, who were classified as 'Rhesus positive.' Antisera of different specificities were subsequently recognized and it was appreciated that the Rhesus system was a complex system with several alleles. The Cambridge geneticist Sir Ronald Fisher proposed the current system of nomenclature in 1944, with three sets of alleles referred to as c and C, d and D, and e and E. An alternative nomenclature was proposed by Alexander Wiener, but this was more complex and has not stood the test of time ( Wiener, 1943). This work stimulated much similar research and many other antigens were recognized in the ensuing few years. Often the name of the system is derived from the first patient described (as opposed to the researcher involved). The identification of new antigenic systems was facilitated by the development of the anti-globulin test ( Coombs et al, 1945 ), as well as the recognition that incubation of erythrocytes with enzymes such as trypsin enhanced the expression of some antigens ( Morton & Pickles, 1947). The Kell system was first identified in 1946 by Coombs himself through application of his newly described anti-globulin test in the case of an infant with haemolytic disease that could not be explained by Rhesus incompatibility between mother and child ( Coombs et al, 1946 ). The Duffy (Fy) and Kidd (Jk) systems were also first identified using this test. Joseph Duffy was a haemophiliac who had received several blood transfusions over the preceding 20 years ( Cutbush et al, 1950 ). Mrs. Kidd hailed from Boston (USA) and her fifth baby was born with haemolytic disease of the newborn; an antibody in her blood was found to agglutinate the red cells of 146 out of 189 (77%) donors ( Allen et al, 1951 ). The antibody was called Jk, taking the initials of her son. The rapid and inevitable coagulation of blood imposed a natural limit to the quantity of fresh blood that could be transfused in the early days. A surgical technique was developed by Alexis Carrel (1873–1948), a French vascular surgeon originally from Lyon who also moved to the Rockefeller Institute in New York, to address this problem and permit transfusion of larger quantities of blood. This involved the temporary anastomosis of the artery of a donor with that of the vein of the recipient. He first undertook this procedure in March 1908 to transfuse the newborn daughter of one of his medical colleagues with blood from her father. The left radial artery of the father was anastomosed at the wrist with a vein in the leg of the infant, whose condition improved dramatically. Years later, another of his colleagues recalled Carrel's personal account to him of the incident ( Clarke, 1949): 'I went to the house … Here the sight that met my eyes was pitiful indeed. The young mother was lying in her bed still weak from her confinement and terribly worried over the condition of her little babe. The baby was beside her, as white as the sheet on which it lay, apparently almost bloodless and quite unconscious. I feared that it would die before I could prepare for the operation. While the baby was at first so weak from loss of blood that it did not stir or cry when I dissected out the vein, in a few moments after the anastomosis was completed and the father's blood was entering the vein, it began to move. Soon it was whimpering and a faint pink colour appeared in its cheeks. It was not many minutes before it was red all over and crying lustily. I then tied off the father's artery and the baby's vein, cut them apart, sewed up the wounds in the skin, accepted the parents' profound thanks and came home to bed. It was a most interesting experience.' Carrel received the Nobel Prize for Medicine in 1912 in recognition of his work in the field of vascular surgery and his method was widely adopted by other surgeons of the time. However, the problems associated with this surgical approach included the need for a donor to be available for surgery and also the fact that it was not easy to judge exactly how much blood had passed from donor to recipient, so that often the donor became hypotensive or the recipient showed signs of circulatory overload. Another common approach was simply to use defibrinated blood, in which blood was collected in an open vessel and stirred to promote clotting. The clot could then be lifted out and the remaining fluid used for transfusion. As early as 1821, Prevost and Dumas in France showed that defibrinated blood was effective in resuscitating animals whose blood had been removed, and the method was adopted in clinical practice thereafter. However, severe febrile reactions were not infrequent. It was clear that there was a need for a stable, but non-toxic, anticoagulant which could be added to collected blood and permit long-term storage. The British obstetrician, Braxton Hicks, experimented with a solution of phosphate of soda, but this also proved toxic ( Hicks, 1868). Richard Lewinsohn of the Mount Sinai Hospital in New York is credited with introducing sodium citrate into clinical practice as an anticoagulant. In fact, a 1% solution of sodium citrate was already widely used in laboratories as an anticoagulant. This high concentration was toxic to humans but, as Lewinsohn himself recalled, 'Nobody had ever followed the simple thought of carrying out experiments to ascertain whether a much smaller dose might not be sufficient' for use as an anticoagulant. In 1915, he published the results of four years of experiments showing that a 0·2% solution of sodium citrate was effective as an anticoagulant for blood, while at the same time having no toxicity even when as much as 2500 ml citrated blood was transfused ( Lewinsohn, 1915). At first, blood anticoagulated with sodium citrate was generally used within hours of donation and it was certainly not anticoagulated with a view to long-term storage. The following year, experimental work in rabbits showed that the addition of dextrose to blood stored for 2 weeks was effective in correcting anaemia after transfusion to rabbits which had been bled ( Rous & Turner, 1916). Acid–citrate–dextrose (ACD) solution was adopted in the UK for anticoagulation of donated blood after a clinical review that conclusively demonstrated improved red cell survival on storage without any disturbance of acid–base balance in the recipient ( Loutit & Mollison, 1943). Citrate–phosphate–dextrose (CPD) solution was subsequently adopted as the anticoagulant of choice after clinical studies were conducted using blood stored for up to 28 d ( Gibson et al, 1961 ). The first blood donor service in the world was established in London by Percy Oliver (1878–1944), Secretary of the Camberwell Division of the British Red Cross, in 1921 ( Gunson & Dodsworth, 1996a). He was not a physician but a civil servant who worked with refugees during the First World War, for which he was awarded the OBE (Order of the British Empire) in 1918. In October 1921, his branch of the Red Cross received a call from King's College Hospital for volunteers to give blood. A nurse in the group, Sister Linstead, gave blood and this spurred Oliver to establish a panel of potential donors from among his acquaintances who could be contacted at short notice to give fresh blood. It was agreed that volunteers should only accept calls through Oliver's office at 5 Colyton Road, London SE22, which he called the British Red Cross Blood Transfusion Service. The donors often had to be summoned by the police, as telephones were not commonly found in private homes. Each volunteer donor underwent a physical examination and serological tests to establish the blood group and exclude infection with syphilis before being enrolled in the panel to be called up at short notice to give blood. The service was provided entirely free of charge and administrative costs were recovered from charitable donations. The donors were entitled to reclaim their expenses, but many chose not to do so. Oliver's services were called upon only 13 times in 1922, but word of the service he provided soon spread and hospitals sought his assistance 428 times in 1925. Oliver actively recruited donors and he was assisted in this task by Sir Geoffrey Keynes, a distinguished surgeon from St. Bartholomew's Hospital who was appointed as medical adviser to the organization. A set of regulations were established, with the help of Keynes, that were designed to ensure that donors remained on the panel. Hospitals were expected to treat donors with courtesy and also protect donors from witnessing anything that might cause them distress, as this was a significant cause of resignation from the panel. Many doctors were also still reluctant to use any form of anticoagulant and relied on direct donation. In addition, the techniques of venesection needed to be improved. The method initially involved a cut-down to the vein, which was then tied off after the donation with the consequence that the vein would be useless for donation on another occasion. Many hospitals were still not able to type blood reliably and there were several deaths as a result of ABO incompatibility. Some hospitals simply demanded group O blood rather than spend money on reagents to tests the blood groups of recipients, but this was resisted as it put undue pressure on a limited number of donors. It was some years before a similar donor panel system was set up in other cities, but Sheffield, Manchester and Norwich were among the first to establish their own donor panels. An important difference was that some of the provincial centres had no link with the Red Cross and also paid donors for their blood. In Manchester, donors were given three guineas by the Public Health Department for their blood in 1930. Similar systems was also adopted in other countries and among the first were France, Germany, Austria, Belgium, Australia and Japan. Tribute was duly paid to the work of Oliver at the first Congress of the International Society of Blood Transfusion held in Rome in 1935: 'It is to the Red Cross in London that the honour is due to having been the first, in 1921, to solve the problem of blood donation by organizing a transfusion service available at all hours, and able to send to any place a donor of guaranteed health, whose blood has been duly verified.' Obviously, it was inconvenient to have to call in donors at short notice to give blood. Bernard Fantus, of the Cook County Hospital in Chicago, is credited with establishing the first blood bank in 1937 in which blood was collected in bottles and stored in a refrigerator for up to 10 d ( Fantus, 1937). A shortage of blood donors prompted physicians in Russia to explore another avenue in the 1930s, namely the use of blood taken from cadavers. The first documented case was carried out by Shamov in Russia in 1930, after preliminary experimental work in animals, when blood removed from the inferior vena cava of a victim of a road traffic accident was used successfully to resuscitate a young man who had cut arteries in his wrists in an attempt to commit suicide. Success with this case encouraged further work and, within a few years, Shamov published accumulated experience of the use of cadaver blood in some 2500 subjects, of whom only seven recipients died ( Shamov, 1937; Tarasov, 1960). Only the blood of those who had died suddenly, typically of cardiac arrest, would be used and obviously the bodies of those who had died of systemic illness were not drained. Two to four litres of blood could be obtained from each cadaver via the jugular vein and no anticoagulant was added. Prior experimental work with animals had shown that the risk of bacterial contamination was minimal if the blood was collected within a few hours of death, although a serological test for syphilis was performed before the blood was used. The establishment of blood banks rendered such practices redundant, but not before physicians in other countries had also experimented with the technique of using cadaver blood. One report from Michigan in 1964 reported the successful use of cadaver blood in seven patients, of whom five made a full recovery; the principal author of this report was Dr Jack Kevorkian, who subsequently attained notoriety through his involvement with the 'assisted suicide' of patients with terminal illnesses ( Kevorkian & Marra, 1964). Several groups also experimented with transfusion of placental blood, which was obviously readily available in fairly large quantities. However, bacterial contamination with placental blood proved to be a much greater problem than with cadaver blood and so this source of fresh blood for transfusion was abandoned ( Goodall et al, 1938 ; Boland et al, 1939 ). The outbreak of the Second World War provided a great stimulus for the development of blood transfusion services. During the Spanish Civil war (1936–39), Federico Duran-Jordan, a physician from Barcelona, organized a blood bank comprised exclusively of blood from group O donors that could be transported wherever needed. He fled to London after it became clear that the Nationalists would win the war and helped Dr Janet Vaughan to establish a blood bank at Hammersmith Hospital in 1938. It was obvious to many that war with Germany was imminent and preparations were put in place behind the scenes to establish four blood depots in London to be administered by the Medical Research Council. In the autumn of 1938, the War Office also created the Army Blood Supply Depot (ABSD) in Bristol under the control of Dr Lionel Whitby ( Gunson & Dodsworth, 1996b). The Army adopted a policy of supplying units at the battle fronts with blood that had been collected centrally and transported forward, rather than rely on bleeding military personnel at the front. This proved to be a remarkably successful strategy and both the German and American armies were constantly short of blood as it proved very difficult to bleed soldiers at the front in times of battle. The initial target set by the ABSD Committee had been to supply 100 units/d for military hospitals, but by the end of the war 1300 units were supplied and used each day. Recruitment from among civilians was remarkably successful ( Fig 3) and by the end of the war 756 046 donors had been bled. With the establishment of the National Health Service after the end of the war, a National Blood Transfusion Service (NBTS) was set up in 1946 under the control of the Ministry of Health. A 1944 poster urging civilians to donate blood for military casualties. The Army Blood Supply Depot was based in Bristol, and blood from the south-west region was processed for military use. The outbreak of war also stimulated work directed towards fractionation of blood. It was appreciated at an early stage that it would not be possible to transport large quantities of blood from civilian donor centres to battle zones around the world, often in adverse climatic conditions. The value of plasma was appreciated as an alternative for plasma expansion, but this, too, was difficult to transport and administer under field conditions. The breakthrough in this area came from the USA in the laboratory of Edwin Cohn, Professor of Physical Chemistry at Harvard Medical School (Boston). Cohn was approached by the United States Office of Scientific Research and Development and set about his work using blood provided by the American Red Cross. Working through the summer of 1940, Cohn isolated the various fractions of plasma proteins by adding ethyl alcohol to the plasma several times in succession, but each time varying conditions such as salt content, temperature or pH ( Cohn et al, 1946 ). Fraction I contained mostly fibrinogen, fractions II and III contained mainly globulins and fraction V contained mainly albumin. Limited clinical studies in volunteers and victims of accidents showed that the albumin-rich fraction V restored symptoms of circulatory collapse in subjects who had lost blood and there were also no discernible adverse effects. The Japanese invaded Pearl Harbour, Hawaii, on 7 December 1941 and supplies of albumin were shipped out immediately to treat the injured. A total of 87 subjects received infusions of albumin and these were mainly patients with burns. Only four minor reactions were noted, but some dramatic improvements were reported. The reputation of albumin as a life-saver was established without clinical trials, as would now be required, and only very recently has this reputation been called into question ( Cochrane Injuries Group Albumin Reviewers, 1998). Immunoglobulins derived from Cohn fractions II and III proved effective in the prevention of a variety of infectious diseases. One of the earliest clinical studies showed that a dose of immunoglobulin could provide temporary protection against measles ( Ordman et al, 1944 ). Another important clinical application of these proteins was in the prevention of Rhesus haemolytic disease. In 1947, Louis Diamond, then working as a paediatrician in Boston, introduced the technique of excha
Referência(s)