Compressed Foresight and Narrative Bias: Pitfalls in Assessing High Technology Futures
2006; Taylor & Francis; Volume: 15; Issue: 4 Linguagem: Inglês
10.1080/09505430601022668
ISSN1470-1189
Autores Tópico(s)Sustainability and Climate Change Governance
ResumoClick to increase image sizeClick to decrease image size Acknowledgements An early version of this paper was presented at the ESRC Centre for Genomics in Society (Egenis) at the University of Exeter, 30 November 2004. In addition to these discussions, the author is very grateful for feedback from colleagues at the University of Edinburgh (especially Wendy Faulkner, Donald MacKenzie, Sarah Parry and Joyce Tait) and beyond (in particular Nik Brown, David Guston, Arie Rip, Richard Twine and Brian Wynne), as well as two anonymous referees. Notes 14S & EASST Conference, Paris, 25–28 August 2004: Public Proofs: Science, Technology and Democracy; a joint meeting of the (mainly North American) Society for Social Studies of Science and the European Association for the Study of Science and Technology. 2There are, of course, important differences between proponents and opponents of a technology in the way in which future visions of an emerging technology are projected. I am grateful to my colleague Joyce Tait for this observation regarding the differences between 'positive' and 'negative' foresight, reflecting their proponents' diverging aims of targeting money to specific expected areas of innovation versus stopping developments (personal communication with author). 3Brown (2003) Brown, N. 2003. Hope against hype—accountability in biopasts, presents and futures. Science Studies, 16(2): 3–21. [Google Scholar] makes a similar argument, that technology hype mobilizes the future into the present. 4The idea of nanotechnology has a long history of course. The earliest accounts of nanotechnology can be traced back to a 1959 speech by the physicist, Richard P. Feynman, to the American Physical Society entitled 'There's Plenty of Room at the Bottom'. More recently the concept of nanotechnology was promoted in the writings of Eric Drexler (Drexler, 1986 Drexler, K. E. 1986. Engines of Creation: The Coming Era of Nanotechnology, New York: Anchor Books/Doubleday. [Google Scholar]; Drexler et al., 1991 Drexler, K. E., Peterson, C. and Pergamit, G. 1991. Unbounding the Future: The Nanotechnology Revolution, New York: William Morrow and Company. [Google Scholar]) including his PhD, Molecular Engineering: An Approach to the Development of General Capabilities for Molecular Manipulation, published in 1981 and now available at his website: http://www.imm.org/. What is perhaps most striking is how well-rehearsed this self-history of nanotechnology is (Bennett and Sarewitz, 2006 Bennett, I. and Sarewitz, D. 2006. Too little, too late? Research policies on the societal implications of nanotechnology in the United States. Science as Culture, 15(4)[Taylor & Francis Online] , [Google Scholar]). The history and claims surrounding nanotechnology have not been without controversy—as exemplified by the highly publicized dispute between Scientific American and the Foresight Nanotech Institute (http://www.foresight.org/). The latter, founded by Drexler with a mission 'to ensure the beneficial implementation of nanotechnology', objected to an article in Scientific American (Stix, 1996 Stix, G. 1996. Waiting for breakthroughs. Scientific American, 274(4): 36[Crossref] , [Google Scholar]) which contrasted the utopian futuristic visions articulated by Drexler and other 'Nanoists' with the more uneven and mundane achievements of 'Real Nanotechnology', in particular querying the prospects of self-assembling nanodevices. López (2004) López, J. 2004. Bridging the gaps: science fiction in nanotechnology. HYLE—International Journal for Philosophy of Chemistry, 10(2): 129–152. [Google Scholar] suggests that the interpenetration of S&T and Science Fiction discourses presents a particular problem for critical analysis of nanotechnology. The issue of who is constructing these discourses is important. Bennett and Sarewitz (2006) Bennett, I. and Sarewitz, D. 2006. Too little, too late? Research policies on the societal implications of nanotechnology in the United States. Science as Culture, 15(4)[Taylor & Francis Online] , [Google Scholar] highlight the historical exclusion of the STS community from discussions of nanotechnology futures. Though a large body of findings is coming on stream (Schummer, 2004 Schummer, J. 2004. "Bibliography of studies on nanoscience and nanotechnology". In Discovering the Nanoscale, Edited by: Baird, D., Nordmann, A. and Schummer, J. 311–316. Amsterdam: IOS Press. [Google Scholar]) following the US National Nanotechnology Programme (see below), it is striking that content analysis of the two main academic sociology of technology journals—Science, Technology and Human Values and Social Studies of Science—reveals only two papers which mention nanotechnology, even in the most tangential manner (Saari and Miettinen, 2001 Saari, E. and Miettinen, R. 2001. Dynamics of change in research work: constructing a new research area in a research group. Science, Technology & Human Values, 26(3): 300–321. [Crossref] , [Google Scholar]; Gorman, 2002 Gorman, M. E. 2002. Levels of expertise and trading zones: a framework for multidisciplinary collaboration. Social Studies of Science, 32(5–6): 933–938. [Crossref], [Web of Science ®] , [Google Scholar]). 5In particular, the European Commission adopted a Communication: European Commission (2004) Towards a European Strategy for Nanotechnology [Communication, COM(2004) 338, 12 May 2004] followed by an Action Plan: European Commission (2005) Nanosciences and Nanotechnologies: An Action Plan for Europe 2005–2009 [COM(2005) 243, 7 June 2005]. These proposed a safe, integrated and responsible strategy for Europe in nanoscience and nanotechnology. See http://www.cordis.lu/nanotechnology/actionplan.htm (accessed 15 June 2006). 6For example, research programme builders may see articulating promises/bigger visions as a necessary tactic of winning research funding; for the practising researcher, however, it may be business as usual—there is no necessity that they buy-in to these particular visions about the speed and outcomes of change. 7Artificial Intelligence (AI) in the UK provides one illustration. The Lighthill Report (Science Research Council, 1973), which concluded that the ambitious claims confidently projected by early AI proponents in the 1950s and 1960s showed no immediate prospect of being fulfilled, resulted in the near cessation of AI research funding in the UK for over a decade—the so-called AI Winter (http://www.dai.ed.ac.uk/AI_at_Edinburgh_perspective.html; Arnall, 2003 Arnall, A. H. 2003. Future Technologies, Today's Choices. Nanotechnology, Artificial Intelligence and Robotics; A Technical, Political and Institutional Map of Emerging Technologies, London: Greenpeace Environmental Trust. [Google Scholar]) (accessed 15 June 2006). 8Langdon Winner, in his testimony to the Committee on Science of the US House of Representatives on The Societal Implications of Nanotechnology states: I would not advise you to pass a Nanoethicist Full Employment Act, sponsoring the creation of a new profession. Although the new academic research in this area would be of some value, there is also a tendency for those who conduct research about the ethical dimensions of emerging technology to gravitate toward the more comfortable, even trivial questions involved, avoiding issues that might become a focus of conflict. The professional field of bioethics, for example (which might become, alas, a model for nanoethics), has a great deal to say about many fascinating things, but people in this profession rarely say 'no' (Winner, 2003 Winner, L. 2003. Testimony to the Committee on Science of the U.S. House of Representatives on The Societal Implications of Nanotechnology House Committee on Science Hearings, Wednesday, 9 April 2003. Available at: http://www.house.gov/science/hearings/full03/apr09/winner.htm (accessed 15 June 2006) [Google Scholar]). 9The ELSI Research Programme, founded in 1988, received 3–5% of the total funding for the US Human Genome Project. Between 1990 and 2001, the ELSI programme devoted more than $86 million to support some 235 research and education projects and conferences. See http://www.genome.gov/10001798. James Watson promoted the ELSI programme with the ambitious objective 'to address, anticipate, and develop suggestions for dealing with such problems in order to forestall adverse effects' (Watson, 1989 Watson, J. D. 1989. Testimony before the Senate Subcommittee on Science, Technology, and Space, 13Washington, DC: US Government Printing Office. US Congress, Senate Hearing 101–528, 9 November 1989 [Google Scholar]). The specific goals adopted by the programme included developing policy options that would assure that the genetic information arising from HGP is used for the benefit of individuals and society (McCain, 2002 McCain, L. 2002. Informing technology policy decisions: the US Human Genome Project's Ethical, Legal, and Social Implications programs as a critical case. Technology in Society, 24: 111–132. [Crossref] , [Google Scholar]). 10The question of why some new technologies have not been a source of concern, while others have provoked fierce contestation, is addressed by a considerable body of literature, much of it sparked initially by the battles over nuclear power in the late 1960s and 1970s. Classic discussions include Starr (1969) Starr, C. 1969. Social benefit versus technological risk. Science, 165(19 September): 1232–1238. [Crossref], [PubMed] , [Google Scholar] and Slovic (1987) Slovic, P. 1987. Perception of risk. Science, 236(17 April): 280–285. [Crossref], [PubMed], [Web of Science ®] , [Google Scholar]. Although useful (Slovic's work in effect predicted the controversy over genetically modified foodstuffs), the classic analyses are rather narrowly psychological in their focus. For wider, more sociological viewpoints, see for example, Douglas (1986) Douglas, M. 1986. Risk Acceptability According to the Social Sciences, London: Routledge. [Google Scholar], Beck (1992) Beck, U. 1992. Risk Society: Towards a New Modernity, London: Sage. [Crossref] , [Google Scholar], Krimsky and Golding (1992) Krimsky, S. and Golding, D. 1992. Social Theories of Risk, Edited by: Krimsky, S. and Golding, D. London: Praeger. [Google Scholar] and Luhmann (1993) Luhmann, N. 1993. Risk: A Sociological Theory, Berlin: de Gruyter. [Crossref] , [Google Scholar]. 11Thus James Fleck was intrigued that robotics and their societal and especially employment implications were the subject of much debate in the 1970s and 1980s, whereas other simpler but less imaginatively compelling technologies, such as programmable logic controllers, received negligible attention, despite having far more profound employment effects (Fleck et al., 1990 Fleck, J., Webster, J. and Williams, R. 1990. The dynamics of I.T. implementation: a reassessment of paradigms and trajectories of development. Futures, 22: 618–640. [Crossref], [Web of Science ®] , [Google Scholar]). 12The different usages of the term ELSI-fication—reflecting a range of more or less divergent concerns—point to the complex implications of the new engagement between S&T and social sciences/humanities. The term ELSI-fication gained everyday currency in the USA in the HGP era but seems to have been used first in STS discussions by David Guston in a presentation to the 4th Triple Helix Conference (Copenhagen, 6–9 November 2002) (see Guston, 2002 Guston, D. H. CRIs in the wilderness: toward centers for responsible innovation in the commercialized university. Theme paper for the 4th Triple Helix conference. November6–92002, Copenhagen. mimeo, appearing in modified version in Guston (2004) [Google Scholar]; Leydesdorff and Etzkowitz, 2003 Leydesdorff, L. and Etzkowitz, H. 2003. Can 'the public' be considered as a fourth helix in university–industry–government relations? Report of the Fourth Triple Helix conference. Science and Public Policy, 30(1): 55–61. [Crossref] , [Google Scholar]; http://users.fmg.uva.nl/lleydesdorff/th4/spp.htm). Guston raises a concern that the volumes of ELSI research might swamp and distort social science. In contrast, Davenport and Leith have recently used this term to refer to 'The increased participation of "society" in science and also that social science and humanities understandings can be brought to bear on issues of science in society' (Davenport and Leith, 2005 Davenport, S. and Leith, S. 2005. Public participation: agoras, ancient and modern, and a framework for science–society debate. Science and Public Policy, 32(2): 137–153. [Crossref] , [Google Scholar], p. 138). Arie Rip has raised the risk that 'STS can become the victim of "ELSI-fication"' (Rip, 2005 Rip, A. There is mainstreaming, loss of critical distance: are STS scholars finally growing up?. Presentation to workshop 'Does STS Mean Business Too?'. June292005, Oxford. Said Business School. mimeo, Available at: www.sbs.ox.ac.uk/downloads/sts2-rip.pdf (accessed 15 June 2006) [Google Scholar]), becoming compromised if, in becoming engaged, it loses critical distance. The latter usage is the focus of this paper. 13This is not to suggest that everyone undertaking ELSI assessments accepts such a mechanistic view. However, the commissioning and conduct of much ELSI work conveys a sense of a determinate assessment being undertaken, which perhaps corresponds to the kinds of answers that the policy and practitioner audiences are seeking. It often appears closely tied to the conduct and timing of particular S&T research projects or programmes, and to particular issues that need to be resolved for the successful/effective handling of certain kinds of pragmatic problems for S&T research (proper handling of confidentiality issues in genetic medical studies as a case in point). This framing of questions to be addressed and the timing of ELSI studies may restrict the scope of enquiry (for example linking it to particular S&T developments or stakeholders). The attendant focussing involves opportunity costs, insofar as it may divert attention from other pertinent research goals, targets and frameworks. It may thus leave little scope for broader socio-economic enquiry and may be pursued at the cost of more fundamental academic research. 14Indeed amongst some commentators the ethical dimension seems to be used as a euphemism for political/ideological commitment—here 'ethics' refers to a politics of technology and to commitments and values that may remain unspoken to wider audiences. 15For example, proposals for assessing the ethical, environmental, economic, legal and social implications E3LS of technology, rather than ELSI, point to this plurality of concerns (Mnyusiwalla et al., 2003 Mnyusiwalla, A. D., Abdallah, S. and Singer, P. A. 2003. 'Mind the gap': science and ethics in nanotechnology. Nanotechnology, 14: R9–R13. [Crossref], [Web of Science ®] , [Google Scholar]). 16For example the Human Genome Project ELSI programme included an award on the 'Theological Questions Raised by the Human Genome Initiative' [Grant # R01 HG00487]. 17The Collingridge Dilemma can be briefly summarized. At the initial stages of a technological system, knowledge about its potential hazards and other detrimental consequences will be limited. It is therefore difficult at this stage to win support/legitimacy for public intervention and control. Conversely, when a technological system is more developed, it will also be well-entrenched. Though we will have more systematic knowledge about the costs and benefits of the technology, attempts to regulate it will have to confront powerful vested interests (Collingridge, 1980 Collingridge, D. 1980. The Social Control of Technology, London: Frances Pinter. [Google Scholar]). 18Contexts in which consumer choices might not avoid deleterious outcomes include: •risks which are not known/readily appreciated;•collective risks not experienced or perceived as sufficiently hazardous by the individual to deter adoption on the grounds of self-interest but sufficiently large to motivate collective action;•outcomes which are seen as unacceptable to 'society' though accepted/desired by individuals/groups (e.g. criminal uses);•externalized costs (e.g. the undermining of public transport by the private car or likewise of public phone provision by the adoption of mobile phones). 19The socio-technical outcomes of innovation may be patterned in different ways. For example, Arie Rip in mapping innovation pathways and outcomes, has distinguished between capabilities with 'generic richness' and wide-ranging potential and those directed towards specific innovations (see Spinardi and Williams, 2005 Spinardi, G. and Williams, R. 2005. "The governance challenge of breakthrough science and technology". In New Modes of Governance: Developing an Integrated Policy Approach to Science, Technology, Risk and the Environment, Edited by: Lyall, C. and Tait, J. 45–66. Aldershot: Ashgate. [Google Scholar]). Interesting work has been done, for instance, by the European ATBEST project which examined techniques and tools for assessing Breakthrough and Emerging S&T. See http://www.rcss.ed.ac.uk/atbest/ for more details. Given the uncertainties that surround attempts to anticipate innovation pathways and outcomes, it still remains a highly moot point whether it would be sensible to abandon S&T investigations where prior assessment had flagged the possibility of eventual problematic outcomes. 20Attention to potential health and environmental hazards is, of course, important. However some elements of this presumed novelty of risk need to be examined. It is surely strange to find discussions of the hazards of nano-particles as if these were only being encountered for the first time, notwithstanding two or more decades of preceding discussion about the health hazards of welding fumes and vehicle exhaust particulates. The idea that nano-scale materials need to be subject to specific hazard testing (Royal Society/Royal Academy of Engineering, 2004 Royal Society/Royal Academy of Engineering. 2004. Nanoscience and Nanotechnologies: Opportunities and Uncertainties, London: The Royal Society/The Royal Academy of Engineering. [Google Scholar]) has been rapidly accepted by the scientific and policy elite—perhaps because it could allay fears whilst not presenting a new challenge to existing regulatory regimes. 21For example the early popular movements often emerged in reaction against a few heroic large-scale technology projects (nuclear power, supersonic aircraft). Today, when innovation is more rapid and diverse and widely distributed across social actors there is a shift from pro- and anti-debates to the more diverse assessment of a plethora of competing technical solutions in which the opportunity costs of choosing one path over another may figure as highly as costs and benefits. 22In addition, as Grove-White et al. (2004) Grove-White, R., Kearnes, M., Miller, P., Macnaghten, P., Wilsdon, J. and Wynne, B. 2004. Bio-to-Nano? Learning the Lessons, Interrogating the Comparison, Lancaster: Lancaster University. [Google Scholar] point out, the issue of promised social benefits has been systematically excluded from established regulatory processes. 23Wendy Faulkner has drawn my attention to a parallel discussion in relation to feminist epistemology (see for example Haraway, 1988 Haraway, D. 1988. Situated knowledge: the science question in feminism as a site of discourse on the privilege of partial perspective. Feminist Studies, 14(3): 575–599. [Crossref], [Web of Science ®] , [Google Scholar]).
Referência(s)