Reviews
2012; Oxford University Press; Volume: 9; Issue: 4 Linguagem: Inglês
10.1111/j.1740-9713.2012.00594.x
ISSN1740-9713
Tópico(s)Complex Network Analysis Techniques
ResumoSignificanceVolume 9, Issue 4 p. 43-44 ReviewsFree Access Reviews First published: 09 August 2012 https://doi.org/10.1111/j.1740-9713.2012.00594.xCitations: 1AboutSectionsPDF ToolsRequest permissionExport citationAdd to favoritesTrack citation ShareShare Give accessShare full text accessShare full-text accessPlease review our Terms and Conditions of Use and check box below to share full-text version of article.I have read and accept the Wiley Online Library Terms and Conditions of UseShareable LinkUse the link below to share a full-text version of this article with your friends and colleagues. Learn more.Copy URL Share a linkShare onFacebookTwitterLinked InRedditWechat Abstract Books reviewed in this issue. Networks, Crowds, and Markets: Reasoning about a Highly Connected World David Easley and Jon Kleinberg Consent of the Networked: The Worldwide Struggle for Internet Freedom Rebecca MacKinnon The Data Journalism Handbook Edited by Jonathan Gray, Lucy Chambers and Liliana Bounegru; O'Reilly Media The Filter Bubble: What the Internet is Hiding from You Eli Pariser Networks, Crowds, and Markets: Reasoning about a Highly Connected World David Easley and Jon Kleinberg; Cambridge University Press; 744 pp.; September 2010; £30.00, $50.00; ISBN 9780521195331 The moment when the hypercon-nectedness of our world became most evident to me was on October 20th, 2011, when images of the death of Colonel Muammar el-Qaddafi, the former dictator of Libya, flooded media outlets and YouTube and made the immediate aftermath of his assassination a global experience. Any spectator who saw the National Transitional Council fighters circling the slain leader, their rifles replaced with mobiles to capture the sanguinary victory, may have wondered, as I did, which was the cellular node that linked them to this revolutionary event. Professors David Easley of the Department of Economics and Information Science at Cornell University and Jon Kleinberg of Cornell's Department of Computer Science have devoted their careers to disentangling the properties and topologies of intricate social networks like the one that made el-Qaddafi's death an international spectacle. In Networks, Crowds, and Markets they have brought together decades of research and lecture material to introduce the field of network analysis to undergraduate students in the social sciences. The text is a topical and highly readable resource for teachers of network and information theory. The fundamental dyad for Easley and Kleinberg's framework is graph and game theory. Graph theory is the science of modelling network topology – the arrangement of all the nodes, edges, and paths of an interconnected system. A graph is a ball-and-stick model, similar in appearance to the ball-and-stick models of molecular structure we all used to see in chemistry lessons at school; it provides a way to formalise a network's architecture, and the authors nicely lay out basic graph properties with illustrations of familiar social phenomena like the “small world” or the flocking together of “birds of a feather”. The foundation for understanding the behaviour of human nodes in a network is grounded in game theory. Chapters 6 and 7 describe how, in relating human decision-making under uncertainty to contests among players with specific rewards and losses, one can begin to analyse the nature and consequences of choice. Easley and Kleinberg build on this marriage to discuss the study of markets (Part III), information networks (Part IV), network dynamics (Parts V and VI), and the behaviour of institutions (Part VII). Clearly statistics is integral to the authors’ subject. Strangely, though, they treat it as the elephant in the room, the large object that no one will talk about. They assume readers will have had a course in pre-calculus, but they do not state whether any prior statistical training is assumed and no introduction to statistical or probabilistic concepts is provided. This could make it a challenge for instructors to venture into any of the advanced material if their students do not yet have the necessary quantitative training. On the other hand, instructors for the more advanced students in computer science or mathematics-related fields will find the main content too qualitative to serve as the primary text of an introductory course in network analysis. Students of statistics in particular will be disappointed by the lack of hands-on examples, data sets, or any discussion of the computational aspects of studying networks. In building a course for these students, I recommend that instructors use selected advanced material from this book as a supplement to Newman's Networks: An Introduction1 or Kolaczyk's Statistical Analysis of Network Data: Methods and Models2. Stephanie Kovalchik, Maryland Consent of the Networked: The Worldwide Struggle for Internet Freedom Rebecca MacKinnon; Basic Books; 352 pp.; 31 January 2012; £17.99, $26.99; ISBN 9780465024421 On January 18th, 2012, people took to the streets and protested online against two proposed laws in the US Congress: the Stop Online Piracy Act (SOPA) and the Preventing Real Online Threats to Economic Creativity and Theft of Intellectual Property Act (PROTECT IP Act, or PIPA). The English version of Wikipedia was one high-profile protester among several that took their websites down for 24 hours. The protesters feared that freedom of speech would be harmed and they claimed victory when numerous senators withdrew support for the bills and the votes were postponed. Meanwhile, 7000 miles away, President Hosni Mubarak switched off the internet in Egypt ahead of protests in the country's revolution. These, in part, are what Rebecca MacKinnon's Consent of the Networked is about. The book screams “the worldwide struggle for internet freedom” on its front cover. That phrase sums up the entire book. MacKinnon explores the internet's impact on civil liberties and privacy, as well as on democracy. Descriptive terms contained – indeed, coined – in the book such as “networked authoritarianism” and “digital bonapartism” set a sinister undertone; referring to the internet as a “politically contested space” immediately forms a picture of distrust and concern in the mind of the reader. But the internet as a weapon is far from one-sided: the book highlights the power it gives activists to protest against oppressions. Bloggers and citizens in Egypt, Tunisia, China, Iran, Russia and South Korea, amongst others, appear and organisations such as Facebook, Google (MacKinnon names one chapter “Facebookistan and Googledom”), WikiLeaks and Twitter feature heavily. MacKinnon's book also gives readers lessons in history (Magna Carta and the Treaty of Westphalia are both relevant in defining internet users’ rights) and in political theory and philosophy (the title of the book is a play on the phrase “consent of the governed”, and the book refers to the US Constitution, de Tocqueville, Hobbes and Locke); she regularly reminds readers that democracy is there to hold those in power to account and for the protection of citizens’ rights. This is important in keeping the reader grounded before the issues of states and corporations fill the book. She revisits and explores in detail the academic debate of nationstates having their sovereignty undermined – an undermining that is this time accelerated by a digitally connected world in addition to the “geopolitical power of corporations”. But the book finishes positively with a “clarion call to action”, alerting readers to their “responsibility to do whatever they can to prevent abuse of digital power, and avoid abusing it”. MacKinnon's book is an important and informative read about the internet and, more importantly, about the responsibilities that come with it. Abdel Khairoun, London The Data Journalism Handbook Edited by Jonathan Gray, Lucy Chambers and Liliana Bounegru; O'Reilly Media; 120 pp.; May 2012; $24.99; or available for free download at http://datajournalismhandbook.org One of the biggest news stories of the past year has been the Wikileaks affair. A single source, the hapless Bradley Manning, downloaded a single database. It contained US diplomatic cables – 251,287 of them, totalling 261 million words. (The Bible, old and new testaments, contains 750,000 words.) Somehow journalists found and identified the 0.01% of those cables (which still came to a lot) that were incendiary. How? Such data-dredging is the job of an analyst, an information processor, a data miner, a statistician, an IT expert, a programmer, a code-writer. In fact, in the world of big data all these are skills that journalists, or journalism teams, are starting to need. Pioneers are starting to acquire them. Some of those pioneers have edited and contributed to this book. They are the journalists who have realised that hidden inside huge digital databases are wonderful stories about the world – human stories that are interesting and important and exciting, that are just waiting to be extracted. (Which of course is what Significance tries to do in its own way as well.) It needs analysis to uncover those stories; it needs clever ways to explain those stories: new ways of drawing graphs and of visualising data, of using multimedia and interactive connectivity as well as print and broadcast media, but those stories are there and so are those ways of explaining them. All this is called data journalism. The book traces data journalism back to Florence Nightingale. But perhaps Big Data journalism began in 1993, when a Miami Herald journalist by the name of Steve Doig joined two different datasets from Hurricane Andrew: one mapped the level of destruction caused by the hurricane and the other showed wind speeds. This allowed him to pinpoint areas where weakened building codes and poor construction practices contributed to the impact of the disaster. He won a Pulitzer Prize for the story. Data journalism is about the new possibilities that open up when you combine the traditional ‘nose for news’ and ability to tell a compelling story with the sheer scale and range of digital information now available. It is about connections. With digitization, and with public access to data, it has been quietly and steadily growing. The field is so new that it is not exactly a guide to ‘how to do it', more a guide to ‘what it is now possible to do.’ But that in itself is immensely valuable. If you want to mine the datafields and use all kinds of skills, statistical expertise included but a whole lot of others as well to tell the world what is within them, read it now, and get in on the ground floor. Julian Champkin, Sussex The Filter Bubble: What the Internet is Hiding from You Eli Pariser; Viking; 304 pp.; 23 June 2011; £12.99, $16.00; ISBN 9780670920389 Earlier this year Google updated its privacy policy. The new policy allows the search giant to combine data it has gathered about its users across its different products into one big file. What this means is that Google can now mine into more data about our online personas so it can hash out more personalised search results (and, by the way, more targeted advertisements). At first sight, it does appear that the more personalised our search results, the better – right? For instance, by tracking our locations, Google can give us weather forecasts for our parts of the world. It can also curate a list of videos that may potentially interest us by monitoring what we watch on YouTube. But as internet activist Eli Pariser details in The Filter Bubble, a personalised internet casts a looming shadow as well. Pariser discusses a number of reasons why personalisation of the web is not necessarily a good thing. And they all stem from one central theme which he takes considerable time to convene: personalisation creates a filter bubble around us that pampers us and only shows us what we want to see, effectively undemocratising the internet for each one of us. For example, if you are a conservative, you will get results tailored to your conservative views. The chance to challenge your views, to test them against those who disagree with you, will be denied to you. If you are a climate change denier, you will not be linked to articles which explain the consequences of global warming. As he dives into the subject more deeply, Pariser not only looks at how Google and Facebook (which personalises its news feed) are personalising our internet but also gives us a crash-course in new-age marketing. He opens a window into the world of virtual Madison Avenue, if you will. In this world, carefully hidden from us, statisticians, data analysts, programmers, psychologists and marketers are conspiring to collect and interpret as much data about us as they can, with one simple aim: to find out how they can more effectively push their products on us. Pariser rings the alarm bell loud and clear to warn us about the deceiving appearance of an internet that is quietly being corrupted by a handful of companies. We are being fed news that Google and Facebook think is more suited to us, and are being tracked and monitored – or spied upon – so that we spend more money buying products that some lurking companies think we want. We are, in other words, being dictated to – and we are not even aware of it. In this excellent book, Pariser forces us to open our eyes. Khalil A. Cassimally, Mauritius References 1Newman, M. E. J. (2010) Networks: An Introduction. Oxford: Oxford University Press. CrossrefGoogle Scholar 2Kolaczyk, E. D. (2010) Statistical Analysis of Network Data: Methods and Models. New York: Springer. Google Scholar Citing Literature Volume9, Issue4Special Issue: Big DataAugust 2012Pages 43-44 ReferencesRelatedInformation
Referência(s)