PDF (111 KB) - Lynne Rienner Publishers

2 downloads 0 Views 426KB Size Report
Data presented in European Union documents have a social life of their ... these data into facts at the EU level, a process itself shaped by those relation-.
Global Governance 23 (2017), 71–82

The Social Life of Data: The Production of Political Facts in EU Policy Governance !

Christopher C. Leite and Can E. Mutlu Data presented in European Union documents have a social life of their own; they are collected, analyzed, disseminated, used, and reused. Along the way, data interact with different external elements—economic, social, and political actors, positions, and preferences—transforming their potential for different contexts and meanings. The Directorates-General are significant data actors, involved in both data-gathering and fact dissemination stages of data’s social life and controlling the flows of data through the EU. This article focuses on the social life of two sets of data, tracing the evolution of these datasets into political discourses about disaster response and migration, and arguing that data never speak for themselves but rather evolve into facts through networks, which shape official policy discourses. Keywords: data, European Union, migration, disaster response.

THE PROCESSES OF DATA COLLECTION AND DATA ANALYSIS, WHICH ARE CENTRAL to the social construction of giving data meaning, include numerous actors and networks that are active in the construction of political facts at the European Union (EU) level. Data presented in EU documents have a social life of their own; they are collected, analyzed, disseminated, used, and reused in various official and unofficial formats and media. With this article, we seek to understand: How does data get collected, analyzed, and contextualized in the EU’s policy governance practices? In particular, we discuss ways to conceptualize the social life of data as an ongoing process based on constant transformations of actor and network relationships. Rather than a precise model for understanding data governance, we are proposing a conceptual alternative to think about the role of data in the EU’s policymaking processes. By the social life of data, we are referring to both the relationalities and the transformations of relevant data as well as the social processes of turning these data into facts at the EU level, a process itself shaped by those relationalities and transformations. EU actors and networks project their socioeconomic and political preferences and statuses on data during the process of transforming or translating data into facts. In our approach to understanding how the EU constructs political data, we reject the notion that data speak for 71

72

The Social Life of Data

themselves,1 or even the general notion that there is such a thing as raw data. With each step along the way, data interact with different external elements— various economic, social, and political actors, positions, and preferences— transforming their potential for different contexts and meanings. In our approach to the social life of data in the EU’s policymaking practices, science and technology studies inspire us2 and, in particular, we draw from Latour’s actor-network theory (ANT)3 for theoretical inspiration. We make our argument through two empirical cases, outlining how the social life of data is particularly visible in two policy areas increasingly under EU governance jurisdiction: disaster governance and security provision related to migration and mobility. In both instances, the production of, access to, and use of data not only are an intrinsic part of maintaining these policy competencies at the EU level, but they also inform how these policy areas are actually governed on a daily basis. In the field of disaster governance, the Directorate-General for Civil Protection and Humanitarian Aid (DG ECHO) uses the European Commission’s Joint Research Centre (JRC) to produce data that is then run through DG ECHO’s European Response Coordination Centre (ERCC) to monitor the onset of disasters and coordinate their response. Similarly, DG Migration and Home Affairs (DG HOME) gathers intelligence and risk assessment data produced in agencies and departments from around the EU and centralizes this data in DG HOME’s Strategic Analysis and Response Centre (STAR) to monitor for and coordinate scenarios requiring a Commission-level response. While we discuss these two complimentary empirical cases in detail in the third and fourth sections, in the next section we discuss the particular theoretical framework used to understand the social life of data. In particular, we look at the ways in which data gets translated into facts and the relationalities of this process. Conceptualizing the Social Life of Data: An Actor-Network Approach The political impact of data is becoming a popular avenue for academic research. Ian Hacking famously looked at how the production and manipulation of statistical information was central to the formation of the early modern state.4 Alain Desrosières and Michael Oakeshott both separately traced the development of the notion that the world can be readily calculated and acted on, discrediting the rise of “Liberal Instrumental Reason” and “rationalist” policymaking and decisionmaking models based on the inability to calculate all circumstances into useful and meaningful data.5 Similarly, Theodore Porter has argued that data and statistics are used to justify social decisions and root them in an appeal to objectivity—this, of course, presupposes that objectivity itself is a legitimating factor in policy implementation.6 This is a critique similar to what is currently being seen in the way that support is garnered for transnational governance projects.7

Christopher C. Leite and Can E. Mutlu

73

Some have stressed the fact that a reliance on governing through the use of data has created “blind spots” in policymaking,8 or that the rampant uncertainty of everyday life means that new or different ways of understanding governance institutions is needed.9 Despite these important forays into the political use of data and the logics behind it, data as a concept has not sustained enough academic attention in political studies. It is therefore important to extend this debate by using empirical examples to show that data’s relational and constantly transforming ontology has impacts on the volume, scope, and speed with which institutional actors perform their governance roles. There is thus nothing raw about data; whether consciously or unconsciously, they are always produced, processed, and consumed with a social objective or purpose in mind. Data, in this sense, can never speak for themselves; they always need someone to make them speak or render them intelligible. The construction, interpretation, utilization, and reproduction of data is thus always dependent on someone or some actor to collect it, make sense of it, contextualize it, and translate it. The meaning or the interpretation of particular data changes from person to person across time as well as varies between the types of interfaces or technologies used to interact with it. It is this relationality and transformability of data that pushes us to think about it in terms of social relations, or as having a social life of its own. Building from the work of Bruno Latour, data are instead built on associational ontologies—they circulate across actors and networks. Data are actants that are both continuously transformed and continuously transforming through various associations and translations.10 Latour developed the term “actant” to draw attention to the sociological link between human and nonhuman—the way inanimate objects and artifacts are embedded in relations with humans and the way those relations alter or influence the institutions within which these relations exist. This emphasizes, for example, that the use of computers and their interfaces to generate and manage data do not exist as disembodied processes, or that the objects that humans use are only passive entities to be manipulated, but instead highlights the social role of what these objects or artifacts do to the humans with whom they interact.11 The fact that these objects or artifacts have meaning that is altered or variable depending on social contexts is why Latour was so interested in the concept of “translation.” According to Latour, translation is “a relation that does not transport causality but induces two mediators into coexisting”12—an object does not have to be thought of as being acted on or causing a specific reaction, but that an object and its user are both altering each other in that interaction of use. Similarly, Latour has also said that “in place of a rigid opposition between context and content, chains of translation refer to work through which actors modify, displace, and translate their various and contradictory interests.”13 It is this relationship of associations, mediations, and translations that generate the relationship between actors, actants, and networks that result in the social life of data.

74

The Social Life of Data

This continuous relationship between various actors and networks results in an associational ontology that defines the social life of data as an actant. This builds from Latour’s twin logics that, on one hand, “instead of constantly predicting how an actor should behave and which associations are allowed a priori, ANT makes no assumption at all”14 while, on the other hand, “the notion of network helps us to lift the tyranny of geographers in defining space and offers us a notion which is neither social nor ‘real’ space, but associations.”15 Similar to the discussion raised by Monique Jo Beerli in this special section, we argue that the relational ontology of data has distinctive organizational and social effects that shape policy practices, that is, a social life: data exist through the meaning they are imbued with by the actors and networks that sustain them. The social life of data is structured around the twin processes of data collection and analysis. The data collection process is about making the social world legible. It is about creating charts, documents, graphs, texts, visualizations, or in general records of social phenomena. It is a process that is designed to translate events, discourses, numbers, processes, and other social phenomena into a set of information that has qualitative and quantitative characteristics. This process precedes data analysis. Data analysis is the process through which we make sense of data. This making sense process is more than just analysis, modeling, or regression though; it is primarily about translating the social world that has been made legible through data collection into actionable facts. In the case of policymaking specifically, data analysis is about translating data into political facts that can be acted on. This is not to say that institutional actors use data strategically, in a rational instrumentalist sense, but that data are used as tools through which other priorities or decisions are mediated. Whether we are talking about big or small data, these processes of translating general events into data are central to the social life of data. To understand the impact the translation process has on the social life of data, one must ask: Who or what is translating the data? Who or what else can be a translator? For what purpose are they translating the data? What are the other possible translations? What is omitted? And what remains? The answers to these questions demonstrate the various kinds of negotiations that sustain the social life of data. Data live a social life. Various actors and networks of material actants act on and sustain these translation processes and, through their involvement in the data collection and analysis process, interact with data and with each other. Through continued interventions and translations, data evolve into facts. Such translations, which include various EU-level actors and their networks, transform data into vibrant political discourses related to disaster response and migration. Actors in EU networks discuss and negotiate what kinds of data to collect and how to sort, categorize, display, visualize, store, and share data. Interrogating the production, access to, and use of data across EU actors serves to highlight that data have a fluid and socially contingent life, but also that data

Christopher C. Leite and Can E. Mutlu

75

alter and are altered by the contexts in which they are mobilized—the meaning of data changes depending on its context. These different stages of the social life of data are discussed below as well as what these evolutions mean for two different EU policy areas. The Social Life of Data in EU Policy Governance: DG ECHO DG ECHO is the Directorate-General tasked with managing civil protection response, humanitarian aid coordination, and development project implementation at the EU level. In terms of deploying in-field operations, civil protection is the only policy area that ECHO runs in-house as part of its disaster governance platform.16 Controlling key data allows actors from DG ECHO to be the de facto leaders in the field of disaster response, coordinating civil protection expert deployments in Europe and beyond. Working closely with individual state civil protection teams, ministries of the interior and defense, and UN agencies such as the United Nations Disaster Assessment and Coordination (UNDAC), DG ECHO not only runs its own in-house disaster response teams but also coordinates disaster response initiatives by other response personnel. For example, DG ECHO–coordinated teams were the first to land in Port-au-Prince after the 2010 earthquake in Haiti and DG ECHO teams were some of the first to offer search and rescue assistance after earthquakes rocked Nepal in the spring of 2015. DG ECHO actors have become central fixtures in responding to disasters specifically because of their institutionalized ability to develop and manage data. ECHO actors have an important hand in producing the data they use to monitor for disaster scenarios. According to high-ranking DG ECHO officials, DG personnel contact the Joint Research Centre and illustrate a specific need—for example, a need to assess tsunami early-warning capabilities. The JRC will then act as a liaison with the scientific community most involved in developing these systems (i.e., seismologists or oceanographers, in the case of tsunami monitoring), develop a model, and pass it on to the requesting DG. These JRC analyses and systems are scientifically sophisticated and often based on emerging and new technologies. This also points out, though, that the parameters by which they are conducting their analyses and setting up their systems are inherently tailored to DG ECHO’s range of operations. As a result, while the systems themselves may be highly specific and quite precise, the definition of what is being studied at all comes from a preexisting policy decision that has already happened at the DG. The data that are used by the leaders in the field of disaster response are thus produced specifically by those same leaders: the JRC designs datagenerating monitoring systems based on specifications from DG ECHO. Some of these systems include the Meteoalarm Europe system, the system linking the Network of European Meteorological Services (EUMETNET), the European Flood Awareness System (EFAS), and the European Forest Fire Infor-

76

The Social Life of Data

mation System (EFFIS) as well as a system for collecting data from third-party data producers. The data by the different JRC systems come from buoys off of the Atlantic coast automatically measuring wave heights, soil dryness evaluation data that are manually entered in data collection centers, and seismic sensors that sense tectonic plate shifts, to name three examples. The data are then transmitted through constantly evolving spreadsheets, charts, maps, and graphs that are sent to DG ECHO through the Interactive Disaster Analysis System. Once this disaster monitoring data reaches Brussels, all of it is centralized through ECHO’s monitoring center in the ERCC. The ERCC allows DG ECHO actors to control the flow of disaster-relevant data. For example, every morning the ERCC gathers data from its global monitoring systems and publishes anything of note in its “ECHO Daily Flash.” These factsheets include any data that the JRC monitors have picked up as well as supplementary information including history of disasters in the region or background on conflict-caused human displacement, for example, with detailed maps or briefing notes with even more information about a given ongoing situation—the amount of people affected, other potential hazards nearby, and so forth. Also included in these Flashes are data gathered from other international monitoring systems and government systems, further highlighting how the ERCC acts as an intermediary of disaster monitoring data. These Flashes are thus technical artifacts, tools that work to translate the disaster monitoring data into accessible information used by disaster responders. The ERCC is serviced by a rotation of thirty-three duty officers, seconded from countries participating in ECHO’s disaster response programs. The job of these duty officers is to constantly monitor the JRC data that forecasts or anticipates natural disasters as well as monitor internationally provided sensor data that appear on the ERCC monitoring screens along with the JRC-generated information; they then alert response coordinators in the case of a disastrous event. How these officers share this information is another way that DG ECHO actors control access to disaster data: all disaster data and coordination information is run through the Common Emergency Communication and Information System (CECIS). CECIS is a peer-to-peer communication system, which works as a text-based messenger service operating on a closed communication network—it does not send messages to regular e-mail addresses or mobile phones. CECIS is located in-house at the ERCC and, by nature of being a closed emergency network, CECIS limits how it can be used, when it can be used, or who has access. This means that all disaster response communications are physically run through the ERCC. In terms of how these data are actually used, the JRC uses its disaster monitoring interfaces, which set automatic triggers in motion when a disaster scenario has occurred, alerting the duty officers to call for a response based on preapproved thresholds—water levels for floods, wave heights and frequencies for tsunamis, wind speeds for hurricanes, and so forth. The ERCC

Christopher C. Leite and Can E. Mutlu

77

also uses its monopoly on disaster data and its in-house CECIS to coordinate disaster responders in their deployment and through a field headquarters once on the ground. This ensures that the ECHO actors in the ERCC play a mediating role in how a deployment works as well as controlling the call for a deployment in the first place—two coordinating roles afforded it by the ERCC’s capabilities. In-field, CECIS shares ongoing data updates run through the ERCC. Data being shared in this way help DG ECHO and its ERCC to act as coordinators for multifaceted disasters—for example, to monitor the food shortage effects reported by the World Food Programme when a flood destroys crops—in a way that most of the other more niche disaster response bodies cannot. Disaster monitoring data play an inherently social role in the way disasters are governed and should not be thought of as being merely tools, but sociological and political entities that impact and alter their contexts. Disaster monitoring data are clearly not somehow divorced from the sociopolitical processes behind their generation and use. These data are instead part of the larger way that the social world is rendered legible and actionable. Data allow phenomena to be quantified and qualified as needing a policy response. A buoy measuring wave heights off of the coast of Portugal is not only calculating probabilities of tsunamis occurring but also reproducing a physical space delineated as worth protecting (Portugal, or Europe more generally) and outlining the types of scenarios that are worth mitigating (Atlantic tsunamis). The data produced and transmitted by the buoy and, to some extent, even the physical location of the buoy itself, translates these sociological preferences and choices into a clear, tangible policymaking goal: Portugal and Europe must be protected from tidal wave heights that signal potential tsunamis. In this sense, the buoy not only acts as a passive technology of data collection, but itself becomes a producer of legitimate techniques by which these sociological preferences can be rendered visible—it becomes an artifact. The data used in disaster governance are thus never just facts that help coordinate response teams—these data are carefully cultivated sets of practices that allow ECHO and ERCC personnel to claim the legitimate governance of disaster management. They can claim this precisely because of the data they create: disaster monitoring data turn natural phenomena such as earthquakes and floods into actionable areas for interventions that they themselves coordinate. It is important to think of how disaster monitoring data exist socially: ERCC personnel exist as a node within a larger network of civil protection experts and coordinators and the instruments they use to monitor and respond to a disastrous scenario. They generate and manage data by and for this relationship of actors to allow ERCC personnel to do their jobs and legitimate the particular institutional approach to disaster response that they represent: voluntarily seconded personnel, trained according to ECHO-negotiated specifications, and deployed and coordinated from the ERCC. The creation and use of disaster monitoring data should thus be seen as a technique of ERCC and

78

The Social Life of Data

ECHO actors stabilizing their influence or roles in the way disasters are governed in Europe, therefore underlining the social life of this disaster monitoring data. The inherently social dimension of data, in this case of the production and use of disaster monitoring data, then resides in the fact that the data are produced for a specific social purpose—in the case of disaster governance, it allows ERCC actors to claim and assume a position of authority in the way disasters are governed more generally. The Social Life of Data in EU Policy Governance: DG HOME Similar to how ECHO actors use disaster monitoring data to act as de facto leaders in their policy field, actors at DG HOME have begun using a similar approach to controlling the production, access, and use of data about (im)migration and internal security threats to become de facto leaders in the field of EU internal security provision. HOME is tasked with tackling organized crime and terrorist attacks inside of the EU as well as being in charge of managing migration and the external borders of the European Union through its Frontex border agency. HOME handles these different tasks by relying on the risk and intelligence data generated by its different agencies and departments to both determine how it will manage those agencies and be able to control EU internal security. This work takes place at HOME’s Strategic Analysis and Response Centre, established in 2012 and designed to mimic ECHO’s ERCC as a coordinator for EU security intelligence and risk analysis data. Like the ERCC, neither HOME’s interior offices nor STAR produce data themselves, but rather are collectors of data produced by others. The list of data producers sending their work to STAR is incredibly extensive.17 These more traditional security actors are all in addition to the less obvious internal security actors, such as the DG for Mobility and Transport (DG MOVE) sending reports from the European Maritime Safety Agency, the European Aviation Safety Agency, the European Railway Agency, and the Trans-European Transport Network Executive Agency, or DG ENER—responsible for EU energy policy—similarly sending reports from the European Atomic Energy Community (Euratom) Safeguards Office monitoring nuclear safety in Europe. With this long list of agencies, centers, and departments producing the data that STAR manages, its personnel are primarily concerned with sifting through these data and finding what sets represent scenarios needing a response. Just as DG ECHO actors contact the JRC to develop disaster data monitoring systems required for the ERCC, STAR receives risk or intelligence assessment data conducted elsewhere and develops what its members call new methodological approaches to analyze them. This means coming up with ways to frame and understand those assessments. Whether it is determining thresholds of lives lost, cost to EU institutions, or simply the likelihood of an event taking place, STAR develops the methodological approaches by which all other data is examined once reaching the STAR secured offices in Brussels. The individual monitors that forward risk and intelligence data to STAR have

Christopher C. Leite and Can E. Mutlu

79

their own sophisticated expertise in a given issue area, which they use to determine the likelihood of a given event taking place—for example, a sudden mass migration, a rapid influx in illicit drugs, or the spread of an organized crime network. To reiterate, just as the ERCC and ECHO personnel set the parameters for the type of data that they need the JRC systems to produce, STAR and DG HOME actors have developed their own in-house methodologies for determining what parts of their massive stores of data need to be acted on. Unlike in the DG ECHO’s case, DG HOME actors do not control access to the data they receive, as these data originate in third-party agencies; instead, DG HOME actors have worked to centralize all of these data through STAR and have restricted who has access to the full sets of amalgamated data. This is explicitly STAR’s mandate: its primary function is receiving weekly, daily, or even more frequent risk or intelligence packages from the individual centers. This is intriguing, as operational information sharing has been woefully neglected throughout EU-level institutional partnerships at both military and police levels. Information security procedures remain intensely protected state capacities. While standard risk and threat analysis reports are widely distributed, their methodologies and technologies used to develop the ways that the EU and its member state governments calculate these risks remain tightly guarded secrets. That all of these types of reports are sent to one location is a pivotal moment in the way EU institutions gather and store data. In terms of how STAR and DG HOME actors use these amassed data, STAR understands itself as a customer of intelligence and risk data—a fact emphatically clarified by more than one HOME bureaucrat. STAR acts as a filter for the data produced in various EU agencies and offices, passing on relevant information to policymakers for further action when needed.18 One STAR member even went so far as to say that “I want to be very straightforward and clear: we are not doing any intelligence, because it is not the mandate of the Commission. . . . From this comes the concept of being an intelligence customer.”19 It acts as a filter through which policy decisions are made. STAR receives sometimes hourly updates from those centers; its personnel decide if action is needed and, if so, who should be taking or leading that action. This, of course, provides STAR and HOME with a particularly enviable role of having the most complete picture of ongoing security situations in Europe by virtue of being the filters through which ongoing security emergencies are communicated. The way that actors at DG HOME use the data they collect highlights the social life of data. The data that STAR receives are created in a different setting, for a different purpose, and with a different policy action in mind. For example, Frontex’s Risk Analysis Unit (RAU) collects data about political upheavals in countries targeted as potential sources of migration, using their in-house Common Integrated Risk Analysis Model (CIRAM) to analyze their data and decide what political events might trigger a mass migration; the RAU then sends this data to STAR for secondary analysis. STAR personnel combine this information with the data sent from the European Asylum Support Office

80

The Social Life of Data

data about countries of origin for asylum seekers. If this new combination of data can be made to show, again for example, that a coup taking place in a North or West African country will trigger mass migrations into the EU via Spain’s enclave border cities in northern Morocco, then STAR will push these findings up to the HOME executive for them to develop a political response. Two recent instances of this include when STAR used their data to signal to the HOME executive that a coming migration surge was happening in the Strait of Gibraltar in late 2014, leading to an increased marine presence in the area, as well as using data to request Frontex to step in with its Rapid Border Intervention Team (RABIT) on the Greek-Turkish border in 2012. The datasets sent to STAR are changed by their social context. At STAR, new methodologies are applied, meaning that STAR personnel use the same sets of data to look for potential problems or indicators that are different from the ones examined and analyzed by the sources of these data. The data literally play two different social functions: one function in their original site of production and use, and another function at STAR. When STAR personnel talk about the new methodological approaches they use when trying to assess a set of data, they are literally translating the dataset so that it represents a new social phenomenon that better fits with what HOME is tasked with managing. This, perhaps, is the most important thing to highlight about the social life of data. Data do not only allow actors to claim authority, as seen in the way data are used to let ERCC personnel assume positions of authority in the field of disaster governance. The way that datasets are used at STAR demonstrates that data is created with specific social purposes in mind, and that data can be changed and altered to fit new and different social purposes—data are not a raw or unassailable set of facts, but socially produced and dependent on context for meaning. Conclusion: What Can the Social Life of Data Tell Us? In tracing the social life of two sets of data through the process of data production, dispersion, and use in policymaking, we made two key theoretical arguments about how data should be understood. First, we highlighted how data change the way governance institutions work because data are social and have social effects. The way that the meaning of data, their very ontology, changes according to their social context underlines that data are a socially produced entity that exists only because of the meaning given to it by social actors and the networks in which they exist—they are never raw, objective, or unaltered. Second, similarly, we provided a sociology of data by looking at two EU governance institutions’ production, management, and use of data. By emphasizing the role of data in altering how policymaking fields function, we pointed to how data translate phenomena into social events that can be politically managed and governed, which in turn allows actors to use data to establish their relative positions within policymaking networks. Given these two points, and because the production of data is an inherently social process, the way data are

Christopher C. Leite and Can E. Mutlu

81

used to justify or facilitate decisionmaking must also be understood as a relational process between actors, networks, and their material capacities—a sorely needed contribution to actor-network theory accounts of EU policymaking processes. These arguments about how data must be thought of in terms of their social life have implications that stretch beyond academia, however. Tracing the production, management, and use of data in the disaster response and migration policy fields allowed us to explain two aspects of how EU policymaking actors use data to establish and legitimate a particular EU way of developing and implementing policies. In the case of ECHO, producing and managing the use of disaster monitoring data allowed ERCC personnel to become the de facto leaders in their field. Similar to how Beerli, in this special section, discusses the way that data about humanitarian vulnerability are used to establish hierarchies in assistance provision, ERCC actors use this disaster monitoring data to decide which types of disasters are responded to and to coordinate the corresponding response interventions. Just as Isabel Rocha de Siqueira’s article in this special section outlines how datasets about state fragility are used differently by different actors to alter the way that development projects are implemented, risk and intelligence data are manipulated through new methodological approaches by STAR personnel to empower DG HOME to influence how migration concerns are managed at the EU level. Hans Krause Hansen and Tony Porter’s article in this special section emphasizes big data’s capacity to change the way we understand political decisionmaking and the flow of information. This touches on the final conclusion to draw from our insistence that data are an inherently social phenomenon. New ways of understanding the impacts of complex disaster scenarios and cross-border flows of people are being developed precisely because of emerging techniques of data collection. This also means, though, that new types of policymaking are also taking shape to manage these scenarios and flows. Therefore, to properly account for these different types of policies, we need to develop theoretical and methodological frameworks that better account for the types of logics of governance through data to understand how modern complex networks of political institutions govern. ! Notes

Christopher C. Leite is a PhD candidate in the School of Political Studies at the University of Ottawa, Ontario, Canada. Can E. Mutlu is assistant professor of international relations at Acadia University, Wolfville, Nova Scotia, Canada. 1. See Peter Gould, “Letting the Data Speak for Themselves,” Annals of the Association of American Geographers 71, no. 2 (1981): 166−176; Michael X. Delli Carpini and Scott Keeter, What Americans Know About Politics and Why It Matters (New Haven: Yale University Press, 1997). 2. John Law and Kevin Hetherington, “Materialities, Spatialities, Globalities,” in John R. Bryson, Peter W. Daniels, Nick Henry, and Jane Pollard, eds., Knowledge, Space, Economy (London: Routledge, 2000), pp. 34−49; John Law, “On Sociology and STS,” The Sociological Review 56, no. 4 (2008): 623−649.

82

The Social Life of Data

3. Bruno Latour, Reassembling the Social: An Introduction to Actor-network Theory (Oxford: Oxford University Press, 2000). 4. Ian Hacking, The Emergence of Probability: A Philosophical Study of Early Ideas About Probability Induction and Statistical Inference (New York: Cambridge University Press, 2009) (Original work published 1975); Ian Hacking, The Taming of Chance (New York: Cambridge University Press, 2010) (Original work published 1990). 5. Michael Oakeshott, Rationalism in Politics and Other Essays (London: Methuen, 1967); Alain Desrosières, La Politique Des Grands Nombres: Histoire De La Raison Statistique (Paris: La Découverte & Syros, 1993). 6. Theodore M. Porter, Trust in Numbers: The Pursuit of Objectivity in Science and Public Life (Princeton, NJ: Princeton University Press, 1996). 7. Hans Krause Hansen and Tony Porter, “What Do Numbers Do in Transnational Governance?” International Political Sociology 6, no. 4 (2012): 409–426; Isabel Rocha de Siqueira, “Measuring and Managing ‘State Fragility’: The Production of Statistics by the World Bank, Timor-Leste, and the g7+,” Third World Quarterly 35, no. 2 (2014): 268−283. 8. See Louise Amoore, “Security and the Incalculable,” Security Dialogue 45, no. 5 (2014): 423−439. 9. Michel Callon, Pierre Lascoumes, and Yannic Barthes, Acting in an Uncertain World: An Essay on Technical Democracy, trans. Graham Burchell (Cambridge: MIT Press, 2009) (Original work published 2001). 10. Bruno Latour, “On Actor-network Theory: A Few Clarifications,” Soziale Welt 47, Jahrg., H. 4 (1996): 369. 11. This emphasis on the relational and mediational aspect of social relationships is different from, for example, the Foucauldian type of calculating governmental logics studied or the Bourdieusian focus on the social relations between actors within their institutions, as well as between actors and the institutions themselves. 12. Bruno Latour, Reassembling the Social: An Introduction to Actor-NetworkTheory (Oxford: Oxford University Press, 2007), p. 109. 13. Bruno Latour, Pandora’s Hope: Essays on the Reality of Science Studies (Cambridge: Harvard University Press, 1999), p. 301; (emphasis added). 14. Latour, “On Actor-network Theory,” p. 374 (emphasis added). 15. Ibid., p. 371 (emphasis added). 16. Humanitarian aid and development policies are run from either the DG for International Cooperation and Development (DEVCO) or the External Action Service (EEAS). ECHO acts as a partner in creating new policies or coordinating existing policy implementation. 17. The list includes the EEAS Intelligence Analysis Centre (INTCEN); the European Police Office’s (Europol) Analysis System (EAS); Europol’s Scanning, Analysis and Notification (SCAN) system; the EUROPOL Drugs Unit (EDU); individual UK, French, and German intelligence agencies; the Frontex border agency’s Risk Analysis Unit; the European Asylum Support Office (EASO); the European Border Surveillance System (EUROSUR) and its closely related reporting bodies from member states and border agencies; Interpol’s international criminal tracking system; the UN Office on Drugs and Crime (UNODC); and the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA). 18. DG HOME employee, interviewed by the author, 27 February 2013; DG ECHO employee, interviewed by the author, 25 March 2013. 19. DG ECHO employee, interviewed by the author, 25 March 2013.