Climate Science as Sensory Infrastructure

Welcome to the Anthropocene, that planetary tempo in which all the metabolic rhythms of the world start dancing to crazy new tunes. Sure, you can join the Heideggerians and blame western metaphysics for all this. You could put it down to Walter Benjamin’s angel of history. Or, perhaps it is time to find some new characters to talk about, and new objects of thought. Maybe critical writing could get its head out of the cloudy superstructures and think again about this base and vulgar world.

The problem with the traditional humanist disdain for science and technology is that it is now a line of thought pursued most vigorously again by reactionaries and fascists. If you want to accept the reality of climate change, that most awkward rift in the planet’s metabolism, then that means accepting the science on which it is based. Accepting the science, it turns out, means relying on a particular kind of infrastructure that produces it.


Perhaps it is time then to turn to a kind of critical theory that was particularly interested in infrastructures, in technologies, and in sciences. For example: let’s talk about Alexander Bogdanov. Lenin’s rival for the leadership of the Bolshevik party, he was an early theorist of the biosphere, and founder of Proletkult, the movement for worker’s knowledge. Let’s talk about Andrey Platonov, the finest product of Proletkult, who gave up writing during the Russian Civil War to become an engineer and fight famine in the countryside. Those seem to me the kinds of writers we might have need of again.


Platonov worked on four kinds of infrastructure: the railways, electricity, irrigation, and the rather more subtle but pervasive one of standards, when he worked on weights and measures. He gives a vivid description of the struggle to build up infrastructure in his story ‘The Motherland of Electricity’. Set in 1921, it follows a young engineer, ‘haunted by the task of the struggle against ruin’, who is summoned to a remote village by a rather poetic communiqué from the secretary of the village soviet. The land is parched, the peasants are starving. So the engineer concocts a scheme to build some infrastructure to help them.


There are few resources. Searching through the loot expropriated from the landlord, the poet secretary and his new comrade the engineer find some Picasso paintings, but no pumps. There are irrigation channels but no way to move the water. The young engineer persuades an old peasant to give up the alcohol he makes in his still, ‘fuelled by rhizomes’. The alcohol will run the one engine in the village, an old motorcycle. The engineer then persuades the local mechanic to take some scrap metal and make a paddlewheel. This replaces the back wheel of the bike, and now pumps water through the old irrigation channels.


The engineer succeeds – at least for a moment – in building an infrastructure out of available parts in which the technologies – such as they are – interlock. In this case the social practices that might go with this system don’t quite take. The still explodes, killing the old peasant. He was using his own stomach as the apparatus to test the proof of the liquor, and fell asleep in the process. Often in Platonov the infrastructure fails to live up to the superstructure of the revolution.


Platonov even thought of a knowledge infrastructure in his plan for a literary factory, a distributed network of specialised text filtering centres with central nodes for the final synthesis of literary works. This, like Bogdanov’s Socialist University, did not come to pass. What both of them were concerned with was building infrastructures through which we can sense the world. How could our collective labours know and organise the world and themselves? How can the resistance of matter and the mental resistance of labour be overcome? That, to Bogdanov, was the revolution.


It did not come to pass, but a quite surprising knowledge infrastructure that did get built is the one through which scientists came to know about climate. What distinguishes climate science from particle physics or even genetics is that the data has to come from the whole of the big, bad, outside world. It’s not possible to build an apparatus that makes a cut, separating and delineating the object of study in its totality. Atmosphere is all around. You are sucking it in right now. The study of climate called into being not an apparatus but a whole infrastructure of separate apparatuses, of distinctive cuts. Climate science thus poses some fairly novel problems as to what a science is or should be, and not least what a comradely, cooperative science could be.


Our guide to climate science is A Vast Machine by Paul Edwards, which offers a history of the models, data and politics through which scientists came to understand global warming. It takes its title from a remark by John Ruskin, who in 1839 imagined panoptic system for knowing simultaneously the state of the weather on the entire surface of the planet. Ruskin prefigures in some respects what actually happened, the building up, bit by bit, often from existing parts, first of partial systems, then a network, and by the twenty-first century an entire infrastructure for gathering weather data.


In some other respects, perfect weather surveillance turned out to be an impossibility. What Edwards calls friction, for which Bogdanov’s term resistance could well substitute, stands in the way of a perfect knowledge of real weather. There is data resistance, which impedes the gathering of weather data readings. There is metadata resistance, which is a limit to what is known about the circumstances relevant to particular weather readings. There is computational resistance, which limits the accuracy of the calculation of future weather states from weather data. Edwards also gives many examples of kinds of cultural and political, or in Bogdanov’s terms organisational resistance, that had to be overcome to produce a global knowledge of global climate.


Even with the technology of the twenty-first century, human meteorologists are not entirely obsolete as components in weather forecasting cyborgs. (Cyborg to be understood here in its original formulation as a portmanteau of ‘cybernetic organism’: something with both organic and biomechatronic parts.) Predicting future weather is hard. Predicting past weather isn’t much easier. Climate is the history of weather. The resistances which make predicting weather hard make the rather different science of understanding climate even harder. Weather is a chaotic system. Small variations can have rather large consequences. Even when the friction that results from gathering data, knowing about the gathering of the data, and processing the data is small, that small variation can produce significant differences in the prediction of both past and future climate, even in today’s advanced climate modelling computation systems.


Despite the differences in the results that climate modelling scientists get from their efforts, there is an overwhelming consensus around certain propositions that could at this point be called facts. Average temperatures around the world are rising. One of the causes of that trend is increases in the levels of atmospheric carbon. The cause of that increase is collective human labour. As in any established science, there is plenty of controversy, but it is within this consensus.


The scientific recognition that collective human labour is causing climate change could well be one of the great discontinuities in perspective. The heliocentric universe of Copernicus and Galileo, the evolution of species in Darwin and Wallace, and what Louis Althusser rather problematically calls Marx’s opening up of the ‘continent of history’ might be its precursors. Some neutral, pre-given planetary nature is no longer available as a fiction of the real.


Nature always appears in the rear-view mirror, as a reflection of that which recedes from this juggernaut of an SUV in which we rattle on into the future. Or perhaps nature is just the movie that’s playing on the screen now embedded in the dashboard, while the SUV idles in a ditch, no longer going anywhere, just waiting for the wave of travail it left behind to catch up with it. Here comes the carbon liberation front, and the slo-mo replay of ecological crisis catching up with the vehicle which tore across the world excreting its molecular cause.


The Anthropocene era is an historical moment when one true worldview is superseded by one of greater generality. Mother Nature as passive resister to he who acts no longer holds as a basic metaphor. Resistance has to be rethought a bit. There can be no resort to the figure of Gaia any more than to Prometheus. The gods cannot save us; the goddess cannot save us. As Donna Haraway, author of The Cyborg Manifesto, insists, the engendering habits of thought about nature have to become an object of critique as a whole, rather than merely reversing the poles and celebrating a divine feminine nature.


Climate science understands the planet as a system with a history. It is the coming together of three separate kinds of knowledge: weather forecasting, the empirical study of climate and theoretical meteorology. Bringing those three fields together required the growth of an extensive infrastructure of observation, communication, processing and modelling. This infrastructure is a vastly more elaborate form of exactly the same sort of thing Platonov was working on in the countryside. It is embedded, extensive, standardised, modular, and – as he knew all too well – visible when it fails.


Infrastructure connects diverse structures, such as the liquor still, the motor bike and the irrigation ditch. Science, like irrigation, requires an infrastructure, and any Marxist theory of knowledge rightly begins with some study of that infrastructure. In A Vast Machine, Edwards is not offering a Marxist theory of knowledge, but he usefully provides the tools with which to start building one. Such a theory starts and ends not with mathematics or method but with the infrastructure by which a science is made


Climate science differs from bench science in the extent of its dependence on cooperative labour and communication for the production of data. Its history is one of various bits of infrastructure slowly accreting and linking. Colonial occupations provided some of the first opportunities for recording data from very different climates. The US Navy early on started keeping detailed meteorology. The first theories of global atmospheric circulation arose in part out of the global vectors of trade and empire and the still-partial sensation of a global space they produced.


Through war and peace, the various sciences of weather and climate oscillated between forms of scientific nationalism and internationalism. Imperial power brought with it attempts at universal standards, a universal time, Morse code and the vector of telegraphic communication, which moves information faster than weapons or commodities, although as yet not much faster than weather itself. Attempts at international cooperation on weather collapsed during both world wars. Yet war made weather an even higher priority. Meteorology aids accurate ballistics, not to mention the timing of troop movements and air sorties. Just as the ship had once become a mobile weather station, so too did aircraft, adding a third dimension to data-space.


As a field, climatology started off as descriptive and empirical. The notion that clearing forests could cause local changes in climate had already occurred to David Hume, but it was Eduard Brüchner who in the 1880s came up with some plausible ideas about different causes of climate change. He thought German climate data showed periodic fluctuations around a thirty-five-year-cycle, and he linked increasing desertification and drought to deforestation. Svante Arrhenius brought concepts from physics into climate thinking. He calculated that doubling atmospheric carbon dioxide would raise global temperatures by 5 or 6 degrees Celsius, but he was more interested in explaining periods of global cooling.


Thomas Chamberlin started combining gas physics theories with geological data, and arrived at both a concept of the carbon cycle, and of the role of life in it. The idea that the carbon cycle was a driver of global climate changes fell out of favour until the 1930s, as other research with other apparatuses seemed to show the water vapour in the atmosphere had a stronger greenhouse effect than carbon dioxide. Bogdanov’s interest in it in the 1910s and 1920s appears idiosyncratic. In his science fiction novel Red Star he imagined planet-wide global warming caused by collective labour. The prevailing idea linked climate change to variations in the earth’s orbit and tilt.


The carbon cycle climate theory was revived by the steam engineer G. S. Callendar, who also grasped the role of fossil fuel combustion in the carbon cycle. More sensitive instruments now showed carbon absorbing heat in a wider spectrum than water vapour. Callendar calculated that temperatures were already rising. The Royal Society was unimpressed, and rightly so, given the poor state of theoretical climate science and the unreliability of the data at the time.


A major advance on the theoretical side was when the Norwegian physicist and meteorologist Vilhelm Bjerknes applied concepts from fluid dynamics and modelled weather with a system of seven equations. The problem was that as yet there was no way to gather the data or compute it, or an efficient way to make the computations.


Edwards: ‘Data are things.’ If we are to avoid fetishising these things, then we must inquire as to how data are produced. Data are the product of a whole series of labours, of observing, recording, collecting, transmitting, verifying, reconciling, storing, cataloguing and retrieving. In each of these processes, human labour and the apparatus intra-act in all sorts of ways. The most utopian thing in Bogdanov’s novel Red Star is that data and computational resistance have been overcome. In climate science, we have just such a story of the labour of information, which overcomes resistances which correspond to neither Bogdanov’s mental or manual resistances to organisation.


The exigencies of war were a great fillip to overcoming such resistances, and it leaves its traces. The image of a storm ‘front’ is a substitution that arrives from military forecasting. Speed was the essence of weather forecasting, and so the forecasters did not want too much data. The overcoming of communication resistance had outstripped the speed of the processing of data. The labour of processing data caught up with its transmission with the punch-card Holerith machines. In the United States, there was even a depression-era WPA project to put all past ship weather data onto punch cards for the study of climate.


The Second World War was the weather war. Military aircraft discovered the jet streams, and finally theories about how upper level weather controls ground weather could be confirmed. Weather data captured from the Germans was treated as an intelligence find and several tons of records were carefully shipped back to the United States. It was an era when computing resistance was falling, but data resistance was still high. Too much manual processing was still involved. The human labour took the form of a series of analogue-digital conversions: recording an observation from an instrument in a log book, then transcribing those numbers for communication by telegraph, then writing them back into tables by hand, and so on.


The Normandy invasion confirmed the military value of forecasting. It began under cover of appalling weather conditions, based on a meteorological prediction that the weather would clear once the Allied forces had crossed the English Channel. It could be said that the weather was not a chance circumstance in this instance but had been fully weaponised.


Post-war weather data infrastructure was a vast improvement on its pre-war state but, as in Platonov, had been cobbled together from different kinds of parts. The wartime development of computing made numerical weather prediction a possibility for the first time. The Hungarian mathematician John Von Neumann grasped how computers could be used for weather data. Like the hydrogen bomb, weather is a matter of fluid dynamics. What worked in one field might work in another. The American meteorologist Jule Charney used the ENIAC computer (the first electronic general-purpose computer, completed in 1946) for forecasting experiments, although it took more than 24 hours to compile a 24-hour weather forecast. This was the beginning of a shift from analogue to digital forecasting. As in other fields, early results led to optimism about computing power which took a long time to pay off.


What particularly appealed to military taskmasters was the prospect of controlling the weather. The Normandy invasion used accurate prediction to weaponise weather that happened to occur. What would appear a next logical step would be to command the weather. But before it could be controlled, it had to be measured and modelled. Not for the first or last time, a fairly remote military objective fuelled a whole range of research. The post-war period is one of the growth over time of model sophistication, model resolution and computing speed, in part fuelled by military funding and imperatives, but in the process creating something else.


Something quite strange happens when computing power reaches the point that it can handle the complexity of a climate model: as Edwards points out, ‘If you can simulate the climate, you can do experiments.’ The computer itself becomes the laboratory. Over time, more and more processes can be integrated into the model, starting from a – relatively – simple thermodynamic engine which transports air from the equator to the poles, and eventually including a comprehensive model of how the atmosphere and the oceans process solar energy, the effect of clouds, of the albedo effect of different surfaces, ocean currents and turbulence, and so on.


Climate models become earth system models that include the entire carbon cycle. Such an apparatus still can’t predict the actual weather too far in advance. Weather is chaotic, so predictive models aren’t accurate past a week or two. No model yet reproduces all features of recorded climate, but they do simulate its general patterns. The computer becomes the apparatus for a kind of infinite forecast, predicting both the past and future climates of simulated planet Earths.


What made all this possible is supercomputing infrastructure. Edwards: ‘No fields other than nuclear weapons research and high-energy physics have ever demanded so much calculating capacity’. Climate science is one of the few fields to have influenced the development path of supercomputing. The first Cray supercomputer (model 1) shipped to a client was a prototype that went to Los Alamos in 1976 for weapons work, but the next (model 3) was for climate research.


Modelling problems remain. There is a trade-off between the resolution and the complexity of a climate model: should it have more finely grained detail and only a few processes, or more processes but less fine detail? Then there is the problem of rounding off error. To how many decimal points should a result be calculated? Tiny errors accumulate and affect the outcome. Even when starting with the same data, small differences in computer code can result in different simulation outcomes when modelling the same climate on different machines. The simulated Earths of climate science all differ a bit from each other – although all of them heat up.


Meanwhile, on the actual Earth, connecting up the weather data gathering practices of the planet turns out to be an interesting precursor to the creation of the internet. Overcoming geopolitical resistances to get agreements on standards, and getting them implemented, took patience and persistence. Cold War geopolitics and decolonisation movements favoured a kind of infrastructural globalism. The superpowers sought global information in all sorts of areas, of which weather was just one, and preferably from some direct technical means, rather than relayed via allies. Former colonies attempting to establish themselves as sovereign states wanted to participate in global information sharing, but were not in a position to make major infrastructural commitments. Global weather data came to rely on two standard kinds of Cold War apparatus: the computer and the satellite.


The Cold War saw an escalation of what Edwards calls the ‘techno-politics of altitude’. When the Soviets shot down Gary Powers’ U2 spy plane in 1960, the Americans claimed it was doing weather research, which may even have been slightly true. Climate science intersected with the Cold War in one particularly eerie way: the ‘atmospheric’ testing of nuclear weapons also turned out to be a nuclear testing of the atmosphere. Tracking radioactive fallout was a boon to the development of three-dimensional models of global circulation patterns, and also to atmospheric chemistry.


The United States and the Soviet Union each launched their own weather satellite systems, but superpower rivalry generated a subsidiary counter-discourse of peace, cooperation, and scientific progress. Weather was an area in which to realise all three. Edwards: ‘The project enrolled scientists and weather forecasters in the competitive goals of the superpowers while simultaneously enrolling the superpowers… in scientific cooperation.’ The Soviet Union shared weather data from the vast territory it controlled, although it withheld the locations of certain weather stations, which of course were likely military installations.


With some sort of infrastructure cobbled together, it was feasible by the late seventies to execute something called the Global Weather Experiment, one of the largest scientific experiments ever conducted, and the first to attempt to produce a fully global data set about planetary weather. It also led to regular meetings of climate modellers from around the world. Climate emerges as a global object of a global knowledge practice, but overcoming geopolitical resistances hardly resolves the other kinds of resistance.


A key part of Edwards’ presentation of the genesis of climate science is his insistence on what Karen Barad might recognise as a kind of intra-action, the term she coined to describe the inseparability of entangled agencies and factors. The data and the models co-produce each other. It’s not that there is data and then a model of that data. At a certain point in its evolution, the model acquires the capacity to create a certain quality of the data set itself. Edwards: ‘Adding the third dimension, computer model grids became an ideal Earth: the world in a machine. Data from the real atmosphere had to first pass through this abstract grid-space, which transformed them into new data calculated for the model’s demands.’ Interpolation algorithms adjust data to fit the model’s predictions.


To test how a model fits the world, climate science does not compare model to data, but model to another model, to a model of the data. The mutual adjustment of model grid-points and observational records is a kind of data assimilation. Data and models intra-act. Sometimes models with simulated data predict better than ones with actual observations.


Part of the difficulty for climate science is that the difference between ‘raw data’ and analysis is much wider than in a lot of fields, due to the kind of data. The apparatus is inside-out, as it were. In a bench science, the apparatus would effect a cut from the world, producing a situation where an event can happen that can be contained and repeated. In climate science, there are thousands upon thousands of instruments, each of which is an apparatus that makes a particular and local cut, measuring the value for temperature, or humidity or barometric pressure. What is between and beyond these cuts remains indeterminate. The data have to be gathered ‘in the wild’, and will always fall short of being perfect records due to the instabilities inherent in the placing of the apparatus within the situation it records, rather than having an apparatus produce the situation then record it. The theory-laden nature of data is pushed to the extreme in this science.


Climate science is not a reductionist science; it is a reproductionist science. Edwards: ‘Reproductionism seeks to simulate a phenomenon, regardless of scale, using whatever combination of theory, data, and “semi-empirical” parameters may be required. It’s a “whatever works” approach.’ From a rationalist perspective, this looks like ‘anything goes’. From a positivist perspective, it start to look like it’s ‘not science’. But it makes sense from the point of view Bogdanov enunciates, which hews close to the sensations that apparatus produces, which toggles between data, model and theory, and which uses conceptual, even critical tools to understand how each element in the practice of producing this knowledge, in and against the world, proceeds.


Models emerge as a relatively new category of knowledge production in this account. Models are mediators between the apparatus, the data, and the theory. They are a distinct kind of technology, embodied in code and the machines that run it. There is no purity in either the data or the theory. How is the apparatus itself turning up as an artefact in the results? An ‘artefact’ meaning an observed effect that was in fact introduced by the technology used in the investigation, or by experimental error. An unwanted refraction may show up in an image made with a compound lens as lens flare. Digital devices produce quite different and strange artefacts of their own.


The vast panoptic machine Ruskin imagined did not come to pass, even in the era of big data. More data seems also always to mean more resistance. There are limits to observability. The apparatus produces the data it’s designed to produce. Edwards: ‘You will never get perfect knowledge of initial conditions.’ Any particular model will always diverge from observations, in part because of the determinate quality of those very observations.


The problem with climate data is that most of it is actually weather data. It was collected for the purposes of short-run prediction, not long-run modelling. There is no consistent metadata attached to the data. There are not always records of what instruments were used, or if there were changes in the conditions of their use, nor are there always calibrations of a new instrument against the one it replaces. Artefacts in the data set can result from all sorts of such little changes. For example, there appears to be a sudden drop in sea temperature of .03 degrees in 1945, but it might be caused by a switch from data gathered by American ships that measure engine intake water to British ones that drop a bucket over the side.


Edwards: ‘Historically, climate scientists have often found themselves stuck with damaged goods – leftovers from a much larger forecasting enterprise that ignored consistency and reliability in its constant drive to move forward with newer, better technology.’ Ironically, this is a situation where the constant revolutionising of the means of production introduces resistances of its own. Climate science requires a theory and a method of studying the means of production of its own data.


This makes climate science vulnerable to attack by those with an interest in preserving carbon-fuelled commodity economy, or those with an emotional investment in maintaining the everyday life it appears to sustain. Techniques pioneered for the defence of tobacco, acid rain and ozone depletion now appear in attacks on climate science. Edwards: ‘During the George W. Bush administration, even as the scientific consensus grew even stronger, political appointees carried the manufacture of controversy to the point of criminal corruption.’ Enter the era, since experienced in other fossil fuel producer states such as Canada and Australia, of blocking appointments, censoring scientists, doctoring reports.


Crucially: The critique of science now shifts to the right. Hence the need for progressive forces to think more tactically about where and when the critique of science is appropriate. Perhaps returning to something like the classical Marxist and Bogdanovite open-mindedness towards the sciences might be appropriate, rather than the Heidegger-inflected critique of Marcuse and others, which sees all science and technology as embedded in the same western metaphysics.


Here we arrive at something of a crisis point for the romantic left. It wanted a totalising critique of technology that would ground its rejection of techno-modernity on a claim to something prior to or outside of it: on being, on nature, on poetry, on the body, on the human, or on communism as event or leap. But the problem is that the attendant closing-off of thought to science and technology now plays into the hands of climate denial. That the carbon liberation front is changing the climate is a knowledge that can only be created via a techno-science apparatus so extensive that it is now an entire planetary infrastructure. To reject techno-science altogether is to reject the means of knowing about ‘metabolic rift’ – John Bellamy Foster’s term for Marx’s prediction of the tendency of capitalism towards ecological crisis. We are cyborgs, making a cyborg planet with cyborg weather, a crazed, unstable disingression, whose information and energy systems are out of whack. It’s a de-natured nature without ecology.


Climate science has no need for Marxist theories, but Marxist theories – critical theories more broadly – have need of climate science. The critical project has to move on, from the critique of political economy, to the critique of their Darwinian descendants in biology, to the critique of climate science as a knowledge which shapes the most general worldview of the Anthropocene era.


It may sound like hubris to name an entire epoch the Anthropocene. It seems at odds with the decentring and demoting of the human that was a significant achievement of ecological thought. But the Anthropocene draws attention to androgenic climate change as an unintended consequence of collective human labour. It calls attention not to the psychic unconscious or the political unconscious but to the infrastructural unconscious. Moreover, viewed via the Anthropocene, human action remains quite modest and minor. It’s just that in fluid systems a small action can have disproportionately large effects, as Platonov’s characters found out the hard way.


This is an extract from McKenzie Wark’s Molecular Red: Theory for the Anthropocene (Verso, April 2015).


McKenzie Wark is the author of A Hacker Manifesto (Harvard 2004), Gamer Theory (Harvard 2007), The Beach Beneath the Street (Verso 2011), and various other things. He teaches at the New School for Social Research in New York City.