Category Archives: iteracy

The Digital Humanities Stack

Thinking about the structure of the digital humanities, it is always helpful if we can visualise it to provide some sort of map or overview. Here, I am exploring a way of representing the digital humanities through the common computer science technique of a software “stack“. This is the idea that a set of software components provides the infrastructure for a given computer system or platform. In a similar way, here I illustrate the discipline of digital humanities with a pictorial representation of the layers of abstraction in the image given below. This gives the reader an idea of what I am calling the digital humanities stack.

The Digital Humanities Stack, illustration by Marcus Leis Allion  (Berry 2016)

This type of diagram is common in computation and computer science to show how technologies are “stacked” on top of each other in growing levels of abstraction. Here, I use the method in a more illustrative and creative sense of showing the range of activities, practices, skills, technologies, and structures that could be said to make up the digital humanities as an ideal type. This is clearly a simplification, and is not meant to be prescriptive, rather it is aimed to be helpful for the newcomer to the digital humanities as it helps to understand how the varied elements that make up the digital humanities fit together. Whilst I can foresee criticisms about the make-up and ordering of this stack that I present here, nonetheless, I think it, more or less, provides a useful visual guide to how we can think about the various components of a digital humanities and contributes towards further understanding digital humanities. I deliberately decided to leave out the “content” elements in terms of the specificity, for example, of the different kinds of digital archive that we see across the digital humanities. I think that this is acceptable as the term digital archive does, I think, capture a wide range of digital databases and archival forms, although perhaps does not strongly enough signify the related material elements, for example in a “postdigital archive” that includes both digital and non-digital element. Relatedly, this diagram does not capture sufficiently, perhaps, something like the inclusion of a media archaeological collection in its materiality.

So this diagram can be read as the bottom levels indicating some of the fundamental elements of the digital humanities stack, such as computational thinking and knowledge representation, and then other elements that later build on these. Of course, diagrams simplify and even though I would have preferred for the critical and cultural critique to run through more of the layers, in the end it made for a more easily digestible visual representation if I didn’t over-complicate the diagram. The illustration here stretches the concept of a stack, in a strict computer science manner, as it includes institutional layers and non-computational elements, but as a heuristic for thinking about the digital humanities in its specificity, I think it can be helpful. As a version 1.0 of the digital humanities stack I look forward to reworkings of it and complication and re-articulations in the comments.


Questions from a Worker Who Codes

In relation to the post-digital, it is interesting to ask the question as to the extent to which the computational is both the horizon of, and the gatekeeper to, culture today (Berry 2014a). If code operates as the totalising mediator of culture, if not the condition for such culture, then access to both culture and code should become social, political and aesthetic questions. This is partially bound up with questions of literacy and the scope of such knowledges, usually framed within the context of computational competence within a particular programming language. This question returns again and again in relation to the perceived educative level of a population in order to partake of the commonality shared within a newly post-digital culture – should one code? In other words, to what extent must a citizen be able to read and interact with the inscriptions that are common to a society?  Indeed, in the register of art, for example, Brecht considered the question itself to be superfluous, in as much as providing an opportunity of access and therefore praxis opens the possibility of such experiences and understanding. He writes,

one need not be afraid to produce daring, unusual things for the proletariat so long as they deal with its real situation. There will always be people of culture, connoisseurs of art, who will interject: “Ordinary people do not understand that.” But the people will push these persons impatiently aside and come to a direct understanding with artists (Brecht 2007: 84).

In relation to the practices of code itself, it is, of course, not a panacea for all the ills of society. However, it is on the other hand a competence that increasingly marks itself out as a practice which creates opportunities to interact with and guide ones life in relation to being able to operate, and define how the computational functions in relation to individuation processes (see Stiegler 2013, Cowen 2013, Economist 2014). Not only that, as the epistemic function of code grows in relation to the transformation of previous media forms into a digital substrate, and the associated softwarization of the process, culture is itself transformed and the possibilities for using and accessing that culture change too. Indeed, Bifo argues, without such competences, “the word is drawn into this process of automation, so we find it frozen and abstract in the disempathetic life of a society that has become incapable of solidarity and autonomy” (Berardi 2012: 17). For Berardi, cognitive labour would then have become disempowered and subjected to what he calls “precarization” (Berardi 2012: 141). In response he calls for an “insurrection” in as much as “events” can generate the “activation of solidarity, complicity, and independent collaboration between cognitarians”, that is, “between programmers, hardware technicians, journalists, and artists who all take part in an informational process” (Berardi 2012: 142-3).

The aim of this literacy, if we can call it that, in relation to the computational, and which is similar to what I have called iteracy elsewhere (Berry 2014b), is also connected to notions of reflexivity, critique, and emancipation in relation to the mechanisation of not only labour, but also culture and intellectual activities more generally. Understanding the machine, as it were, creates the opportunity to change it, and to give citizens the capacity to imagine that things might be other than they are.

This is important to avoid a situation whereby the proletarianisation of labour is followed by the capacity of machines to proletarianise intellectual thought itself. That is, that machines define the boundaries of how, as a human being, one must conduct oneself, as revealed by a comment by a worker at a factory in France in the 1960s who commented, that “to eat, in principle, one must be hungry. However, when we eat, it’s not because we’re hungry, it’s because the electronic brain thought that we should eat because of a gap in production” (Stark 2012: 125). Delegation into the machine of the processes of material and intellectual production abstracts the world into a symbolic representation within the processes of machine code. It is a language of disconnection, a language that disables the worker, but simultaneously disables the programmer, or cognitive worker, who no longer sees another human being, but rather an abstract harmony of interacting objects within a computational space – that is, through the application of compute (Berry 2014c). This is, of course, a moment of reification, and as such code and software act as an ideological screen for the activities of capitalism, and the harsh realities of neoliberal restructuring and efficiencies, the endless work,[1] made possible by such softwarization. Indeed, under capital,

time sheds its qualitative, variable, flowing nature; it freezes into an exactly delimited, quantifiable continuum filled with quantifiable ‘things’ (the reified, mechanically objectified ‘performance’ of the worker, wholly separated from his total human personality): in short, it becomes space. In this environ­ment where time is transformed into abstract, exactly measurable, physical space, an environment at once the cause and effect ofthe scientifically and mechanically fragmented and specialised pro­ duction of the object of labour, the subjects of labour must like­ wise be rationally fragmented. On the one hand, the objectifica­tion of their labour-power into something opposed to their total personality (a process already accomplished with the sale of that labour-power as a commodity) is now made into the permanent ineluctable reality of their daily life. Here, too, the personality can do no more than look on helplessly while its own existence is reduced to an isolated particle and fed into an alien system. On the other hand, the mechanical disintegration of the process of production into its components also destroys those bonds that had bound individuals to a community in the days when production was still ‘organic’. In this respect, too, mechanisation makes of them isolated abstract atoms whose work no longer brings them together directly and organically; it becomes mediated to an increasing extent exclusively by the abstract laws of the mechanism which imprisons them (Lukács 1971: 90).

But of course here, it is not seconds and minutes measured in “the pendulum of the clock [that] has become as accurate a measure of the relative activity of two workers as it is of the speed of two locomotives”, but rather the microsecond and millisecond time of code, combined with new forms of sensors and distributed computational devices that measure time. Indeed, “time is everything, [humans are] nothing; they are at the most the incarnation of time. Quality no longer matters. Quantity alone decides everything: hour for hour, day for day” (Marx 1976: 125). For it is in the spaces of such quantification that lies the obfuscation of the the realities of production, but also of the possibility for changing production to a more democratic and humane system that makes, as Stiegler claims, “a life worth living” (Stiegler 2009).[2]


[1] It is interesting to think about the computational imaginary in relation to the notion of “work” that this entails or is coded/delegated into the machine algorithms of our post-digital age. Campagna (2013) has an interesting formulation of this in relation to Newman (2012) has called “nothing less than a new updated Ego and Its Own for our contemporary neoliberal age” (Newman 2012: 93). Indeed, Campagna writes, “westerners had to find a way of adapting this mystical exercise to the structures of contemporary capitalism. What would a mantra look like, in the heart of a global metropolis of the 21st Century? What other act might be able to host its obsessive spirit, whilst functioning like a round, magic shield, covering the frightened believers from their fear of freedom? There was only one possible, almost perfect candidate. The activity of repetition par excellence: Work. The endless chain of gestures and movements that had built the pyramids and dug the mass graves of the past. The seal of a new alliance with all that is divine, which would be able to bind once again the whole of humanity to a new and eternal submission. The act of submission to submission itself. Work. The new, true faith of the future” (Campagna 2013: 10). Here, though I argue that it is not immaterial apparitions and spectres which are haunting humanity and which the Egoist can break free from, but the digital materiality of computers’ abstractions formed of algorithms and code and which are a condition of possibility for individuation and subjectivity itself within cognitive capitalism. 
[2] As Stark writes,  “for a worker to claim the right to create—to theoretically “unalienated” labor—was a gesture as threatening to the factory bosses as it was to the official organs of the left, with their vision of the worker acceding to a state of being-in-oneself through work. Regarding this form of sociological indeterminacy, Rancière argues that “perhaps the truly dangerous classes are . . . the migrants who move at the border between classes, individuals and groups who develop capabilities within themselves which are useless for the improvement of their material lives and which in fact are liable to make them despise material concerns.” Further, for Rancière, “Working- class emancipation was not the affirmation of values specific to the world of labor. It was a rupture in the order of things that founded these ‘values,’ a rupture in the traditional division [partage] assigning the privilege of thought to some and the tasks of production to others.” Binetruy affirms this rupture, recalling that while initially wary of “these Parisians who came stuffed with film and cameras,” he quickly realized that “they did not come to teach us any lessons, but rather to transmit technical training that would liberate our spirits through our eyes. Once you have put your eyes behind a camera, you are no longer the same man, your perspective has changed.”” (Stark 2012: 150).


Berardi, F. (2012) The Uprising: On Poetry and Finance, London: Semiotext(e).

Berry, D. M. (2014a) The Post-Digital, Stunlaw, accessed 14/1/2014,

Berry, D. M. (2014b) Critical Theory and the Digital, New York: Bloomsbury.

Berry, D. M. (2014c) On Compute, Stunlaw, accessed 14/1/2014,

Brecht, B. (2007) Popularity and Realism, in Aesthetics and Politics, London: Verso Press.

Campagna, F. (2013) The Last Night: Anti-Work, Atheism, Adventure, London: Zero Books.

Cowen, T. (2013) Average Is Over: Powering America Beyond the Age of the Great Stagnation, London: Dutton Books.

Economist (2014) Coming to an office near you, The Economist, accessed 16/01/2014,

Lukács, G. (1971) History and Class Consciousness: Studies in Marxist Dialectics, MIT Press.

Marx, K. (1976) The Poverty of Philosophy, in Karl Marx and Frederick Engels, Collected Works, Volume 6, 1845–1848, London: Lawrence & Wishart.

Newman, S. (2013) Afterword, In Campagna, F. (2013) The Last Night: Anti-Work, Atheism, Adventure, London: Zero Books, pp. 92-5.

Stark, T. (2012) “Cinema in the Hands of the People”: Chris Marker, the Medvedkin Group, and the Potential of Militant Film, OCTOBER, 139, Winter 2012, pp. 117–150.

Stiegler, B. (2009) What Makes Life Worth Living: On Pharmacology, Cambridge: Polity Press

On Compute

Today, the condition of possibility for the milieu of contemporary life is compute. That is, compute as the abstract unit of computation, both as dunamis (potentiality) and energeia (actuality), that is as the condition of possibility for the question of the in-itself and the for-itself.  Compute as a concept, exists in two senses, as the potential contained in a computational system, or infrastructure, and in the actuation of that potential in actual work, as such. Whilst always already a theoretical limit, compute is also the material that may be brought to bear on a particular computational problem – and now many problems are indeed computational problems. Such then that the theoretical question posed by compute is directly relevant to the study of software, algorithms and code, and therefore the contemporary condition in computal society, because it represents the moment of potential in the transformation of inert materials into working systems. It is literally the computational unit of “energy” that is supplied to power the algorithms of the world’s systems. Compute then, is a notion of abstract computation, but it is also the condition of possibility for and the potential actuation of that reserve power of computation in a particular task. Compute becomes a key noetic means of thinking through the distribution of computation in the technological imaginary of computal society.

In a highly distributed computational environment, such as we live in today, compute is itself distributed around society, carried in pockets, accessible through networks and wireless connections and pooled in huge computational clouds. Compute then is not only abstract but lived and enacted in everyday life, it is part of the texture of life, not just as a layer upon life but as a structural possibility for and mediation of such living. But crucially, compute is also an invisible factor in society, partially due to the obfuscation of the technical condition of the production of compute, but also due to the necessity for an interface, a surface, with which to interact with compute. Compute then as a milieu is such that it is never seen as such, even as it surrounds us and is constantly interacting with and framing our experiences. Indeed, Stiegler (2009) writes that,

Studying the senses, Aristotle underlines in effect that one does not see that, in the case of touching, it is the body that forms the milieu, whereas, for example, in the case of sight, the milieu is what he calls the diaphane. And he specifies that this milieu, because it is that which is most close, is that which is structurally forgotten, just as water is for a fish. The milieu is forgotten, because it effaces itself before that to which is gives place. There is always already a milieu, but this fact escapes us in the same way that “aquatic animals,” as Aristotle says, “do not notice that one wet body touches another wet body” (423ab): water is what the fish always sees; it is what it never sees. Or, as Plato too says in the Timaeus, if the world was made of gold, gold would be the sole being that would never be seen – it would not be a being, but the inapparent being of that being, appearing only in the occurrence of being, by default (Stiegler 2009: 13-14)

In this sense, compute, is the structural condition of possibility that makes the milieu possible by giving it place, in as much as it creates those frameworks within which technicity takes place. The question of compute then, both as a theoretical concept but also as a technical definition is crucial for thinking through the challenge of computation more broadly. But, in a rapidly moving world of growing computational power, comparative analysis of computational change is made difficult without a metric by which to compare different moments historically. This is made much more difficult by the reality that compute is not simply the speed and bandwidth of a processor as such, but includes a number of other related technical considerations such as the speed of the underlying motherboard, ram, graphics processor(s), storage system and so forth.

Compute then is a relative concept and needs to be thought about in relation to previous iterations, and this is where benchmarking has become an important part of the assessment of compute – for example SPECint, a computer benchmark specification for a processor’s integer processing power maintained by the Standard Performance Evaluation Corporation (SPEC 2014). Another, called GeekBench (2013), scores compute against a baseline score of 2500, which is the score of an Intel Core i5-2520M @ 2.50 GHz. In contrast, SYSmark 2007, another benchmark, attempts to bring “real world” applications into the processing measurement by including a number of ideal systems that run canned processing tasks (SYSmark 2007). As can be seen, comparing compute becomes a spectrum of benchmarks that test a variety of working definitions of forms of processing capacity. It is also unsurprising that as a result many manufactures create custom modes within their hardware to “game” these benchmarks and unfortunately obfuscate these definitions and comparators. For example,

Samsung created a white list for Exynos 5-based Galaxy S4 phones which allow some of the most popular benchmarking apps to shift into a high-performance mode not available to most applications. These apps run the GPU at 532MHz, while other apps cannot exceed 480MHz. This cheat was confirmed by AnandTech, who is the most respected name in both PC and mobile benchmarking. Samsung claims “the maximum GPU frequency is lowered to 480MHz for certain gaming apps that may cause an overload, when they are used for a prolonged period of time in full-screen mode,” but it doesn’t make sense that S Browser, Gallery, Camera and the Video Player apps can all run with the GPU wide open, but that all games are forced to run at a much lower speed (Schwartz 2013).

On a material register the unit of compute can be thought of as roughly the maximum potential processing capacity of a computer processing chip running for a notional hour. In todays softwarized landscape, of course, processing power itself become a service and hence more often is framed in terms of virtual machines (VMs), rather than actual physical machines – a number of compute instances can be realised on a single physical processor using sophisticated software to manage the illusion. Amazon itself defines compute through an abstraction of actual processing as follow,

Transitioning to a utility computing model fundamentally changes how developers have been trained to think about CPU resources. Instead of purchasing or leasing a particular processor to use for several months or years, you are renting capacity by the hour. Because Amazon EC2 is built on commodity hardware, over time there may be several different types of physical hardware underlying EC2 instances. Our goal is to provide a consistent amount of CPU capacity no matter what the actual underlying hardware (Amazon 2013).

Indeed, Amazon tends to discuss compute in relation to its unit of EC2 Compute Unit (ECU) to enable the discretisation.[1] Google also uses an abstract quantity and measures “minute-level increments” of computational time (Google 2013). The key is to begin thinking about how an instance provides a predictable amount of dedicated compute capacity and as such is a temporal measure of computational power albeit seemingly defined rather loosely in the technical documentation.

The question of compute is then a question of the origin of computation more generally, but also how the infrastructure of computation can be understood both qualitatively and quantitatively. Indeed, it is clear that the quantitative changes that greater compute capacity introduces makes possible the qualitative experience of computation that we increasingly take for granted in our use of a heavily software-textured world. To talk about software, processes, algorithms and code is then deficient without a corresponding understanding of the capacity of compute in relation to them and a key question for thinking about the conditions of possibility that computation make possible for our lives today.


[1] Amazon used to define the ECU directly, stating: “We use several benchmarks and tests to manage the consistency and predictability of the performance of an EC2 Compute Unit. One EC2 Compute Unit provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor. This is also the equivalent to an early-2006 1.7 GHz Xeon processor referenced in our original documentation” (Berninger 2010). They appear to have stopped using this description in their documentation (see Amazon 2013). 


Amazon (2013) Amazon EC2 FAQs, accessed 05/01/2014,

Berninger, D. (2010) What the heck is an ECU?,  accessed 05/01/2014,

GeekBench (2013) GeekBench Processor Benchmarks, accessed 05/01/2014,

Google (2013) Compute Engine — Google Cloud Platform, accessed 05/01/2014,

Schwartz, R. (2013) The Dirty Little Secret About Mobile Benchmarks,  accessed 05/01/2014,

SPEC (2014) The Standard Performance Evaluation Corporation (SPEC), accessed 05/01/2014,

Stiegler, B. (2009) Acting Out, Stanford University Press.

SYSmark (2007),  SYSmark 2007 Preview, accessed 05/01/2014,

The Post-Digital

Courbet, Gustave-The Painter’s Studio; A Real Allegory (1855)

As we increasingly find that the world of computational abundance is normalised, the application of cheap digital technologies to manage or partially augmented traditionally analogue experiences, technologies and practices will doubtless grow.[1] That is, the power of “compute” is growing both in breadth and depth as it permeates society and culture (see Davies 2013; Berry 2014a). All around us we are increasingly surrounded by new fields and flows of computation that co-construct and stabilise a new artifice for the human sensorium – streams, clouds, sensors and infrastructures. Not unlike previous moments in which mediums become part of everyday life, this new field is noticeable for its ability to modulate and transform itself through the use of algorithms and code. Not just as a general plasticity but as a flexible structure that adapts to context and environment tailored to the individual, or perhaps better, dividual, of the computational age. This new field of computation is not necessarily top-down and corporate controlled either. Thus, we see at a bottom-up level, the emergence of a market in cheap digital processors that enable the implementation of innovative new forms of culture and cultural experimentation. We might think of these moments as part of the constellation I am calling the “post-digital” (see also Berry 2013a; Cramer 2013; Cox 2013; Philipsen 2013; Sable 2012).

Museu de Arte de São Paulo (MASP), 1968.
Designed by Lina Bo Bardi

Thus, the historical distinction between the digital and the non-digital becomes increasingly blurred, to the extent that to talk about the digital presupposes a disjuncture in our experience that makes less and less sense. Thus computation becomes spatial in its implementation, embedded within the environment and part of the texture of life itself which can be walked around, touched, manipulated and interacted with in a number of ways and means – life becomes mediated in and through the computal (Berry 2014b). Indeed, in a similar way in which the distinction between “being online” or “being offline” has become anachronistic, with our always-on smart phones and tablets and widespread wireless networking technologies, so too, perhaps, the term “digital” describes a world of the past.

Which is not to say that time is not an important aspect to computation in this post-digital world. The compressive effects of computation and the flattening metaphors and visual language of computation tend towards an encounter, maximised perhaps by its tendency toward spatiality, to transform time from a diachronic to a synchronic experience. Indeed, history itself may be re-presented through the screen through a number of computation functions and methods that make it seem geometric, flat and simultaneous. A sense of history is then a sense of real-time flows, not so much distant and elusive, whether as cultural or individual memory, but here and now, spectacular and vividly represented and re-presented. Time in this sense is the time of technical time, and the history attendant to it is technical history, presented through databases, code and algorithms.

Thus within a time of computational abundance we might think in relation to the question of the “post-digital”, in as much as we are rapidly entering a moment when the difficulty will be found in encountering culture outside of digital media. Or perhaps the non-digital will largely be the preserve of the elite (by choice, education and wealth) or the very poor (by necessity).  The detritus of society will be cast into the non-digital and the fading and ephemeral will be preserved within computational databanks only, if it is preserved at all. Indeed, even the non-digital becomes bound up in the preservation possibilities offered by the digital,

Non-digital media technologies… become post-digital when they are not simply nostalgically revived, but functionally repurposed in (often critical) relation to digital media technologies: zines that become anti- or non-blogs, vinyl as anti-CD, cassette tapes as anti-mp3, analog film as anti-video (Cramer 2013).

Computal Surfaces: main stage for the
Republican convention in Tampa, Fla (2012)

In a post-digital age, whether something is digital or not will no longer be seen as the essential question. Or rather, the question as to whether something is or is not “digital” will be increasingly meaningless as all forms of media become themselves mediated, produced, accessed, distributed or consumed through digital devices and technologies. This is, to move away from a comparative notion of the digital, contrasted with other material forms such as paper, celluloid or photopaper, and instead begin to think about how the digital is modulated within various materialities. It is also when the contrast between “digital” and “analogue” no longer makes sense either. This spectrum of the digital, a distribution across an axis of more of less computal, gives rise to the expectation of the always already computational of everyday life.

Muffwiggler, Modular Synth Meetup,
University of Sussex (2013).

Thus, the post-digital is represented by and indicative of a moment when the computational has become both hegemonic and post-screenic (see Bosma 2013; Ludovico 2013). As Cramer argues, “the distinction between ‘old’ and ‘new’ media collapses in theory as well as in practice. As Kenneth Goldsmith observes, his students ‘mix oil paint while Photoshopping and scour flea markets'” (Cramer 2013). The “digital” is then understood as a previous historic moment when computation as digitality was understood in opposition to the analogue, although that is not to say that it will not remain as a marginal notion with related practices within post-digitality. Thus, under our contemporary conditions it might be better to think about modulations of the digital or different intensities of the computational as a post-digital moment rather than digital versus analogue as such. We should therefore critically think about the way in which cadences of the computational are made and materialised. In other words, notions of quantitative and qualitative dimensions of “compute” will be increasingly important for thinking about culture, economics, society, politics and everyday life. Tracing power will in many cases be tracing compute, both in terms of the reservoirs of compute managed by gigantic computational Stacks, but also in the places where compute is thin and poorly served. By Stacks, I am referring to the corporations that increasingly rely on computational “technology stacks” for profit and power, such as Google, Apple, Facebook, Twitter and Amazon but also the technical imaginary formed through the notion of these stacks as a diagram (Berry 2013b).

Cuddlebot“: low-tech touch/haptic sensing hardware (2013)

Compute as already always part of life might also herald that the moment of the digital as digitalisation is already the past, and that new challenges lie ahead for thinking about the way in which the computal saturates our culture, institutions and everyday life in varying degrees of modularity and intensity. This growth in computation has put citizens at an obvious disadvantage in a society that not only has historically tended to disavow the digital as a form of knowledge or practice, but also has not seen computational thinking or skills as part of the educational requirements of a well-informed citizen. For example, the lack of understanding of the importance of encryption and cryptography in digital society was humbly described recently by Glenn Greenwald, who one might have thought to have been better schooled in these technologies (Greenwald 2013). Indeed, as computer power has increased, so has the tendency to emulate older media forms to provide content within simulations of traditional containers, such as “e”-books, through techniques of skeuomorphism and glossy algorithmic interface design – rather than learning and teaching computational practices as such. This, perhaps, has the advantage of new computational forms being able to be used and accessed without the requisite computational skills to negotiate the new literary machines of computation, such as the underlying logics, structures, processes and code. However, it also means that in many cases today, we are unable to read what we write, and are not always the writers of the systems that are built around us (Berry 2011; Oliver, Savičić and Vasiliev 2011; Allen 2013). This illiteracy does not seem to be the ideal conditions for the emergence of an informed and educated citizenry to engage with the challenges and dangers of a fully softwarized post-digital society. It also points to the urgent need for a critical and engaged Bildung for the post-digital world, if it is not to become precariously post-democratic.


[1] This post was inspired by attending “Muffwiggler” at the University of Sussex, Saturday 16 November 2013, organised by Andrew Duff, and funded by the Centre for Digital Material Culture. The event was notionally a homage to analogue synths, but in reality was colonised by digital/analogue hybrid synthesisers and controllers which were properly post-digital in both form and function. More information and


Allen, J. (2013) Critical Infrastructure, accessed 31/12/2013,

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave Macmillan.

Berry, D. M. (2013a) Post-Digital Humanities, Stunlaw, accessed 30/12/2013,

Berry, D. M. (2013b) Digital Breadcrumbs, Stunlaw, accessed 30/12/2013,

Berry, D. M. (2014a) On Compute, Stunlaw, accessed 05/01/2014,

Berry, D. M. (2014b) Critical Theory and the Digital, New York, Continuum/Bloomsbury Academic.

Bosmas, J. (2013) Post-Digital is Post-Screen – Shaping a New Visuality, accessed 30/12/2013,

Cox, G. (2013) some old problems with post–anything (draft version), accessed 30/12/2013,

Cramer, F. (2013) Post-digital: a term that sucks but is useful (draft 2), accessed 30/12/2013,

Davies, J. (2013) Compute Power with Energy- Efficiency, accessed 30/12/2013,

Greenwald, G. (2013) 30c3 Keynote, Chaos Computer Club, accessed 30/12/2013,

Ludovico, A. (2013) Post Digital Publishing, Hybrid and Processual Objects in Print, accessed 30/12/2013,

Oliver, J. Savičić, G. and Vasiliev, D. (2011) Critical Engineering Manifesto, accessed 31/12/2013,

Philipsen, L. (2013) Do not Return to Sender – Why post-digital aesthetic research should actually distinguish between artist, critics, and audience, accessed 30/12/2013,

Sable, D. (2012) A “Post Digital” World, Really?, Google Think Insights, accessed 30/12/2013,

Digital Breadcrumbs

In April 2013, the world population was 7,057,065, 162 (Hunt 2013). This is a population that increasing accesses and uses communications and digital media, and creates huge quantities of real-time and archived data, although it remains divided in its access to digital technology (Berry 2011). We often talk about the vast increase in data creation and transmission but it is sometimes difficult to find recent and useful quantitative measures of the current contours of digital media. Indeed, the internet as we tend to think of it, has become increasingly colonised by massive corporate technology stacks. These companies, Google, Apple, Facebook, Amazon, Microsoft, are called collectively “the Stacks” (Berry 2013). Helpfully, the CIA’s chief technology officer, Ira Hunt (2013), has listed the general data numbers for the “stacks” and gave some useful comparative numbers in relation to telecoms and SMS messaging (see figure 1).

Data Provider

Quantitative Measures

Google (2009 Stats from SEC filing)
More than 100 petabytes of data.

One trillion indexed URLS. 
Three million servers. 
7.2 billion page-views per day.

Facebook (August 2012)
More than one billion users in August 2012.

300 petabytes of data. more than 500 terrabytes per day. 
Holds 35% of the world’s photographs.

Youtube (2013)
More than 1000 petabytes of data (1 exabyte).

More than 72 hours of video uploaded per minute. 
37 million hours per year. 
4 billion views per day.

Twitter (2013)
More than 124 billion tweets per year.

390 million tweets per day or ~4500 tweets per second.

Global Text Messaging (2013)

More than 6.1 trillion text messages per year. 
193,000 messages sent per second 
or 876 per person per year

US Cell Calls (2013)
More than 2.2 trillion minutes per year. 
19 minutes per person per day. 
Uncompressed telephone data is smaller in 
size than Youtube data in a year.

figure 1: Growth in Data Collections and Archives (adapted from Hunt 2013)

The CIA have a particular interest in big data and growth in the “digital breadcrumbs” left by digital devices. Indeed, they are tasked with security of the United States and have always had an interest in data collection and analysis, but it is fascinating to see how increasingly the value of data comes to shape the collection of SIGINT which is digital and subject to computational analysis. Hunt argued,

“The value of any piece of information is only known when you can connect it with something else that arrives at a future point in time… Since you can’t connect dots you don’t have, it drives us into a mode of, we fundamentally try to collect everything and hang on to it forever” (Sledge 2013)

It is also interesting to note the implicit computationality that shapes and frames the way in which intelligence is expected to develop due to the trends in data and information growth. Nevertheless, these desires shape not just the CIA or other security services, but any organisation that is interested in using archival and real-time data to undertake analysis and prediction based on data – which is increasingly all organisations in a computational age.

Information has time value, and soon can lose its potency. This drives the growth of not just big data, but real-time analysis – particularly where real-time and archival or databases can be compared and processed in real-time. Currently real-time is a huge challenge for computational systems and pushes at the limits of current computal systems and data analytic tools. Unsurprisingly, new levels of expertise are called for, usually grouped under the notion of “data science”, a thoroughly interdisciplinary approach sometimes understood as the movement from “search” to “correlation”. Indeed, as Hunt argues,

“It is really very nearly within our grasp to be able to compute on all human generated information,” Hunt said. After that mark is reached, Hunt said, the [CIA] agency would also like to be able to save and analyze all of the digital breadcrumbs people don’t even know they are creating (Sledge 2013).

In a technical sense the desire in these “really big data” applications is the move from what is called “batch map/reduce”, such as represented by Hadoop and related computational systems to “real-time map/reduce” whereby real-time analytics are made possible, represented currently by technologies like Google’s Dremel (Melnik et al 2010), Caffeine (Higgenbotham 2010), Impala (Brust 2012), Apache Drill (Vaughan-Nichols 2013), Spanner (Iqbal 2013), etc. This is the use of real-time stream processing combined with complex analytics and the ability to manage large historical data sets. The challenges for the hardware are considerable, requiring peta-scale RAM architectures so that the data can be held in memory, but also the construction of huge distributed memory systems enabling in-memory analytics (Hunt 2013).

Traditional Computer Processing

Real-Time Analytics/Big Data
Data on storage area network (SAN) Data at processor
Move data to question Move question to data
Backup Replication Management
Vertical Scaling Horizontal Scaling
Capacity after demand Capacity ahead of demand
Disaster recovery Continuity of Operations Plan (COOP)
Size to peak load Dynamic/elastic provisioning
Tape storage area network (SAN)
storage area network (SAN) disk
disk solid-state disk
RAM limited Peta-scale RAM

figure 2: Tectonic Technology Shifts (adapted from Hunt 2013)

These institutional demands are driving the development of new computing architectures, which have principles associated with them, such as: data close to compute, power at the edge, optical computing/optical bus, the end of the motherboard and the use of shared pools of everything, new softwarized hardware systems that allow compute, storage, networking, and even the entire data centre to be subject to software control and management (Hunt 2013). This is the final realisation of the importance of the network, and shows the limitations of current network technologies such that they become one of the constraints on future softwarized system growth.

This continues the move towards context as the key technical imaginary shaping the new real-time streaming digital environment (see Berry 2012), with principles such as “Schema on Read”, which enables the data returned to be shaped in relation to the context of the question asked, “user assembled analytics”, which requires answers to be given for a set of research questions, and the importance of elastic computing, which enables computing power to be utilised in reference to a query or processing demand in real-time, similar to the way electricity is drawn from in greater proportions from the mains as it is required.

These forces are combining in ways that are accelerating the pace of data collection, whether from data exhausts left by users, or through open-source intelligence that literally vacuums the data from the fibre-optic cables that straddle the globe. As such, they also raise important questions related to the form of critical technical practices that are relevant to them and how we can ensure that citizens remain informed in relation to them. To take one small example, the mobile phone is now packed with real-time sensors which is constantly monitoring and processing contextual information about its location, use and the activities of its user. This data is not always under the control of the user, and in many cases is easily leaked, hacked or collected by third parties without the understanding or consent of the user (Berry 2012).

The notion that we leave behind “digital breadcrumbs”, not just on the internet, but across the whole of society, the economy, culture and even everyday life is an issue that societies are just coming to terms with. Notwithstanding the recent Snowdon revelations (see Poitras et al 2013), new computational techniques, as outlined in this article, demonstrate the disconnect between people’s everyday understanding of technology, and its penetration of life and the reality of total surveillance. Not just the lives of others are at stake here, but the very shape of public culture and the ability for individuals to make a “public use of reason” (Kant 1784) without being subject to the chilling effects of state and corporate monitoring of our public activities. Indeed, computal technologies such as these described have little respect for the public/private distinction that our political systems have naturalised as part of a condition of possibility for political life at all. This makes it ever more imperative that we provide citizens with the ability to undertake critical technical practices, both in order to choose how to manage the digital breadcrumbs they leave as trails in public spaces, but also to pull down the blinds on the post-digital gaze of state and corporate interests through the use of cryptography and critical encryption practices.


Berry, D. M. (2011) The Philosophy of Software: Code and Mediation in the Digital Age, London: Palgrave.

Berry, D. M (2012) The social epistemologies of software, Social Epistemology, 26 (3-4), pp. 379-398. ISSN 0269-1728

Berry, D. M. (2013) Signposts for the Future of Computal Media, Stunlaw, accessed 14/10/2013,

Brust, (2012) Cloudera’s Impala brings Hadoop to SQL and BI, accessed 14/10/2013,

Higgenbotham, S. (2013) How Caffeine Is Giving Google a Turbo Boost, accessed 14/10/2013,

Hunt, I. (2013) The CIA’s “Grand Challenges” with Big Data, accessed 14/10/2013,

Iqbal, M. T. (2013) Google Spanner : The Future Of NoSQL, accessed 14/10/2013,

Kant, I. (1784) What Is Enlightenment?, accessed 14/10/2013,

Melnik, S., Gubarev, A., Long, J. J., Romer, G., Shivakumar, S., Tolton, M. and Vassilakis, T. (2010) Dremel: Interactive Analysis of Web-Scale DatasetsProc. of the 36th Int’l Conf on Very Large Data Bases (2010), pp. 330-339.

Poitras, L., Rosenbach, M., Schmid, F., Stark, H. and Stock, J. (2013) How the NSA Targets Germany and Europe, Spiegel, accessed 02/07/2013,

Sledge, M. (2013) CIA’s Gus Hunt On Big Data: We ‘Try To Collect Everything And Hang On To It Forever’, accessed 14/10/2013,

Vaughan-Nichols, S. J. (2013) Drilling into Big Data with Apache Drill, accessed 14/10/2013,