Category Archives: Computationality

Curatorialism as New Left Politics

It is often argued that the left is left increasingly unable to speak a convincing narrative in the digital age. Caught between the neoliberal language of contemporary capitalism and its political articulations linked to economic freedom and choice, and a welfare statism that appears counter-intuitively unappealing to modern political voters and supporters, there is often claimed to be a lacunae in the political imaginary of the left. Here, I want to explore a possible new articulation for a left politics that moves beyond the seeming technophilic and technological determinisms of left accelerationisms and the related contradictions of “fully automated luxury communism”. Broadly speaking, these positions tend to argue for a post-work, post-scarcity economy within a post-capitalist society based on automation, technology and cognitive labour. The aim here is to move beyond the assertion that the embracing of technology itself solves the problem of a political articulation that has to be accepted and embraced by a broader constituency within the population (or party). Technophilic politics is not, of itself, going to be enough to convince an electorate, nor a population, to move towards leftist conceptualisations of possible restructuring or post-capitalist economics. However, it seems to me that the abolition of work is not a desirable political programme for the majority of the population, nor does a seemingly utopian notion of post-scarcity economics make much sense under conditions of neoliberal economics. Thus these programmes are simultaneously too radical and not radical enough. I also want to move beyond the staid and unproductive arguments often articulated in the UK between a left-Blairism and a more statist orientation associated with a return to traditional left concerns personified in Ed Miliband.

Instead, I want to consider what a politics of the singularity might be, that is, to follow Fredrick James’s conceptualisation of the singularity as “is a pure present without a past or a future” such that,

today we no longer speak of monopolies but of transnational corporations, and our robber barons have mutated into the great financiers and bankers, themselves de-individualized by the massive institutions they manage. This is why, as our system becomes ever more abstract, it is appropriate to substitute a more abstract diagnosis, namely the displacement of time by space as a systemic dominant, and the effacement of traditional temporality by those multiple forms of spatiality we call globalization. This is the framework in which we can now review the fortunes of singularity as a cultural and psychological experience (Jameson 2015: 128). 

That is the removal of temporality of a specific site of politics as such, or the successful ideological deployment of a new framework of understand of oneself within temporality, whether through the activities of the media industries, or through the mediation of digital technologies and computational media. This has the effect of the transformation of temporal experience into new spatial experiences, whether through translating media, or through the intensification of a now that constantly presses upon us and pushes away both historical time, but also the possibility for political articulations of new forms of futurity. Thus the politics of singularity point to spatiality as the key site of political deployment within neoliberalism, and by this process undercutting the left’s arguments which draw simultaneously on a shared historical memory of hard-won rights and benefits, but also the notion of political action to fight for a better future. Indeed, one might ask if green critique of the anthropocene, with its often misanthropic articulations, in some senses draws on some notion of a singularity produced by humanity which has undercut the time of geological or planetary scale change. The only option remaining then is to seek to radically circumscribe, if not outline a radical social imaginary that does not include humans in its conception, and hence to return the planet to the stability of a geological time structure no longer undermined by human activity. Similarly, neoliberal arguments over political imaginaries highlight the intensity and simultaneity of the present mode of capitalist competition and the individualised (often debt-funded) means of engagement with economic life.

What then might be a politics of the singularity which moved beyond politics that drew on forms of temporality for their legitimation. In other words, how could a politics of spatiality be articulated and deployed which re-enabled the kind of historical project towards a better future for all that was traditionally associated with leftist thought?

To do this I want to think through the notion of the “curator” that Jameson disparagingly thinks is an outcome of the singularity in terms of artistic practice and experience. He argues, that today we are faced with the “emblematic figure of the curator, who now becomes the demiurge of those floating and dissolving constellations of strange objects we still call art.” Further,

there is a nastier side of the curator yet to be mentioned, which can be easily grasped if we look at installations, and indeed entire exhibits in the newer postmodern museums, as having their distant and more primitive ancestors in the happenings of the 1960s—artistic phenomena equally spatial, equally ephemeral. The difference lies not only in the absence of humans from the installation and, save for the curator, from the newer museums as such. It lies in the very presence of the institution itself: everything is subsumed under it, indeed the curator may be said to be something like its embodiment, its allegorical personification. In postmodernity, we no longer exist in a world of human scale: institutions certainly have in some sense become autonomous, but in another they transcend the dimensions of any individual, whether master or servant; something that can also be grasped by reminding ourselves of the dimension of globalization in which institutions today exist, the museum very much included (Jameson 2015: 110-111).

However, Jameson himself makes an important link between spatiality as the site of a contestation and the making-possible of new spaces, something curatorial practice, with its emphasis on the construction, deployment and design of new forms of space points towards. Indeed, Jameson argues in relation to theoretical constructions, “perhaps a kind of curatorial practice, selecting named bits from our various theoretical or philosophical sources and putting them all together in a kind of conceptual installation, in which we marvel at the new intellectual space thereby momentarily produced” (Jameson 2015: 110).

In contrast, the question for me is the radical possibilities suggested by this event-like construction of new spaces, and how they can be used to reverse or destabilise the time-axis manipulation of the singularity. The question then becomes: could we tentatively think in terms of a curatorial political practice, which we might call curatorialism? Indeed, could we fill out the ways in which this practice could aim to articulate, assemble and more importantly provide a site for a renewal and (re)articulation of left politics? How could this politics be mobilised into the nitty-gritty of actual political practice, policy, activist politics, and engender the affective relation that inspires passion around a political programme and suggests itself to the kinds of singularities that inhabit contemporary society? To borrow the language of the singularity itself, how could one articulate a new disruptive left politics?

At this early stage of thinking, it seems to me that in the first case we might think about how curatorialism points towards the need to move away from concern with internal consistency in the development of a political programme. Curatorialism gathers its strength from the way in which it provides a political pluralism, an assembling of multiple moments into a political constellation that takes into account and articulates its constituent moments. This is the first step in the mapping of the space of a disruptive left politics. This is the development of a spatial politics in as much as, crucially, the programme calls for a weaving together of multiplicity into this constellational form. Secondly, we might think about the way in which this spatial diagram can then be  translated into a temporal project, that is the transformation of a mapping program into a political programme linked to social change. This requires the capture and illumination of the multiple movements of each moment and re-articulation through a process of reframing the condition of possibility in each constellational movement in terms of a political economy that draws from the historical possibilities that the left has made possible previously, but also the need for new concepts and ideas to link the political of necessity to the huge capacity of a left project towards mitigating/and or replacement of a neoliberal capitalist economic system. Lastly, it seems to me that to be a truly curatorial politics means to link to the singularity itself as a force of strength for left politics, such that the development of a mode of the articulation of individual political needs, is made possible through the curatorial mode, and through the development of disruptive left frameworks that links individual need, social justice, institutional support, and left politics that reconnects the passions of interests to the passion for justice and equality with the singularity’s concern with intensification. [1] This can, perhaps, be thought of as the replacement of a left project of ideological purity with a return to the Gramscian notions of strategy and tactics through the deployment of what he called a passive revolution, mobilised partially in the new forms of civil society created through collectivities of singularities within social media, computational devices and the new infrastructures of digital capitalism but also within the through older forms of social institutions, political contestations and education.[2] 

Notes

[1] This remains a tentative articulation that is inspired by the power of knowledge-based economies both to create the conditions of singularity through the action of time-axis manipulation (media technologies), but also their (arguably) countervailing power to provide the tools, spaces and practices for the contestation of the singularity connected only with a neoliberal political moment. That is, how can these new concept and ideas, together with the frameworks that are suggested in their mobilisation, provide new means of contestation, sociality and broader connections of commonality and political praxis. 
[2] I leave to a later paper the detailed discussion of the possible subjectivities both in and for themselves within a framework of a curatorial politics. But here I am gesturing towards political parties as the curators of programmes of political goals and ends, able then to use the state as a curatorial enabler of such a political programme. This includes the active development of the individuation of political singularities within such a curatorial framework.  

Bibliography

Jameson F. (2015) The Aesthetics of Singularity, New Left Review, No. 92.

Advertisements

Signal Lab

As part of the Sussex Humanities Lab, at the University of Sussex, we are developing a research group clustered around information theoretic themes of signal/noise, signal transmission, sound theorisation, musicisation, simulation/emulation, materiality, game studies theoretic work, behavioural ideologies and interface criticism. The cluster is grouped under the label Signal Lab and we aim to explore the specific manifestations of the mode of existence of technical objects. This is explicitly a critical and political economic confrontation with computation and computational rationalities.

Signal Lab will focus on techno-epistemological questions around the assembly and re-assembley of past media objects, postdigital media and computational sites. This involves both attending to the impressions of the physical hardware (as a form of techne) and the logical and mathematical intelligence resulting from software (as a form of logos). Hence we aim to undertake an exploration of the technological conditions of the sayable and thinkable in culture and how the inversion of reason as rationality calls for the excavation of how techniques, technologies and computational medias direct human and non-human utterances without reducing techniques to mere apparatuses.

This involves the tracing of the contingent emergence of ideas and knowledge in systems in space and time, to understand distinctions between noise and speech, signal and absence, message and meaning. This includes an examination of the use of technical media to create the exclusion of noise as both a technical and political function and the relative importance of chaos and irregularity within the mathematization of chaos itself. It is also a questioning of the removal of the central position of human subjectivity and the development of a new machine-subject in information and data rich societies of control and their attendant political economies.

Within the context of information theoretic questions, we revisit the old chaos, and the return of the fear of, if not aesthetic captivation toward, a purported contemporary gaping meaninglessness. Often associated with a style of nihilism, a lived cynicism and jaded glamour of emptiness or misanthropy. Particularly in relation to a political aesthetic that desires the liquidation of the subject which in the terms of our theoretic approach, creates not only a regression of consciousness but also the regression to real barbarism. That is, data, signal, mathematical noise, information and computationalism conjure the return of fate and the complicity of myth with nature and a concomitant total immaturity of society and a return to a society in which self-relfection can no longer open its eyes, and in which the subject not only does not exist but instead becomes understood as a cloud of data points, a dividual and a undifferentiated data stream.

Signal Lab will therefore pay attention both to the synchronic and diachronic dimensions of computational totality, taking the concrete meaningful whole and essential elements of computational life and culture. This involves the explanation of the emergence of the present given social forces in terms of some past structures and general tendencies of social change. That is, that within a given totality, there is a process of growing conflict among opposite tendencies and forces which constitutes the internal dynamism of a given system and can partly be examined at the level of behaviour and partly at the level of subjective motivation. This is to examine the critical potentiality of signal in relation to the possibility of social forces and their practices and articulations within a given situation and how they can play their part in contemporary history. This potentially opens the door to new social imaginaries and political possibility for emancipatory politics in a digital age.

Interview with David M. Berry at re:publica 2013

Open science interview at re:publica conference in Berlin, 2013, by Kaja Scheliga.

Kaja ScheligaSo to start off…what is your field, what do you do?


David M. Berry: My field is broadly conceived as digital humanities or software studies. I focus in particular on critical approaches to understanding technology, through theoretical and philosophical work, so, for example, I have written a book called Philosophy of Software and I have a new book called Critical Theory and The Digital but I am also interested in the multiplicity of practices within computational culture as well, and the way the digital plays out in a political economic context.

KS: Today, here at the re:publica you talked about digital humanities. What do you associate with the term open science?

DB: Well, open science has very large resonances with Isaiah Berlin’s notion of the open society, and I think the notion of open itself is interesting in that kind of construction, because it implies a “good”. To talk about open science implies firstly that closed science is “bad”, that science should be somehow widely available, that everything is published and there is essentially a public involvement in science. It has a lot of resonances, not necessarily clear. It is a cloudy concept. 

KS: So where do you see the boundary between open science and digital humanities? Do they overlap or are they two separate fields? Is one part of the other?


DB: Yes, I think, as I was talking in the previous talk about how digital humanities should be understood within a constellation, I think open science should also be understood in that way. There is no single concept as such, and we can bring up a lot of different definitions, and practitioners would use it in multiple ways depending on their fields. But I think, there is a kind of commitment towards open access, the notion of some kind of responsibility to a public, the idea that you can have access to data and to methodologies, and that it is published in a format that other people have access to, and also there is a certain democratic value that is implicit in all of these constructions of the open: open society, open access, open science, etc. And that is really linked to a notion of a kind of liberalism that the public has a right, and indeed has a need to understand.  And to understand in order to be the kind of citizen that can make decisions themselves about science. So in many ways it is a legitimate discourse, it is a linked and legitimating discourse about science itself, and it is a way of presenting science as having a value to society.

KS:  But is that justified, do you agree with this concept? Or do you rather look at it critically?

DB: Well, I am a critical theorist. So, for me these kinds of concepts are never finished. They always have within them embedded certain kinds of values and certain kinds of positions. And so for me it is an interesting concept and I think “open science” is interesting in that it emerges at a certain historical juncture, and of course with the notion of a “digital age” and all the things that have been talked about here at the re:publica, everyone is so happy and so progressive and the future looks so bright – apparently…

KS: Does it?

DB: Yes, well, from the conference perspective, because re:publica is a technology conference, there is this whole discourse of progress – which is kind of an American techno-utopian vision that is really odd in a European context – for me anyway. So, being a critical theorist, it does not necessarily mean that I want to dismiss the concept, but I think it is interesting to unpick the concept and see how it plays out in various ways. In some ways it can be very good, it can be very productive, it can be very democratic, in other ways it can be used for example as a certain legitimating tool to get funding for certain kinds of projects, which means other projects, which are labelled “closed”, are no longer able to get funded. So, it is a complex concept, it is necessarily “good” or “bad”.

KS: So, not saying ‘good’ or ‘bad’, but looking at the dark side of say openness, where do you see the limits? Or where do you see problem zones?

DB: Well, again, to talk about the “dark side,” it is kind of like Star Wars or something. We have to be very careful with that framework, because the moment you start talking about the dark side of the digital, which is a current, big discussion going on, for example, in the dark side of the digital humanities, I think it is a bit problematic. That is why thinking in terms of critique is a much better way to move forward. So for me, what would be more interesting would be to look at the actual practices of how open science is used and deployed. Which practitioners are using it? Which groups align themselves with it? Which policy documents? And which government policies are justified by rolling back to open science itself? And then, it is important to perform a kind of genealogy of the concept of “open science” itself. Where does it come from? What is it borrowing from? Where is the discussion over that term? Why did we come to this term being utilised in this way? And I think that then shows us the force of a particular term, and places it within an historical context. Because open science ten years ago may have meant one thing, but open science today might mean something different. So, it is very important we ask these questions.

KS: All right. And are there any open science projects that come to mind, spontaneously, right now?


DB: I’m not sure they would brand themselves as “open science” but I think CERN would be for me a massive open science project, and which likes to promote itself in these kinds of ways. So, the idea of a public good, publishing their data, having a lot of cool things on their website the public can look at, but ultimately, that justification for open science is disconnected because, well, what is the point of finding the Higgs Boson, what is the actual point, where will it go, what will it do? And that question never gets asked because it is open science, so the good of open science makes it hard for us to ask these other kinds of questions. So, those are the kinds of issues that I think are really important. And it is also interesting in terms of, for example, there was an American version of CERN which was cancelled. So why was CERN built, how did open science enable that? I mean, we are talking huge amounts of money, large amounts of effort, would this money have been better transferred to solving the problem of unemployment, you know, we are in a fiscal crisis at the moment, a financial catastrophe and these kinds of questions get lost because open science itself gets divorced from its political economic context.

KS: Yes. But interesting that you say that within open science certain questions are maybe not that welcome, so actually, it seems to be at certain places still pretty closed, right?

DB: Well, that is right, open itself is a way of closing down other kinds of debates. So, for example, in the programming world open source was promoted in order not to have a discussion about free software, because free software was just too politicised for many people. So using the term open, it was a nice woolly term that meant everything to a lot of different people, did not feel political and therefore could be promoted to certain actors, many governments, but also corporations. And people sign up to open source because it just sounds – “open source, yes, who is not for open source?” I think if you were to ask anyone here you would struggle to find anybody against open source. But if you ask them if they are for free software a lot of people would not know what it is. That concept has been pushed away. I think the same thing happens in science by these kinds of legitimating discourses. Certain kinds of critical approaches get closed down. I think you would not be welcomed if at the CERN press conference for the Higgs boson you would put up your hand and ask: “well actually, would it not have been better spending this money on solving poverty?” That would immediately not be welcomed as a legitimate line of questioning.  

KS: Yes, right. Okay, so do you think science is already open, or do we need my openness? And if so, where?

DB: Well, again, that is a strange question that assumes that I know what “open” is. I mean openness is a concept that changes over time. I think that the project of science clearly benefits from its ability to be critiqued and checked, and I do not necessarily just want to have a Popperian notion of science here – it is not just about falsification – but I think verification and the ability to check numbers is hugely important to the progress of science. So that dimension is a traditional value of science, and very important that it does not get lost. Whether or not rebranding it as open science helps us is not so straightforward. I am not sure that this concept does much for us, really. Surely it is just science? And approaches that are defined as “closed” are perhaps being defined as non-science.

KS: What has the internet changed about science and working in research?

DB: Well, I am not a scientist, so –   

KS: – as in science, as in academia. Or, what has the internet changed in research?

DB: Well, this is an interesting question. Without being too philosophical about it I hope, Heidegger was talking about the fact that science was not science anymore, and actually technology had massively altered what science was. Because science now is about using mechanisms, tools, digital devices, and computers, in order to undertake the kinds of science that is possible. So it becomes this entirely technologically driven activity. Also, today science has become much more firmly located within economic discourse, so science needs to be justified in terms of economic output, for example. It is not just the internet and the digital that have introduced this, there are larger structural conditions that I think that are part of this. So, what has the Internet or the web changed about science? One thing is allowing certain kinds of scientism to be performed in public. And so you see this playing out in particular ways, certain movements – really strange movements – have emerged that are pro-science and they just seek to attack people they see as anti-science. So, for example, the polemical atheist movement led by Richard Dawkins argues that that it is pro-science and anyone who is against it is literally against science – they are anti-science. This is a very strange way of conceptualising science. And some scientists I think are very uncomfortable with the way Dawkins is using rhetoric, not science, to actually enforce and justify his arguments. And another example is the “skeptics” movement, another very “pro-science” movement that has very fixed ideas about what science is. So science becomes a very strong, almost political philosophy, a scientism. I am interested in exploring how digital technologies facilitate a technocratic way of thinking: a certain kind of instrumental rationality, as it were.

KS: How open is your research, how open is your work? Do you share your work in progress with your colleagues?

DB: Well, as an academic, sharing knowledge is a natural way of working – we are very collaborative, go to conferences, present new work all the time, and publish in a variety of different venues. In any case, your ability to be promoted as an academic, to become a professor, is based on publishing, which means putting work out there in the public sphere which is then assessed by your colleagues. So the very principles of academia are about publishing, peer review, and so on and so forth. So, we just have to be a bit careful about the framing of the question in terms of: “how ‘open’ is your work?”, because I am not sure how useful that question is inasmuch as it is too embedded within certain kinds of rhetorics that I am a little bit uncomfortable with. So the academic pursuit is very much about sharing knowledge – but also knowledge being shared.

KS: Okay. I was referring to, of course, when you do work and when you have completed your research you want to share it with others because that is the point of doing the research in the first place, to find something out and then to tell the world look this is what I found out, right?

DB: Possibly. No.

KS: No?

DB: This is what I am saying. I mean –

KS: I mean, of course in a simplified way.

DB: Well, disciplines are not there to “tell the world”. Disciplines are there to do research and to create research cultures. What is the point of telling the world? The world is not necessarily very interested. And so you have multiple publics – which is one way of thinking about it. So one of my publics, if you like, is my discipline, and cognate disciplines, and then broader publics like re:publica and then maybe the general public. And there are different ways of engaging with those different audiences. If I was a theoretical physicist for example, and I publish in complex mathematical formulae,  I can put that on the web but you are not really going to get an engagement from a public as such. That will need to be translated. And therefore maybe you might write a newspaper article which translates that research for a different public. So, I think it is not about just throwing stuff on the web or what have you. I think that would be overly simplistic. It is also about translation. So do I translate my research? Well I am doing it now. I do it all the time. So, I talk to Ph.D. students and graduates, that is part of the dissemination of information, which is, I think really what you are getting at. How do you disseminate knowledge?

KS: Exactly. And knowledge referring not only to knowledge that is kind of settled and finished, you know, I have come to this conclusion, this is what I am sharing, but also knowledge that is in the making, in the process, that was what I was referring to.

DB: Sure, yes. I mean, good academics do this all the time. And I am talking particularly about academia here. I think good academics do research and then they are teaching and of course these two things overlap in very interesting ways. So if you are very lucky to have a good scholar as a professor you are going to benefit from seeing knowledge in the making. So that is a more general question about academic knowledge and education. But the question of knowledges for publics, I think that is a different question and it is very, very complex and you need to pin down what it is you want to happen there. In Britain we have this notion of the public engagement of science and that is about translation. Let’s say you do a big research project that is very esoteric or difficult to understand, and then you write a popular version of it – Stephen Hawking is a good example of this – he writes books that people can read and this has major effects beyond science and academia itself. I think this is hugely important, both in terms of understanding how science is translated, but also how popular versions of science may not themselves be science per se.

KS: So, what online tools do use for your research?

DB: What online tools? I do not use many online tools as such. I mean I am in many ways quite a traditional scholar, I rely on books – I will just show you my notes. I take notes in a paper journal and I write with a fountain pen which I think is a very traditional way of working. The point is that my “tools” are non-digital, I hardly ever digitise my notes and I think it is interesting to go through the medium of paper to think about the digital, because digital tools seem to offer us solutions and we are very caught up in the idea that the digital provides answers. I think we have to pause a little bit, and paper forces you to slow down – that is why I like it. It is this slowing down that I think is really important when undertaking research, giving time to think by virtue of making knowledge embodied. Obviously, when it comes to collecting data and following debates I will use digital tools. Google of course is one of the most important, Google scholar and social media are really interesting tools, Gephi is very interesting social network analysis tool. I use Word and Excel as does pretty much everybody else. So the important issue is choosing which digital tools to use in which contexts. One thing I do much less of is, for example, the kind of programming were people write APIs and scrapers and all this kind of approaches, I have been involved in some projects doing that but I just do not have time to construct those tools, so I sometimes other people’s software (such as digital methods tools).

Notes, reproduced in Lewandowska and Ptak (2013)


KS: Okay, and how about organising ideas, do you do that on paper? Or for example do you use a tool for task managing?

DB: Always paper. If you have a look in my journal you can see that I can choose any page and there is an organisation of ideas going on here. For me it is a richer way to work through ideas and concepts  Eventually, you do have to move to another medium – you know I do not type my books on typewriters! – I use a word processor, for example. So eventually I do work on a computer, but by that point I think the structure is pretty much in my head but mediated through paper and ink – the computer is therefore an inscription device at the end of thinking. I dwell on paper, as it were, and then move over into a digital medium. You know, I do not use any concept mapping softwares, I just find them too clumsy and too annoying actually. 

KS: Okay, so what puts you off not using / not being tempted by using all those tools that offer you help and offer to make you more productive?

DB: Well, because firstly, I do not want to be more productive, and secondly I do not think they help. So the first thing I tell my new students, including new Ph.D. students, is: buy a note book and a pen and start taking notes. Do not think that the computer is your tool, or your servant. The computer will be your hindrance, particularly in the early stages of a Ph.D. It is much more important to carefully review and think through things. And that is actually the hardest thing to do, especially in this world of tweets and messages and emails – distractions are everywhere. There are no tweets in my book, thankfully, and it is the slowness and leisureliness that enables me to create a space for thinking. It is a good way of training your mind to pause and think before responding.

KS: So, you are saying that online tools kind of distract us from thinking and actually we think that we are doing a lot of stuff but actually we are not doing that much, right?

DB: Well, the classic problem is students that, for example, think they are doing an entirely new research project and map it all out in a digital tool that allows you to do fancy graphs, etc. – but they are not asking any kind of interesting research questions because they have not actually looked at the literature and they do not know the history of their subject. So it is very important that we do this, indeed some theorists have made the argument that we are forgetting our histories. And I think this is very true. The temptation to be in the future, to catch the latest wave or the latest trend affects Ph.D. students and academics as much as everybody else. And there are great dangers from chasing those kinds of solutions. Academia used to be about taking your time and being slow and considering things. And I think in the digital age academia’s value is that it can continue to do that, at least I hope so.

KS: Okay, but is there not a danger that if you say: okay, I am taking my time, I am taking my paper and my pen while others are hacking away, being busy using all those online tools, and in a way you could say okay that speeds up some part of research, at least when you draw out the cumulative essence of it, can you afford to invest the time?

DB: Well, it is not either or. It is both. The trouble is, I find anyway, with Ph.D. students, their rush to use the digital tools is to prevent them from having to use the paper. And, a classic example of this is Endnote. Everybody rushes to use Endnote because they do not like doing bibliographies. But actually, doing the bibliography by hand is one of the best things you can do because you learn your field’s knowledge, and you immediately recognise names because you are the one typing them in. Again this is a question of embodiment. When you leave that to a computer program to do it for you, laziness emerges – and you just pick and choose names to scatter over your paper. So, I am not saying you should not use such tools, I am saying that you should maybe do both. I mean, I never use these tools to construct bibliographies, I do them by hand because it encourages me to think through, what about this person are they really contributing, what do they add? And I think that is really important.

KS: Although, it probably should be more about, okay what do I remember this persons writing, and what have they contributed and not so much about whose name sounds fancy and which names do I need to drop here.

DB: Totally. Well, there has been some interesting work on this. Researchers have undertaken bibliometric analysis to show how references are used in certain disciplines and how common citations crop up again and again because they were used in previous papers and researchers feel the need to mention them again – so it becomes a name-checking exercise. Interestingly, few people go back and read these original canonical papers. So it is really important to read early work in a field, and place it within an historical context and trajectory, if one is to make sense of the present.

KS: A last question, I want to ask you about collaborative writing, do you write with other people and if so, how does that work? Where do you see advantages and where do you see possible trouble?

DB: Yes, I do. I have been through the whole gamut of collaborative writing, so I have seen both the failures and the successes. Collaborative writing is never easy, first and foremost. Particularly I think for humanities’ academics, because we are taught and we are promoted on the basis of our name being on the front of a paper or on the cover of a book. This obviously adds its own complications, plus you know academics tend to be very individualistic, and there is always questions about –

KS: …in spite of all the collaboration, right?


DB: Indeed, yes of course, I mean that is just the academic way, but I think you need that, because writing a book requires you to sit in a room for months and months and months and the sun is shining, everyone else having fun and you are sitting there in a gloomy room typing away, so you need that kind of self-drive and belief, and that, of course, causes frictions between people. So I have tried various different methods of working with people, but one method I found particularly interesting is a method called booksprinting. It is essentially a time-boxed process where you come together with, let us say, four or five other scholars, you are locked in a room for the week (figuratively speaking!), except to sleep and you eat together, write together, concept map and develop a book, collaboratively. And then the book that is produced is jointly authored, there is no arguments over that, if you do not agree you can leave, but the point is that the collaborative output is understood and bought into by all the participants. Now, to many academics this sounds like absolute horror, and indeed when I was first asked if I would like to be involved I was sceptical – I went along but I was sure this was going to be a complete failure. However it was one of the most interesting collaborative writing processes I have been involved in. I have taken part in two book sprints to date (three including 2014). You are welcome to have a look at the first book, it is called New Aesthetic New Anxieties. It is amazing how productive those kinds of collaborative writing processes can be. But it has to be a managed process. So, do check out booksprinting, it is very interesting – see also Imaginary museums, Computationality & the New Aesthetic and On Book Sprints.

KS: Okay, but then for that to work what do you actually / from your experience, can you draw out factors that make it work?

DB: Sure. The most important factor is having a facilitator, so someone who does not write. And the facilitators role is to make sure that everybody else does write.  And that is an amazing ability, a key person, because they have to manage difficult people and situations – it is like herding cats. Academics do not like to be pushed, for example. And the facilitator I have worked with, he is very skilled at this kind of facilitation. The second thing is the kinds of writing that you do and how you do it. The booksprinting process I have been involved in has been very paper-based, so again there is a lot of paper everywhere, there are post-it notes, there is a lot of sharing of knowledge, and this is probably the bit you are going to find interesting: There is, nonetheless, a digital tool which enables you to write collaboratively. It is a cleverly written tool, it has none of the bells and whistles, it is very utilitarian and really focuses the writing process and working together. And, having seen this used out on two different booksprints, I can affirm that it does indeed help the writing process. I recommend you have a look.

KS: So, what is the tool?

DB: It is called Booktype. And Adam Hyde is the facilitator who developed the process of Book Sprints, and is also one of the developers of the software.

KS: Okay, interesting. Any questions? Or any question I did not ask you, anything you want to add that we have missed out, any final thoughts? Any questions for me?

DB: Yes, I do think that a genealogy of “open science” is important and your questions are really interesting because they are informed by certain assumptions about what open science is. In other words, there is a certain position you are taking which you do not make explicit, and which I find interesting. So it might be useful to reflect on how “open science” needs to critically unpacked further.

KS: Okay, great, thank you very much.

DB: My pleasure.

KS: Thanks.

DB: Thank you.






Interview archived at Zenodo. Transcript corrected from the original to remove errors and clarify terms and sentences. 

The Antinomies of Computation

AntiSurveillance Feminist Poet Hair & Makeup Party

In this post I explore what I want to call the antinomies of computation.[1] This is part of a larger project to map out these contradictions but here I will only talk about one of the antinomies that I think is interesting, namely visibility/opacity. In subsequent posts I hope to explore multiple strata to map out different moments in these antinomies. This is an attempt to contribute to a critique of an increasingly softwarized society and economy that requires analysis and contestation (see Berry 2011, 2014).

Computation makes the collection of data relatively easy. This increases visibility through what Rey Chow (2012) calls “Capture”. Software enables more effective systems of surveillance and hence new capture systems. As Foucault argues, “full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap” (Foucault 1991:200). The question is also linked to who is made visible in these kinds of systems, especially where as Feminist theorists have shown, visibility itself can be a gendered concept and practice, as demonstrated in the historical invisibility of women in the public sphere, for example. Here we might also reflect on the way in which the practice of making-visible also entails the making-invisible – computation involves making choices about what is to be captured. For example, Zach Blas’s work is helpful in showing the various forms of race, gender and class-based exclusion in computational and biometric systems (Magdaleno 2014).

The question then becomes how to “darken” the visibility to prevent the totalising nature of full top-view possible in computational society? Using the metaphor of “black boxes” – the technical notion of objects which have opaque or impossible to read internal states but readable surfaces – how can we think about spaces that paradoxically enable democracy and the political, whilst limiting the reading of the internal processes of political experimentation and formation. Thus, how to create the conditions of possibility for “opaque presence” working on the edges or at the limits of legibility. These spaces we might call opaque temporary autonomous zones, that seek to enable democratic deliberation and debate. These should be fully political spaces, open and inclusive, but nonetheless opaque to the kinds of transparency that computation makes possible. As Rossiter and Zehle (2014) argue, we need to move towards a “politics of anonymity”, part of which is an acknowledgement of the way in which the mediation of algorithms could operate as a plane of opacity for various actors.

It is important to note that this is not to create paranoid spaces or secret societies, but conditional and temporary moments – glitches in the regime of computational visibility. The idea is not to recreate notions of individual privacy as such, but rather collective spaces of critical reflection for practices of creating a political response. That is, to draw on theory and “un-theory” as a way of proceeding theoretically as “an open source theory [and practice] in constant reformulation from multiple re-visions and remixings” (Goldberg 2014), what CTI (2008) calls “poor theory”. Indeed, crypto practices can create shadows in plain sight thus tipping the balance away from systems of surveillance and control. Of course, paradoxically these opaque spaces themselves may draw attention to state authorities and the intelligence community who monitor the use of encryption and cryptography – demonstrating again the paradox of opacity and visibility.

CV Dazzle Project by Adam Harvey

By crypto practices, or crypto-activism, I mean the notion of “hiding in plain sight”, a kind of stenography of political practice. This is not merely a technical practice but a political and social one too. Here I am thinking of the counter-surveillance art of Adam Harvey, such as “CV Dazzle“, which seeks to design make-up that prevents facial recognition software from identifying faces, or the “Stealth Wear” which creates the “potential for fashion to challenge authoritarian surveillance” (Harvey 2014). Some examples in political practice can also be seen at the AntiSurveillance Feminist Poet Hair and Makeup Party. Additionally, Julian Oliver’s work has also been exemplary in exploring the ideas of visibility and opacity. Here I am thinking in particular of Oliver’s works that embed code executables paradoxically in images of the software objects themselves, such as “Number was the substance of all things” (2012), but also “PRISM: The Beacon Frame” (2013) which makes visible the phone radio networks, and hence the possibility of surveillance in realtime of networks and data channels (Oliver 2014).

These artworks point towards the notion of “opaque presence” explored by Broeckmann (2010) who argues that “the society of late capitalism – whether we understand it as a society of consumption, of control, or as a cybernetic society – visibility and transparency are no longer signs of democratic openness, but rather of administrative availability” (Broeckmann 2010). It also is suggestively explored by the poet Edouard Glissant, who believes that we should “agree not merely to the right to difference but, carrying this further, agree also to the right to opacity that is not enclosure within an irreducible singularity. Opacities can coexist and converge, weaving fabrics” (Glissant 1997: 190).

So this is not just a technical (e.g. cryptographic) practice. Indeed crypto practices have to be rethought to operate on the terrain of the political and technical simultaneously. Political activity, for example, is needed to legitimate these cryptographically enabled “dark places”. Both with the system (to avoid paranoia and attack), with the public (to educate and inform about them), and with activists and others.

That is, we could think about these crypto-practices as (re)creating the possibility of being a crowd, both in the terms of creating a sense of solidarity around the ends of a political/technical endeavour and the means which act as a condition of possibility for it. Thus we could say in a real sense that computer code can act to create “crowd source”, as it were, both in the technical sense of the computer source code, and in the practices of coming together to empower actors within a crowd, to connect to notions of the public and the common. But these crypto-practices could also help individuals to “look to comprehend how things fit together, how structural conditions and cultural conceptions are mutually generative, reinforcing, and sustaining, or delimiting, contradictory, and constraining. [They] would strive to say difficult things overlooked or purposely ignored by conventional thinking, to speak critically about challenging matters, to identify critical and counter-interests” (Goldberg 2014).

In contrast, to think for a moment about the other side of the antinomy, liberal societies have a notion of a common good of access to information to inform democratic citizens, whilst also seeking to valorise it. That is, the principle of visibility is connected to not only the notion of seeing ones representatives and the mechanisms of politics themselves but also the knowledge that makes the condition of acting as a citizen possible.

Meanwhile, with the exploding quantity of information in society and the moves towards a digital economy, information is increasingly seen as a source of profit for capitalism if captured in an appropriate way. Indeed, data and information are said to be the new ‘oil’ of the digital age (e.g. Alan Greenspan 1971) (Berry 2008: 41, 56). This highlights both the political and economic desire for data. Meanwhile, the digital enables exploding quantities of data that are increasingly hard to contain within organisation boundaries.

One response to computational changes in political and the economy has been the kinds of digital activism connected with whistleblowing and megaleaks, that is the release of massive amounts of data into the public sphere and the use of social media and the internet to distribute it. These practices tend to act to take information out of the “black boxes” of corporations, governments and security services and provide information in the public domain about their mechanisms, practices and machinations. They seek then to counter the opaqueness of the organisation form, and making use of the copyable nature of digital materials.

However, as megaleaks places raw data into the public sphere – usually as files and spreadsheets of data – there is a growing problem of being able to read and comprehend it, hence the growing need for journalists to become data journalists. Ironically then, “opening the databanks” (Berry 2014: 178, Lyotard 1984: 67) creates a new form of opaqueness. Computational strategies are needed to read these new materials (e.g. algorithmic distant readings). Attached to the problem of information overload is that this mechanism can also be harnessed by states seeking to attack megaleaks by counter-leaking and delegitimate megaleaks. Additionally, in some senses the practices of Wikileaks are connected to creating an informational overload within organisations, both in terms of their inability to cope with the release of their data, but also the requirement to close communicational channels within the organisation. So information overload can become a political tactic of both for control and resistance.

But what is at stake here is not just the relationship between visibility and incarceration, nor the deterritorialisation and becoming-mobile made possible by computation. Rather it is the collapse of the “time lag between the world and its capture” (Chow 2012).  When capture becomes real-time through softwarized monitoring technologies and the mediation of “police” functions and control that implies.

The question then becomes what social force is able to realise the critique of computational society but also block the real-time nature of computational monitoring. What practices become relevant when monitoring and capture become not only prevalent but actively engaged in. Tentatively I would like to suggest embedding critical cryptographic practices made possible in what Lovink and Rossiter  (2013) calls OrgNets (organised networks).

Antisurveillence Feminist Party

But also what we might call crypto-activism, the creation of systems of inscription that enable the writing of opaque codes and the creation of “opaque places”. This is not just the making possible the spaces of collectivity (“crowd source”) but also the hacking and jamming of the realtime mediation of politics, dissent and everyday life (Deleuze 1992). As Glissant argues “We clamour for the right to opacity for everyone” (Glissant 1997: 194). This, I think, calls for both a cartography of the hybridity of digital media (its post-digital materiality) but also and importantly, the possible translation of crypto, as a concept and as a technical practice, into digital activism tactics.

Notes

[1] This post is drawn from a talk given at Digital Activism #Now: Information Politics, Digital Culture and Global Protest Movements, at Kings College, London (KCL), 04/04/14. See http://www.kcl.ac.uk/aboutkings/worldwide/initiatives/global/nas/news-and-events/events/eventrecords/Digital-Activism-Now-Information-Politics,-Digital-Culture-and-Global-Protest-Movements.aspx


Bibliography

Berry, D. M. (2008) Copy, Rip, Burn: The Politics of Copyleft and Open Source, London: Pluto Press.

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Broeckmann, A. (2010) Opaque Presence / Manual of Latent Invisibilities, Berlin: Diaphanes Verlag.

Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.

CTI (2008) Poor Theory Notes: Toward a Manifesto, Critical Theory Institute, accessed 14/4/2014, https://www.humanities.uci.edu/critical/poortheory.pdf

Deleuze, G. (1992) Postscript on the Societies of Control, October, vol. 59, pp. 3-7. Available at https://files.nyu.edu/dnm232/public/deleuze_postcript.pdf

Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.

Glissant, E. (1997) The Poetics of Relation, Michigan: The University of Michigan Press.

Goldberg, D. T. (2014) Afterlife of the Humanities, accessed 14/04/2014, http://humafterlife.uchri.org

Harvey, A. (2014) Stealth Wear, accessed 04/04/2014, http://ahprojects.com/projects/stealth-wear/

Lovink, G. and Rossiter, N (2013) Organised Networks: Weak Ties to Strong Links, Occupy Times, accessed 04/04/2014, http://theoccupiedtimes.org/?p=12358

Lyotard, J. F. (1984) The Postmodern Condition: A Report on Knowledge. Manchester:
Manchester University Press

Magdalenom J. (2014) Is Facial Recognition Technology Racist?, The Creators Project, accessed 05/04/2014, http://thecreatorsproject.vice.com/blog/is-facial-recognition-technology-racist

Oliver, J. (2014) Julian Oliver, accessed 05/04/2014, http://julianoliver.com/output/

Rossiter, N. and Zehle, S. (2014) Toward a Politics of Anonymity: Algorithmic Actors in the Constitution of Collective Agency and the implications for Global Justice Movements, in Parker, M., Cheney, G., Fournier, V. and Land, C. (eds.) The Routledge Companion to Alternative Organization, London: Routledge.

Digital/Post-digital

I want to take up the question of the definition of the “post-digital” again because I think that what the post-digital is pointing towards as a concept is the multiple moments in which the digital was operative in various ways (see Berry 2014a, 2014b, 2014c). Indeed, historicising the “digital” can be a useful, if not crucial step, in understanding the transformation(s) of digital technologies. That is, we are at a moment whereby we are able to survey the various constellations of factors that made up a particular historical configuration around the digital and in which the “digital” formed an “imagined” medium to which existing analogue mediums where often compared, and to which the digital tended to be seen as suffering from a lack, e.g. not a medium for “real” news, for film, etc. etc. The digital was another medium to place at the end (of the list) after all the other mediums were counted – and not a very good one. It was where the digital was understood, if it were understood at all, as a complement to other media forms, somewhat lacking, geeky, glitchy, poor quality and generally suited for toys, like games or the web, or for “boring” activities like accountancy or infrastructure. The reality is that in many ways the digital was merely a staging post, whilst computing capacity, memory, storage and display resolutions could fall in price/rise in power enough to enable a truly “post-digital” environment that could produce new mediated experiences. That is, that it appears that the digital was “complementary” but the post-digital is zero-sum. Here is my attempt to sum up some of the moments that I think might serve as a provocation to debate the post-digital.


 DIGITAL 


 POST-DIGITAL 
Non-zero sum Zero-sum
Objects Streams
Files Clouds
Programs Apps
SQL databases NoSQL storage
HTML node.js/APIs
Disciplinary Control
Administration Logistics
Connect Always-on
Copy/Paste Intermediate
Digital Computal
Hybrid Unified
Interface Surface
BitTorrent Scraping
Participation Sharing/Making
Metadata Metacontent
Web 2.0 Stacks
Medium Platform
Games World
Software agents Compactants
Experience Engagement
Syndication Push notification
GPS Beacons  (IoTs)
Art Aesthetics
Privacy Personal Cloud
Plaintext Cryptography
Responsive Anticipatory
Tracing Tracking
Surfing Reading

figure 1: Digital to Post-Digital Shifts 
This the table offers constellations or moments within a “digital” as opposed to a “post-digital” ecology, as it were, and, of course, a provocation to thought. But they can also be thought of as ideal types that can provide some conceptual stability for thinking, in an environment of accelerating technical change and dramatic and unpredictable social tensions in response to this. The question then becomes to what extent can the post-digital counter-act the tendencies towards domination of specific modes of thought in relation to instrumentality, particularly manifested in computational devices and systems? For example, the contrast between the moments represented by Web 2.0 / Stacks provides an opportunity for thinking about how new platforms have been built on the older Web 2.0 systems, in some cases replacing them, and in others opening up new possibilities which Tiziana Terranova (2014) has pointed to in her intriguing notion of “Red Stacks”, for example (and in contrast to Bruce Sterlings notion of “The Stacks”, e.g. Google, Facebook, etc.). Here I have been thinking of the notion of the digital as representing a form of “weak computation/computationality”, versus the post-digital as “strong computation/computationality”, and what would the consequences be for a society that increasingly finds that the weak computational forms (CDs, DVDs, laptops, desktops, Blogs, RSS, Android Open Source Platform [AOSP], open platforms and systems, etc.) are replaced by stronger, encrypted and/or locked-in versions (FairPlay DRM, Advanced Access Content System [AACS], iPads, Twitter, Push-notification, Google Mobile Services [GMS], Trackers, Sensors, ANTICRISIS GIRL, etc.)?  

These are not just meant to be thought of in a technical register, rather the notion of “weak computation” points towards a “weak computational sociality” and “strong computation” points towards a “strong computation sociality”, highlighting the deeper penetration of computational forms into everyday life within social media and push-notification, for example. Even as the post-digital opens up new possibilities for contestation, e.g. megaleaks, data journalism, hacks, cryptography, dark nets, torrents, piratization, sub rosa sharing networks, such as the Alexandria Project, etc. and new opportunities for creating, sharing and reading knowledges, the “strong computation” of the post-digital always already suggests the shadow of computation reflected in heightened tracking, surveillance and monitoring of a control society. The post-digital points towards a reconfiguration of publishing away from the (barely) digital techniques of the older book publishing industry, and towards the post-digital singularity of Amazonized publishing with its accelerated instrumentalised forms of softwarized logistics whilst also simultaneously supporting new forms of post-digital craft production of books and journals, and providing globalised distribution. How then can we think about these contradictions in the unfolding of the post-digital and its tendencies towards what I am calling here “strong computation”, and in what way, even counter-intuitively, does the digital (weak computation) offer alternatives, even as marginal critical practice, and the post-digital (strong computation) create new critical practices (e.g. critical engineering), against the increasing interconnection, intermediation and seamless functioning and operation of the post-digital as pure instrumentality, horizon, and/or imaginary.  



Bibliography

Berry, D. M. (2014a) The Post-Digital, Stunlaw, accessed 14/1/2014, http://stunlaw.blogspot.co.uk/2014/01/the-post-digital.html

Berry, D. M. (2014b) Critical Theory and the Digital, New York: Bloomsbury.

Berry, D. M. (2014c) On Compute, Stunlaw, accessed 14/1/2014,  http://stunlaw.blogspot.co.uk/2014/01/on-compute.html

Terranova, T. (2014) Red stack attack! Algorithms, capital and the automation of the common, EuroNomade, accessed 20/2/2014,  http://www.euronomade.info/?p=1708


Questions from a Worker Who Codes

In relation to the post-digital, it is interesting to ask the question as to the extent to which the computational is both the horizon of, and the gatekeeper to, culture today (Berry 2014a). If code operates as the totalising mediator of culture, if not the condition for such culture, then access to both culture and code should become social, political and aesthetic questions. This is partially bound up with questions of literacy and the scope of such knowledges, usually framed within the context of computational competence within a particular programming language. This question returns again and again in relation to the perceived educative level of a population in order to partake of the commonality shared within a newly post-digital culture – should one code? In other words, to what extent must a citizen be able to read and interact with the inscriptions that are common to a society?  Indeed, in the register of art, for example, Brecht considered the question itself to be superfluous, in as much as providing an opportunity of access and therefore praxis opens the possibility of such experiences and understanding. He writes,

one need not be afraid to produce daring, unusual things for the proletariat so long as they deal with its real situation. There will always be people of culture, connoisseurs of art, who will interject: “Ordinary people do not understand that.” But the people will push these persons impatiently aside and come to a direct understanding with artists (Brecht 2007: 84).

In relation to the practices of code itself, it is, of course, not a panacea for all the ills of society. However, it is on the other hand a competence that increasingly marks itself out as a practice which creates opportunities to interact with and guide ones life in relation to being able to operate, and define how the computational functions in relation to individuation processes (see Stiegler 2013, Cowen 2013, Economist 2014). Not only that, as the epistemic function of code grows in relation to the transformation of previous media forms into a digital substrate, and the associated softwarization of the process, culture is itself transformed and the possibilities for using and accessing that culture change too. Indeed, Bifo argues, without such competences, “the word is drawn into this process of automation, so we find it frozen and abstract in the disempathetic life of a society that has become incapable of solidarity and autonomy” (Berardi 2012: 17). For Berardi, cognitive labour would then have become disempowered and subjected to what he calls “precarization” (Berardi 2012: 141). In response he calls for an “insurrection” in as much as “events” can generate the “activation of solidarity, complicity, and independent collaboration between cognitarians”, that is, “between programmers, hardware technicians, journalists, and artists who all take part in an informational process” (Berardi 2012: 142-3).

The aim of this literacy, if we can call it that, in relation to the computational, and which is similar to what I have called iteracy elsewhere (Berry 2014b), is also connected to notions of reflexivity, critique, and emancipation in relation to the mechanisation of not only labour, but also culture and intellectual activities more generally. Understanding the machine, as it were, creates the opportunity to change it, and to give citizens the capacity to imagine that things might be other than they are.

This is important to avoid a situation whereby the proletarianisation of labour is followed by the capacity of machines to proletarianise intellectual thought itself. That is, that machines define the boundaries of how, as a human being, one must conduct oneself, as revealed by a comment by a worker at a factory in France in the 1960s who commented, that “to eat, in principle, one must be hungry. However, when we eat, it’s not because we’re hungry, it’s because the electronic brain thought that we should eat because of a gap in production” (Stark 2012: 125). Delegation into the machine of the processes of material and intellectual production abstracts the world into a symbolic representation within the processes of machine code. It is a language of disconnection, a language that disables the worker, but simultaneously disables the programmer, or cognitive worker, who no longer sees another human being, but rather an abstract harmony of interacting objects within a computational space – that is, through the application of compute (Berry 2014c). This is, of course, a moment of reification, and as such code and software act as an ideological screen for the activities of capitalism, and the harsh realities of neoliberal restructuring and efficiencies, the endless work,[1] made possible by such softwarization. Indeed, under capital,

time sheds its qualitative, variable, flowing nature; it freezes into an exactly delimited, quantifiable continuum filled with quantifiable ‘things’ (the reified, mechanically objectified ‘performance’ of the worker, wholly separated from his total human personality): in short, it becomes space. In this environ­ment where time is transformed into abstract, exactly measurable, physical space, an environment at once the cause and effect ofthe scientifically and mechanically fragmented and specialised pro­ duction of the object of labour, the subjects of labour must like­ wise be rationally fragmented. On the one hand, the objectifica­tion of their labour-power into something opposed to their total personality (a process already accomplished with the sale of that labour-power as a commodity) is now made into the permanent ineluctable reality of their daily life. Here, too, the personality can do no more than look on helplessly while its own existence is reduced to an isolated particle and fed into an alien system. On the other hand, the mechanical disintegration of the process of production into its components also destroys those bonds that had bound individuals to a community in the days when production was still ‘organic’. In this respect, too, mechanisation makes of them isolated abstract atoms whose work no longer brings them together directly and organically; it becomes mediated to an increasing extent exclusively by the abstract laws of the mechanism which imprisons them (Lukács 1971: 90).

But of course here, it is not seconds and minutes measured in “the pendulum of the clock [that] has become as accurate a measure of the relative activity of two workers as it is of the speed of two locomotives”, but rather the microsecond and millisecond time of code, combined with new forms of sensors and distributed computational devices that measure time. Indeed, “time is everything, [humans are] nothing; they are at the most the incarnation of time. Quality no longer matters. Quantity alone decides everything: hour for hour, day for day” (Marx 1976: 125). For it is in the spaces of such quantification that lies the obfuscation of the the realities of production, but also of the possibility for changing production to a more democratic and humane system that makes, as Stiegler claims, “a life worth living” (Stiegler 2009).[2]

Notes

[1] It is interesting to think about the computational imaginary in relation to the notion of “work” that this entails or is coded/delegated into the machine algorithms of our post-digital age. Campagna (2013) has an interesting formulation of this in relation to Newman (2012) has called “nothing less than a new updated Ego and Its Own for our contemporary neoliberal age” (Newman 2012: 93). Indeed, Campagna writes, “westerners had to find a way of adapting this mystical exercise to the structures of contemporary capitalism. What would a mantra look like, in the heart of a global metropolis of the 21st Century? What other act might be able to host its obsessive spirit, whilst functioning like a round, magic shield, covering the frightened believers from their fear of freedom? There was only one possible, almost perfect candidate. The activity of repetition par excellence: Work. The endless chain of gestures and movements that had built the pyramids and dug the mass graves of the past. The seal of a new alliance with all that is divine, which would be able to bind once again the whole of humanity to a new and eternal submission. The act of submission to submission itself. Work. The new, true faith of the future” (Campagna 2013: 10). Here, though I argue that it is not immaterial apparitions and spectres which are haunting humanity and which the Egoist can break free from, but the digital materiality of computers’ abstractions formed of algorithms and code and which are a condition of possibility for individuation and subjectivity itself within cognitive capitalism. 
[2] As Stark writes,  “for a worker to claim the right to create—to theoretically “unalienated” labor—was a gesture as threatening to the factory bosses as it was to the official organs of the left, with their vision of the worker acceding to a state of being-in-oneself through work. Regarding this form of sociological indeterminacy, Rancière argues that “perhaps the truly dangerous classes are . . . the migrants who move at the border between classes, individuals and groups who develop capabilities within themselves which are useless for the improvement of their material lives and which in fact are liable to make them despise material concerns.” Further, for Rancière, “Working- class emancipation was not the affirmation of values specific to the world of labor. It was a rupture in the order of things that founded these ‘values,’ a rupture in the traditional division [partage] assigning the privilege of thought to some and the tasks of production to others.” Binetruy affirms this rupture, recalling that while initially wary of “these Parisians who came stuffed with film and cameras,” he quickly realized that “they did not come to teach us any lessons, but rather to transmit technical training that would liberate our spirits through our eyes. Once you have put your eyes behind a camera, you are no longer the same man, your perspective has changed.”” (Stark 2012: 150).

Bibliography

Berardi, F. (2012) The Uprising: On Poetry and Finance, London: Semiotext(e).

Berry, D. M. (2014a) The Post-Digital, Stunlaw, accessed 14/1/2014, http://stunlaw.blogspot.co.uk/2014/01/the-post-digital.html

Berry, D. M. (2014b) Critical Theory and the Digital, New York: Bloomsbury.

Berry, D. M. (2014c) On Compute, Stunlaw, accessed 14/1/2014,  http://stunlaw.blogspot.co.uk/2014/01/on-compute.html

Brecht, B. (2007) Popularity and Realism, in Aesthetics and Politics, London: Verso Press.

Campagna, F. (2013) The Last Night: Anti-Work, Atheism, Adventure, London: Zero Books.

Cowen, T. (2013) Average Is Over: Powering America Beyond the Age of the Great Stagnation, London: Dutton Books.

Economist (2014) Coming to an office near you, The Economist, accessed 16/01/2014, http://www.economist.com/news/leaders/21594298-effect-todays-technology-tomorrows-jobs-will-be-immenseand-no-country-ready

Lukács, G. (1971) History and Class Consciousness: Studies in Marxist Dialectics, MIT Press.

Marx, K. (1976) The Poverty of Philosophy, in Karl Marx and Frederick Engels, Collected Works, Volume 6, 1845–1848, London: Lawrence & Wishart.

Newman, S. (2013) Afterword, In Campagna, F. (2013) The Last Night: Anti-Work, Atheism, Adventure, London: Zero Books, pp. 92-5.

Stark, T. (2012) “Cinema in the Hands of the People”: Chris Marker, the Medvedkin Group, and the Potential of Militant Film, OCTOBER, 139, Winter 2012, pp. 117–150.

Stiegler, B. (2009) What Makes Life Worth Living: On Pharmacology, Cambridge: Polity Press

On Compute

Today, the condition of possibility for the milieu of contemporary life is compute. That is, compute as the abstract unit of computation, both as dunamis (potentiality) and energeia (actuality), that is as the condition of possibility for the question of the in-itself and the for-itself.  Compute as a concept, exists in two senses, as the potential contained in a computational system, or infrastructure, and in the actuation of that potential in actual work, as such. Whilst always already a theoretical limit, compute is also the material that may be brought to bear on a particular computational problem – and now many problems are indeed computational problems. Such then that the theoretical question posed by compute is directly relevant to the study of software, algorithms and code, and therefore the contemporary condition in computal society, because it represents the moment of potential in the transformation of inert materials into working systems. It is literally the computational unit of “energy” that is supplied to power the algorithms of the world’s systems. Compute then, is a notion of abstract computation, but it is also the condition of possibility for and the potential actuation of that reserve power of computation in a particular task. Compute becomes a key noetic means of thinking through the distribution of computation in the technological imaginary of computal society.

In a highly distributed computational environment, such as we live in today, compute is itself distributed around society, carried in pockets, accessible through networks and wireless connections and pooled in huge computational clouds. Compute then is not only abstract but lived and enacted in everyday life, it is part of the texture of life, not just as a layer upon life but as a structural possibility for and mediation of such living. But crucially, compute is also an invisible factor in society, partially due to the obfuscation of the technical condition of the production of compute, but also due to the necessity for an interface, a surface, with which to interact with compute. Compute then as a milieu is such that it is never seen as such, even as it surrounds us and is constantly interacting with and framing our experiences. Indeed, Stiegler (2009) writes that,

Studying the senses, Aristotle underlines in effect that one does not see that, in the case of touching, it is the body that forms the milieu, whereas, for example, in the case of sight, the milieu is what he calls the diaphane. And he specifies that this milieu, because it is that which is most close, is that which is structurally forgotten, just as water is for a fish. The milieu is forgotten, because it effaces itself before that to which is gives place. There is always already a milieu, but this fact escapes us in the same way that “aquatic animals,” as Aristotle says, “do not notice that one wet body touches another wet body” (423ab): water is what the fish always sees; it is what it never sees. Or, as Plato too says in the Timaeus, if the world was made of gold, gold would be the sole being that would never be seen – it would not be a being, but the inapparent being of that being, appearing only in the occurrence of being, by default (Stiegler 2009: 13-14)

In this sense, compute, is the structural condition of possibility that makes the milieu possible by giving it place, in as much as it creates those frameworks within which technicity takes place. The question of compute then, both as a theoretical concept but also as a technical definition is crucial for thinking through the challenge of computation more broadly. But, in a rapidly moving world of growing computational power, comparative analysis of computational change is made difficult without a metric by which to compare different moments historically. This is made much more difficult by the reality that compute is not simply the speed and bandwidth of a processor as such, but includes a number of other related technical considerations such as the speed of the underlying motherboard, ram, graphics processor(s), storage system and so forth.

Compute then is a relative concept and needs to be thought about in relation to previous iterations, and this is where benchmarking has become an important part of the assessment of compute – for example SPECint, a computer benchmark specification for a processor’s integer processing power maintained by the Standard Performance Evaluation Corporation (SPEC 2014). Another, called GeekBench (2013), scores compute against a baseline score of 2500, which is the score of an Intel Core i5-2520M @ 2.50 GHz. In contrast, SYSmark 2007, another benchmark, attempts to bring “real world” applications into the processing measurement by including a number of ideal systems that run canned processing tasks (SYSmark 2007). As can be seen, comparing compute becomes a spectrum of benchmarks that test a variety of working definitions of forms of processing capacity. It is also unsurprising that as a result many manufactures create custom modes within their hardware to “game” these benchmarks and unfortunately obfuscate these definitions and comparators. For example,

Samsung created a white list for Exynos 5-based Galaxy S4 phones which allow some of the most popular benchmarking apps to shift into a high-performance mode not available to most applications. These apps run the GPU at 532MHz, while other apps cannot exceed 480MHz. This cheat was confirmed by AnandTech, who is the most respected name in both PC and mobile benchmarking. Samsung claims “the maximum GPU frequency is lowered to 480MHz for certain gaming apps that may cause an overload, when they are used for a prolonged period of time in full-screen mode,” but it doesn’t make sense that S Browser, Gallery, Camera and the Video Player apps can all run with the GPU wide open, but that all games are forced to run at a much lower speed (Schwartz 2013).

On a material register the unit of compute can be thought of as roughly the maximum potential processing capacity of a computer processing chip running for a notional hour. In todays softwarized landscape, of course, processing power itself become a service and hence more often is framed in terms of virtual machines (VMs), rather than actual physical machines – a number of compute instances can be realised on a single physical processor using sophisticated software to manage the illusion. Amazon itself defines compute through an abstraction of actual processing as follow,

Transitioning to a utility computing model fundamentally changes how developers have been trained to think about CPU resources. Instead of purchasing or leasing a particular processor to use for several months or years, you are renting capacity by the hour. Because Amazon EC2 is built on commodity hardware, over time there may be several different types of physical hardware underlying EC2 instances. Our goal is to provide a consistent amount of CPU capacity no matter what the actual underlying hardware (Amazon 2013).

Indeed, Amazon tends to discuss compute in relation to its unit of EC2 Compute Unit (ECU) to enable the discretisation.[1] Google also uses an abstract quantity and measures “minute-level increments” of computational time (Google 2013). The key is to begin thinking about how an instance provides a predictable amount of dedicated compute capacity and as such is a temporal measure of computational power albeit seemingly defined rather loosely in the technical documentation.

The question of compute is then a question of the origin of computation more generally, but also how the infrastructure of computation can be understood both qualitatively and quantitatively. Indeed, it is clear that the quantitative changes that greater compute capacity introduces makes possible the qualitative experience of computation that we increasingly take for granted in our use of a heavily software-textured world. To talk about software, processes, algorithms and code is then deficient without a corresponding understanding of the capacity of compute in relation to them and a key question for thinking about the conditions of possibility that computation make possible for our lives today.

Notes

[1] Amazon used to define the ECU directly, stating: “We use several benchmarks and tests to manage the consistency and predictability of the performance of an EC2 Compute Unit. One EC2 Compute Unit provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor. This is also the equivalent to an early-2006 1.7 GHz Xeon processor referenced in our original documentation” (Berninger 2010). They appear to have stopped using this description in their documentation (see Amazon 2013). 

Bibliography

Amazon (2013) Amazon EC2 FAQs, accessed 05/01/2014, http://aws.amazon.com/ec2/faqs/#What_is_an_EC2_Compute_Unit_and_why_did_you_introduce_it

Berninger, D. (2010) What the heck is an ECU?,  accessed 05/01/2014, http://cloudpricecalculator.com/blog/hello-world/

GeekBench (2013) GeekBench Processor Benchmarks, accessed 05/01/2014, http://browser.primatelabs.com/processor-benchmarks

Google (2013) Compute Engine — Google Cloud Platform, accessed 05/01/2014, https://cloud.google.com/products/compute-engine/

Schwartz, R. (2013) The Dirty Little Secret About Mobile Benchmarks,  accessed 05/01/2014, http://mostly-tech.com/tag/geekbench/

SPEC (2014) The Standard Performance Evaluation Corporation (SPEC), accessed 05/01/2014, http://www.spec.org

Stiegler, B. (2009) Acting Out, Stanford University Press.

SYSmark (2007),  SYSmark 2007 Preview, accessed 05/01/2014, http://bapco.com/products/sysmark-2007#details-product-info

The Post-Digital

Courbet, Gustave-The Painter’s Studio; A Real Allegory (1855)

As we increasingly find that the world of computational abundance is normalised, the application of cheap digital technologies to manage or partially augmented traditionally analogue experiences, technologies and practices will doubtless grow.[1] That is, the power of “compute” is growing both in breadth and depth as it permeates society and culture (see Davies 2013; Berry 2014a). All around us we are increasingly surrounded by new fields and flows of computation that co-construct and stabilise a new artifice for the human sensorium – streams, clouds, sensors and infrastructures. Not unlike previous moments in which mediums become part of everyday life, this new field is noticeable for its ability to modulate and transform itself through the use of algorithms and code. Not just as a general plasticity but as a flexible structure that adapts to context and environment tailored to the individual, or perhaps better, dividual, of the computational age. This new field of computation is not necessarily top-down and corporate controlled either. Thus, we see at a bottom-up level, the emergence of a market in cheap digital processors that enable the implementation of innovative new forms of culture and cultural experimentation. We might think of these moments as part of the constellation I am calling the “post-digital” (see also Berry 2013a; Cramer 2013; Cox 2013; Philipsen 2013; Sable 2012).

Museu de Arte de São Paulo (MASP), 1968.
Designed by Lina Bo Bardi

Thus, the historical distinction between the digital and the non-digital becomes increasingly blurred, to the extent that to talk about the digital presupposes a disjuncture in our experience that makes less and less sense. Thus computation becomes spatial in its implementation, embedded within the environment and part of the texture of life itself which can be walked around, touched, manipulated and interacted with in a number of ways and means – life becomes mediated in and through the computal (Berry 2014b). Indeed, in a similar way in which the distinction between “being online” or “being offline” has become anachronistic, with our always-on smart phones and tablets and widespread wireless networking technologies, so too, perhaps, the term “digital” describes a world of the past.

Which is not to say that time is not an important aspect to computation in this post-digital world. The compressive effects of computation and the flattening metaphors and visual language of computation tend towards an encounter, maximised perhaps by its tendency toward spatiality, to transform time from a diachronic to a synchronic experience. Indeed, history itself may be re-presented through the screen through a number of computation functions and methods that make it seem geometric, flat and simultaneous. A sense of history is then a sense of real-time flows, not so much distant and elusive, whether as cultural or individual memory, but here and now, spectacular and vividly represented and re-presented. Time in this sense is the time of technical time, and the history attendant to it is technical history, presented through databases, code and algorithms.

Thus within a time of computational abundance we might think in relation to the question of the “post-digital”, in as much as we are rapidly entering a moment when the difficulty will be found in encountering culture outside of digital media. Or perhaps the non-digital will largely be the preserve of the elite (by choice, education and wealth) or the very poor (by necessity).  The detritus of society will be cast into the non-digital and the fading and ephemeral will be preserved within computational databanks only, if it is preserved at all. Indeed, even the non-digital becomes bound up in the preservation possibilities offered by the digital,

Non-digital media technologies… become post-digital when they are not simply nostalgically revived, but functionally repurposed in (often critical) relation to digital media technologies: zines that become anti- or non-blogs, vinyl as anti-CD, cassette tapes as anti-mp3, analog film as anti-video (Cramer 2013).

Computal Surfaces: main stage for the
Republican convention in Tampa, Fla (2012)

In a post-digital age, whether something is digital or not will no longer be seen as the essential question. Or rather, the question as to whether something is or is not “digital” will be increasingly meaningless as all forms of media become themselves mediated, produced, accessed, distributed or consumed through digital devices and technologies. This is, to move away from a comparative notion of the digital, contrasted with other material forms such as paper, celluloid or photopaper, and instead begin to think about how the digital is modulated within various materialities. It is also when the contrast between “digital” and “analogue” no longer makes sense either. This spectrum of the digital, a distribution across an axis of more of less computal, gives rise to the expectation of the always already computational of everyday life.

Muffwiggler, Modular Synth Meetup,
University of Sussex (2013).

Thus, the post-digital is represented by and indicative of a moment when the computational has become both hegemonic and post-screenic (see Bosma 2013; Ludovico 2013). As Cramer argues, “the distinction between ‘old’ and ‘new’ media collapses in theory as well as in practice. As Kenneth Goldsmith observes, his students ‘mix oil paint while Photoshopping and scour flea markets'” (Cramer 2013). The “digital” is then understood as a previous historic moment when computation as digitality was understood in opposition to the analogue, although that is not to say that it will not remain as a marginal notion with related practices within post-digitality. Thus, under our contemporary conditions it might be better to think about modulations of the digital or different intensities of the computational as a post-digital moment rather than digital versus analogue as such. We should therefore critically think about the way in which cadences of the computational are made and materialised. In other words, notions of quantitative and qualitative dimensions of “compute” will be increasingly important for thinking about culture, economics, society, politics and everyday life. Tracing power will in many cases be tracing compute, both in terms of the reservoirs of compute managed by gigantic computational Stacks, but also in the places where compute is thin and poorly served. By Stacks, I am referring to the corporations that increasingly rely on computational “technology stacks” for profit and power, such as Google, Apple, Facebook, Twitter and Amazon but also the technical imaginary formed through the notion of these stacks as a diagram (Berry 2013b).

Cuddlebot“: low-tech touch/haptic sensing hardware (2013)

Compute as already always part of life might also herald that the moment of the digital as digitalisation is already the past, and that new challenges lie ahead for thinking about the way in which the computal saturates our culture, institutions and everyday life in varying degrees of modularity and intensity. This growth in computation has put citizens at an obvious disadvantage in a society that not only has historically tended to disavow the digital as a form of knowledge or practice, but also has not seen computational thinking or skills as part of the educational requirements of a well-informed citizen. For example, the lack of understanding of the importance of encryption and cryptography in digital society was humbly described recently by Glenn Greenwald, who one might have thought to have been better schooled in these technologies (Greenwald 2013). Indeed, as computer power has increased, so has the tendency to emulate older media forms to provide content within simulations of traditional containers, such as “e”-books, through techniques of skeuomorphism and glossy algorithmic interface design – rather than learning and teaching computational practices as such. This, perhaps, has the advantage of new computational forms being able to be used and accessed without the requisite computational skills to negotiate the new literary machines of computation, such as the underlying logics, structures, processes and code. However, it also means that in many cases today, we are unable to read what we write, and are not always the writers of the systems that are built around us (Berry 2011; Oliver, Savičić and Vasiliev 2011; Allen 2013). This illiteracy does not seem to be the ideal conditions for the emergence of an informed and educated citizenry to engage with the challenges and dangers of a fully softwarized post-digital society. It also points to the urgent need for a critical and engaged Bildung for the post-digital world, if it is not to become precariously post-democratic.


Notes

[1] This post was inspired by attending “Muffwiggler” at the University of Sussex, Saturday 16 November 2013, organised by Andrew Duff, and funded by the Centre for Digital Material Culture. The event was notionally a homage to analogue synths, but in reality was colonised by digital/analogue hybrid synthesisers and controllers which were properly post-digital in both form and function. More information http://www.muffwiggler.com and http://www.flickr.com/photos/du_ff/sets/72157632801557258/

Bibliography

Allen, J. (2013) Critical Infrastructure, accessed 31/12/2013, http://post-digital.projects.cavi.dk/?p=356

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave Macmillan.

Berry, D. M. (2013a) Post-Digital Humanities, Stunlaw, accessed 30/12/2013,  http://stunlaw.blogspot.co.uk/2013/10/post-digital-humanities.html

Berry, D. M. (2013b) Digital Breadcrumbs, Stunlaw, accessed 30/12/2013, http://stunlaw.blogspot.co.uk/2013/10/digital-breadcrumbs.html

Berry, D. M. (2014a) On Compute, Stunlaw, accessed 05/01/2014, http://stunlaw.blogspot.co.uk/2014/01/on-compute.html

Berry, D. M. (2014b) Critical Theory and the Digital, New York, Continuum/Bloomsbury Academic.

Bosmas, J. (2013) Post-Digital is Post-Screen – Shaping a New Visuality, accessed 30/12/2013, http://post-digital.projects.cavi.dk/?p=580

Cox, G. (2013) some old problems with post–anything (draft version), accessed 30/12/2013, http://post-digital.projects.cavi.dk/?p=230

Cramer, F. (2013) Post-digital: a term that sucks but is useful (draft 2), accessed 30/12/2013, http://post-digital.projects.cavi.dk/?p=295

Davies, J. (2013) Compute Power with Energy- Efficiency, accessed 30/12/2013, http://developer.amd.com/wordpress/media/2013/06/Compute_Power_with_Energy-Efficiency_Jem_AMD_v1.1.pdf

Greenwald, G. (2013) 30c3 Keynote, Chaos Computer Club, accessed 30/12/2013,  http://media.ccc.de/browse/congress/2013/30C3_-_5622_-_en_-_saal_1_-_201312271930_-_30c3_keynote_-_glenn_greenwald_-_frank.html

Ludovico, A. (2013) Post Digital Publishing, Hybrid and Processual Objects in Print, accessed 30/12/2013, http://post-digital.projects.cavi.dk/?p=323

Oliver, J. Savičić, G. and Vasiliev, D. (2011) Critical Engineering Manifesto, accessed 31/12/2013, http://criticalengineering.org

Philipsen, L. (2013) Do not Return to Sender – Why post-digital aesthetic research should actually distinguish between artist, critics, and audience, accessed 30/12/2013, http://post-digital.projects.cavi.dk/?p=350

Sable, D. (2012) A “Post Digital” World, Really?, Google Think Insights, accessed 30/12/2013, http://www.google.com/think/articles/a-post-digital-world-really.html

Post-Digital Humanities

This post forms part of a panel discussion as a response to a lecture given by Professor Willard McCarty at Kings College London, at the event called “The Digital and the Human: Remembering the Future of the Digital Humanities” on 17th October 2013. 

Today we live in computational abundance whereby our everyday lives and the environment that surrounds us is suffused with digital technologies. This is a world of anticipatory technology and contextual computing that uses smart diffused computational processing to create a fine web of computational resources that are embedded into the material world. Thus, the historical distinction between the digital and the non-digital becomes increasingly blurred, to the extent that to talk about the digital presupposes a disjuncture in our experience that makes less and less sense. Indeed, in a similar way to which the “online” or “being online” has become anachronistic, with our always-on smart phones and tablets and widespread wireless networking technologies, so too, perhaps, the term “digital” assumes a world of the past.

Post-Digital Philips Transitions (lighting installation, 2007)

Cascone argued as long back as the year 2000 that “the revolutionary period of the digital information age has surely passed. The tendrils of digital technology have in some way touched everyone”, coining the term post-digital as a means of thinking about it (Cascone 2000: 12). Indeed, the explosion of digital information and data, combined with contraction in the time available to deal with it, has created a tendency to understand the “digital” as a “spectatorial condition”, whereby we assume that “the result is that we filter and graze, skim and forward” (Bishop 2012). In a similar way to that of the art world, whereby “mainstream contemporary art simultaneously disavows and depends on the digital revolution”, today mainstream humanities research equally disavows and depends on the digital – to the extent that to ask the question of the distinction raised by the digital for the humanities, is a question that the digital humanities has sought to address (McCarty 2005, 2013).

The digital humanities, at its most straightforward, is the application of computational principles, processes and machinery to humanities texts – and here I use texts to refer to all forms of materialised cultural forms, such as images, books, articles, sound, film, video, and so on. Indeed, “the digital humanities try to take account of the plasticity of digital forms and the way in which they point towards a new way of working with representation and mediation, what might be called the digital ‘folding’ of memory and archives, whereby one is able to approach culture in a radically new way” (Berry 2012: 2). Much of the early work of the digital humanities was focused on getting traditional humanities materials into a form whereby they could be subject to computational work, so through digitalisation projects, new digital archives and databases, and the “marking up” of texts to enable computational analysis. However, the digital humanities have also had to come to terms with new forms of digital collections and archives, such as the web itself and the archives made from it, e.g. the Internet Archive, and indeed of archives and databases that may be made up of data about data, so called metadata, and computational non-human-readable materials.

Thus, we enter a time of a new illegibility, whereby we might say that we can no longer read what we are writing – we increasingly rely on digital technology both to write and read for us as a form of algorithmic inscription. Not only the new forms of grammatization but we are entering a phase whereby the grammatization process produces symbols and discrete representational units which become opaque to us even us they are drawn from us through technical devices that monitor and track us. As Stiegler writes, digital technology engenders,

a process of the grammatization of flows, as a process of discretization – for example, of the flow of speech, or the flow of gestures of the worker’s body – [this] is what makes possible… technical reproducibility and thus… control, and therefore, by the same stroke, the organization of short circuits inside the long circuits constitutive of transindividuation (Stiegler 2009: 40).

This process of transindividuation, through practices such as a humanities education, create the psychic structures for the possibility of thinking at all. They are constitutive of the development of the “maturity” of the individual and the collective formation and education of intelligence and knowledge. It is through transindividuation that Stiegler argues that the ability to “think for oneself” is developed, and which he has used to outline what is a “life worth living”, a concern to which the humanities have traditionally been linked (Stiegler 2013). It is here, in its destablising and deconstructing moment, that Stiegler argues that the digital as presently constructed, undermines the development of attention, memory, concentration and intelligence.

Indeed, the question the digital poses to the humanities is addressed directly at what Lakatos (1980) would have called the “hard-core” of the humanities, the unspoken assumptions and ontological foundations which support the “normal” research that humanities scholars undertake on an everyday basis – for example the notion of a hermeneutically accessible”text” as a constitutive and foundational concept. Digital humanities has attempted to address these issues with notions of “close” and “distant” reading, particular practices related to dealing with both small and larger numbers of texts. However, it remains somewhat ill-equiped to deal with the hermeneutic challenges of computer generated data which nonetheless contains some sense of internal structure, meaning and in some instances, narrative, but which is structured in “chains” that are not conducive to human memory and understanding. Indeed, it raises the question of what the research programmes relevant to a post-digital humanities might be – and this is a question of both research and practice, theoretical work and building things, technologically engaged work and critical technical practice.

At the same time, and from a different direction, digital technologies have undermined and reconfigured the very texts that humanities and digital humanities scholars have taken as their research objects, and re-presented them as fragmentary forms, often realigned and interleaved with fragments from other texts. This interdiscursivity and intertextuality of the digital has, of course, been much remarked upon and even used creatively in the writing of new forms of digital and e-literature. However, in a new way, this process has, to follow Stiegler, begun to undermine the “long circuits” of culture, such that we no longer describe a method, such as the production of concordances within digital humanities, but actually a logic of computational media from which no “long chains” are reconstructed from their shorter counterparts. This, Stiegler diagnoses as a serious danger to societies as they deconstruct the very structures of education and learning on which they are built. Indeed he calls for the creation of counter-products that might reintroduce singularity into cultural experience and somehow disconnect desire from the imperatives of consumption.

In which case, in the particular constellation of concepts and ideas represented by our increasingly computational societies, which Weiner described as a “social experiment” over twenty years ago (Weiner 1994: xv), should we ask about a post-digital humanities that is adequate to begin to address this problem?

In other words, in a time of computational abundance might it therefore be better to begin to raise the question of the “post-digital”, in as much as we are rapidly entering a moment when the difficulty will be found in encountering culture outside of digital media. Indeed as Cramer argues, “in a post-digital age, the question of whether or not something is digital is no longer really important – just as the ubiquity of print, soon after Gutenberg, rendered obsolete all debates (besides historical ones) about the ‘print revolution’” (Cramer 2012: 162). This is to move away from a comparative notion of the digital, contrasted with other material forms such as paper, celluloid or photopaper, and instead begin to think about how the digital is modulated within various materialities.

Thus, the post-digital is represented by and indicative of a moment when the computational has become hegemonic. The digital is then understood as an historic moment defined in opposition to the analogue. We might no longer talk about digital versus analogue, but instead modulations of the digital or different intensities of the computational. We should therefore critically analyse the way in which cadences of the computational are made and materialised.

The post-digital humanities would then be attentive to our massively computational world and culture, but nonetheless attend to the ways in which culture is materialised and fixed in forms specific to digital material culture. That is, how culture is inscribed not just in moments of culture created by human actors, but also in the technical devices, recording systems, trackers, web bugs and beacons of the digital age, together with the giant databases they fill with narratives and documentaries about the great and the mundane, the event and the everyday. Attentive, that is, to the way in which human culture writ large, is also written digitally, in an open-ended arrangement of diverse practices and parts. A digital humanities with cultural critique, as called for by Liu (2012), is a necessary precondition for asking the kinds of questions that the post-digital raises in relation to questions of power, domination, myth and exploitation, but also in relation to the historical, social, political and cultural contexts that cadences of the digital makes possible today.

Bibliography

Berry, D. M. (2012) Understanding Digital Humanities, London: Palgrave.

Bishop, C. (2012) Digital Divide, Artforum, September 2012.

Cascone, K. (2000) The Aesthetics of Failure: “Post-Digital” Tendencies in Contemporary Computer Music, in Computer Music Journal, 24:4, pp. 12-18.

Cramer, F. (2012) Afterword, in Ludovico, A. Post-Digital Publishing, Onomatopee 77: Cabinet Project.

Lakatos, I. (1980), Methodology of Scientific Research Programmes, Cambridge: Cambridge University Press.

Liu, A. (2012) Where Is Cultural Criticism in the Digital Humanities?, Gold, M. K. (ed.) Debates in the Digital Humanities, accessed 11/1/2013, http://dhdebates.gc.cuny.edu/debates/text/20

Ludovico, A. (2012) Post-Digital Publishing, Onomatopee 77: Cabinet Project.

McCarty, W. (2005) Humanities Computing, London: Palgrave.

McCarty, W. (2013) Getting there from here: Remembering the future of digital humanities, 2013 Roberto Busa Award lecture, DH2013, University of Nebraska (Lincoln).

Stiegler, B. (2009) Teleologics of the Snail: The Errant Self Wired to a WiMax Network, Theory Culture Society, 26, pp. 33-45.

Stiegler, B. (2013) What is a Life Worth living, London: Polity.

Weiner, L. R. (1994) Digital Woes. New York: Addison Wesley.

Digital Breadcrumbs

In April 2013, the world population was 7,057,065, 162 (Hunt 2013). This is a population that increasing accesses and uses communications and digital media, and creates huge quantities of real-time and archived data, although it remains divided in its access to digital technology (Berry 2011). We often talk about the vast increase in data creation and transmission but it is sometimes difficult to find recent and useful quantitative measures of the current contours of digital media. Indeed, the internet as we tend to think of it, has become increasingly colonised by massive corporate technology stacks. These companies, Google, Apple, Facebook, Amazon, Microsoft, are called collectively “the Stacks” (Berry 2013). Helpfully, the CIA’s chief technology officer, Ira Hunt (2013), has listed the general data numbers for the “stacks” and gave some useful comparative numbers in relation to telecoms and SMS messaging (see figure 1).


Data Provider

Quantitative Measures

Google (2009 Stats from SEC filing)
More than 100 petabytes of data.

One trillion indexed URLS. 
Three million servers. 
7.2 billion page-views per day.

Facebook (August 2012)
More than one billion users in August 2012.

300 petabytes of data. more than 500 terrabytes per day. 
Holds 35% of the world’s photographs.

Youtube (2013)
More than 1000 petabytes of data (1 exabyte).

More than 72 hours of video uploaded per minute. 
37 million hours per year. 
4 billion views per day.

Twitter (2013)
More than 124 billion tweets per year.

390 million tweets per day or ~4500 tweets per second.

Global Text Messaging (2013)

More than 6.1 trillion text messages per year. 
193,000 messages sent per second 
or 876 per person per year

US Cell Calls (2013)
More than 2.2 trillion minutes per year. 
19 minutes per person per day. 
Uncompressed telephone data is smaller in 
size than Youtube data in a year.

figure 1: Growth in Data Collections and Archives (adapted from Hunt 2013)

The CIA have a particular interest in big data and growth in the “digital breadcrumbs” left by digital devices. Indeed, they are tasked with security of the United States and have always had an interest in data collection and analysis, but it is fascinating to see how increasingly the value of data comes to shape the collection of SIGINT which is digital and subject to computational analysis. Hunt argued,

“The value of any piece of information is only known when you can connect it with something else that arrives at a future point in time… Since you can’t connect dots you don’t have, it drives us into a mode of, we fundamentally try to collect everything and hang on to it forever” (Sledge 2013)

It is also interesting to note the implicit computationality that shapes and frames the way in which intelligence is expected to develop due to the trends in data and information growth. Nevertheless, these desires shape not just the CIA or other security services, but any organisation that is interested in using archival and real-time data to undertake analysis and prediction based on data – which is increasingly all organisations in a computational age.

Information has time value, and soon can lose its potency. This drives the growth of not just big data, but real-time analysis – particularly where real-time and archival or databases can be compared and processed in real-time. Currently real-time is a huge challenge for computational systems and pushes at the limits of current computal systems and data analytic tools. Unsurprisingly, new levels of expertise are called for, usually grouped under the notion of “data science”, a thoroughly interdisciplinary approach sometimes understood as the movement from “search” to “correlation”. Indeed, as Hunt argues,

“It is really very nearly within our grasp to be able to compute on all human generated information,” Hunt said. After that mark is reached, Hunt said, the [CIA] agency would also like to be able to save and analyze all of the digital breadcrumbs people don’t even know they are creating (Sledge 2013).

In a technical sense the desire in these “really big data” applications is the move from what is called “batch map/reduce”, such as represented by Hadoop and related computational systems to “real-time map/reduce” whereby real-time analytics are made possible, represented currently by technologies like Google’s Dremel (Melnik et al 2010), Caffeine (Higgenbotham 2010), Impala (Brust 2012), Apache Drill (Vaughan-Nichols 2013), Spanner (Iqbal 2013), etc. This is the use of real-time stream processing combined with complex analytics and the ability to manage large historical data sets. The challenges for the hardware are considerable, requiring peta-scale RAM architectures so that the data can be held in memory, but also the construction of huge distributed memory systems enabling in-memory analytics (Hunt 2013).


Traditional Computer Processing

Real-Time Analytics/Big Data
Data on storage area network (SAN) Data at processor
Move data to question Move question to data
Backup Replication Management
Vertical Scaling Horizontal Scaling
Capacity after demand Capacity ahead of demand
Disaster recovery Continuity of Operations Plan (COOP)
Size to peak load Dynamic/elastic provisioning
Tape storage area network (SAN)
storage area network (SAN) disk
disk solid-state disk
RAM limited Peta-scale RAM

figure 2: Tectonic Technology Shifts (adapted from Hunt 2013)

These institutional demands are driving the development of new computing architectures, which have principles associated with them, such as: data close to compute, power at the edge, optical computing/optical bus, the end of the motherboard and the use of shared pools of everything, new softwarized hardware systems that allow compute, storage, networking, and even the entire data centre to be subject to software control and management (Hunt 2013). This is the final realisation of the importance of the network, and shows the limitations of current network technologies such that they become one of the constraints on future softwarized system growth.

This continues the move towards context as the key technical imaginary shaping the new real-time streaming digital environment (see Berry 2012), with principles such as “Schema on Read”, which enables the data returned to be shaped in relation to the context of the question asked, “user assembled analytics”, which requires answers to be given for a set of research questions, and the importance of elastic computing, which enables computing power to be utilised in reference to a query or processing demand in real-time, similar to the way electricity is drawn from in greater proportions from the mains as it is required.

These forces are combining in ways that are accelerating the pace of data collection, whether from data exhausts left by users, or through open-source intelligence that literally vacuums the data from the fibre-optic cables that straddle the globe. As such, they also raise important questions related to the form of critical technical practices that are relevant to them and how we can ensure that citizens remain informed in relation to them. To take one small example, the mobile phone is now packed with real-time sensors which is constantly monitoring and processing contextual information about its location, use and the activities of its user. This data is not always under the control of the user, and in many cases is easily leaked, hacked or collected by third parties without the understanding or consent of the user (Berry 2012).

The notion that we leave behind “digital breadcrumbs”, not just on the internet, but across the whole of society, the economy, culture and even everyday life is an issue that societies are just coming to terms with. Notwithstanding the recent Snowdon revelations (see Poitras et al 2013), new computational techniques, as outlined in this article, demonstrate the disconnect between people’s everyday understanding of technology, and its penetration of life and the reality of total surveillance. Not just the lives of others are at stake here, but the very shape of public culture and the ability for individuals to make a “public use of reason” (Kant 1784) without being subject to the chilling effects of state and corporate monitoring of our public activities. Indeed, computal technologies such as these described have little respect for the public/private distinction that our political systems have naturalised as part of a condition of possibility for political life at all. This makes it ever more imperative that we provide citizens with the ability to undertake critical technical practices, both in order to choose how to manage the digital breadcrumbs they leave as trails in public spaces, but also to pull down the blinds on the post-digital gaze of state and corporate interests through the use of cryptography and critical encryption practices.

Bibliography

Berry, D. M. (2011) The Philosophy of Software: Code and Mediation in the Digital Age, London: Palgrave.

Berry, D. M (2012) The social epistemologies of software, Social Epistemology, 26 (3-4), pp. 379-398. ISSN 0269-1728

Berry, D. M. (2013) Signposts for the Future of Computal Media, Stunlaw, accessed 14/10/2013, http://stunlaw.blogspot.co.uk/2013/08/signposts-for-future-of-computal-media.html

Brust, (2012) Cloudera’s Impala brings Hadoop to SQL and BI, accessed 14/10/2013, http://www.zdnet.com/clouderas-impala-brings-hadoop-to-sql-and-bi-7000006413/

Higgenbotham, S. (2013) How Caffeine Is Giving Google a Turbo Boost, accessed 14/10/2013, http://gigaom.com/2010/06/11/behind-caffeine-may-be-software-to-inspire-hadoop-2-0/

Hunt, I. (2013) The CIA’s “Grand Challenges” with Big Data, accessed 14/10/2013,  http://new.livestream.com/gigaom/structuredata/videos/14306067

Iqbal, M. T. (2013) Google Spanner : The Future Of NoSQL, accessed 14/10/2013,  http://www.datasciencecentral.com/profiles/blogs/google-spanner-the-future-of-nosql

Kant, I. (1784) What Is Enlightenment?, accessed 14/10/2013, http://www.columbia.edu/acis/ets/CCREAD/etscc/kant.html

Melnik, S., Gubarev, A., Long, J. J., Romer, G., Shivakumar, S., Tolton, M. and Vassilakis, T. (2010) Dremel: Interactive Analysis of Web-Scale DatasetsProc. of the 36th Int’l Conf on Very Large Data Bases (2010), pp. 330-339.

Poitras, L., Rosenbach, M., Schmid, F., Stark, H. and Stock, J. (2013) How the NSA Targets Germany and Europe, Spiegel, accessed 02/07/2013, http://www.spiegel.de/international/world/secret-documents-nsa-targeted-germany-and-eu-buildings-a-908609.html

Sledge, M. (2013) CIA’s Gus Hunt On Big Data: We ‘Try To Collect Everything And Hang On To It Forever’, accessed 14/10/2013, http://www.huffingtonpost.com/2013/03/20/cia-gus-hunt-big-data_n_2917842.html

Vaughan-Nichols, S. J. (2013) Drilling into Big Data with Apache Drill, accessed 14/10/2013, http://blog.smartbear.com/open-source/drilling-into-big-data-with-apache-drill/

Advertisements