I’m enrolled in in the “E-Learning and Digital Cultures” MOOC (#edcmooc) that the University of Edinburgh is offering through Coursera, and it’s offering an interesting bit of synchronicity with some of the other things I’m working on, including taking part in a reading group with five graduate students as we work our way through Marx’s Capital volume 1, and teaching the spring-semester iteration of a 300-level WSU course (DTC356) called “Electronic Research and the Rhetoric of Information.” As you might imagine, reader, there’s a bit of overlap, and some curiously shifting perspectives.
In the reading group, we just finished the notorious Chapter 3, the chapter on money, and the dialectical back-and-forth got a bit head-spining. It’s the first chapter where Marx mentions accumulation, and the impulse toward accumulation, but it’s also an amazing analysis of how capitalism when it works perfectly inevitably tends toward crisis because of the way it works perfectly. The chapter takes Marx’s foundational work with the commodity (and its instantiation of frozen socially necessary abstract labor: in other words, the first way we see labor undertaking its transformation into capital) as its starting point and then investigates the curious and contradictory ways that money functions, winding its analysis toward the function of paper money and credit as a human-created technology. Marx notes that there are some items that possess value (in that they are frozen labor) and a price, and that there are other items that possess no value in his technical sense of the term (because no labor went into them: his examples are honor and conscience) but that do possess a price. I’ll leave my quibbles with that second half of the definition for later — I believe that social constructs like honor and conscience themselves require labor to produce even if we are seldom conscious of that labor — because the important thing to note is that there are some things that have prices but that do not have values. I would extend this to say that there are some things that have prices but that have negative values: for example, the collateralized debt obligations (CDOs) and credit default swaps (CDSs) that were intentionally crafted to be so mathematically complex as to be beyond understanding and so to be able to hide the so-called “toxic” mortgage loans that were incorporated into them, with that complexity allowing bankers to sell them to investors while those bankers simultaneously bet against those instruments as investments, and thereby profited from the collapse of the product that they had sold knowing that they had designed them to fail. Those CDOs and CDSs are human-designed technologies of capitalism, and they carry prices as mechanisms for the redistribution of wealth (from sucker investors to savvy bankers, apparently), but I’m still wondering whether or not they fulfill Marx’s definition of a commodity as carrying the value of the abstract social labor that went into their production.
Here’s an analogue for that question: given enough computational and analytical power — or, in other words, given enough human labor translated into the digital capital of financial systems analytic software via lines of code written and accounting formulae written and aggregated study and expertise all operating on machines designed by teams of engineers and experts who relied on previous insights and innovations going back even prior to the invention of the transistor — could the ways that CDOs and CDSs contributed to the Crash of 2008 have been anticipated or prevented? Did CDOs and CDSs as technologies of capitalism determine that such a Crash *must* have happened at some point? In the DTC356 course I’m teaching, we’re reading about Claude Shannon as an information theorist who believed the necessary step to decode information was to discard meaning: we don’t care about meaning, Shannon argued. We care about the signal, about the code. Focus enough on the code and discard the context and one can decode any information. In this sense, I suspect Shannon was largely a technological instrumentalist of the sort produced by the first half of the 20th century, particularly if we understand “technology” to exist as a field that includes “tools, instruments, machines, organizations, media, methods, techniques, and systems” (“Reification”). Technological instrumentalists believe technologies to be use-neutral and subject only to human intention, even as their invention seems to demand their use, even as they seem to exist as autonomous entities divorced from us, apart from society, simply things laying to hand to be used.
To my mind, though, what Marx helps to show is the ways in which human social arrangements give rise to systems that blinker us in specific ways, that point us toward certain ways of being and certain technologies, so that in a capitalist system CDOs and CDSs make perfect sense even as they precipitate crises that demolish enormous amounts of actually-existing value (as instances of frozen human labor). I don’t (or won’t) identify as a technological determinist (although I tend much more easily toward an overdetermined technological determinism than toward a technological instrumentalism), but when I look at the intersection of social, political, and economic habits and practices with technologies like computers, cell phones, CDOs, and CDSs, I can’t help but think of the end of the classic Raymond Carver story “The Bridle” and its attitude about technologies like the bridle: Marge looks at the bridle — that instance of frozen labor, that commodity, that technology — after all that has gone on in the story, and thinks, “If you had to wear this thing between your teeth, I guess you’d catch on in a hurry. When you felt it pull, you’d know it was time. You’d know you were going somewhere.” That circumstance at the end of the story, though, seems to me to point to the same circumstance that finally happened, however inexorably, in 2008: the overdetermined combination of heterogeneously massed human intent and reified technologies that some understood better than others produced a perfect crisis. We socially design our own technological affordances, and often, as with the bridle, we elect to wear those affordances.