The prescience of Joe Weizenbaum continues. After long detours, I have returned to the end of his book, where he is asking whether the world has any understanding of what computers will do. He saw very clearly how a “system” would grow which had its own logic, and in front of which everyone seemed to stand powerless. What set off this riff was the production of an early seismology database, which made it possible to visualise 8000 known earthquakes, and so see the world’s tectonic plates but simultaneously wiped from the map everything that had happened before 1960, because the earlier data was too difficult to scan in.
The computer has thus begun to be an instrument for the destruction of history. For when society legitimates only those “data” that are “in one standard format” and that “can easily be told to the machine,” then history, memory itself, is annihilated. The New York Times has already begun to build a “data bank” of current events. Of course, only those data that are easily derivable as by-products of typesetting machines are admissable to the system. As the number of subscribers to this system grows, and as they learn more and more to rely on “all the news that [was once] fit to print,” as the Times proudly identifies its editorial policy, how long will it be before what counts as fact is determined by the system, before all other knowledge, all memory, is simply declared illegitimate? Soon a supersystem will be built, based on the New York Times’ data bank (or one very like it), from which “historians” will make inferences about what “really” happened, about who is connected to whom, and about the “real” logic of events. There are many people now who see nothing wrong in this.
This was written in 1975 or a little earlier.
Of course, this is a concern – if your web page isn’t indexed by Google, does it really exist?
But hasn’t this always been the case? You can argue that an editor, being human (there is some evidence for such chimeras), is thus better qualified than any computer to write a history of each day by what they include in their paper. But exclusion is part of any editorial, curatorial or collation task: the British Library may have copies of all periodicals, but it never got my father’s Gestetner’d parish newsletter. Which, bless it, would no doubt be of some interest to anthropologists of the future picking through the bones of the Anglican communion, being around twenty years’ worth of fairly intimate detail of the quotidien life of a country parish which only made the nationals a handful of times and the Western Evening Herald only rarely..
All systems define and exclude. That’s what makes being an historian such a skill: the reading between the lines, inferring what was missed and why it mattered. Given that the overwhelming mass of human existence was and is never recorded, the real danger isn’t in collating incomplete data but in not recognising its incompleteness. That’s not a risk invented by computers: if anything, they give us better tools to propagate that recognition. And they’d probably have helped preserve my father’s Church News – I think the archives got dumped when he retired, but I must ask.
I’m sympathetic to the Weizenbaum programme, but with Google’s book-scanning project well under way, it’s books published too early to be “easily derivable as by-products of typesetting machines” that we’re going to have access to, thanks to increasingly restrictive copyright laws.