People here in the Year of Our Simulation 2024 have never been better at hating the forces underlying that simulation – at hating, in other words, digital technology itself. And good for them. These ubiquitous technology critics don't just rely on vague, nostalgic, technophobic views for their trendy views feelings more. Now they have research papers to back them up. They have bestsellers by Harari and Haidt, among others. They have – imagine their smugness –statistics. The kids, I don't know if you heard, commit suicide by their classmates.
None of this bothers me. Well, teen suicide obviously does, it's horrible, but it's not hard to debunk arguments that blame technology. What is difficult to refute, and what does bother me, is, in my opinion, the only exception to this rule: the anti-tech argument of the contemporary philosopher.
By philosopher I don't mean some statistic-spewing writer of glorified self-help. I mean a ridiculously learned overanalyzer at the deepest level, someone who breaks problems down into their relevant pieces so that when those pieces are put back together, nothing looks quite the same. Descartes didn't just blurt out, “I think, therefore I am” from memory. He had to go so far inside his mind as he humanly could, stripping away everything else before he could get to his classic one-liner. (Plus God. People always seem to forget that Descartes, the inventor of the so-called rational mind, couldn't take away God.)
For someone who wants to make a case against the technology, a Descartes-style line of attack might go something like this: We dive as far into the technology as possible, strip away everything else, and break the problem down into its component parts. , where do we end up? Exactly there, of course: at the literal bits, the 1's and 0's of digital calculations. And what do bits tell us about the world? I'm simplifying here, but pretty much: everything. Cat or dog. Harris or Trump. Black or white. Everyone thinks in binary terms these days. Because that is what is enforced and entrenched by the dominant machinery.
So, in short, the sharpest argument against digital technology goes: 'I binarize', the computers teach us, 'therefore I am.' Certain technoliterates have been venturing into versions of this Theory of Everything for some time now; Earlier this year an English professor at Dartmouth, Aden Evens, published what is, as far as I know, the first truly philosophical codification, The digital and its dissatisfaction. I had a quick chat with Evens. Nice guy. Not a technophobe, he claims, but still: it is clear that he is in world-historical trouble because of digital life, and that he has his roots in the foundations of technology.
Maybe I would agree with it. Like I said, it worries me. I am dissatisfied. The more I think about Evens et al.'s technophilosophy, the less I want to accept it. Two reasons for my dissatisfaction, I think. One: Since when do the basic units of something dictate the entirety of its expression at a higher level? Genes, the basic units of life, are responsible for only a submajority percentage of how we develop and behave. Quantum mechanical phenomena, the basic units of physics, have no influence on my physical actions. (Otherwise I'd be walking through walls – even though I wasn't dead half the time.) So why must binary numbers forever define the boundaries of computation and our experience of it? When complex systems interact, new behaviors can always emerge mysteriously. Nowhere in the individual bird can you find the flow algorithm! Turing himself said that you cannot look at computer code and know: completewhat will happen.
And two: Attributing technology's dissatisfaction to the 1's and 0's treats the digital as an endpoint, as a kind of logical conclusion to the history of human thought—as if humanity, as Evens suggests, has finally realized the dreams of had achieved an Enlightened rationality. There is no reason to believe such a thing. Computer use was common throughout most of its history not digital. And if predictions about an analog comeback are correct, it won't remain purely digital for much longer. I'm not here to say whether computer scientists should develop chips analog or not, just to say that: if it happenedit would be foolish to suggest that all the binarisms of modern existence, so thoroughly instilled in us by our digitized machinery, would suddenly collapse into nuance and glorious analog complexity. We invent technology. Technology did not invent us.