Plough Logo

Shopping Cart

      View Cart

    Subtotal: $

    Checkout
    an old Commodore computer in pink and blue lighting

    A Law of Deceleration

    How I dumped the internet and learned to love technology again.

    By Paul McDonnold

    April 26, 2021
    1 Comments
    1 Comments
    1 Comments
      Submit
    • Alan Griffin

      A very interesting article, but I do believe that like anything else the internet can be used for good, as well as bad. Like anything that's man made it can be used for good or bad, and like anything it depends on the individual how they use it.

    In 1983, the Commodore VIC-20 went platinum – the first computer ever to sell a million units. With a price point that had fallen below $100, it was a major advance in the democratization of home computing. In television ads, William Shatner pitched the machine, essentially a bloated keyboard that plugged into your television, as “The wonder computer of the 1980s.” I was a skinny fifteen-year-old in Spring, Texas, a fast-growing suburb of Houston, and was enchanted.

    I don’t remember exactly when I got one. It was probably either my birthday or Christmas. But I do remember being unable to pull my eyes from the spiral-bound user manual, even reading it while riding in the family car to the supermarket or the mall. The VIC-20 was much more than a gaming box – like the Atari or Intellivision that so many of my generation owned. You could program it, make it do things.

    Soon I added a modem and a subscription to Compuserve – America’s first big consumer-oriented online information service. This allowed me to do things that seemed amazing at the time, like read newspapers from other cities or text-chat with people in faraway places. It didn’t occur to me that the strange gasps and tweets I heard as the modem connected to the phone line might be the murmurings of a nascent monster that would one day take over my life and the lives of millions of others. I just remember staying up nights exploring Compuserve or dialing into Telenet – a backbone network to which many government and business computers of the time were connected. I imagined myself hacking into exotic supercomputers like Matthew Broderick in the 1984 film Wargames. My skill level precluded this from actually happening, but I did frequently lose all sense of time in front of the keyboard, I was so transfixed. I would look up to see the hands of the clock spun far past my usual bedtime.

    Digital technology is not so much a monster as a mirror.

    One reason I loved the computer was that, like reading, I could do it by myself. But it wasn’t just about isolation. In the computer math class at Spring High School, I met others who had gotten into this new technology, from stereotypical nerds to popular girls to football players. Meanwhile, outside of Spring, down through Houston and all across America, the copper spiderweb of the telephone network was being increasingly coopted by computers speaking to each other in their strange language of gasps and tweets. The digital age had begun.

    I went to college at Texas A&M, then into the workforce. At Klein Bank, where I had my first real job, personal computers were sprouting from more and more workspaces. Some employees plunged into them excitedly; others regarded them as if a penguin had appeared on their desks. Evangelists promised that the fast-evolving technology would free us from drudgery, make our jobs easier. But the keyboard, I soon learned, could bring its own drudgery. My work hours, if anything, seemed to grow as technology advanced, and I seemed to be serving it rather than vice versa. The enchantment I once felt at the keyboard became, like my childhood, a memory.

    As the 1990s progressed, computers in homes and businesses increasingly wove themselves into the internet, while its physical infrastructure – the copper-stranded telephone and cable television networks – was being augmented with fiber-optic lines, along which computers could speak in quiet light pulses rather than gasps and tweets.

    Before I’d even gotten my VIC-20, Microsoft’s co-founder Bill Gates had dreamed of a personal computer in every home. But even he hadn’t envisioned one in every hand, which happened in the 2000s with the advent of smartphones. With that, the digital monster became a pulsing complex of copper wires, fiber-optic cables, and electromagnetic waves entangling and hypnotizing the planet.

    an old Commodore computer in pink and blue lighting

    Photograph by Lorenzo Herrera

    By this time, the monster had taken over my work life, home life, and many of the spaces in between. My one-time enchantment was now disgust, and in 2019 I decided to disconnect, or at least pull way back. As much as possible, I began reading and writing with paper and pen instead of pixels. I dropped my home broadband service. My only personal internet came from a smartphone, which had a 3-gigabyte monthly limit. Beyond that, I used public wi-fi at the library. Email became a once-a-day thing, and I stopped scanning Google News. I let my Facebook page languish for weeks, then months. Then I deleted it. My life decelerated, and time seemed to expand. I was able to do more, read more, and think more. And I felt better. But with so many people still paying near-constant obeisance to digital screens, I also began to feel like I was in a science fiction movie – the only human who had snapped out of the monster’s malevolent hypnosis.

    Then Covid-19 hit, and I had to make some concessions to the monster. With the library closed during lockdown, I upped my phone’s data limit to 10 gigs. But I remained committed to a less-digital lifestyle, and managed not to backslide into digital addiction. I came to realize that for all the silicon, copper, and glass that form its infrastructure, the internet is the intellectual equivalent of plastic – its screens an artificial environment of endless malleability. If not kept in check, they tempt us to shift attention constantly from one thing to the next, accelerating our minds into anxiety and worse.

    A 2019 article in World Psychiatry entitled “The ‘online brain’: How the Internet may be changing our cognition,” reviewed numerous psychological, psychiatric, and neuroimaging studies to elucidate disturbing correlations between a digital lifestyle and cognitive problems such as diminished attentional capacity, decreased verbal intelligence, and mental health problems. Neuroimaging studies even indicated that internet use is changing the “neuronal architecture” of our brains. The monster is rewiring us.

    I had read a chapter from The Education of Henry Adams titled “A Law of Acceleration” for a college English class in 1987. I remember being mostly baffled by it at the time. But dipping back into the book recently, I found its portrayal of life accelerated to dizzying, horrifying speed by science and technology to be both relatable and prescient, especially considering it was written around the turn of the twentieth century: “Prosperity never before imagined, power never yet wielded by man, speed never reached by anything but a meteor, had made the world irritable, nervous, querulous, unreasonable and afraid,” it said, “and every municipal election shrieked chaos.”

    But where Adams was fatalistic, I was sanguine, having found my own law of deceleration, a life in which to be still and know God better, away from the contextless chaos of the information age. Having watched the unfolding of the digital revolution, I am glad I was not born into it. I’ve lived through the revolution, seen the world around me change radically. But unlike digital natives – those born since about 1980 – I could remember another way to live, and have found it again. That’s not to say that my way should be universal. Everyone has to find their own balance point where technology is concerned. But I suspect most people could find a much better one than they currently have.

    Recently, after a day that included hours with the feel of pen and paper on my fingers, I wanted to unwind. I had data left on my smartphone, so I kicked back and watched music videos on YouTube. As someone who came of age in the early years of MTV, I had to give the internet its due. Here I was with a depth and breadth of music videos that no 1980s VJ could have imagined, literally at my fingertips. Watching videos that night, I felt like a teenager in front of my VIC-20 again. Time suspended, and I looked up to see the clock spun into the morning hours.

    Deep in the 2019 World Psychiatry article is an interesting finding. The internet use which seems to be damaging so many also appears to be helping older adults retain cognitive function longer, by providing them with a “new and novel” form of engagement. I think of my eighty-three-year-old mother’s frequent Twitter use. Digital technology, I’ve come to realize, is not so much a monster as a mirror. When we look at it we see ourselves, and lives that for many have grown grotesque through over-connectivity. But the stillness and peace such connectivity hides is still right there, if we have the wisdom and will to take it.

    Contributed By

    Paul McDonnold is a freelance writer whose work has appeared in the Christian Science Monitor, World Magazine, Arkansas Life, and Texas Highways, among other publications. He is the author of the novel The Economics of Ego Surplus.

    1 Comments

    Sign up for Plough's weekly newsletter