Let’s track some changes to technology over the recent past. Although technological and digital trends occurred with those social changes in the previous post, it’s worth putting the tech ones in a separate pile and spending a little more time on them, because they are the background to most of what we have to deal with later on.
So first I’ll sketch out some trends, and then we can consider how Christians have responded to them, both to the background noise of technological changes over the decades, and the sudden crisis of the pandemic.
I was at school during that first moon landing, and I remember watching the flickering footage on a big old TV in the physics classroom, with Mr. Colman in charge. It was remarkable that all the calculations were done with pencil, paper, slide rules and log tables; today they could be done in the fraction of a second by a computer. If you don’t know about slide rules and log tables, let’s just say that your calculator has saved you many hours of misery and given you unimaginable accuracy.
Behind the physics classroom was the small space set aside for the Computer Club, where I was a happy member. Let me describe the cutting edge of technology in those days: we had two languages, BASIC and FORTRAN; one required you to punch small holes in long streams of pink paper tape, the other used thick stacks of preprinted cards, and again you had to punch holes. In early James Bond movies, the baddies used punched cards and pink paper tape. That’s how cool we were.
The ultimate skill was to design a code (all drawn out on paper first, as a flow chart) which would allow a printer to produce an outline of Snoopy, using the letter X.
I tell you that, not to make you gasp at my age, but to point out that computing was at that stage still quite a physical, analog activity. At times we wondered when computers might have sufficient power to produce a realistic image, or even a movie, but that was still unimaginably difficult.
At this stage, computers just helped us to do what we were doing anyway, but much faster.
They were sharper pencils.
Bicycles for the mind
Of course, suburban south London was not where the brightest minds were working on the issue. Across in the States, young men like Bill Gates, Steve Jobs, and Steve Wozniak were making their mark, not only by asking those questions but beginning to answer them. Computers grew screens, keyboards, mice. There was a trend, first observed by one of the co-founders of Intel, Gordon Moore, and nicknamed ‘Moore’s Law’, that computing power seemed to double in power and halve in price every couple of years. Gates, Jobs and their teams were asking what they might be able to do with such technology, imagining the exponential power over the decades to come; not just, what can we do faster, but what could we do that we could never have imagined. Jobs, in particular, made great play of the ability of the enhanced computing power to make computers easy to use, and therefore liberating. He famously drew a word-picture, to show they were tools to take people to new, unexplored possibilities of freedom. He said it himself, “What a computer is to me, is it’s the most remarkable tool we’ve ever come up with. It’s the equivalent of a bicycle for our minds’. And an Apple ad in the Wall Street Journal captured it too:
‘When man created the bicycle, he created a tool that amplified an inherent ability. That’s why I like to compare the personal computer to the bicycle. The Apple personal computer is a 21st century bicycle if you will, because it’s a tool that can amplify a certain part of our inherent intelligence.’
It’s important not to miss this transition. At this stage, buying a computer meant having to programme it too. Basically, you had to join the Computer Club. Even industry giants like IBM couldn’t imagine the value of every home, every student, having a computer – their own personal computer.
There has always been room for a geek, and there always will be. Classic car clubs exist for people who want to take car engines apart and make them work well, with spanners and oil. When our great-grandparents started to drive, everyone had to do that, but now it’s a hobby, and the rest of us drive more reliable cars. So with computers. With my first few computers I had to learn how to take the back off to put in a second floppy drive, a CD-ROM drive, or more memory. To print, you had to get inside the printer, and check the combination of an array of tiny ‘DIP’ switches, to make sure the printer and computer would talk to each other. To use italics meant inserting a new wheel to print from.
I’ve no doubt there are lots of groups of computer fans, who still meet to swap their daisy wheels, and manipulate their ‘DIP’ switches with the tip of an empty biro.
Because what the next generation of computers were offering was something that anyone could use, and almost forget about while they’re using it. And it has been a remarkable achievement. Typing on my laptop is the same kind of exercise as writing with a pencil – I don’t have to think about the technology of either tool in order to use it. Italics is as easy as pressing ⌘I.
So suddenly, the user is liberated – how do I want to use this power? Watch a movie? Browse the news? Listen to some music? All at the same time? More than that, I can make something: Shall I edit a film? Record a track? You might like my style of photographs, commission me to take some for you, and I can sell them to you, without a retailer in between us.
And we’ve moved from being consumers of culture, to creators.
The increased power of the computer went alongside a second part of the Moore’s Law trend, which is miniaturisation. Each generation of chip can become smaller – which means the manufacturer can either get increased power in the same format, like each iteration of a laptop, or make each format smaller.
As Peter Quill boogied his way across planet Morag to the soundtrack of the seventies, he wore headphones tied to a plastic box of tricks with a mix-track on a cassette
Small is not new. Thanks to Guardians of the Galaxy a new generation of kids has been introduced to a technological wonder As Peter Quill boogied his way across planet Morag to the soundtrack of the seventies, he wore headphones tied to a plastic box of tricks with a mix-track on a cassette. ‘What’s one of those things, daddy?’ the youngsters cried. ‘That, my child, is a Sony Walkman.’ And the best kind of tribal elder would go to the spare bedroom and rootle round in a dusty drawer till he found one. ‘Like this.’
At least, that’s how I like to imagine it was in our house. Rather than hoots of derision.
That little box of tricks was a revolution when it came out. Suddenly, your choice of music could go with you wherever you went. Cycling, skateboarding, even commuting – basically, the world now had a soundtrack. Your own personally chosen soundtrack.
Then you could get one which played CDs, though it did skip tracks when you were jogging.
But the big transfer occurred with the MP3 players, popularised overwhelmingly through the iPod, and then onto phones. Suddenly you could carry thousands of your songs in your pocket, curated as you chose. And not just songs: podcasting meant anyone could make their own niche broadcasts, to reach anyone else who shared that niche.
This was going on at the same time as the opening up of broadcasting. Put Netflix, Sky and YouTube into the mix, and suddenly the options for personal choice become dazzling. You are at the centre of your own entertainment choices. Remember – I grew up in a world on black and white TV, two channels, and nothing on in the daytime. I still think Quill’s Walkman is cutting edge.
Multiply power by miniaturisation by creativity, and you have a wonder in your pocket: your smartphone, on which the least interesting thing you can do is make a phone call.
It’s been a long time coming. Way before Star Trek had its communicators, or Arthur Dent had his Hitchhiker’s Guide to the Universe, the eccentric, flawed genius Nicola Tesla had outlined what was coming in the wake of ‘the wireless’. This comes from 1909:
“The practical applications of the revolutionary principles of the wireless art have only begun. What will be accomplished in the future baffles one’s comprehension… It will soon be possible, for instance, for a business man in New York to dictate instructions and have them appear in type in London or elsewhere. He will be able to call from his desk and talk with any telephone subscriber in the world. It will only be necessary to carry an inexpensive instrument no bigger than a watch, which will enable its bearer to hear anywhere on sea or land for distances of thousands of miles.” Nikola Tesla, in the New York Times in 1909
Speech, and speech to type. Global mobile phone coverage. What’s more, in 1926 he even intuited the internet:
When wireless is perfectly applied the whole earth will be converted into a huge brain, which in fact it is, all things being particles of a real and rhythmic whole. We shall be able to communicate with one another instantly, irrespective of distance. Not only this, but through television and telephony we shall see and hear one another as perfectly as though we were face to face, despite intervening distances of thousands of miles; and the instruments through which we shall be able to do his will be amazingly simple compared with our present telephone. A man will be able to carry one in his vest pocket.” Nikola Tesla, in Collier’s magazine, Jan 30, 1926.
There were some false starts along the way, as well, with a number of tech giants trying to run before they could walk. But over the past decade or so, you’ve moved from seeing your mobile as a phone, to seeing it as a multipurpose tool. I’ve even written parts of this book on mine – and the fact that you don’t think that’s weird, shows how fast we’ve all adapted and embraced the shift.
Let’s be honest – having a computer this powerful in your pocket can be seriously useful and also massive fun. Take it out and look at it, as a thing. In the last day alone I have checked my bank balance, spoken to a friend, read a book, skimmed the news, edited a photo, and reset our burglar alarm. From the same device, smaller than a bar of chocolate.
We shall consider some of the downside of that capacity, but the headline is this: distraction. Each of those activities being possible, they itch me all the time. The slightest need to check that I’ve been paid, and I’m back on the screen. More than that, that screen pulls my eyes away from everything else. Remember that photo I was editing? Well, I wrote this paragraph in the glorious Lake District, surrounded by 360 degrees of God’s greatest work of art – and instead of soaking that up, I was being sucked into my screen.
Arguably the greatest change that those technological shifts have brought, though, is connectivity with each other. The development of Facebook, Twitter, Instagram, WhatsApp and a dozen more that I’m too old to know about, has gone hand in hand with both the technology they need to be useable, and the social consequences they have produced. Let’s pause to register what is going on.
We have become aware of each other, and of ourselves, as no previous generation has had the opportunity, or need, to do. Or indeed, expected to. Back to the Future II held out a series of fun inventions, like self-drying clothes, the time-travelling DeLorean or – my favourite – the hoverboard, but they were all within the imagination of a 1980s film making crew, and just extended what they knew. The hyper-connectivity of our world, a mere forty years on, was unimaginable. Which means we have been unprepared in our imaginations; we haven’t been expecting this one, in the way that movies have been preparing us for years for watches-as-phones or flat screen TVs.
And for all the claimed opportunities, it is becoming increasingly well-documented, that this is causing disproportionate problems. It seems hard for humans to keep social media on the level of sharing motivational quotes and pictures of what you had for breakfast. We compare ourselves with each other, and feel twinges of pride, or envy, or greed, or hate, or lust. Our vanity kicks it as we add filters to our selfies. Our fear of being left out means we are ‘on’ all the time.
We also struggle with being ‘off’. Humans are social animals, but we’re also independent. Historically, solitude and silence were as much a part of our lives as company and noise. Yes, and loneliness and boredom, too. But the solitude and the silence and the boredom were where many people got their best work done. Our current reality, though, is that solitude and silence have to be deliberately sought out and chosen. They don’t just exist as a normal part of the day, to be used or lived with. We have to put them into the diary, and then force ourselves to use them. And boredom? Who allows themselves to be bored these days?
What’s more, social media is an unforgiving place where only perfection is allowed. One careless tweet can lose you a job, a recording contract, a friendship. Years later. Your future employers will read the tweets you posted yesterday in the light of the morals of tomorrow, and judge you. It’s not hard to come across well-known people being de-platformed from conferences or campuses, not because the views they hold now are unacceptable, but because views they once held are unacceptable. Now this is a wider cultural development of course, currently being expressed over whether statues should be pulled down, or books pulped. That’s not for us to discuss here. But the subjective experience of each of us, is: and if you feel the need to be permanently on-show, getting it right, not slipping up, then you feel it too. Some people make a living and a reputation by courting criticism; the rest of us have learnt to fear it.
We’re back with the idea that the internet is the Wild West – but this time, you’ve woken up in Dodge City.
 August 13th, 1980.
What’s the most remarkable technological invention you’ve seen?
Are there any which still give you pleasure, years later?
What are you still waiting for?
This is an adapted excerpt from @church: is online, off limits?, now available on a Kindle near you!