I have experienced first hand all of the digital disruptions since the arrival of the personal computer in the 1980s. I remember in college as a Computer Science (CS) major having to go to designated buildings on campus to use “dumb terminals” that connected us to our accounts on “the VMS mainframe.” We did have some PCs in the lab here and there (IBM PC, Macintosh, and even Lisa), but such were mere amusing toys, not the serious machinery that a CS major needed to handle the thousands of lines of Assembly, FORTRAN, Pascal, or C code.

When the PC took hold in earnest over a decade later, it felt like someone had thrown us a curveball: ‘To do your work, you don’t need a dumb terminal, or VMS account, and you don’t need to be in a specific location either. You can just go to a store, buy some software, install it on your computer, and there you go. Do your work as you please at the convenience of your home, lab, office, or wherever your PC is located.’ Reasonable and exciting enough. And yet, for some reason, I remember it taking me and my fellow developers a while to get used to that idea. In our heads, we still had images of those plastic toy PCs in our labs, and it took a while for us to extend some respect to the PC.

The next disruption, the arrival of the Internet and the Cloud, threw us something akin to a change-up: ‘Remember that beautiful mainframe that you used to love, the one with dumb terminals that you used to use a while back? Well, we are going to go back to something like that, except that you can now do this from your PC and from anywhere and not from designated buildings!’ That was, for some reason, even harder for me to grasp. (I’m embarrassed to say that it took me a while to understand how Hotmail worked.)

Next was the arrival of the cell phone. This one I embraced with plenty of glee and no confusion, and I cherish fond memories of those early years in the late 1990s. (I still vividly remember my very first tiny cell phone: a light-as-a-feather, solid black with orange lettered and numbered buttons that felt perfect in the palm of my hand.) My glee further intensified when the Blackberry descended upon us like a blanket of Sky Blue locusts, it seemed. Now I could access my email while walking; while waiting for the elevator; while riding the elevator, and most conveniently, while in the bathroom! And Ah, that flashing red dot — how I do miss, if nothing else, at the very least that self of mine who was so innocent at the time that he saw nothing wrong with that flashing light…..

The arrival of the iPhone next, for some strange reason, I viewed, as did almost everyone around me at work (a technology company) as, how shall I put it: extravagantly unnecessary. Its beauty, compared to the industrial, unapologetically rational, sturdy, lean, and mean Blackberry, came across not only as superfluous but obviously irrational: The easy to break glass screen (drop the thing and it’s kaput), the absurd flat surface keyboard, the fact that it was absolutely awful with email, and the non-existence of the flashing red light, made it a non-starter for many of us. This was a fad, and that was going to be the end of that. But, somehow, here we are, with smartphones nothing short of an extension of ourselves. Being without your smartphone in 2022 is like walking city streets without shoes on. Only the odd and the eccentric would do it.

The next disruption was the arrival of social media. Here, I immediately understood Facebook and I embraced it very tightly, and for the longest time I saw it as pure good. Why? For the very good reason that it enabled me to be in contact with my mother, brothers and sister, and my old friends, from the old country, in a way that I just could not do before. Facebook brought me closer to my family and my real friends, and for this, no matter how toxic the platform has become, I was thankful, and I will remain thankful.

Twitter on the other hand, although I created my account there in 2007, I simply did not understand for several years. The same thing with Youtube. I just couldn’t figure out why they existed and why so much good money was being thrown at them (at the time, a hundred million dollars made a person whistle, and a billion dollars their jaw drop).

Then came my most favorite of all of the disruptions: the arrival of the Voice Assistant, first with Siri in 2011, then the staggeringly magical Amazon Echo in 2014, and then its accomplished imitator Google Home in 2016. I say my most favorite because it was the one that I longed for the most and yet the one that I expected to become reality the least.

Looking back at the last 30 plus years of absorbing such digital disruptions, here are my own key takeaways: (1) I never once saw the next disruption coming, (2) Every single disruption took many years to take hold — none of them happened overnight, (3) Every single disruption makes total sense in hindsight, and (4) I never fully understood the full implications of the disruptions — in fact, I would say that I had almost no real sense of just how deeply the disruptions were going to change our lives, society, culture, and politics.

As we launch into the Third Year of the Third Decade of the Third Millennium, the most obvious looming disruptions underway from my narrow perspective are those of Crypto, Virtual Reality, and Electrically Powered transportation. Having witnessed what I have witnessed so far, I can say for certain two things: First, it is impossible for me to fully imagine how things will turn out, and second, it is impossible for me to believe that the disruptions will be anything but unimaginably seismic.

Art: Disruption, by Nigel Radcliffe.