Even for the skeptics among us, it's hard to overstate the importance of this anniversary: 75 years ago - at the height of the Second World War - a 31-year-old German civil engineer called Konrad Zuse presented the Z3. It was the first programmable, automatic computer - and was widely viewed as the child of a family of machines we take for granted today, from desktop computers and mobile devices to the massive data centers controlling the world.
Compared to the phones and pads we carry in our pockets, however, the Z3 was huge. It was a cluster of glass-fronted wooden cabinets and wiring looms.
And its use was not intended for gaming or social networking on trams and in school yards, but for the German Aircraft Research Institute to perform statistical analyses of wing-flutter.
The Z3 was an entirely new concept, built well before the invention of transistors, as used in contemporary computer chips, and a good 40 years before Richard Feynman even proposed using quantum mechanics.
Instead, the Z3 was built with vacuum tubes as switching elements.
The Z3, zeros and ones
So what Google, Facebook, Apple and others do today all started in wartime Germany. Like so much else. But there's only a small chance Zuse and Schreyer knew what they were doing.
"In 1941, probably no one - not even Zuse or one of the other inventors of computing machines - could have imagined how, decades later, such machines would have an impact on our everyday lives," says Matthias Hagen, a professor of big data analytics at Bauhaus University Weimar. "Not to mention how small, powerful and cheap these devices would get."
General Purpose Technology
As with the anniversary itself, it's equally hard to overstate the ubiquity of computers today. Computing - whether as hardware or software - is omnipresent.
Eric Schmidt, the executive chairman of Alphabet Inc. (formerly Google), has said these technologies will "disappear" - we will no longer see them - once they have infiltrated every aspect of our lives.
Essentially, we are already there. It was to be expected.
"Computers are one of the 'general purpose technologies,'" says Hagen.
That puts them in the same league as steam power, electricity and the internal combustion engine, as described by Erik Brynjolfsson and Andrew McAfee in "Race Against The Machine."
"Computers are the GPT of our era," they wrote in 2011, "especially when combined with networks and labeled 'information and communication technology.'"
And, thanks to Konrad Zuse, computers and computer networks are what we have.
"Computers are a consciousness changing technology, perhaps the biggest one since the invention of fire," says Matt Black, a musician and creative technologies pioneer. "I read 'The Shockwave Rider' in 1976 and it blew my mind with a vision of computers and networks. Also 'The Selfish Gene' with its analogy between DNA code and computer code. Computers are nothing less than the next stage in our evolution."
How far we have come
If you're struggling to imagine the significance of the Z3 in 1941, try thinking back to your first computer and how it compares to the devices you use today. What was it? How much memory did it have? What was its processor speed?
Here's what some of our Twitter followers started out with:
I used a BBC Micro at school and had an Atari 1040ST at home for gaming and desktop publishing (...the young writer in me).
For Ijad Madisch, co-founder and CEO of ResearchGate, it was a Commodore 386SX-16.
"I became fascinated with computers around the same time I became fascinated with viruses. I asked myself why we couldn't compute the many different forms the [HIV] virus could take to help our immune system tackle it," says Madisch."I finally got a 386SX-16 that I mainly used for playing games and programming simple websites. But the initial idea stuck. Programming and science belong together and are key to tackling the challenges we face today."
These and other computers, like the ZX Spectrum, were light years ahead of the Z3, with its average calculation speed of between 0.8 and 3 seconds, and data memory of 64 words at 22 bits.
But at the same time, they are positively infantile compared to the laptops, smartphones and wearables we casually drop each day in the streets.
Quantum computing or bust
The development of computing won't slow now - at least not for the foreseeable future. It's a digital "revolution" only an alien counter-revolution could stop.
"I started with a [Sinclair] ZX81, a tape drive and a television," says Andreas Fuhrer, a quantum scientist at IBM. "And at that point, I did not think about quantum computing. I had my one-meg hard drive for a thousand bucks or whatever. But as I studied physics, I started to hear about these ideas."
The estimated power of quantum computing is staggering: a qubit can represent a one, a zero, or both at once - a superposition. Put simply, it can do loads more.
Fuhrer says we're in a similar "initial phase" with quantum computing as people were in with "the big" Zuse machines in the 1930s and 40s.
"People are learning how to program these systems and finding applications as they go along. We've shown we can build the basic unit, this 5 qubit unit cell, and we have an idea how it can be scaled," Fuhrer says.
And isn't scale everything? We're told we want ever-faster, ever-smaller devices. To this end, the IBM system is currently available via a cloud platform, because it's too big and fragile to move around.
But I'd like to suggest it won't be long before we all get networked, quantum implants at birth.
"I see no end to the technological developments regarding computing," says Professor Matthew Bailes, an astronomer at Swinburne University of Technology. "I suspect that in the future we will soon wonder how we managed with such primitive devices in 2016. Artificial intelligence will be the dominant issue. It will be the last great revolution in human technological evolution."