This public health emergency has been a rude awakening for people opposed to a full — some may say invasive — digital society. Others say it's the wake-up call that sectors such as education and primary care have long needed.
"Whole sectors — after about 20 years of talking about doing more things online — have suddenly done it in a week," says Helen Margetts, Professor of Society and the Internet at the Oxford Internet Institute and Director of the Public Policy Programme at the Alan Turing Institute.
Read more: We're dripping with data
But the more we do online, the more data we create. More data, more risks. And potential benefits.
"Data is really important, and that applies through all the different kinds of data — testing data, contact tracing data and data that enables you to assess policy interventions and deal with the consequences of policy interventions that are being made in a rapid-fire way," says Margetts.
Even before this pandemic started, says Margetts, we could have analyzed data to assess whether it makes sense to have a lot of sick people sitting in a crowded waiting room before they speak to a general practitioner for 10 minutes.
"It could be that a lot of that activity is moved online and that it's for the better," says Margetts, "but knowing that will be crucial to the kind of data that is generated by that very activity."
Read more: Sexy Siri, you made a fool of everyone
"So, there is an argument for thinking, 'How can we learn from all this?' It's like a massive natural experiment," says Margetts, "although a real experiment would be randomized."
Digitalization and its discontents
That experiment, whether real or unreal, is also happening in schools, where at least in Germany it's highlighted a range of inefficiencies. Digital education experts say the sector is stuck in analog ways.
"We've had to catch up in a very short space of time because of the coronavirus crisis," says Jens Brandenburg, a member of German parliament with the Free Democratic Party. "Data protection, technical implementation, and the ongoing training of our teaching staff have become urgent issues, because we've ignored them for so long."
For instance, some schools lack a dedicated cloud for exchanging learning materials. Teachers often use their private computers and private email addresses, with no standards for updates or data security. And there's hardly any IT backup.
Since the start of the lockdown, all that has been down to the teachers. They have done a grand job managing a difficult situation — but schools, and teachers individually, have had to make important decisions on the fly.
As a result, digital platforms like the open source Moodle, Zoom, Instagram, YouTube, Sofatutor, or German-language learning apps like Anton, have profited from a lack of forward-thinking in German schools. They have stepped in where schools have traditionally feared to tread. But there's nothing necessarily nefarious about that.
Some federal states, such as Baden-Württemberg, may have been better placed to weather — what becomes — a digital storm. Schools there have long used and tested teaching platforms like Moodle, and other cloud systems.
Elsewhere, there was a desperate attempt to get organized at the start, via email and the occasional, impromptu real-world, letterbox drop-off.
As the lockdown got extended, more schools turned to Moodle and invitations to meetings on video conferencing platforms — mostly Zoom — became regular fixtures on the schedule.
That has raised all manner of questions about data protection, with schools getting parents to sign (or re-sign) release forms.
Some teachers are keen adopters, while others refuse to use it.
"There is a real worry that data governance rules are being ripped up and challenged by the need to move quickly. Managing to do that in a crisis, in an ethical way, is something we should prioritize," says Margetts.
In April, a virtual meeting organized by a school in Baden-Württemberg was apparently hacked and pornographic images were posted into the session.
The state's data protection authority, perhaps more vocal than others, warned of the risks — some caused by so-called "user error" and others by bugs built into Zoom's system. The authority suggested two alternatives: an encrypted messaging app called Threema, and an open source platform called BigBlueButton for video conferencing. Zoom has since taken steps to improve its security.
Read more: 6 top scoring encrypted messaging apps
Technological safety issues on digital platforms are well-documented, but difficult to address or fix fully.
"There are two aspects to data protection, and first there's user error," says Dr. Lutz Hellmig, a computer scientist at the University of Rostock. He's talking about a user intentionally or unintentionally doing something that puts them or other people at risk.
"Then there's a general lack of respect and understanding for the technology," says Hellmig. "But if schools provided teachers with computers that met data protection rules, some forms of misuse just wouldn't be possible."
Hellmig says we will never achieve 100% data security online. But leaving teachers to their own devices — quite literally — is "unacceptable," he says.
That some adults, and that includes politicians, don't understand the risks is a given.
"There are many people in education politics who feel no personal connection with the digital world," says Brandenburg. "They don't understand what younger generations are going through and that can lead to a lack of innovation at schools."
There are also those who seem happy to take a gamble on technology, like British Prime Minister Boris Johnson. He tweeted an image of the UK's first digital cabinet meeting that was held via Zoom at the start of the lockdown.
The tweet, replete with the meeting ID number, went viral and Johnson was promptly criticized for the potential security breach.
And then there are the kids.
We call them "digital natives" as though they understand everything digital. But some experts, such as Professor Birgit Eickelmann at the University of Paderborn, says that's not strictly true.
During a webinar on digital education policy, hosted by Germany's "Gesellschaft für Informatik" — an association of computer science educators and researchers — Eickelmann cited studies, showing people use technology more than they truly understand it.
Read more: Opinion: The revolt of the masses 2.0
So, take a group of kids, more at ease with technology than their teachers, but just "using" it. And during a school session on Zoom, they start posting random text in the platform's chat function. If the teacher polices that, then no problem.
But if an artificial intelligence wanted to trawl through Zoom meetings for personal, identifying information, even if it was "just" for advertising purposes, or a hacker was looking for kids to groom, text would be the easiest point of entry.
There have been huge advances in automated audio and video analysis. But you still can't beat a text search for its simplicity and speed.
Then, of course, any kid who has ever used a smartphone will know about screenshots — so, even if your virtual meeting was fully secure and perhaps even "sandboxed" or locked in a dedicated school environment, a kid could take a compromising screenshot and post it anywhere else.
Once that's done, those parental release forms become meaningless. Because what's done online is done.
"Yes, that is true, but that was also true in analog times," says Hellmig. "You would have kids secretly taking photos of teachers with a camera hidden under a desk. But you've just got to teach the kids that they face a form of prosecution if they do that, and then actually follow through."
But on the second anniversary of the European Union's data protection legislation, known as GDPR, some watchdogs say the rules don't get enforced enough.
Add to that a lack of training among teachers. If teachers can't assess the risks, how can they help their students protect themselves?
Hellmig says teachers need better, ongoing digital training to give them the tools and knowledge to judge the risks and benefits of technology as it develops. He is one of the main authors of new working paper by the "Gesellschaft für Informatik," which includes recommendations for the future of digital education in Germany.
"The coronavirus may have presented us with an opportunity to change education in unprecedented ways, but that opportunity may soon be gone again," says Hellmig, hinting at a time when the pandemic is considered over and schools reopen fully.
Data: A double-edged sword
We all need data — even just basic information and advice. And that may have presented new opportunities for data-driven tech companies during the pandemic.
But should we worry if it helps us stay safe?
In the UK, for instance, social media platforms like Twitter are promoting COVID-19 information from the country's National Health Service at the top of peoples' feeds — putting it above other posts.
"It raises a red flag," says Dr. Monica Horten, a visiting fellow at the London School of Economics and Political Science and author of The Closing of the Net.
"The global platforms may be using the public health emergency to embed themselves with governments," says Horten.
Horten is concerned that social media platforms have placed themselves in a dominant position to influence the flow of information — even later, when all this is said and done — although it's just an observation at present. Something to watch.
She is also concerned about the data collected by contact tracing apps.
As economies reopen, despite high-level warnings that we're heading into a second wave of coronavirus infections, governments and health authorities will want to track our movements to track the spread of the virus.
"If we open schools, or parts of the economy, we will need to understand what we're doing," says Margetts. "What evidence is there to say we should open schools or not? That's crucially important."
And that's a clear, short-term objective. What is less clear is what will happen to the data later.
"It must not be possible for governments or commercial providers [of contact tracing apps] to obtain the 'social graph' of individuals," says Horten. "The social graph of a contact tracing app would reveal not only data about people the user knows, or those with whom they have had contact, but it could also [be] people they have just been near. This would be overly intrusive and is not necessary for the purpose of addressing the public health emergency."
Horten says there should be a termination arrangement, so that when the public health emergency is declared over, the contact tracing apps cease to function.
But will the data die at the same time? Or will it sit somewhere for years to come, waiting to be bought by health insurance companies down the line?
"There are questions of transparency, accountability, bias and fairness, what are the plans for decommissioning a project," says Margetts. "And you may get easy answers at the moment, like 'Don't worry, we'll work it out later,' but it shouldn't be too much later. You've got to lay down those principles early on."
All that, it seems, has yet to be resolved.