Several hundred thousand computer systems were hit by the "WannaCry" virus. Those who created it aren't to blame, but rather the intelligence community, which was trying to exploit a security gap, says Konstantin Klein.
The search is on for those responsible following the first wave of attacks of the "WannaCry" hack. That is how it always is when something major goes wrong. Who are the mysterious Shadow Brokers - the group that in April published information on two highly sophisticated NSA hacker tools? Who made use of these tools to unleash "WannaCry" against the general population?
There is also the matter of the usual suspects - those responsible for virus attacks on Microsoft Windows for years. The time when Microsoft opted for user friendliness over security is long over; the company has been releasing security updates for its operating system for quite some time. The gap "WannaCry" exploited Microsoft had patched in March - one month before the NSA's hacker tools had been published.
Admins under pressure
That is exactly the problem: Patches and updates make computers safer. They also make them more complicated.
Particular software and tools may function poorly, forcing large companies to run intensive tests before installing security updates or entirely new operating systems. The IT personnel of Deutsche Bahn, Renault and the UK's National Health Service are in part responsible: It is their job not to immediately implement a patch.
However, the software running train information, helping build cars or getting patients the care they need does not play a role in the eyes of the public. For the public, names like Microsoft and Windows remain intact. That is only somewhat comforting for software companies. Microsoft's president and chief legal officer, Brad Smith, was therefore quick to pin responsibility on government intelligence agencies. It was, after all, the NSA that exploited vulnerabilities for its own purposes without informing Microsoft. It was the NSA that allowed their tools to be stolen. For Smith, the theft is the same as cruise missiles being carted off a poorly guarded military installation. It is a terrifying comparison.
National security, individual insecurity
As members of an information society, we have learned to live with a certain amount of insecurity. Software is made by humans and therefore fallible. We can only expect relative security, which is achieved by learning from mistakes - and knowing about them. Keeping mistakes a secret due to national security only increases insecurity for individuals. British hospital patients are just one example.
Germany's Minister of Transport and Digital Infrastructure, Alexander Dobrindt, recognizes this. He wants security gaps made public. Microsoft's Smith has long said the same thing, calling for a digital Geneva Convention. That would make life difficult for the spy community, but easier for the rest of us. Even the most zealous security policymaker will think twice before requiring back doors built into encryption for state institutions to access. Such back doors demand the holder of the key to that door take extreme caution; otherwise, encryption is useless. That the almighty NSA failed at this is enough to question lesser authorities being in possession of such keys.
Have something to say? You can share your opinion in the comment section below. The thread will remain open for 24 hours after publication.