Rights experts have responded to a call for more online extremist propaganda to be removed in the wake of the Würzburg train attack. The debate follows revelations that the assailant was likely radicalized online.
Digital rights groups have warned the German government against giving huge internet platforms like Google and Facebook greater responsibility for removing extremist propaganda following the knife and ax attack on a train in Bavaria this week, which left five people injured.
Authorities believe the attacker -a 17-year-old asylum seeker - radicalized himself in a short period of time by consuming jihadi propaganda online. On Thursday, Interior Minister Thomas De Maiziere called for web platforms to delete more illegal content, to stop others from being tempted by extremist groups.
But Markus Beckedahl, Editor-in-Chief of the digital rights blog Netzpolitik, has warned against increasingly handing removal decisions to tech companies, saying it would lead the country down a dangerous path towards censorship.
"Our politicians are forcing a privatization of law enforcement. What should be the decision of a judge for deleting content is being given to internet platforms," Beckedahl told DW, adding that only the judiciary can decide whether something is propaganda or legal under freedom of expression laws.
Currently, Internet service providers (ISPs) and other platforms such as Facebook and YouTube can be requested to remove extremist content, which is often flagged by intelligence agencies and police. If they refuse, the courts can order them to do so.
In the last year, Europol has established a separate unit to tackle the issue at the European level. Intelligence officials have admitted that jihadist groups have achieved a "sophisticated understanding of how social networks operate," and had "launched well-organized campaigns to recruit followers and promote or glorify acts of terrorism."
In May, the European Commission went further, signing up many of the big internet players to a voluntary code of conduct, where they would, "take the lead on countering the spread of illegal hate speech online," and agree to remove violent material within 24 hours.
In February, an online video showed a doctoral student at Germany's Darmstadt University supporting IS
But privacy rights lawyer Christian Solmecke told DW that internet companies were themselves, wary of taking on additional responsibility.
"The problem is that many of the platforms don't have enough staff to verify the content that is reported to them," Solmecke said, adding that de Maiziere's pledge to require operators to actively check more content is, "not feasible."
Where's the debate?
Netzpolitik's Beckedahl warned that despite the dangers of giving more control to internet firms, there has been little public discourse about the issue.
"Politicians have been working on this for more than 2 years but no one is discussing it. We need a real debate. As a society, we must decide whether to go down this route or find legal, constitutional ways to resolve the problem," Beckedahl insisted.
German officials have confirmed that a threatening video purporting to be from the 17-year-old asylum seeker who carried out Monday night's train attack near Würzburg was authentic. In it, he said: "I am a soldier of the caliphate and I am going to carry out a suicide attack in Germany."
The self-styled "Islamic State" (IS) group and other terror networks often post videos glorifying violence and praising terrorists. In recent years, radical preachers have uploaded their hate speeches or published essays, stirring up religious or racial aggression, while social media posts by radicalized individuals sometimes incite others to commit acts of terrorism.
On Thursday, de Maiziere specifically pointed to "hate mail, bomb-making instructions and other similar content," saying it wasn't "too much to ask" that offensive content was removed from the web "faster."
But Beckedahl insisted there is still no clear, legal definition of what constitutes extremist content, and that web platforms have differing rules of what is illegal.
New technology planned
Last month, the Reuters news agency reported that Facebook and YouTube were deploying automatic software that could block or remove IS videos and other violent propaganda. The technology can locate unique digital fingerprints that are assigned to specific content, allowing content to be blocked speedily.
Separately, US scientists from the Counter Extremist Project (CEP) have created an algorithm that they claim also works to reduce the amount of extremist propaganda online.
Although the authors have promised to share the technology sparingly, both ideas have been met with some skepticism.
"Algorithms can help platforms to find [extremist] content more quickly and delete it. But, the assessment of whether content is illegal or not must always be done by people," said Solmecke, from the Cologne-based WBS Law.
Beckedahl was more explicit, saying: "I wouldn't really rely on this [algorithm]. Of course, you can try to block certain keywords. But this would impact all internet users. A journalist, for example, wouldn't be able to report stories that use these words because they would be blocked too."
He warned that the implementation of automatic technical filters would amount to censorship.
"We thought this was only possible in repressive regimes. Whenever we've seen it before, it has been very opaque and often violates freedom of expression and freedom of information laws. ISPs will also have to invest money to make this happen," said Beckedahl.
Most internet platforms have so far remained non-committal about whether they'll use the technology.
Growing public pressure
Intelligence agencies, opposition MPs and others have warned for months that the terror threat that has struck in several European cities would soon hit German soil.
Although the Würzburg train attack was minor compared to the recent violence in Nice, Brussels and Paris, rising concern among the German public is likely to lead to demands for quick answers to ensure a major attack doesn't happen here.
Amid concerns that the slow pace of the judiciary will allow extremist groups to remain one step ahead, some politicians are clearly prepared to live with accusations of censorship, if it means keeping the public safe from terrorism - much to the ire of rights groups.
"When you live in a constitutional democracy, this [going through the courts] is the normal way," argued Beckedahl. "You can always invest more money in the legal system so it can work faster."
Solmecke said Germany's current laws protect freedom of expression and "should not be changed."
"Instead, prosecutors should ensure that platform operators assume a stronger liability [if they don't comply]," he told DW, adding that it would be possible within existing laws, but is "curiously" not enforced at present.