People tend to surround themselves with like-minded people - filter bubbles have taken that to a new level. Two German reporters were shocked when they entered the world of the far-right on Facebook via a fake account.
What goes on in far-right filter bubbles on Facebook?
To find out first-hand, two TV reporters for Germany's ZDF broadcaster created a fake account - 33-year-old "Hans Mayer," a proud German patriot with a clear penchant for right-wing topics. They encountered a world of closed groups, hatred, lies and agitation.
"Mayer," the reporters quickly learned, was surrounded by many like-minded men and women in a filter bubble that had little to do with reality and where objections never stood a chance. A filter bubble results from a personalized search and a website's algorithm selecting information a user might want to see, withholding information that disagrees with his or her viewpoints.
These filter bubbles are a "great threat to democracy," ZDF reporter Florian Neuhann says. He and his colleague David Gebhard had an idea of what went on in far-right filter bubbles, Neuhann told DW, but were "totally taken aback by the speed at which their fake account accumulated Facebook friends and the utter absurdity of the stories being spread."
People in filter bubbles focus their hatred on the same person or phenomenon - like Chancellor Angela Merkel or refugees - and they whip each other into a frenzy to outdo one another with abuse, explains Wolfgang Schweiger, a communication scientist at Hohenheim University.
On day three of the experiment, "Hans Mayer's" timeline brimmed with fake news and lurid headlines: stories about government plans to take away the children of right-wing critics, a report stating that the city of Cologne canceled its carnival celebrations for fear of refugees, fake Merkel quotes - all shared thousands of times. The reports often followed a pattern, with an actual source hidden somewhere in the story that had dealt with the issue on hand, however remotely.
Worldwide, populists benefit from such activities; their supporters rarely challenge the "facts" they are presented.
Humans, Schweiger says, tend to believe information passed on by those who hold the same or similar views they do.
A week into the experiment, "Mayer" had many friends on Facebook and was invited into closed groups where users openly urged resisting the system. Forget inhibitions: Interspersed between cute cat photos and pictures of weddings, posts would read "Shoot the refugees, stand them in front of a wall and take aim," while others denied the Holocaust. No one objected.
Blind to other people's views
By day 12, "Mayer" had 250 Facebook friends - real people who never met him in person but felt he shared their beliefs. Neuhann and Gebhard wondered what would happen if "Mayer" were to pipe up and disagree.
So they posted official statistics showing that crime rates have not risen despite the influx of hundreds of thousands of refugees into Germany. To no avail, Neuhann says: "We were either ignored or insulted."
It's a parallel world, Neuhann says. Part of the bubble is so far gone there is no way reasonable arguments can reach them, he adds, arguing that some people are only partially involved. They still have a life and maybe a job, so they might be approachable, though "perhaps not as much online as offline."
Asked whether the reporters are afraid now that their story is out in the open, Neuhann says no, since "Hans Mayer" wasn't the fake account's real name.
It hasn't been deactivated, but the journalists broke off their experiment after three weeks. The right-wing filter bubble continues to exist.