Elon Musk recently acquired Twitter for $44 billion Image: Stephen Lam/San Francisco Chronicle/AP/picture alliance
Twitter's sacking of content moderators raises concerns
November 16, 2022
Elon Musk has fired a large number of staffers who used to monitor Twitter for toxic content. A former contract worker told DW that she expects abuse to surge on the platform as a result. Experts are alarmed.
Until last Saturday, it was Melissa Ingle's job to help keep Twitter safe.
As a member of the company's "civic integrity" team, the data scientist monitored the platform for political misinformation about elections from Brazil to the US, and she wrote algorithms to automatically detect similar content.
"We wanted to make sure Twitter was a healthy platform," Ingle told DW.
Then, on November 12, a notification popped up on her phone, telling her she no longer had access to her work emails. When she realized that she had also been logged out of her Slack account, the 48-year-old knew she had been fired. She wasn't alone. Thousands of other contract workers had their contracts terminated last weekend, in addition to about 3,700 employees Twitter had already fired earlier this month.
"The cuts didn't really seem to be targeted," Ingle told DW. "It seems to be huge swaths of people who were just fired."
It is unclear how many people within Twitter currently work on content moderation efforts; the company, whose communications team has also been decimated, did not reply to a request for comment.
But Ingle warned that the recent layoffs would lead to "rises in abuse" and that a lack of human oversight could render the company's automated detection systems increasingly unreliable.
"Twitter will be a less enjoyable place to be," she said.
The role of human reviewers
Social media companies such as Twitter deploy complex mechanisms, parts of which rely on human-machine collaboration, to moderate content on their platforms.
Whenever users post new content, it is automatically scanned by software. If the program finds indications that the post could be illegal or harmful, it is sent to human content moderators for review. Their job is to decide if the post complies with the rules set by the platform, as well as local laws, or take it down.
Employing those human content moderators, however, is expensive — and Ingle said that after Musk completed his multi-billion dollar acquisition in late October, staffers were instructed to find ways how to minimize their role.
"The only instruction we got after the takeover was that things needed to be automated," Ingle told DW.
But such efforts underestimated the role human reviewers play in keeping a platform safe, she warned. Not only are people often needed to understand subtle cues in a language, such as sarcasm, but they also play an important role when it comes to testing the automated scanning mechanism of the platform. They ensure that algorithms remain up-to-date, as the nature and language of online debates keeps changing.
"My fear is that with not enough people there to update the script of what we're looking for … the filter that exists will get more and more porous," Ingle said.
'Heading in the wrong direction'
Ingle, a mother of two, said that she had been an avid Twitter user long before she started working for the company about a year ago.
"But now, I am very worried," she said.
Within the two weeks between Musk's takeover and her firing, Ingle said she saw how users increasingly "let lose and attacked people because they feel like that is going to be permitted on the new site."
Initial research seems to back her observation: A study conducted by Tufts University in the US, for instance, found that "post-Musk takeover, the quality of the conversation has decayed," warning that "early signs show the platform is heading in the wrong direction."
Twitter employees file lawsuit amid mass layoffs
Against that backdrop, the ousting of content moderators from Twitter is "worrying on multiple levels," said Eliska Pirkova, a policy analyst at Brussels-based digital rights NGO Access Now.
She warned that more toxic content could flood the platform once Musk decides to reinstate blocked accounts of users like former US President Donald Trump. If and when this will happen is unclear – as is when to expect Twitter's new "contention moderation council" that Musk announced in late October.
At the same time, Pirkova said that she expects marginalized groups and minorities to be particularly targeted by toxic content on Twitter – an assessment echoed by Julian Jaursch, a platform regulation expert at Berlin-based think tank Stiftung Neue Verantwortung.
"Less content moderation will disproportionally affect vulnerable groups," he said.
The recent developments at Twitter, he added, highlight the problems that come with having one person, who is accountable to no one but himself, in charge of one of the world's most important social networks.
"The case of Twitter shows that it's not in the public interest if one person alone controls such an important digital information space."