Take a look at the beta version of dw.com. We're not done yet! Your opinion can help us make it better.
Fake news spread via Facebook has triggered several communal clashes resulting in deaths in Bangladesh. Bangladesh's post and telecommunication minister told DW Facebook failed to create measures to identify hate speech.
Bangladesh is one of the fastest-growing countries of internet users in Asia, with government data showing more than 50% of its population actively use the medium.
Among internet users, Facebook is the most popular social networking platform, with some 50 million subscribers — almost one-third of the country's population. In 2017, a study by We Are Social and Hootsuite found that Bangladesh's capital Dhaka was the city with the second largest number of active Facebook users.
But while the platform does contribute largely to making the voices of marginal communities heard, in recent years it has sparked some significant social and communal conflicts in the country.
Fake news sparks violence
A number of violent clashes in Bangladesh following rumors and fake news spread through Facebook have resulted in deaths.
On October 20, hundreds of Muslims took to the streets in the town of Borhanuddin in Bangladesh's Bohla district, 195 kilometers (120 miles) from the capital, Dhaka, to protest a derogatory Facebook post about Islam's Prophet Muhammad that was allegedly written by a Hindu man.
Clashes between protesters and police ensued and four people were killed. Police said the Facebook account of the alleged youth had been hacked and that the hackers had been orchestrating a clash between the two communities.
Earlier, in June, a rumor spread on Facebook that a bridge under-construction required human sacrifices as offerings and consequently people were looking for children to kidnap. Agitated mobs in Bangladesh beat several people on the street to death after suspecting them of being kidnappers.
Facebook's efforts not enough
A spokesperson from Facebook told DW that the company is equipped with proactive tools to detect hate speech, rumors, and related content. It has a team of 15,000 people who speak 50 different languages and are employed to review content from around the world.
The company also said it is working to develop effective artificial intelligence (AI) to proactively detect violent content.
But experts say the steps Facebook has taken are not enough to tackle the millions of texts and images uploaded by almost 2 billion users every day.
In September 2018, the United Nations Fact Finding Mission in Myanmar released its full report on potential genocide, human rights abuses and war crimes against the Rohingya ethnic minority.
The report stated that, "Facebook has been a useful instrument for those seeking to spread hate, in a context where, for most users, Facebook is the internet."
The Mission noted that "the response of Facebook has been slow and ineffective" and that "Facebook is unable to provide country-specific data about the spread of hate speech on its platform, which is imperative to assess the adequacy of its response."
Bangladesh has expressed similar concerns regarding Facebook's steps on reviewing content.
Parallels drawn between Myanmar and Bangladesh
Bangladesh's Post and Telecommunication Minister Mustafa Jabbar criticized Facebook "for its inability to come up with immediate measures to identify content spreading hate speech and disinformation in Bangladesh."
"It seems to me, Facebook is either unable or not paying necessary attention in this regard,'' Jabbar, who is also an IT expert, told DW.
Jabbar noted that Facebook's automatic system often fails to identify Bengali language, and questioned how a system would be able to review content if it cannot detect the language properly.
Facebook told DW that in light of the incident in Bhola, Bangladesh, it was closely monitoring the situation and was in touch with local authorities and their partners on the ground.
"We will take any action necessary to remove content that violates our policies or poses a risk to people's safety," Facebook said. "We also urge everyone to use our reporting tools if they see any behavior that puts people's safety at risk."
Culture-specific measures needed
Atiqur Rahman, a media expert and PhD researcher at Australia's Queensland University of Technology, said that Bangladesh's collective social structure meant it does not take much time for information to be spread widely.
Atiqur, who is also an associate professor of Bangladesh's state-run Chittagong University, said Facebook would only be able to tackle such a situation if they set up a monitoring team that considered the nature of Bangladesh's society.
Minister Jabbar echoed Atiqar's sentiment, saying that instead of looking at Bangladesh's culture from the perspective of Europe, Facebook should focus on understanding the local context.
Media literacy lacking
Analysts have said the responsibility for checking hate speech and rumors rests not only with Facebook authorities but also the subscribers using the platform.
Facebook has urged the Bangladeshi government to develop Facebook users' media literacy to help them use social media in a responsible manner.
It requires a high level of awareness among the users, said Atiqur, adding thatsocial and political awareness among the people and good governance should be in place for checking such unwanted content.
The Bangladeshi government has initiated a number of awareness-raising programs related to social media education, according to Minister Jabbar.
Jabbar said his government is planning to introduce the concept of media literacy to a textbook that he thinks could effectively help the younger generations to maintain responsible behavior on social media.