Facebook has responded vigorously to challenges around fake news and its community standards.
As one of the world’s biggest communication platforms, Facebook acts as a virtual space for around a quarter of the world’s population, but until now the company has been loath to police its own world. It has been vigorous on non-violent issues like nudity, but slow to respond to widely-shared content like fake news and promotion of violence.
This came to a head in Myanmar, when messages, posts and comments attacking the Rohingya, a Muslim minority group, created an atmosphere in which violence flourished.
“We were too slow to get started in Myanmar,” admitted Facebook CEO Mark Zuckerberg in a statement last week.
This statement came in the wake of a report of the state of Facebook-incited violence in Myanmar released earlier this month. Facebook commissioned Business for Social Responsibility (BSR), an independent, non-profit organisation with expertise in human rights practices, to evaluate the level of Facebook’s involvement in the recent spread of violent attacks in Myanmar. The focus of the BSR’s report was on Facebook’s role in the persecution of the Rohingya, a Myanmari Muslim minority group.
A strong message throughout the report was that Facebook struggles to manage the spread of fake news, allowing an atmosphere that promotes violence. The misinformation ultimately affects the perceptions of what’s true and what’s false to Facebook users in Myanmar.
“Facebook has become a means for those seeking to spread hate and cause harm,” said BSR in its 60-page report. A long list of potential fixes has been given to Facebook for implementation. The organisation warned that, if the situation was not remedied before Myanmar’s parliamentary elections in 2020, it would be “likely to be a flashpoint for hate speech, harassment, misinformation, incitement to violence, and other actions designed to undermine the political process”.
Zuckerberg’s said that Facebook had taught him painful lessons about the way people connect: “One of the most painful lessons I’ve learned is that when you connect two billion people, you will see all the beauty and ugliness of humanity.”
According to a Facebook statement released with the BSR report, there were still challenges ahead for the platform’s remedy to the situation, because Myanmar hasn’t standardised Unicode in its digital representation of the Burmese language. Facebook claimed that this makes it difficult to detect posts which contain hate speech. As a fix, the company will be encouraging the country’s transition to Unicode, and will not allow new users to make use of non-Unicode typefaces.
Unicode is an international standard for representing languages and scripts where each letter, digit or symbol is represented by a unique number combination.
Facebook also said in a statement issued on Thursday that it realises that nuances exist in how the platform is used in different countries. In Myanmar, it is primarily used as an information source; users in Myanmar often refer to Facebook and the Internet interchangeably. This is due to Facebook’s Free Basics service, which allows users to make use of the platform without paying for data.
This highlights the power Facebook has over the South-East Asian country’s communications and level of information. For this reason, the company said, it would be launching the Facebook Journalism Project, which includes journalist safety training for Myanmar’s journalists.
While this news comes as a relief to many, many within Myanmar are less convinced. An activist who goes only by the name Zarni has called the report “a complete whitewash”. He said that Facebook was “resorting to PR” instead of taking stronger measures to stop the spread of fake news on its platform.
Read on: The effect on Facebook’s fake news situation