The Overlooked Problem On Facebook’s Content Monitoring

Facebook is a big and interesting influence in today’s world. Starting out as a way to connect Harvard students together, it has morphed into today’s most prominent and powerful distributor of news, as well as having an impact on our everyday lives. In fact, as of August 17, 2,047 million people are still active on Facebook (as from a report by Statista, a statistics and marketing company in Germany). But, recently social media platforms like Facebook are facing increased pressure to moderate content. However, use of how Facebook treats messages is unpractical, and can lead to more than the average amounts of mistakes.

To give out a quick synopsis of how moderation is handled on Facebook, whenever a user flags a post, it goes to a division in the company named “community operations team,” which is comprised of several thousand people that monitor the flagged contents. In an article from NPR, employees of this division say that they are told to go quick when evaluating a flagged report, and they are measured on the speed in which they handle these posts. To put it into perspective, the average worker in the department makes a decision on these flagged posts once every 10 seconds. In addition, the people taking care of these flagged posts don’t get to see the full context, for example, profiles of the users. Resulting from this methodology there has been much controversy; in 2016 they censored a Pulitzer Prize-winning photo known as “Napalm Girl” since it contained nudity. From this, Facebook received an uproar and criticism from Norwegian users and press, claiming that it was censoring the most powerful image to come out in history. Espen Egil Hansen, the Editor-in-Chief of the largest Norwegian newspaper, Aftenposten, stated that Zuckerberg was “the world’s most powerful editor” (from an article by the NPR).

More recently, Facebook has been popping up in the news due to censoring accounts reported on the Myanmar Conflict. The Myanmar (Burmese) Conflict is on ongoing military crackdown against an ethnic group known as the Rohingya people, an estimated one million people in Myanmar. Conflicts with their government and the Rohingyas is not new as they are stateless under the 1982 Myanmar nationality law (the Myanmar government allows that a person has to be in a certain ethnic group in order to be classified as a citizen). On February 3rd, 2017, an article by the United Nations made a report on the treatment on the Rohingya people by the Myanmar government, documenting “mass gang-rape, killings, including of babies and young children, brutal beatings, disappearances and other serious human rights violations by the country’s security forces.” On October 2, the conflict still went on. The United Nations reported about 720,000 children are displaced and fleeing Myanmar due to the conflict. Because of this, Rohingyas attempted to use social media platforms to report accounts of the incident. However, Facebook had their posts removed and their accounts shut down, as stated in an article by the Daily Beast. Facebook has responded by stating that they removed the posts because “it did not follow Facebook Community Standards” in regard to posts about the military activity in the Rakhine State depicting helicopters flying over the villages where most of the Rohingya population live.

Facebook’s methodology of handling these controversial posts poses a problem: it is difficult to distinguish context (because subjects can be very complex) in the fast rate that Facebook monitors its posts.