What Facebook's Improved Community Standards Mean for Filipino Users
Picture this: you’re coming home from work. It’s already late and you’ve decided to tough it out standing in the bus rather than wait out the tide of commuters. You pull out your phone to pass the time while traffic is at a standstill. You tap the icon and begin browsing Facebook, when suddenly, your friend’s shared post pops up. It’s a long read, but the picture grabs your attention: it’s a naked woman in bed. You read the post, taking care to make sure nobody sees you and accidentally thinks you’re looking at porn in public. The poster was the girl’s ex-boyfriend, and apparently she cheated on him, and he’s posting this as revenge.
Sounds familiar? If Facebook does their job right, then it shouldn’t be. Posts like these violate their Community Standards, rules that outline what can and can’t be shared on the platform. And they’ve been doing a lot on this front: From July to September alone, Facebook has removed 30.8 million examples that violate their Community Standards, 96% of which were removed automatically before anybody could report it.
Recently, Facebook has been updating and fine-tuning its Community Standards to better ensure that no unsavory content goes through. As Sheen Handoo, Facebook Asia-Pacific’s Content Policy Manager puts it, the goal is to “create an environment where people feel both free and safe to express themselves.” They achieve this by consulting with experts, employing over 20,000 people around the globe to deal with safety and security issues, and investing heavily in AI to proactively detect violating content.
All of these are put in place to make sure any content that violates Facebook’s standards is removed, if not instantly, then within 24 hours of somebody reporting it. Facebook currently classifies violating content to eight categories: fake accounts, spam, nudity and sexual activity, violent content, terrorist propaganda, hate speech, bullying and harassment, and child nudity and sexual exploitation.
As of last year, 69 million Filipinos have access to the Internet, and almost all of them have a Facebook account. For a sizeable percentage of these people, access to Facebook is what constitutes the entirety of their internet usage. Facebook has evolved from a site to reconnect with friends to a platform analogous to a public forum, an extension of the real world. Issues in the platform get magnified and affect millions of people in major ways.
Facebook acknowledges this, and understands that their current system, robust as it is, still isn’t perfect. Some categories, like hate speech and bullying, are very context-heavy and still require manual intervention.
Another issue especially relevant to the Filipino Facebook user is fake accounts and blatant misinformation: Although most fake accounts are handled by the system before registration, and Facebook partners with independent fact checkers to ensure that blatant misinformation doesn’t get spread across the platform, some inevitably fall through the cracks.
This gray area is what makes “troll farms” so effective and dangerous: they skirt past community standards on technicalities and proceed to push their agenda, usually achieving virality by virtue of sheer numbers. And when a sizeable number of people gets most of their information exclusively on Facebook, public opinion can easily be swayed.
Ultimately, though, Facebook’s greatest resource is its users. While community standards are there as guidelines on what is and isn’t acceptable, it still falls on the users to discern and fine tune what shouldn’t be allowed on Facebook, especially when it comes to content that falls through the cracks. The best thing somebody can do when they see offending content is to report it. In uncertain times, the best defense is constant vigilance.