The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed

This story was originally published by

Wired's Adrian Chen goes down the internet rabbit hole to find out who is really responsible for the moderation of platforms like Facebook and Twitter.

SHARE
THIS



The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. But on a muggy February afternoon, some of these companies’ most important work is being done 7,000 miles away, on the second floor of a former elementary school at the end of a row of auto mechanics’ stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila.

When I climb the building’s narrow stairwell, I need to press against the wall to slide by workers heading down for a smoke break. Up one flight, a drowsy security guard staffs what passes for a front desk: a wooden table in a dark hallway overflowing with file folders.

Past the guard, in a large room packed with workers manning PCs on long tables, I meet Michael Baybayan, an enthusiastic 21-year-old with a jaunty pouf of reddish-brown hair. If the space does not resemble a typical startup’s office, the image on Baybayan’s screen does not resemble typical startup work: It appears to show a super-close-up photo of a two-pronged dildo wedged in a vagina. I say appearsbecause I can barely begin to make sense of the image, a baseball-card-sized abstraction of flesh and translucent pink plastic, before he disappears it with a casual flick of his mouse.

Baybayan is part of a massive labor force that handles “content moderation”- the removal of offensive material -for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies. They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.

Read the full piece on Wired here

Latest News

Rimmel London & The Cybersmile Foundation Take On Beauty Cyberbullying
  • Campaigns

Rimmel London & The Cybersmile Foundation Take On Beauty Cyberbullying

Rimmel London has partnered up with The CyberSmile Foundation to take a stand against Beauty Cyberbullying. Launched on Monday via a global campaign #IWillNotBeDeleted, fronted by Rita Ora and Cara Delevigne with a cast of other individuals who have been affected by this issue. In 2017, Rimmel London undertook a global research study to understand […]

Shining The Light On Classic Out Of Home For Impact And Engagement
  • Advertising
  • Media

Shining The Light On Classic Out Of Home For Impact And Engagement

Jeep Australia and Lumo Energy are among an increasing number of brands creating impact and driving greater audience engagement through the use of special builds and innovative lighting on classic out of home billboards. The two builds, developed and executed by oOh! include strategically placed lighting which appears to increase in luminance as the night gets darker, to capture the attention […]

Media i Awards 2018 Winners Announced
  • Media

Media i Awards 2018 Winners Announced

The Media i Awards, recognising media sales excellence, were announced last night at a sold-out awards ceremony in Sydney. Voted by peers, the Media i Awards acknowledge the important role media sales representatives play in the continual pursuit of media advertising excellence. Introduced in 2011, the Media i Awards are the only awards dedicated to […]