The panic attacks started after Chloe watched a man die. She spent the past three and a half weeks in training, trying to harden herself against the daily onslaught of disturbing posts: the hate speech, the violent attacks, the graphic pornography. In a few more days, she will become a full-time Facebook content moderator, or what the company she works for, a professional services vendor named Cognizant, opaquely calls a “process executive.” For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it’s her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world’s largest social network. None of the trainees have seen it before, Chloe included. She presses play. The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice… Read full this story
- How TripAdvisor changed travel
- YouTube is under fire for inappropriate kids videos. Rivals see an opening
- Yasser Arafat: why he still matters
- The Battle Against 'Hate Speech' on College Campuses Gives Rise to a Generation That Hates Speech
The secret lives of Facebook moderators in America have 253 words, post on www.theverge.com at February 25, 2019. This is cached page on Technology Breaking News. If you want remove this page, please contact us.