Thursday 16 March 2017

Facebook: Moderation tool flawed during BBC investigation

Facebook has been under fire after a report from the BBC last week called out the social network for not taking down sexualized images of children on its site.
Image result for facebook news
On Tuesday, Facebook said the moderation tool that should have flagged the images during the investigation wasn't working, according to a follow-up report by the BBC.
"We welcome when a journalist or a safety organisation contacts us and says we think there is something going wrong on your platform," Facebook UK Director Simon Milner told members of Parliament, according to the report. "We welcome that because we know that we do not always get it right."
Facebook didn't immediately respond to a request for comment.
The social network, with its 1.86 billion members, has been grappling with what to show on its site. In addition to the controversy over sexualized children, it's also wrestled with censorship issues as it's distributed fake news stories, expanded its focus on live video and struggled with what to do with violence on live broadcasts.
Specifically, the BBC story pointed to 100 posts reported to the social network featuring sexualized images or comments about children. Only 18 were removed at that time, though they have since all been taken down.
Among the allegations from the BBC report: Facebook had groups created by pedophiles, an image that looked like a still frame from a child abuse video, and five accounts belonging to convicted pedophiles -- whom Facebook explicitly bans.

No comments:

Post a Comment