Find the best digital marketing software and service providers to help grow your business.
Welcome Guest | Sign In

Facebook Under Fire for Bizarre Child Predator Survey Question

By David Jones
Mar 5, 2018 3:57 PM PT

Facebook has come under fire after posing a survey question on how it should deal with predatory sexual behavior against children over the weekend.

Survey participants were asked whether sexual predators should be allowed to request photographic images from 14-year-old girls online. Further, users were queried about how Facebook should handle such a request if it learned about it.

The survey also asked whether the site should better manage content involving extremist behavior and whether cultural norms should be taken into account.

Mea Culpa

After Jonathan Haynes, digital editor at The Guardian, tweeted how "out of touch" he thought Facebook was in connection with the survey, Guy Rosen, vice president of product at Facebook, issued a response.

At its annual Global Safety Summit last week, Facebook addressed several key issues related to making sure the site offers a safe and protective environment for users to navigate.

The summit included panel discussions on a variety of online safety topics that Facebook has had to grapple with over the years -- including how to use of cutting-edge technology to combat exploitative and predatory behavior, how to keep children from overusing social media, and how to make Facebook more family-friendly.

However, the survey question was unrelated to the event, according to Facebook.

Regaining Control

Safety issues repeatedly have come up for Facebook in recent years, as it has come under fire for how it deals with abusive and obscene material. How it works with government agencies to combat hate speech and calls for violence from extremists and terrorist organizations also has been a matter of controversy.

Other problems include the use of Facebook Live to live-stream suicides and criminal behavior, and the ongoing use of the platform to spread fake news.

A coalition of former Google and Facebook executives last month formed a group designed to combat the growing use of social media sites by young children, who increasingly have been engaging with them for so many hours a day that many parents have struggled to prevent them from becoming exposed to strangers.

That has been exacerbated by social media sites specifically targeting children with apps for instant messaging and social media interactions.

The latest Facebook episode is part of a wider problem that it has been dealing with as it has tried to gain control over all of the fake news that has been filtering through the site, observed Rick Edmonds, media business analyst at Poynter.

Facebook has been playing a game of whack-a-mole, in essence, and as soon as one controversy begins to subside, another rears its head, he told TechNewsworld.

"Here's an analogy," Edmonds said. "If I call up and threaten to kill you, I have committed a crime, but the phone company hasn't. Facebook would like to be regarded the same way -- or did until the events of the last few years."

Activities like threats of violence, sexual grooming of children and spreading fake news shine a spotlight on Facebook's role as a publisher rather than a passive utility.

David Jones is a freelance writer based in Essex County, New Jersey. He has written for Reuters, Bloomberg, Crain's New York Business and The New York Times.

Facebook Twitter LinkedIn Google+ RSS
How do you feel about accidents that occur when self-driving vehicles are being tested?
Self-driving vehicles should be banned -- one death is one too many.
Autonomous vehicles could save thousands of lives -- the tests should continue.
Companies with bad safety records should have to stop testing.
Accidents happen -- we should investigate and learn from them.
The tests are pointless -- most people will never trust software and sensors.
Most injuries and fatalities in self-driving auto tests are due to human error.