1448

Facebook requests input on hard questions about fake news and censorship

How should Facebook decide what’s allowed on its social network, and how to balance safety and truth with diverse opinions and cultural norms? Facebook wants your feedback on the toughest issues it’s grappling with, so today it published a list of seven “hard questions” and an email address — hardquestions@fb.com — where you can send feedback and suggestions for more questions it should address.

Facebook’s plan is to publish blog posts examining its logic around each of these questions, starting later today with one abour reponding to the spread of terrorism online, and how Facebook is attacking the problem.

“Even when you’re skeptical of our choices, we hope these posts give a better sense of how we approach them — and how seriously we take them” Facebook’s VP of public policy Elliot Schrage writes. “And we believe that by becoming more open and accountable, we should be able to make fewer mistakes, and correct them faster.”

Here’s the list of hard questions with some context from TechCrunch about each:

  • How should platforms approach keeping terrorists from spreading propaganda online?

Facebook has worked in the past to shut down Pages and accounts that blatantly spread terrorist rhetoric. But the tougher decisions come in the grey area fringe, and where to draw the line between outspoken discourse and propaganda

  • After a person dies, what should happen to their online identity?

Facebook currently makes people’s accounts into memorial pages that can be moderated by a loved one that they designate before they pass away, but it’s messy to give that control to someone, even a family member, if the deceased didn’t make the choice.

  • How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?

Facebook has to walk a thin line between making its app safe for a wide range of ages as well as advertisers, and avoiding censorship of hotly debated topics. Facebook has recently gotten into hot water over temporarily taking down videos of the aftermath of police violence, and of a child nudity in a newsworthy historical photo pointing out the horrors of war.

  • Who gets to define what’s false news — and what’s simply controversial political speech?

Facebook has been racked with criticism since the 2016 US presidential election over claims that it didn’t do enough to prevent the spread of fake news, including right wing conspiracy theories and exagerations that may have given Donald Trump an advatange. But if Facebook becomes the truth police and makes polarizing decisions, it could alienate the conservative side of its user base and further fracture online communities

  • Is social media good for democracy?

On a similar front, Facebook is dealing with how peer-to-peer distribution of “news” omits the professional editors who typically protect readers from inaccuracy and misinformation. That problem is exacerbated when sensationalist or deceitful content is often the most engaging, and that’s what the News Feed that highlights.

  • How can we use data for everyone’s benefit, without undermining people’s trust?

Facebook is a data mining machine, for better or worse. This data powers helpful personalization of content, but also enables highly targeted advertising, and gives Facebook massive influence over a wide range of industries as well as our privacy

  • How should young internet users be introduced to new ways to express themselves in a safe environment?

What’s important news or lighthearted entertainment for adults can be shocking or disturbing for kids. Meanwhile, Facebook must balance giving younger users the ability to connect with each other and form support networks with keeping them safe from predators.