Social Networking

Twitter Creates Advisory Panel on Tweet Speech

Twitter on Tuesday announced the formation of a panel to advise it on speech, abuse and safety issues.

The Trust & Safety Council, which has more than 40 members, will be part of a new strategy to ensure that people can feel safe when they express themselves on Twitter, said Patricia Cartes, head of global policy outreach.

“As we develop products, policies and programs, our Trust & Safety Council will help us tap into the expertise and input of organizations at the intersection of these issues more efficiently and quickly,” she said.

The groups and individuals making up the council include the following:

  • Safety advocates, scholars, and researchers focused on children, media literacy, digital citizenship and efforts that promote greater compassion and empathy on the Internet;
  • Grassroots advocacy groups that use Twitter to build movements and momentum; and
  • Community organizations that work to prevent abuse, harassment and bullying, as well as those involved in mental health and suicide prevention.

Sounding Board

“Twitter is trying to put a little bit more structure and formal process around work they’ve been doing, like a lot of other social media companies, in getting input from advocacy organizations and civil rights and civil liberties groups about what Twitter’s content policies are and how those policies affect their users,” said Emma Llans, director of theCenter for Democracy & Technology’s Free Expression Project and a member of the council.

“Instead of doing a lot of one-off conversations or ad hoc discussions when an issue reaches a boiling point, Twitter will have something a little bit more consistent and regular in place to consult about policies and ideas that they are considering or working on,” she told TechNewsWorld.

The council will be a sounding board for Twitter, added Steve Freeman, director of legal affairs for theAnti-Defamation League, which is also a member of the council.

“We’ve been doing a lot work for a long time on issues of hate speech and hate online, and the best way to respond to it is to balance respect for freedom of expression and creation of a place where people can communicate without being harassed and bullied,” he told TechNewsWorld.

“Twitter, as have other companies, have had issues with that, and they’re just looking for people who have thought about it and can help them think through some of the challenges,” Freeman added.

More Productive Free Speech

By forming the council, Twitter is seeking assistance in dealing with problems all social media companies face.

“These platforms introduce, like all human relationships, social problems and complexities, like feeling excluded or flaming or bullying or aggressive speech or hate speech,” noted Dacher Keltner, a psychology professor atUC Berkeley and a member of the council.

“This group that Twitter’s brought together are people devoted to figuring out how to use Twitter to promote more productive free speech,” he told TechNewsWorld.

Will the panel’s efforts to create more productive free speech actually restrict expression on Twitter? Keltner doesn’t think so.

“It isn’t about regulating as much as it is providing people with resources and framing the positive potential of Twitter,” he said.

“The idea of providing resources is to give people places to go when they feel there are excesses of free speech or harmful speech,” Keltner added. “I don’t think it’s going to infringe on free speech.”

Problem Bigger Than Council

The Center for Democracy & Technology agreed to be on the council to make sure there is a strong voice for free speech on the panel, Llans said.

“I hope through this process we can see even more transparency from Twitter about what their policies are, how they enforce them, and create a feedback loop for getting information back to the company about the consequences of the decisions that they’re making,” she said.

The biggest problem with abuse on Twitter may be beyond the reach of the council, however, maintained Bennett Haselton, webmaster ofPeacefire.org.

“With or without the Trust & Safety Council, Twitter needs to establish a system where abuse reports can be handled in a scalable way,” he told TechNewsWorld.

“One such way is the simple random-sample-voting algorithm, where abuse reports are reviewed by a random sample of Twitter volunteers, who vote on whether a tweet violated the terms of service, and then Twitter can look at the tweets that get the most abuse votes and make the final decision,” Haselton continued.

“Currently, if abuse reports come from users and are reviewed by Twitter employees, that’s too much of a bottleneck,” he said.

What’s more, Haselton added, “it’s too easy to game the system by having a mob of your friends file phony abuse reports.”

John Mello is a freelance technology writer and contributor to Chief Security Officer magazine. You can connect with him on Google+.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by John P. Mello Jr.
More in Social Networking

Technewsworld Channels