A special team within Facebook has the difficult job of policing hate and harassment on their site, forming a virtual police squad utilizing unprecedented power to regulate speech. -db
The New York Times
December 12, 2010
By Miguel Helft
PALO ALTO, Calif. — Mark Zuckerberg, the co-founder and chief executive of Facebook, likes to say that his Web site brings people together, helping to make the world a better place. But Facebook isn’t a utopia, and when it comes up short, Dave Willner tries to clean up.
Dressed in Facebook’s quasi-official uniform of jeans, a T-shirt and flip-flops, the 26-year-old Mr. Willner hardly looks like a cop on the beat. Yet he and his colleagues on Facebook’s “hate and harassment team” are part of a virtual police squad charged with taking down content that is illegal or violates Facebook’s terms of service. That puts them on the front line of the debate over free speech on the Internet.
That role came into sharp focus last week as the controversy about WikiLeaks boiled over on the Web, with coordinated attacks on major corporate and government sites perceived to be hostile to that group.
Facebook took down a page used by WikiLeaks supporters to organize hacking attacks on the sites of such companies, including PayPal and MasterCard; it said the page violated the terms of service, which prohibit material that is hateful, threatening, pornographic or incites violence or illegal acts. But it did not remove WikiLeaks’s own Facebook pages.
Facebook’s decision in the WikiLeaks matter illustrates the complexities that the company grapples with, on issues as diverse as that controversy, verbal bullying among teenagers, gay-baiting and religious intolerance.
With Facebook’s prominence on the Web — its more than 500 million members upload more than one billion pieces of content a day — the site’s role as an arbiter of free speech is likely to become even more pronounced.
“Facebook has more power in determining who can speak and who can be heard around the globe than any Supreme Court justice, any king or any president,” said Jeffrey Rosen, a law professor at George Washington University who has written about free speech on the Internet. “It is important that Facebook is exercising its power carefully and protecting more speech rather than less.”
But Facebook rarely pleases everyone. Any piece of content — a photograph, video, page or even a message between two individuals — could offend somebody. Decisions by the company not to remove material related to Holocaust denial or pages critical of Islam and other religions, for example, have annoyed advocacy groups and prompted some foreign governments to temporarily block the site.
Some critics say Facebook does not do enough to prevent certain abuses, like bullying, and may put users at risk with lax privacy policies. They also say the company is often too slow to respond to problems.
For example, a page lampooning and, in some instances, threatening violence against an 11-year-old girl from Orlando, Fla., who had appeared in a music video, was still up last week, months after users reported the page to Facebook. The girl’s mother, Christa Etheridge, said she had been in touch with law enforcement authorities and was hoping the offenders would be prosecuted.
“I’m highly upset that Facebook has allowed this to go on repeatedly and to let it get this far,” she said.
A Facebook spokesman said the company had left the page up because it did not violate its terms of service, which allow criticism of a public figure. The spokesman said that by appearing in a band’s video, the girl had become a public figure, and that the threatening comments had not been posted until a few days ago. Those comments, and the account of the user who had posted them, were removed after The New York Times inquired about them.
Facebook says it is constantly working to improve its tools to report abuse and trying to educate users about bullying. And it says it responds as fast as it can to the roughly two million reports of potentially abusive content that its users flag every week.
“Our intent is to triage to make sure we get to the high-priority, high-risk and high-visibility items most quickly,” said Joe Sullivan, Facebook’s chief security officer.
In early October, Mr. Willner and his colleagues spent more than a week dealing with one high-risk, highly visible case; rogue citizens of Facebook’s world had posted antigay messages and threats of violence on a page inviting people to remember Tyler Clementi and other gay teenagers who have committed suicide, on so-called Spirit Day, Oct. 20.
Working with colleagues here and in Dublin, they tracked down the accounts of the offenders and shut them down. Then, using an automated technology to tap Facebook’s graph of connections between members, they tracked down more profiles for people, who, as it turned out, had also been posting violent messages.
“Most of the hateful content was coming from fake profiles,” said James Mitchell, who is Mr. Willner’s supervisor and leads the team. He said that because most of these profiles, created by people he called “trolls,” were connected to those of other trolls, Facebook could track down and block an entire network relatively quickly.
Using the system, Mr. Willner and his colleagues silenced dozens of troll accounts, and the page became usable again. But trolls are repeat offenders, and it took Mr. Willner and his colleagues nearly 10 days of monitoring the page around the clock to take down over 7,000 profiles that kept surfacing to attack the Spirit Day event page.
Most abuse incidents are not nearly as prominent or public as the defacing of the Spirit Day page, which had nearly 1.5 million members. As with schoolyard taunts, they often happen among a small group of people, hidden from casual view.
On a morning in November, Nick Sullivan, a member of the hate and harassment team, watched as reports of bullying incidents scrolled across his screen, full of mind-numbing meanness. “Emily looks like a brother.” (Deleted) “Grady is with Dave.” (Deleted) “Ronald is the biggest loser.” (Deleted) Although the insults are relatively mild, as attacks on specific people who are not public figures, these all violated the terms of service.
“There’s definitely some crazy stuff out there,” Mr. Sullivan said. “But you can do thousands of these in a day.”
Nancy Willard, director of the Center for Safe and Responsible Internet Use, which advises parents and teachers on Internet safety, said her organization frequently received complaints that Facebook does not quickly remove threats against individuals. Jim Steyer, executive director of Common Sense Media, a nonprofit group based in San Francisco, also said that many instances of abuse seemed to fall through the cracks.
“Self-policing can take some time, and by then a lot of the damage may already be done,” he said.
Facebook maintains it is doing its best.
“In the same way that efforts to combat bullying offline are not 100 percent successful, the efforts to stop people from saying something offensive about another person online are not complete either,” Joe Sullivan said.
Facebook faces even thornier challenges when policing activity that is considered political by some, and illegal by others, like the controversy over WikiLeaks and the secret diplomatic cables it published.
Last spring, for example, the company declined to take down pages related to “Everybody Draw Muhammad Day,” an Internetwide protest to defend free speech that surfaced in repudiation of death threats received by two cartoonists who had drawn pictures of Muhammad. A lot of the discussion on Facebook involved people in Islamic countries debating with people in the West about why the images offended.
Facebook’s team worked to separate the political discussion from the attacks on specific people or Muslims. “There were people on the page that were crossing the line, but the page itself was not crossing the line,” Mr. Mitchell said.
Facebook’s refusal to shut down the debate caused its entire site to be blocked in Pakistan and Bangladesh for several days.
Facebook has also sought to walk a delicate line on Holocaust denial. The company has generally refused to block Holocaust denial material, but has worked with human rights groups to take down some content linked to organizations or groups, like the government of Iran, for which Holocaust denial is part of a larger campaign against Jews.
“Obviously we disagree with them on Holocaust denial,” said Rabbi Abraham Cooper, associate dean of the Simon Wiesenthal Center. But Rabbi Cooper said Facebook had done a better job than many other major Web sites in developing a thoughtful policy on hate and harassment.
The soft-spoken Mr. Willner, who on his own Facebook page describes his political views as “turning swords into plowshares and spears into pruning hooks,” makes for an unlikely enforcer. An archaeology and anthropology major in college, he said that while he loved his job, he did not love watching so much of the underbelly of Facebook.
“I handle it by focusing on the fact that what we do matters,” he said.
Copyright 2010 The New York Times Company FAC Content Use Policy