Social media companies ramp up censorship in response to alt-right extremism

In the wake of the Charlottesville car attack, Reddit and Facebook are removing hate groups from their venues. Reddit removed r/Physical_Removal, a site that advocates concentration camps, racial segregation and included the wish that CNN would be bombed and the threat that if Trump is impeached people would be massacred and maimed and left to die in the streets. This year Facebook   banned eight hate groups and this weekend removed the Charlottesville Unite the Right page. (c/net, August 15, 2017, by Morgan Little and Sean Hollister)

Facebook, Twitter, and YouTube are responding to European pressure to curtail the use of their platforms  by Islamic extremists and now are under the gun to react to racist and neo-Nazi content. The companies tried to provide a forum free of censorship but in the face of aggressive far right activity on their platforms have been moved to act. (Yahoo!, August 14, 2017, by Jim Finkle and Salvador Rodriguez of Reuters)

Go Daddy booted the neo-Nazi website Daily Stormer for inciting violence, a move prompted by an online post by Andrew Anglin of website that mocked Heather Heyer killed in Charlottesville by an alt-right protestor. Harvard law professor Lawrence Tribe applauded the move, “It’s well past time for platforms that already exercise some discretion to stop pretending they are just dumb pipes that allow all types of garbage to flow through them.” But the ACLU said that censorship can bite back and that allowing vile speech enables Americans to stand up and counter it. Tech companies are concerned about sites that promote violence. (The Washington Post, August 14, 2017, by Elizabeth Dwoskin and Tracy Jan with contributions from Andrew de Grandpre)