Deplatforming

From Wikipedia, the free encyclopedia

Deplatforming, also known as no-platforming, has been defined as an "attempt to boycott a group or individual through removing the platforms (such as speaking venues or websites) used to share information or ideas",[1] or "the action or practice of preventing someone holding views regarded as unacceptable or offensive from contributing to a forum or debate, especially by blocking them on a particular website."[2]

History[]

In the United States, the banning of speakers on University campuses dates back to the 1940s. This was carried out by policies of the universities themselves. The University of California had a policy known as the Speaker Ban, codified in university regulations under President Robert Gordon Sproul that mostly, but not exclusively, targeted communists. One rule stated that "the University assumed the right to prevent exploitation of its prestige by unqualified persons or by those who would use it as a platform for propaganda." This rule was used in 1951 to block Max Shachtman, a socialist, from speaking at the University of California at Berkeley. The rule was not only used against political voices; in 1947, former U.S. Vice President Henry A. Wallace was banned from speaking at UCLA because of his views on U.S. Cold War policy,[3] and in 1961, Malcolm X was prohibited from speaking at Berkeley as a religious leader.

Controversial speakers invited to appear on college campuses have faced deplatforming attempts to disinvite them or to otherwise prevent them from speaking.[4] The British National Union of Students established its No Platform policy as early as 1973.[5] In the mid-1980s, visits by South African ambassador Glenn Babb to Canadian college campuses faced opposition from students opposed to apartheid.[6]

In the United States, recent examples include the March 2017 disruption by protestors of a public speech at Middlebury College by political scientist Charles Murray.[4] In February 2018, students at the University of Central Oklahoma rescinded a speaking invitation to creationist Ken Ham, after pressure from an LGBT student group.[7][8] In March 2018, a "small group of protesters" at Lewis & Clark Law School attempted to stop a speech by visiting lecturer Christina Hoff Sommers.[4] Adam Carolla and Dennis Prager documented their disinvitation, and others, in their 2019 film No Safe Spaces.[9] As of February 2020, the Foundation for Individual Rights in Education, a speech advocacy group, documented 469 disinvitation or disruption attempts at American campuses since 2000,[10] including both "unsuccessful disinvitation attempts" and "successful disinvitations"; the group defines the latter category as including three subcategories: formal disinvitation by the sponsor of the speaking engagement; the speaker's withdrawal "in the face of disinvitation demands"; and "heckler's vetoes" (situations when "students or faculty persistently disrupt or entirely prevent the speakers' ability to speak").[11]

Social media[]

Beginning in 2015, Reddit banned several communities on the site ("subreddits") for violating the site's anti-harassment policy.[12] A 2017 study published in the journal Proceedings of the ACM on Human-Computer Interaction, examining "the causal effects of the ban on both participating users and affected communities," found that "the ban served a number of useful purposes for Reddit" and that "Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage. Communities that inherited the displaced activity of these users did not suffer from an increase in hate speech."[12] In June 2020 and January 2021, Reddit also issued bans to two prominent online pro-Trump communities over violations of the website's content and harassments policies.

On May 2, 2019, Facebook and the Facebook-owned platform Instagram announced a ban of "dangerous individuals and organizations" including Nation of Islam leader Louis Farrakhan, Milo Yiannopoulos, Alex Jones and his organization InfoWars, Paul Joseph Watson, Laura Loomer, and Paul Nehlen.[13][14] In the wake of the 2021 storming of the US Capitol, Twitter banned then-president Donald Trump, as well as 70,000 other accounts linked to the event and the far-right movement QAnon.

Donald Trump[]

On January 6, 2021, in a joint session of the United States Congress, the counting of the votes of the Electoral College was interrupted by a breach of the United States Capitol chambers. The rioters were supporters of President Donald Trump who hoped to delay and overturn the President's loss in the 2020 election. The event resulted in five deaths and at least 400 people being charged with crimes.[15] The certification of the electoral votes was only completed in the early morning hours of January 7, 2021. In the wake of several Tweets by President Trump on January 7, 2021 Facebook, Instagram, YouTube, Reddit, and Twitter all deplatformed Trump to some extent.[16][17][18][19] Twitter deactivated his personal account, which the company said could possibly be used to promote further violence. Trump subsequently tweeted similar messages from the President's official US Government account @POTUS, which resulted in him being permanently banned on January 8.[20] Twitter has announced that Trump's ban from their platform will be permanent.

Trump plans to re-join on social media through the use of a new platform by May or June 2021, according to Jason Miller on a Fox News broadcast.[21][22]

Other examples[]

In print media[]

In December 2017, after learning that a French artist it had previously reviewed was a neo-Nazi, the San Francisco punk magazine Maximum Rocknroll apologized and announced that it has "a strict no-platform policy towards any bands and artists with a Nazi ideology".[23]

Criticism[]

In 2019, students at the University of the Arts in Philadelphia circulated an online petition demanding that Camille Paglia "should be removed from UArts faculty and replaced by a queer person of color."[24] Paglia, a tenured professor for over 30 years who identifies as transgender, had long been outspoken on controversial "matters of sex, gender identity, and sexual assault".[24] Conor Friedersdorf, writing in The Atlantic about the 2019 campaign to remove Paglia, wrote: "It is rare for student activists to argue that a tenured faculty member at their own institution should be denied a platform. Otherwise, the protest tactics on display at UArts fit with standard practice: Activists begin with social-media callouts; they urge authority figures to impose outcomes that they favor, without regard for overall student opinion; they try to marshal antidiscrimination law to limit freedom of expression."[24] Friedersdorf pointed to evidence of a chilling effect on free speech and academic freedom. Of the faculty members he had contacted for interviews, a large majority "on both sides of the controversy insisted that their comments be kept off the record or anonymous. They feared openly participating in a debate about a major event at their institution—even after their university president put out an uncompromising statement in support of free speech".[24]

According to technology journalist Declan McCullagh, "Silicon Valley's efforts to pull the plug on dissenting opinions" began around 2018 with Twitter, Facebook, and YouTube denying service to selected users of their platforms, "devising excuses to suspend ideologically disfavored accounts."[25] In 2019, McCullagh predicted that paying customers would become targets for deplatforming as well, citing protests and open letters by employees of Amazon, Microsoft, Salesforce, and Google who opposed policies of U.S. Immigration and Customs Enforcement (ICE), and who reportedly sought to influence their employers to deplatform the agency and its contractors.[25]

Law professor Glenn Reynolds dubbed 2018 the "Year of Deplatforming" in an August 2018 article in The Wall Street Journal. Reynolds criticized the decision of "internet giants" to "slam the gates on a number of people and ideas they don't like", naming Alex Jones and Gavin McInnes, and stated, "If you rely on someone else's platform to express unpopular ideas, especially ideas on the right, you're now at risk."[26] Reynolds cited further restrictions on "even mainstream conservative figures" such as Dennis Prager, as well as Facebook's blocking of a campaign advertisement by a Republican candidate "ostensibly because her video mentioned the Cambodian genocide, which her family survived."[26] Reynolds wrote that in contrast, "Extremists and controversialists on the left have been relatively safe from deplatforming," concluding that "the fact that a few corporations can play such a disproportionate role in deciding what subjects are open for debate is a problem" as a matter of free speech.[26]

Defense[]

Supporters of deplatforming have justified the action on the grounds that it produces the desired effect of reducing what they characterize as "hate speech".[12][27][28] Angelo Carusone, president of the progressive organization Media Matters for America and who had run deplatforming campaigns against conservative talk hosts Rush Limbaugh in 2012 and Glenn Beck in 2010, pointed to Twitter's 2016 ban of Milo Yiannopoulos, stating that "the result was that he lost a lot.... He lost his ability to be influential or at least to project a veneer of influence."[27]

According to its defenders, deplatforming has been used as a tactic to prevent the spread of hate speech and disinformation.[12] Social media has evolved into a significant source of news reporting for its users, and support for content moderation and banning of inflammatory posters has been defended as an editorial responsibility required by news outlets.[29]

The First Amendment is sometimes cited as a criticism of deplatforming in the United States, but according to Audie Cornish, host of the NPR show 'Consider This', modern deplatforming is not a government issue. She states that "...the government can't silence your ability to say almost anything you want on a public street corner. But a private company can silence your ability to say whatever you want on a platform they created."[30] Because of this, proponents say, deplatforming is a legal way of dealing with controversial users online or in other digital spaces, so long as the government is not involved with causing the deplatforming.

Legislation[]

United Kingdom[]

In May 2021, the UK government under Boris Johnson announced a Higher Education (Freedom of Speech) Bill that would allow speakers at universities to seek compensation for no-platforming, impose fines on universities and student unions that promote the practice, and establish a new ombudsman charged with monitoring cases of no-platforming and academic dismissals.[31] In addition, the government published an Online Safety Bill that would prohibit social media networks from discriminating against particular political views or removing "democratically important" content, such as comments opposing or supporting political parties and policies.[32]

See also[]

References[]

  1. ^ The Good, The Bad, & The Semantically Imprecise: The words that defined the week of August 10th, 2018, Merriam-Webster (August 8, 2018).
  2. ^ Deplatforming, Lexico.com (Dictionary.com/Oxford University Press).
  3. ^ Freeman, Jo (2000). "A Short History of the University of California Speaker Ban". JoFreeman.com. Archived from the original on 2019-12-08.
  4. ^ Jump up to: a b c Young, Cathy (April 8, 2018). "Half of college students aren't sure protecting free speech is important. That's bad news". Los Angeles Times. Archived from the original on 2019-02-08.
  5. ^ German, Lindsey (April 1986). "No Platform: Free Speech for all?". Socialist Worker Review (86).
  6. ^ Bueckert, Michael (April 2018). "No platform for Apartheid". africasacountry.com. Retrieved 2020-11-17.
  7. ^ Hinton, Carla (February 8, 2018). "UCO Student Group Rescinds Invitation to Christian Speaker Ken Ham". The Oklahoman. Archived from the original on 2018-05-28.
  8. ^ Causey, Adam Kealoha (February 8, 2018). "Creationist's speech canceled at university in Oklahoma". Houston Chronicle. Associated Press. Archived from the original on 2018-02-09.
  9. ^ Fund, John (November 3, 2019). "In No Safe Spaces, an Odd Couple Teams Up to Fight Free-Speech Bans". National Review. Archived from the original on 2019-12-18.
  10. ^ "Disinvitation Database". Foundation for Individual Rights in Education. Retrieved 2021-02-16.
  11. ^ "User's Guide to FIRE's Disinvitation Database". Foundation for Individual Rights in Education. June 9, 2016. Retrieved 2021-02-16.
  12. ^ Jump up to: a b c d Chandrasekharan, Eshwar; Pavalanathan, Umashanti; et al. (November 2017). "You Can't Stay Here: The Efficacy of Reddit's 2015 Ban Examined Through Hate Speech" (PDF). Proc. ACM Hum.-Comput. Interact. 1 (2): Article 31. doi:10.1145/3134666. S2CID 22713682.
  13. ^ Wells, Georgia (May 2, 2019). "Facebook Bans Louis Farrakhan, Alex Jones and Others as 'Dangerous'". The Wall Street Journal. Archived from the original on 2019-05-03.
  14. ^ Lorenz, Taylor (May 2, 2019). "Instagram and Facebook Ban Far-Right Extremists". The Atlantic. Archived from the original on 2019-05-03.
  15. ^ "The Capitol Siege: The Arrested And Their Stories". NPR.org. Retrieved 2021-04-18.
  16. ^ John Healy (January 8, 2021) Opinion: It took a mob riot for Twitter to finally ban Trump
  17. ^ Danny Crichton (January 9, 2021) The deplatforming of President Trump
  18. ^ Casey Newton (January 6, 2021) It's time to deplatform Trump
  19. ^ Jaclyn Diaz (Jan 13, 2021) YouTube Joins Twitter, Facebook In Taking Down Trump's Account After Capitol Siege
  20. ^ "The expulsion of Donald Trump marks a watershed for Facebook and Twitter". The Economist. January 10, 2021. ISSN 0013-0613. Retrieved 2021-01-10.
  21. ^ Martin Pengelly (Mar 21, 2021) Trump will use 'his own platform’ to return to social media after Twitter ban
  22. ^ Breuninger, Kevin (June 2, 2021). "Trump blog page shuts down for good". CNBC.
  23. ^ "Letters". Maximum Rocknroll (editorial statement). No. 415. December 2017. p. 8.
  24. ^ Jump up to: a b c d Friedersdorf, Conor (May 2019). "Camille Paglia Can't Say That". The Atlantic. Archived from the original on 2019-05-01.
  25. ^ Jump up to: a b McCullagh, Declan (February 2019). "Deplatforming Is a Dangerous Game". Reason. Archived from the original on 2019-03-31.
  26. ^ Jump up to: a b c Reynolds, Glenn Harlan (August 18, 2018). "When Digital Platforms Become Censors". The Wall Street Journal. Archived from the original on 2019-03-30.
  27. ^ Jump up to: a b Koebler, Jason (August 10, 2018). "Deplatforming Works". Motherboard. Vice Media. Archived from the original on 2019-03-19.
  28. ^ Wong, Julia Carrie (September 4, 2018). "Don't give Facebook and YouTube credit for shrinking Alex Jones' audience". The Guardian. London.
  29. ^ Yaraghi, Niam (January 10, 2021). "How should social media platforms combat misinformation and hate speech?". Brookings.
  30. ^ "Deplatforming: Not A First Amendment Issue, But Still A Tough Call For Big Tech : Consider This from NPR". NPR.org. Retrieved 2021-04-18.
  31. ^ "Universities could face fines over free speech breaches". BBC News. 12 May 2021. Retrieved 13 May 2021.
  32. ^ Hern, Alex (12 May 2021). "Online safety bill 'a recipe for censorship', say campaigners". The Guardian. Retrieved 13 May 2021.

Further reading[]

External links[]

  • The dictionary definition of deplatform at Wiktionary
Retrieved from ""