The Art of Deplatforming | Part IV | Artificial Horizons
IV of IV | demonetizing, blacklisting, whitelisting, shadowbanning, chilling |
We are not actually doing what we say we do publicly.
— Internal Review at Facebook, 2019
Interface: singular as a noun, plural by nature.
I. Death of the Editor
Like a crack through concrete, with neither good nor bad intentions, the platform has torn through the foundations of society; replastering the world beneath our feet with machine learning (ML), artificial intelligence (AI) and big data (Uh-Oh).
Unsteadied, a growing number of reporters and pundits are beginning to wax apocalyptic about the sudden groundswell of content-creating algorithms: GPT-3, Dall-E, Imagen, et al. It is no longer merely the cashiers and truckers who have to worry, writes Vanity Fair reporter Nick Bilton, “new advancements in A.I. have made it clear that writers, illustrators, photographers, journalists, and novelists could soon be driven from the workforce and replaced by high-tech player pianos.”ᶦ
What Bilton and his colleagues did not cover, however, was the flipside of content-creation: content-deletion. If the role of the author is unsafe, can the role of the editor be much safer? What about that of the fact-checker, the fact-finder? For those who opt to colour outside the lines, as well as those who do so accidentally—there is rarely a royal road to the right answer—the future is something of an editorial singularity: a point past which one cannot see.
II. Deleting
Deplatformings are hypervisible acts of excommunication—somewhat of a catch-22 for self-described platforms, which are not only designed for communication, but have designs on infrastructural invisibility. In the high-profile cases of Laura Loomer, Milo Yiannopoulos, Alex Jones and (when conditions were ripe) Donald Trump, deplatforming made both sense and cents: for not only had each “creator” become a hypervisible edge case, it was a species of hypervisibility “unfriendly” to core audiences and core advertisers.
Cases are no longer so open-and-shut, however.
“Given the scale that Twitter is at, a one-in-a-million chance happens 500 times a day,” writes David Harvey, VP of Trust and Safety at Twitter. “For us, edge cases, those rare situations that are unlikely to occur, are more like norms.”ᶦᶦ That was almost a decade ago. Who knows how many times one-in-a-million chances happen today.
Moreover, scale is not merely a Twitter problem. Every minute, over 500 hours of video are uploaded to YouTube;ᶦᶦᶦ Instagram processes 95 million photos per day;ᶦᵛ Facebook has grown from one million users in 2004 to almost three billion in 2022.ᵛ
To cope, platforms have taken an algorithmic turn toward the unholy alliance of ML and AI. Just as Google Search biases are more statistical than journalistic, oftentimes, the crises and curiosities of content moderation no longer revolve around questions of sense and sensibility, but rather around a new centre of gravity: pattern recognition. It is a centre of gravity as enormous as it is error-prone.
“Platforms dream of electric shepherds,” writes Tarleton Gillespie, whose Custodians of the Internet, unfortunately, has only become more relevant.ᵛᶦ For untold thousands (if not millions) of creators—past, present, and future—that dream has become a nightmarish reality.
Between October and December of 2019, YouTube deleted 5 million videos: 109,000 deletions were appealed, 23,000 deletions were reinstated.ᵛᶦᶦ From January to June of 2020, Twitter removed potentially offensive content from 1.9 million accounts, suspending 925,700 accounts for violating “Rules.”ᵛᶦᶦᶦ
“Today, more than 90% of the content that we remove from YouTube is first detected by machine learning systems, and most of the videos we remove for violating our policies have fewer than 10 views on them,” states Jennifer Flannery O’Connor, Director of Trust and Safety.ᶦˣ In theory, according to YouTube, deplatforming is simple, stupid: “Our principles are organized around the four ‘Rs’: remove harmful content, raise authoritative voices, reduce borderline content, and reward trusted creators.”ˣ In practice, however, the principles of YouTube-the-company have lagged behind the prejudices of YouTube-the-machine. ‘Harmful,’ ‘authoritative,’ ‘borderline,’ ‘trusted’: apparently, such words are not so easily reduced to 1s and 0s.
IV. Demonetizing
Near the middle of The Children of Men, P. D. James hits upon a political truism that has only grown truer over the years: “It’s taken governments a long time to realize that you don’t need to manipulate unwelcome news. Just don’t show it.”ˣᶦ It has taken platforms less time to realize that, oftentimes, you do not need to deplatform people. Just show less of them… to advertisers, especially.
In the era of machine learning, demonetizing (the confiscation of ad revenue) has emerged as first line of defence: a line that does not cut through rule-breakers in particular, but risk-takers in general. It is a better-to-ask-for-forgiveness-than-permission form of taxation, which disproportionately hurts YouTube’s working class.
When YouTube deliberately demonetized Steven Crowder, a channel with millions of subscribers, Crowder was unfazed: “This really isn’t that big of a ding for us.”ˣᶦᶦ When YouTube accidentally demonetized Felix “PewDiePie” Kjellberg, a channel with tens of millions of subscribers, Kjellberg was unsurprised: “It’s inefficient, it’s unstable, and an insecure revenue model.”ˣᶦᶦᶦ
But what about the creators without the fame or funds to weather the storm? Increasingly, as content moderation becomes algorithmic, automatic, nobody knows: not us, not YouTube, nobody—except, of course, the nobody at home; the nobody at work; the nobody who coulda’ been a contender.
IV. Blacklisting
In the twentieth century, the Hollywood blacklist was deplatforming in analogue form: a means of reducing the cultural visibility of supposed Communists and alleged sympathizers. In the twenty-first century, however, as culture has transitioned from a filter-then-publish to a publish-then-filter model,ˣᶦᵛ the blacklist has become the norm. Whitelisting has become the gatekeeping strategy du jour: a subtle reframing of content moderation from the demotion of “inappropriate content”ˣᵛ to the promotion of “trusted creators.”ˣᵛᶦ
On Facebook, the XCheck system reroutes the content of “high-profile” users to a specific and specialized area deep within the stack: a VIP blackbox, safely out of range of the ordinary “rough justice” of machine learning.ˣᵛᶦᶦ In tandem, Instagram has continuously flexed the soft power of the XCheck system, overlooking the violations of “protected” accounts—particularly those of the “algorithmically recognisable” and “advertiser friendly” variety.ˣᵛᶦᶦᶦ Moreover, as one member of Facebook’s Mistakes Prevention Team lamented, “VIP lists continue to grow.”ˣᶦˣ
On YouTube, whitelisting has become a feature of the appeals process, whereby channels with over 10,000 subscribers automatically skip the queue: “We do this because we want to make sure that videos from channels that could have early traffic to earn money are not caught in a long queue behind videos that get little to no traffic and have nominal earnings.”ˣˣ
YouTube’s whitelisting is not reserved to content moderation, moreover, for as well as “offering different users different sets of rules, different material resources and opportunities, and different procedural protections when content is demonetized,” the platform offers special treatment to “premium tier” companies.ˣˣᶦ
The reason why Jimmy Kimmel Live! could play unskippable ads during its coverage of the 2017 Las Vegas shooting, while other popular and unpopular YouTubers could not, was neither human error nor algorithmic bias. It was due to the fact that ABC, a whitelisted company, had free rein to run third-party ads on YouTube—an affordance not extended to YouTubers themselves.ˣˣᶦᶦ
V. Shadowbanning
For a long time, the existence of shadowbanning was considered as little more than Instagram folklore: a figment of the imagination of the cultural fringe. “Shadowbanning is essentially Instagram’s way of policing the community without overt censorship,” writes Taylor Lorenz.ˣˣᶦᶦᶦ Yet, despite the mainstream coverage, Instagram has yet to acknowledge the existence of the practice.
In a since-deleted Facebook post, Instagram alluded to the fact that large swathes of users had experienced problems “that caused posts not to be surfaced,” yet stopped short of providing an explanation. It was a two-handed “slap in the face,” remarked Currie Lee, a shadowbanned fashion photographer, whose account has since been deactivated.ˣᶦᵛ
On Twitter, meanwhile, shadowbanning has been taking place in the cold light of day: “Tweets that are popular are likely to be interesting and should be higher ranked… Tweets from bad-faith actors who intend to manipulate or divide the conversation should be ranked lower.”ˣˣᵛ In emphasizing that the platform does not “shadow ban” but may “limit Tweet visibility,” rendering certain categories of content harder but not impossible to find, Twitter makes a distinction without a difference.ˣˣᵛᶦ
For disobedient or unlucky users, running afoul of Twitter’s “Rules and Policies” can trigger a cascade of visibility-limiting effects, which range from “Downranking Tweets in replies” and “Make Tweets ineligible for amplification in Top Search results” to “Excluding Tweets and/or accounts in email or in-product recommendations.”ˣˣᵛᶦᶦ Not life or death for Twitter accounts with large followings, yet crippling for Twitter accounts trying to build one.
VII. Chilling
To recycle a tired phrase, “YouTube’s unfeeling, opaque and shifting algorithms”ˣˣᵛᶦᶦᶦ have become part of the lived experience of artists and creatives, marginal and mainstream alike. On Facebook and Twitter, likewise, Taina Bucher argues, “there is not so much a ‘threat of visibility’ as there is a ‘threat of invisibility.’”ˣˣᶦˣ
For creators who skirt close to the edge, however, one can certainly become too visible to the wrong pair of eyes (human or otherwise). While the centre of platform affords standing and support, the edge affords nudging, shadowboxing, spooking, staring, threatening, menacing—a broad spectrum of (mis)behaviour, which has a visible and explicit effect on the ecosystem, despite the invisibility and implicitness of the cause.
Along the border, platforms have fostered an omertà-like system of implicit incentives and informal constraints: an unspoken agreement, between creators and moderators alike, whose unwritten rules take such linguistic forms as “trusted creators” and “High Quality Principles.”ˣˣˣ
In the words of Glen Loury: “It is not the iron fist of repression, but the velvet glove of seduction that is the real problem.”ˣˣˣᶦ Insidiously, as the opacities and uncertainties of algorithms become part of the cultural furniture, new habits and routines concretize: new normals take root, which not only celebrate the status quo, but camouflage viable, vital alternatives.
The fact that Apple will not autocorrect “obscene” words like “abortion” has little consequence in and of itself, yet strange things happen at scale.ˣˣˣᶦ As Philip W. Anderson famously wrote, more is different.ˣˣˣᶦᶦ
Like spirals of silence, the asymmetrical effects of deplatforming threaten to reverberate throughout the creator economy like an aesthetic form of behavioural sync: a phenomenon discovered by ethologist John B. Calhoun, whose caged rats worked together, like a silent symphony, to raise utopia to the ground.ˣˣˣᶦᶦᶦ
VIII. Death of the Poet
In 1891, the French poet Jules Huret wrote an essay on how creativity comes to be defined by the finest of margins: “At fifteen, nature tells a young man whether he is cut out to be a poet or should be content with mere prose.”ˣˣˣᶦᵛ Nowadays, at an earlier and earlier age, platforms are nudging creators away from the poetic and toward the prosaic, the programmatic, the “content that meets a higher level of brand safety.” In a future-focused book from 1978, Eric Barnouw writes of how “the sponsor, the merchant, has been living at the summit of our communications system,” lamenting the fact that market values had formed “a buffer zone of approved ‘culture.’”ˣˣˣᵛ In the digital age, such buffer zones have only grown more powerful and less visible.
Deplatforming (and the threat thereof) has heralded a new dawn of precarity. It is a precarity that, with regard to artists, relies on a “blending of apparent contradictions”ˣˣˣᵛᶦ that strengthens the core and weakens the edge, crippling the creative freedom and career prospects of the avant-garde, the up-and-coming, the niche, the fringe: all of whom are reduced to little more than the flotsam and jetsam of the mainstream. It is a precarity that, with regard to audiences, interferes at the horizon of the potential and possible, widening the array of our consumptive present, whilst narrowing the diversity of our creative future.
Day by day, the famous words of Percy Shelley are fading behind the postscript of political theorist Langdon Winner.ˣˣˣᵛᶦᶦ It is no longer poets, but platforms who are the unacknowledged legislators of the world.
References
ᶦ Nick Bilton, “The New Generation of A.I. Apps Could Make Writers and Artists Obsolete,” Vanity Fair, 2 June 2022. https://www.vanityfair.com/news/2022/06/the-new-generation-of-ai-apps-could-make-writers-and-artists-obsolete.
ᶦᶦ Quoted in Tarleton Gillespie, Custodians of the Internet (New Haven: Yale University Press, 2018), 74.
ᶦᶦᶦ Thomas Poell, David B. Nieborg & Brooke Erin Duffy, Platforms and Cultural Production (New York: Wiley, 2021), 7.
ᶦᵛ Reuters Staff, “Instagram's User Base Grows to More than 500 Million,” Reuters, 21 June 2016. https://www.reuters.com/article/us-facebook-instagram-users-idUSKCN0Z71LN.
ᵛ Andrew Hutchinson, “Facebook Closes in on New Milestone of 3 Billion Total Users Across its Platforms,” Social Media Today, 29 April 2020. https://www.socialmediatoday.com/news/facebook-closes-in-on-new-milestone-of-3-billion-total-users-across-its-pla/577048.
ᵛᶦ Gillespie, Custodians of the Internet, 107.
ᵛᶦᶦ Julia Alexander , “YouTube Rarely Reinstates Removed Videos—Even When Creators Appeal,” The Verge, 28 February 2020. https://www.theverge.com/2020/2/28/21157476/youtube-video-removal-appeal-takedown-community-guidelines-report.
ᵛᶦᶦᶦ BBC News, “Twitter Tells Users to Be Nice and Think Twice Before Replying,” BBC, 6 May 2021. https://www.bbc.com/news/business-57004794.
ᶦˣ YouTube, “Rules and Policies, How YouTube Works. https://www.youtube.com/howyoutubeworks/policies/.
ˣ James Beser, “Our Responsibility Approach to Protecting Kids and Families on YouTube,” YouTube Blog, 25 October 2021. https://blog.youtube/news-and-events/our-responsibility-approach-protecting-kids-and-families-youtube.
ˣᶦ P. D. James, The Children of Men (London: Faber & Faber, 1992), 123.
ˣᶦᶦ Julia Alexander, “YouTube Looks to Demonetization as Punishment for Major creators, but It Doesn’t Work,” The Verge, 25 June 2019. https://www.theverge.com/2019/6/25/18744246/youtube-demonetization-steven-crowder-patreon-advertising-merch.
ˣᶦᶦᶦ Ibid.
ˣᶦᵛ Clay Shirky, Here Comes Everybody (London: Penguin Books, 2008).
ˣᵛ Abby Ohlheiser, “YouTube Is ‘looking into’ Complaints that It Unfairly Censors LGBT Videos,” The Washington Post, 20 March 2017. https://www.washingtonpost.com/news/the-intersect/wp/2017/03/20/youtube-is-looking-into-complaints-that-it-unfairly-censors-lgbt-videos.
ˣᵛᶦ Beser, “Our Responsibility Approach to Protecting Kids and Families on YouTube.”
ˣᵛᶦᶦ Jeff Horwitz, “Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt,” The Wall Street Journal, 13 September 2021. https://www.wsj.com/amp/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353.
ˣᵛᶦᶦᶦ Tarleton Gillespie, “The Relevance of Algorithms,” in Media Technologies, eds. Tarleton Gillespie, Pablo Boczkowski & Kirsten Foot (Cambridge: MIT Press, 2014), 184.
ˣᶦˣ Horwitz, “Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt.”
ˣˣ Quoted in Sangeet Kumar, “Algorithmic Dance: YouTube’s Adpocalypse and the Gatekeeping of Cultural Content on Digital Platforms,” Internet Policy Review, Vol. 8, No. 2 (2019), 6.
ˣˣᶦ Robyn Caplan & Tarleton Gillespie, “Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy,” Social Media + Society, Vol. 6, No. 2 (2020), 2.
ˣˣᶦᶦ Julia Alexander, “YouTube Fixing Monetization Issues with ‘Premium Tier’ Partners After Complaints,” Polygon, 10 October 2017. https://www.polygon.com/2017/10/10/16453306/youtube-monetization-ads-casey-neistat-philip-defranco.
ˣˣᶦᶦᶦ Taylor Lorenz, “Instagram’s ‘Shadowban’ Explained: How to Tell If Instagram Is Secretly Blacklisting Your Post,” MIC, 7 June 2017. https://www.mic.com/articles/178987/instagrams-shadowban-explained-how-to-tell-if-instagram-is-secretly-blacklisting-your-posts.
ˣˣᶦᵛ Ibid.
ˣˣᵛ Vijaya Gadde & Kayvon Beykpour, “Setting the Record Straight on Shadow Banning,” Twitter Blog, 26 July 2018. https://blog.twitter.com/official/en_us/topics/company/2018/Setting-the-record-straight-on-shadow-banning.html.
ˣˣᵛᶦ Twitter, “Rules and Policies,” Twitter Help Center, 2021. https://help.twitter.com/en/rules-and-policies.
ˣˣᵛᶦᶦ Ibid.
ˣˣᵛᶦᶦᶦ Amanda Hess: Quoted in Sangeet Kumar, “The Algorithmic Dance: YouTube's Adpocalypse and the Gatekeeping of Cultural Content on Digital Platforms.”
ˣˣᶦˣ Taina Bucher, “Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook,” New Media & Society, Vol. 14, No. 7 (2012), 1171.
ˣˣˣ YouTube, “Best Practices for Children’s and Family Content,” YouTube Help Centre, 1 November 2021. https://support.google.com/youtube/answer/10774223.
ˣˣˣᶦ Glenn C. Loury, “Self-Censorship in Public Discourse: A Theory of ‘Political Correctness’ and Related Phenomena,” Rationality and Society, Vol. 6, No. 4 (1994), 430.
ˣˣˣᶦᶦ Gillespie, Custodians of the Internet, 186.
ˣˣˣᶦᶦᶦ John B. Calhoun, “Population Density and Social Pathology,” Scientific American Vol. 206, No. 2 (1962).
ˣˣˣᶦᵛ Quoted in Pierre Bourdieu, “The Field of Cultural Production, or: The Economic World Reversed,” Poetics, Vol. 12, No. 4-5 (1983), 342.
ˣˣˣᵛ Eric Barnouw, The Sponsor (New York: Oxford University Press, 1978), 182.
ˣˣˣᵛᶦ Albert O. Hirschman, Exit, Voice, and Loyalty (Cambridge: Harvard University Press, 1970), 32.
ˣˣˣᵛ Langdon Winner, “Engineering Ethics and Political Imagination,” in Broad and Narrow Interpretations of Philosophy of Technology, ed. Paul T. Durbin (Dordrecht: Kluwer Academic Publishers, 1990), 59.