ALGORITHMS and content moderation policies in internet spaces discriminate against minority groups and rob people of their livelihoods and sense of community.
That’s the conclusion an expert panel came to at Exhibit, an online forum organised by Digital Rights Watch Australia.
Author, researcher and former Penthouse Pet Zahra Stardust kicked off the evening with an explanation that governments and social media platforms deploy legislation and technology to censor sex-related content online.
“The regulation of sex is political,” she said.
But the notions of offence, obscenity and profanity driving that censorship are middle-class values that punish sex workers, BIPOC and members of the LGBTQ+ community.
Big Tech’s heavy-handed content dragnets are unable and unwilling to distinguish truly objectionable material from, for example, sex worker ads and sex education resources.
The unfortunate result is that escorts and queer people are often shadow-banned or de-platformed on social media.
Making matters worse are governments that refuse to see algorithms as a problem and continue to tie online safety to improving these machine rules.
Model April Hélene-Horton, another speaker at the online event, described the heartbreak of having her online community ripped away from her because of censorship.
She pointed out the community guidelines governing social media platforms are certainly not those of the fat, Black and queer communities.
“The degree of hypocrisy shown by the different treatment of slim white women to fat black women on social media in terms of content removal is disgusting,” she said.
“A white slim woman holding her breast is allowed, yet a fat black woman in the same pose has her content removed.
“That is infuriating and shows how current online moderation is flawed and disproportionately affects minorities or those who are non-normative.
“People like me, in various intersections, need to be represented.”
Queer philosopher, writer and poet Joshua Badge said repressive content moderation online has a pernicious effect on people being able to find, access and create community, including contact with vital health and support information.
“Moderation is simply the enforcement of conservative moral norms… that really comes down on non-normative people,” Joshua said.
“The attempt to moderate queer art and culture is little more than a sinister de-queering of queerness, a forced assimilation into normative culture.”
Activist Eliza Sorensen denounced the real-world implications of shadow-banning and de-platforming on social media, including losing friends and clients.
“Sex work is work, and we deserve the same protections as [workers in] any other industry,” Eliza said.
Eliza called for creating alternative sex-friendly social media spaces like Switter.at and Tryst.link.
Eliza holds out little hope Big Tech will be willing and able to respond to the criticisms.
“As long as tech companies refuse to accept their decisions are not apolitical, these same issues will keep coming up again and again.”
Holding Big Tech to account while also creating new online safe havens were also constants in the messages of Joshua and April.
Joshua called for “local rather than global rules” on social media to encourage community-led moderation and avoid defaulting to “lowest common denominator” moralising.
Joshua acknowledged reform is an uphill battle, lamenting, “maybe tech companies are not for us.”
April demanded humans, not machines, be at the heart of the moderation process, and particularly those most affected by discriminatory algorithms.
Zahra said the goal was a genuinely equal internet, free from the tentacles of neoliberalism and surveillance capitalism.
“Sex should be able to have a place online just like any other cultural media,” she said.