Australia: Sexual Content Moderation and Platform Governance Through a Sex-Positive Framework
A collective framework for rethinking how platforms govern sex, sexuality, and visibility online, outlining concrete principles for building more equitable, accountable, and sex-positive digital environments.
Does the Internet Feed Off of Women’s Bodies? How algorithms, AI, and platforms profit from the exploitation and silencing of women
From facial recognition bias to AI-generated sexual violence and reproductive health censorship, digital systems increasingly shape how women’s bodies and voices appear online. As platforms prioritize engagement and profit, algorithmic infrastructures are amplifying exploitation, surveillance, and the silencing of feminist and reproductive health information.
We asked artists in China about censorship. Here’s what they told us.
We interviewed artists in China about censorship and self-censorship in artistic practice. Their reflections describe a system shaped less by explicit prohibition and more by uncertainty, where creators often anticipate consequences and quietly adjust expression in advance.
Shadowbanned: Art, Language, and Survival Under Digital Censorship
When platforms label marginalized voices as “dangerous,” censorship becomes violence. In this essay, Vianney Harelly writes about art, borders, Palestine solidarity, and the real cost of being visible under digital censorship.
Victims of Algorithmic Hyperreality: One Real Death, Infinite Fabrications
An essay on how algorithms, disinformation, and generative AI distort reality in moments of violence. When truth is overwritten by hyperreality, victims are forced to exist in both the real and the fabricated at once.
Gendered Censorship in the Age of AI: Artist Violet on Platforms, Visibility, and Feminine Labor
Artist Violet shares a firsthand account of gendered censorship, algorithmic visibility, and AI driven extraction, examining how platforms shape creative labor and safety online.
Meta’s Vendetta Against Queer Culture & Sex: Double Standards in Platform Policy (and it’s not just Meta)
Big Tech platforms claim to allow queer, sexual, and reproductive health content, yet their automated systems routinely erase it. Across Meta, YouTube, TikTok, and beyond, accounts are deleted, content is demonetised, and pleasure-affirming information is mislabelled as “adult,” while misogynistic and male-centred sexual content continues to thrive.
How the UK Online Safety Act is harming marginalized communities and setting a dangerous global precedent
As the UK’s Online Safety Act moves into enforcement, “safety” is increasingly achieved through removal rather than protection. Sex workers, queer communities, and sexual and reproductive health advocates are losing visibility, income, and access to essential information under compliance driven moderation.
When Environmental Disaster Is Silenced: How Algorithmic Power Erased Indonesia’s Flood Crisis
When an environmental disaster struck Indonesia, the world barely noticed. As floods swallowed entire communities and thousands were displaced, the crisis failed to register beyond local networks. This silence was not accidental. It was produced by an ecosystem where algorithms, media concentration, and political power quietly determine which lives are seen and which are rendered invisible.
Digital Anarchy, Cybernetics, and the Politics of Feedback
Digital systems are often framed as neutral tools, yet they are built from choices that shape whose voices are amplified and whose are erased. Drawing on cybernetics and anarchist thought, this piece examines how feedback, power, and governance operate within digital infrastructures and why reclaiming collective agency over these systems is essential for justice, accountability, and care.
Appealing into a void - can the DSA protect Europe's marginalised?
This research documents how reproductive health, sex worker-led, and queer organizations across Europe continue to face censorship despite the Digital Services Act. It shows how appeal and enforcement mechanisms remain inaccessible in practice, leaving lawful communities without effective protection.
Repro Uncensored × The Guardian: Our Investigation on Meta’s Global Censorship of Abortion Advice and Queer Content
This research, conducted by Repro Uncensored, exposes a global escalation in the censorship of abortion access, sexual health, and queer content across Meta platforms, revealing opaque enforcement practices, ineffective appeals mechanisms, and material harm.
Visibility Isn’t Access: How Platforms Erase Disabled Creators
Visibility on social media does not equal access. Platforms are built around speed, constant output, and algorithmic reward systems that are not designed for disabled people.
Censoring Erotics, Censoring Community Art under Platform Surveillance
Erotic artists sit at the frontlines of content policing. Through SPUNK ROCK’s story, this piece examines the patterns of moderation that disproportionately target queer and body-positive creators and the emotional and economic toll of sustaining an erotic practice under platform surveillance
Meta Is Deleting Queer and Sex Worker Accounts, and Our Communities Are Being Erased With Them
More than 45 queer and sex worker accounts have been removed across the UK, the Netherlands, and other countries in a coordinated wave of digital erasure. This is not random. It is part of a wider political and economic logic that silences communities perceived as a threat to conservative agendas, reshapes sex work as “exploitation,” and prioritises corporate risk over human rights, safety, and free expression online.
Repro Uncensored: Bridging Online and Offline Community Organizing for Stronger Movements
Repro Uncensored is exploring how open-source platforms like Decidim can bridge online and offline community organizing. In a time of increasing surveillance, censorship, and repression, autonomous digital spaces are essential for protecting movements, strengthening participation, and ensuring collective decision-making remains in the hands of communities.
The Death of Teen Vogue Is Not an Accident, It’s Digital Suppression in Disguise
When Teen Vogue is folded into Vogue.com, it’s not just a business move, it’s a warning. In this sharp, urgent essay, Ana Karen Flores argues that what’s being sold as “streamlining” is in fact digital suppression in disguise, a quiet erasure of young, queer, and political voices that once challenged power and shaped a generation’s understanding of justice.
Your Menstrual Health Data is Very Valuable, and Big Tech Wants It.
Google and period-tracking app Flo Health have been ordered to pay a total of $56M in damages to settle a class-action lawsuit, after Flo shared user’s intimate menstrual cycle and fertility data with Meta, Google, and two further platforms between 2016-2019.
When Khmer Became a Target: Censorship of SRHR Online
Censorship of sex education in Cambodia has shifted from euphemisms and stigma in everyday life to new digital suppression in Khmer. As algorithms increasingly flag SRHR content, accurate information is being silenced, leaving young people without the knowledge they need for bodily autonomy and reproductive rights
Documenting Censorship: Farsi/Dari SRHR Voices
The primary challenge facing these communities is that anti-rights and anti-SRHR forces are in power in both countries. This not only strips people of access to their rights but also cuts them off from the information they need to understand those rights.
Publish research with us!
Researchers, activists, and organizations: join us in exploring the intersection of reproductive health, digital rights, artificial intelligence, and more. Together, we can tackle challenges like online censorship of abortion information, access to care in underserved communities, and advocacy for digital freedom.