Meta’s Vendetta Against Queer Culture & Sex: Double Standards in Platform Policy (and it’s not just Meta)
By Rhian Farnworth
Meta’s recent vendetta against queer culture, reproductive health, sexual content, and women's bodies produced one of the largest waves of account deletions in recent years. With over 50 accounts removed overnight in late 2025 - many without receiving any warnings - account owners are often left confused and bewildered by what has happened. They follow Meta’s rules closely and adhere to community guidelines, yet account and content removals are shadow-bans are abundant and increasing.
Patterns of similar content removals and restrictions are not unique to Meta. Platforms like YouTube, TikTok, MicroSoft, and others regularly block and remove similarly themed information, accounts, and content, despite platform rules stating they allow it. Additionally, the digital advertisement of sexual and reproductive products, and the monetisation of sexualised, bodily, and queer content, is deeply restricted. Queer and sexual content is often miscategorised or bundled into general “adult” or “mature” topic themes, with highly intricate, restrictive, medicalised, and multiply-interpretable rules and regulations applied. Increasingly, disparities between a platform's policy and its technology are becoming prevalent; a platform's regulations promise one thing, while its technologies and automated content regulation (ACR) systems do something completely different.
If you’re running a queer culture account, a sex educator, sex-worker, sharing sexual and reproductive health and rights (SRHR) information or products, or discussing bodily and pleasure-related topics,, understanding platform content policies is a vital lifeline for sharing information, promoting yourself or your brand, or simply talking about topics that everyone needs (and wants) to know about. Access to queer and sexual information content is not only fun and normal, it's a vital lifeline. For 71% of youth globally, online sources are their primary means for accessing information on bodies, relationships, and sex education. For many queer people in the world, digital access may be their main or only route to their community. This is not only essential health and cultural information: this is some of the most suppressed, removed, banned, and blocked digital information in existence.
This censorship is not benign, it happens across more than social media platforms. Content blockages extend to search engines, email service providers, wifi networks, generative AI, and other digital and AI systems, essentially deeming information around queer culture and sexual bodies ‘illegal’ across many digital systems. Yet platforms policy often contradicts the censorship we see happening daily. According to Meta, they allow ‘the discussion of sex worker rights advocacy and sex work regulation’ across their platforms, and nudity for educational purposes (we assume this would include sex-ed). Further, YouTube states that creators can monetise content about non-graphic sex education. In practise, this is not what happens. Messages found in platform policy like this, become increasingly confusing when the exact opposite happens in reality, and the multiple, conflating, and intersecting platform policies are examined more closely.
Policy, Technology, and Governance Complexity
To understand why or why not queer and sex-related content is digitally sanctioned, it's necessary to examine the complexities of platform policy, and wider digital technology policy too. This is where it gets really tricky, and/or very interesting, depending on perspectives. With numerous, deeply intersecting, and often contradictory policies, they must all be considered when it comes to publishing queer and sexual content. Just some of the policy types which can and do affect information sharing include policies like: best practices, acceptable use, content guidelines, community guidelines, privacy policies, technical governance & IT policies, service & vendor management, security policies, incident response policies, AI and emerging tech, platform regulations, trade & advertising, digital inclusion policies, accessibility policies, age policies, and many more.
Another aspect to consider is the technical and data-driven infrastructure. This refers to things like automated and algorithmic decision-making, recommender systems, AI and machine learning, databases, and other technical aspects of digital technologies. Lastly, intersecting legal frameworks and their relation to big tech and digital technologies, must also be considered. While the digital technologies and platforms of big tech operate internationally with sovereign control of digital landscapes, law operates on a country- or regional basis, making overarching technology policies challenging to develop. Additionally, many legal professionals and judges do not necessarily understand how technology works, yet make decisions about this. One highly demonstrative example showcasing conflating and intersecting technology, platform, and legal aspects, was the blocking of Women on Web’s site in Spain. Here, Spanish authorities removed access to the site within Spain, while local law declared technologically impossible solutions to reinstate it. While on paper, the site was restored, in reality the solution decided by Spanish lawmakers who lacked technical expertise was impossible to implement, resulting in the case seemingly solved on paper, while the website was still blocked in Spain.
When publishing and sharing digital information, content types bring further levels of policies and standards. Organic, paid, and monetised content types all come with their own rules and regulations about what topics are or are not allowed to be published, searchable, and visible. It's crucial to understand that this isn’t only about freedom of access to sharing and accessing information: it's big business for big tech, and capitalism in action. Meta, the parent company of Facebook, Instagram, WhatsApp, Threads, and the metaverse, is a large digital advertising network, which accounts for 97% of its revenue. In September 2025, Meta’s 12-month revenue was stated at $189.45B. The same applies to Google, Amazon, and MicroSoft, with respective digital advertising profits of around $296B, $60-62B and $14B, and TikTok also expected to hit the $40B mark. So just why do these platforms care so deeply about sharing queer and sexy content, while at the same time promoting and profiting from misogynistic content in the manosphere? Without getting into the finer details of the history of censorship programmed into digital systems, cultural and societal norms, and of course patriarchy, let's take a closer look at some confusing examples of big tech policy.
Sex Ed and YouTubes adult content monetisation policy
If content creators wish to make money from their work, it's essential to become familiar with YouTubes content monetisation policies. An incredible amount of highly detailed thought has been put into this policy, with exquisite detail of exactly what type of twerking, jiggling, and moving a body in various types of apparently scandalous outfits, can or cannot be monetised. Whoever designed these policies, has seemingly not received good, pleasure-inclusive comprehensive sexuality education (CSE) either, as YouTubes regulations around discussing CSE are also confusing and contradictory, essentially banning all useful information about the physical mechanics of intercourse. While it may appear possible at first to monetise sex-ed content and start earning a living as a digital sex-educator, once you compare what is and is not possible to monetise, regulations start becoming very blurry and confusing.
YouTubes Adult Content Monetisation Policy splits content into will earn, may earn, and will not earn/no ad revenue. Taking sex education as an example, this is what YouTubes policy states:
Will earn: non-graphic sex education;
May earn: non-arousing sexual education containing animated sex acts;
Will not earn/no ad revenue:
Discussion of sexual topics, including fetishes, tips, and experiences
Descriptions of or implicit references to sexual activities (such as implicit reference to sexual body parts using emojis or graphics)
A medical object which resembles genitalia introduced during a discussion.
Discussions of intimate sexual experiences, such as masturbation, orgasm, intercourse, tips, or other sexual acts. This may also include sexual innuendos or sexually explicit or obscene text or audio, such as detailed conversations about sex.
Explicit discussions on sex tips or how to have sex.
According to YouTube, non-graphic sex-education can be monetised and discussed, but when ‘animated sex acts’ are mentioned, content is demoted to the ‘may earn’ category. However, there is no transparency about what exactly counts as ‘animated’. As soon as sexual tips, experiences, activities, intercourse, detailed conversations about sex, or anything around orgasms and pleasure are discussed, content is instantly not eligible for monetisation. Without engaging in conversations and content like these, sharing sex-ed content is highly restricted, centering negative narratives around risk, infections, shame, and dangers.
TikTok, Meta, and MicroSoft’s war on advertising pleasure
In other particularly dry policies, the digital advertising of condoms, lube, and anything hinting sexual pleasure, is highly restricted or forbidden across TikTok, Meta, YouTube, and MicroSoft's advertising policies. While policies are market-based and vary per locale, locations which do or may allow condom and lubricant advertising specifically state that ads must not be sexualised, or focus on sexual pleasure or enhancement. Ads may only centre medicalised messages of reproduction, contraception, and infection prevention, to those who are 18+. According to TikTok’s policies, several countries do not prohibit pleasure (Argentina, Cambodia, Chile, Colombia, Ecuador, and Peru), with the caveat that local authorities must approve adverts. MicroSoft goes further, adding sex education as another forbidden advertising topic, stating only exact-match keywords from the approved adult keyword list for adult campaigns can be bidded on for promotion in text-based advertising. The keyword list is only available to approved advertisers, who are approved to participate in MicroSoft's adult advertising program.
While medicalised messages around contraception to people 18+ are not per se bad messages, such digital advertising frameworks lack vital elements of CSE, like pleasure and consent. The use of condoms and lube reaches far beyond sex for the purpose of reproduction, and are normal, safe, and fun everyday items all partners benefit from and need. For many people globally, particularly young people who’re experimenting and exploring their bodies, it is important for available information around sexual items to be pleasure-based, encouraging agency, communication, enjoyment, and consent. Pleasure is an important message often left out of everyday and school-based CSE programming - and one of the main reasons people have sex - because it's fun and feels good! Not all intercourse is for reproductive purposes, however if online messaging and digital exposure to basic sexual health and pleasure-based products centres dry, medicalised messages forbidding pleasure, sex being risky, or an obligatory act to engage in, these stigmas are reinforced through promoted digital messaging.
While advertising for male-oriented sexual health products, including erectile dysfunction treatments, is routinely permitted, reproductive health organisations report that ads mentioning condoms, lubrication, or sexual pleasure are often rejected or required to remove non-reproductive language. The result is a digital advertising environment with double-standards privileging male-centred sexual health and performance, while restricting broader, pleasure-affirming, female-centred, and rights-based sexual health information.
If your content on abortion, sex education, queer culture, pleasure, condoms, lube, twerking, bodily autonomy, or similar has been restricted, shadow-banned, demonetised, or removed, report it. Individual cases are dismissed as mistakes. Patterns are not. Collective documentation is one of the few ways to expose the gap between platform policy and reality, and to demand accountability.