Your Menstrual Health Data is Very Valuable, and Big Tech Wants It.
By Rhian Farnworth, Digital Rights Specialist.
Google and period-tracking app Flo Health have been ordered to pay a total of $56M in damages to settle a class-action lawsuit, after Flo shared user’s intimate menstrual cycle and fertility data with Meta, Google, and two further platforms between 2016-2019. The data was used for targeted advertising without prior consent, violating Californian Privacy Laws.
Menstruation, fertility, and contraception apps are now a billion dollar industry, with the information that someone is expecting a baby worth up to 200 times more to advertisers, than data about their age, gender, or location. Cycle Tracking Apps (CTAs) are not only a goldmine for advertisers, they also provide their users with essential cycle-tracking and fertility information, and equally valuable hormonal insights like when it's a good time of the month to start a new project, or when hormonal balances might make you feel more conversational (or cranky). Quickly becoming an everyday tool, menstruating people can hastily rely on them to plan holidays and shopping, track fertile days, plan or avoid conception, monitor symptoms, improve self-care, and much more, often without thinking about where precious cycle-data is being stored, or who can access it. However, CTAs can also tell at what point in your cycle you're more susceptible to influence and are likely to buy something with less persuasion - hello databrokers, advertisers, and big tech!
On the surface, CTAs can appear to be neutral, offer beneficial health insights, and are even presented as empowerment tools addressing gendered-health gaps. Peaking underneath, it's easy to see they are designed around commercial, for-profit and highly monetised business models, reliant on accessing, surveilling, tracking, and selling your menstrual cycle data. With the global CTA market predicted to be worth $6.90Billion by 2032 and rising to $11.2Billion by 2034, it’s clear to see why CTA data are so desirable and valuable to advertisers and databrokers. But it's not just about the money; other actors, including law enforcement, can have different interests in cycle data. As reproductive rights narrow globally and access to safe abortion is increasingly restricted, new developments like the UK police being issued guidance in June 2025 to examine CTA data when investigating pregnancy loss, demonstrate how menstrual cycle data is increasingly ‘valuable’ to other actors who’re keen to use cycle data for other purposes, like for criminal evidence.
We might think our reproductive health data - including very intimate records of cycles, intercourse, yeast infections, masturbation, diet, menstrual flow, mood, ovulation, and more - is kept private and securely stored, yet data breaches and illegal sales of reproductive health data are rife. As mentioned earlier, in summer 2025 the Flo case was revealed, where Meta had illegally collected menstrual data from millions of Flo users. In 2023, 100 million users' data from American telecommunications giant AT&T was hacked, revealing sensitive call and text information including when people accessed abortion healthcare, and web browsing and geolocation data that can be accessed by law enforcement. This is especially concerning, as this type of data can show when someone visited an abortion clinic. In 2022, Vice was able to buy Mobile Advertising IDs (or ‘MAIDs’ - device-identifiers used by the mobile advertising industry) and data harvested from the period tracking app Clue, which could be used to identify individuals using the app, within minutes of signing-up to the data-broker platform, Narrative.io. They’ve since removed this sort of data, luckily. While MAIDs data are supposedly anonymous, there are established sub-industries which cross-identify MAIDs with real people, removing the anonymity MAIDs claim. Needless to say, reproductive health data is not as safe as we can be led to believe by CTAs.
Dark Patterns, Data harvesting, and Little Regulation of Cycle Tracking Apps
The unauthorised sharing, tracking, and sales of cycle-data often happens without users knowledge or consent, as apps harvest cycle data from the moment you register and start logging intimate info. The first thing most apps do when you login is present you with data and privacy agreements. Apps frequently operate with ‘give-us-everything’ or ‘pay-for-privacy’ models, with limited information about what happened to your data available when first registering. Further, ‘dark pattern design’ techniques are often employed, coercing users into sharing all their data, with the app and beyond to databrokers. Dark patterns rely on overly-complex, confusing, and lengthy privacy agreements resulting in consent fatigue, resulting in users sharing everything, without fully understanding the implications of data sharing. Privacy agreement messaging will use techniques like prominently showing large, green ‘agree to everything’ accept buttons, while hiding data privacy options in small, difficult to read text, or making you read through lengthy terms before being shown data-sharing options. This encourages and manipulates new users to not read privacy terms carefully, in turn agreeing to share all their intimate data with an app. Privacy agreements regularly fail to mention the types of intimate health data they are collecting, and offer insufficient options to agree or disagree to the data being shared with third-parties and data-brokers. This results in users often agreeing to share everything without fully realising what intimate data they are sharing and with whom, unaware of the far-reaching impact sharing CTA data could have on their privacy and life.
Once data is sourced and harvested, databrokers enter the picture. Databrokers are companies specialising in collecting, storing, and selling vast amounts of information to other companies. This includes buying data from third-party companies (like CTAs, credit card companies, grocery stores, etc), collecting it from public records (social media, court houses, housing records, and similar), and directly tracking your wider online activities. Data is then cleaned, cross-referenced, merged, categorised, and sold to advertisers. This could include lists categorised into topics like ‘high-earning yoga enthusiasts’, ‘women under 30 with missed periods’, ‘rape victims’ (yes, this actually happened), ‘individuals experiencing X illness’, and similar. Lists are supposedly anonymous, but once advertisers buy them, they are relatively easy to de-anonymise, resulting in individuals being targeted with highly specific advertising across their devices. CTA users may start to be suspicious about their data privacy, when adverts about pregnancy and baby-related items start appearing in their feeds, for example.
Coupled with the lack of regulations for CTAs, users are further left vulnerable to data protection violations. Another tactic used by CTAs to avoid firmer regulations is labelling themselves as ‘wellness’ products, rather than medical apps, which are more heavily regulated. Further, data regulations lack language and nuance to deal with the intimate data stored by CTAs, particularly if they are labelled as a wellness product. While there are many regulatory mechanisms in place to protect personal and sensitive data, they are not designed for the type of data that are collected from CTAs, leaving users open to data exploitation. Existing data protection regulations include:
GDPR: The European Union's General Data Protection Regulation.
CCPA: California’s Consumer Privacy Act.
HIPPA: Health Insurance Portability and Accountability Act.
FD&C Act: Federal Food, Drug, and Cosmetic Act.
MHRA: The United Kingdom Medicines & Healthcare products Regulatory Agency.
Federal Trade Commission Act, The Swiss Federal Act on Data Protection, and The EU Medical Devices regulation.
If you’ve got this far, we highly recommend you read this excellent article - Mind The FemTech Gap: Regulation Failings and Exploitative Systems - where data protection regulations and key failings in FemTech and CTA regulation are highlighted and explored in greater detail. These include:
Lack of explicit FemTech (CTA) terminology: there are no legal definitions of FemTech or CTA data in existing regulatory data protection mechanisms.
No cross-jurisdictional consistency: regulations in different countries do not match, while CTAs are available globally or from multiple countries.
Classifications do not cover CTAs: for example, while devices and apps classified as ‘medical’ have strict regulatory features, CTAs are often classified as ‘wellness-orientated products’, allowing them to avoid regulations, benefit from less data-protection mechanisms, and avoid stricter certifications.
‘Special Data’ category exceptions: GDPR, for example, gives extra protection to ‘special category’ data which requires greater sensitivity and regulation. There are however 10 exceptions, including if people give ‘explicit consent’ to data-sharing. CTA users may not be aware of the consequences of this exception. CTAs can then leverage GDPRs narrow definitions to operate in a legal vacuum that lacks a specific CTA category
No sector-specific standards: There are no dedicated certifications, compliance testing, or policy frameworks with best-practice guidelines or regulations for FemTech or CTA products
With reproductive rights and access to safe-abortion becoming increasingly difficult to access globally, it's imperative to be aware of the regulation nuances, data-exploitations, and privacy failings of CTAs, and the potential impact this can have. If cycle data gets into the wrong hands including “(ex-)partner and family, employers and colleagues, insurance firms, advertising companies, political and religious organisations, governments, and medical and research companies” (Mind the Femtech Gap), it can have devastating effects, threatening menstruating people's equity, prompting pregnancy related redundancies, and leading to (health insurance) discrimination, amongst others. In certain US states and other locations where abortion and even pregnancy-loss is highly restricted, illegal, or criminalised, cycle data can be used as a tool of criminalisation.
What can you do to protect your digital body and CTA data?
We get it, CTAs are super useful for a multitude of reasons, and you may not want to hear ‘omg they’re a privacy nightmare, delete them!’ and give them up. There are a few things you can do to still track cycle data with care and privacy. Our tips are as follows:
Read privacy and data sharing policies closely, and understand how your data will be used and shared and with whom, before using new apps on your phone or other digital devices. It can be a very tedious and annoying process, but ultimately it the best way to understand what happens to your CTA data once you start entering it into an app.
Use your phone's built-in health app. Remember to check your phone's device-sharing policies, and if you’ve ever shared health-data with someone else, they may still be able to access it.
Make your own period-tracking spread-sheet/database. Follow this handy guide for a step-by-step guide about how to create a cycle-tracking spreadsheet.
Use a calendar; on your phone, computer, or a physical paper one. Digital calendars are more open to being digitally leaked, so you can use emojis or other code-words to obscure and document your period, in case you’re concerned about the data being accessed by others. If you’re using a physical calendar, remember this could also be used by law-enforcement as a record.
Use ‘anonymous mode’. Some CTAs offer this, allowing you to access the app without providing identifying information.
Read up on data privacy and safer, more private CTAs. This guide to the best period tracking apps for data-privacy was release in September 2025, and we’ve shared some further reading below. We strongly encourage all CTA users to do their own research and decide what is best for themselves <3
Ultimately, if you are someone who menstruates and documents intimate health data on digital devices, do everything you can to control who has access to your intimate and reproductive health data. Data breaches happen often, supposedly anonymous data is sold by databrokers to third parties, and privacy is not guaranteed. Don’t divulge ANY data you want to keep private!
Further Reading
Did we catch your attention? Here are some further resources to read about the topic at a deeper level.
Felsberger, S. (2025). The High Stakes of Tracking Menstruation. Minderoo Centre for Technology and Democracy. https://doi.org/10.17863/CAM.118325
Key, Kim. Don’t give big tech your period info. Heres how to track it privately. (2025) https://www.pcmag.com/explainers/dont-give-big-tech-your-period-info-heres-how-to-track-it-privately
Heather Chong, Data for Sale: Navigating the Role of Data Brokers and Reproductive Health Information in a Post-Dobbs World, 26 SMU Sci. & Tech. L. Rev. 99 (2023)
Kuźnicka-Blaszkowska, Dominika and Joachimska, Joanna (2025) "When Your Phone Knows You’re Pregnant Even If You Don’t: Period Tracking Applications and Threats to Privacy," Journal of International Women's Studies: Vol. 27: Iss. 1, Article 9. Available at: https://vc.bridgew.edu/jiws/vol27/iss1/9
Mehrnezhad M, Van Der Merwe T and Catt M (2024), Mind the FemTech gap: regulation failings and exploitative systems. Front. Internet. Things 3:1296599. doi: 10.3389/friot.2024.1296599
Stempel, Jonathan. Google, Flo Health Pay $56 Million in period-tracking app case. 2025.
Kelly, Bridget G.; Junchaya, Ornsiree; Min, Jie; and Burdan, Michael, "Safeguarding autonomy: Examining the complexities and implications of under-regulated period-tracking apps and paired devices in a post-Roe landscape" (2025). GW Authored Works. Paper 7425. https://hsrc.himmelfarb.gwu.edu/gwhpubs/7425