Go With The Flo? What Should The Digital Health & Wellness Industry Learn From The Flo Health Lawsuit?
Verdict: My Body, My Choice.
[Quick note: No newsletter next week! Back in your inboxes on Friday 29th August.]
Last week the latest court case involving the data sharing practices of Flo Health drew to a close in the U.S, and all companies involved in digital health and wellness should sit up and take notice of the verdict. There was a bit of a twist in the tale as ultimately Meta, the last company standing when Flo and other parties settled, was found liable for privacy invasions arising from the collection of data from Flo app users. Meta’s damages bill could run into billions of dollars.
Flo might be the largest in its market, but they are no Meta and can’t absorb huge fines and settlements as part of the cost of doing business. Startups entering the market may continue to fall into the same traps which could result in financial and reputational damage that sinks the business, so this case provides important lessons.
Now, I’m not a lawyer. I’m a human rights and tech expert who has spent years holding tech companies accountable for privacy harms. I have extensively researched the digital advertising and Femtech industry, following this particular case with great interest. My work focuses on human rights impacts that go beyond current legal compliance and this deep-dive into the case is from that perspective.
It’s a longer article this week but the TL:DR version is:
Users, especially women, care deeply about health and intimate data privacy and are willing to defend this in court.
Users in this case all accepted Meta’s T&Cs but a jury ultimately decided that this did not mean they had signed away their privacy. This could have implications for the use of sensitive data for advertising/marketing purposes going forward.
This case goes deep into the stack of “industry standard” developer tools like software development kits (SDKs) that extract valuable data without the user’s knowledge.
Similar companies should look deeply into their own stack to truly understand what “third parties” are getting out of your users' data and prioritise user privacy.
If you would like to chat further about the themes of this week’s article, get in touch.
Ok, let’s dive in…
Flo Health is a UK-based company also registered in California that makes the Flo period and fertility tracker app. In a crowded market of cycle trackers they are No.1, with 380 million downloads and over 68 million active monthly users according to their website. In 2024, Flo became the first Femtech company in Europe to achieve “unicorn” status, meaning their valuation had gone beyond $1 billion. In short, they are massive in the field of women’s digital health.
Even before the overturning of Roe v Wade in the U.S removed federal protections for abortion and raised the stakes and tension around reproductive surveillance, period and fertility trackers were on the radar of many civil society organisations and journalists regarding the collection and sharing of sensitive data which could be weaponised against women seeking an abortion, including Privacy International, where I worked until 2022. These concerns were starkly put into context following the very public bowing down of Big Tech to the Trump administration and Meta proving loyalty by doing away with many protections for users.
In short, sharing of sensitive health data with Big Tech is a problem.
Back to where it all began…
Flo’s problems began in 2019, when the Wall Street Journal reported that it was able to intercept unencrypted health information transmitted by the Flo app to Facebook. Data included the user’s intention to get pregnant, and when the user was having a period. The report stated the information also included a unique advertising identifier, which can be matched to a specific device or social media profile, identifying the user. This is despite Flo’s privacy policies stating that Flo would not share users' health details with any third parties.
The third parties in question were Google, Facebook (now Meta), Fabric, AppsFlyer and Flurry. You’ve probably only heard of two out of the five and this demonstrates the complex and opaque nature of the digital advertising world and just how many companies are in the chain. You may think you are communicating with one company or brand providing an app, but you’re actually communicating with many companies analysing all your interactions on that one app.
The Federal Trade Commission (FTC) in the U.S, a regulator whose mission is to protect the public from deceptive or unfair business practices, was interested. The FTC had, at that point, pre-second term Trump, cracked down on tech companies for unfair practices under consumer law relating to data sharing, and it began investigating Flo for data sharing practices between 2016-2019.
Under consumer law, the FTC’s case against Flo essentially boiled down to, ‘You said you wouldn’t share data with third parties and you did’ - a simple premise that resonates with users. Broken promises = broken trust, and that is bad for business.
This is significant when considering the kind of information users were entering into the Flo app. When users signed up, a survey asked all kinds of intimate questions about physical and mental health conditions, medication and sex life, along with name, email address, date of birth and other personal identifying information.
The FTC ordered Flo to change its practices; Flo agreed to settle without admission of wrongdoing and said at the time the Wall Street Journal report was based on "inaccurate representations”. They were keen for that to be the end of the matter. However, Flo users reacted strongly to this perceived betrayal of trust and this was far from being the end of the matter. According to the FTC complaint, Flo received hundreds of complaints and requests their details be deleted. Customers were “outraged,” “incredibly upset,” “disturbed,” “appalled,” and “very angry.” Indeed, they felt “victimized” and “violated”...”
A class action lawsuit followed, brought by a group of women in California against the parties in the FTC complaint- Flo, Meta, Google, AppsFlyer, Flurry and Fabric - claiming the sharing of their personal data had contravened the California Invasion of Privacy Act and California’s Confidentiality of Medical Information Act during the same time period of 2016-2019.
“We don’t sell your data”
You may have heard this from various companies but I’m afraid it’s a bit more complicated than that. It’s not about a sale, it’s about a trade and this is where the class action lawsuit added more detail on top of the FTC complaint.
In this case, it’s all about what happened deep in the workings of the app, in the software development kits (SDKs), and this is where it starts to get murky. SDKs are a package of software tools that contain everything a developer needs in order to build apps. They can include components like debuggers or API’s, but are also tools for analytics and offered for free by many companies as part of the services they provide.
The class action complaint explains,
“For instance, Facebook’s SDK can be incorporated into an app to share user data between an app and Facebook. By using the Facebook SDK, developers can gain access to Facebook’s data analytics and use Facebook tools to assist with mobile ads, among other things. Flo Health incorporated Facebook’s SDK so that it could use Facebook’s analytics tools to identify which of its users would be prime targets for advertisements keyed off the data they entered into the App.”
Flo also incorporated SDKs from Google, AppFlyer, Flurry and Fabric into the app and the case claims that this allowed those companies (“Non-Flo Defendents”) to access data in return for the analytics they provided. The complaint continues,
“In exchange for using the Non-Flo Defendants’ SDKs, Flo Health transmitted intimate health data entered into the Flo App to the Non-Flo Defendants—in direct contravention of Flo Health’s assurances to users that this information would not be disclosed—including when a user indicated that they were on their period or intended to get pregnant.”
A trade, not a sale
As a result of this trade these other companies, whose business it is to generate revenue from advertising, could target that exact person or people like them with adverts elsewhere on the web, whether through Facebook, Instagram, Google, or other platforms and apps where the third parties did business. The class action complaint provides some real world examples of what is known as “cycle-based advertising”,
“For instance, armed with knowledge that a Flo App user is pregnant or attempting to get pregnant, the Non-Flo Defendant can specifically target that user with ads for prenatal vitamins, breast pumps, or fertility treatments, among other things.”
“As another example, if a user indicated that she experienced oily skin during her menstruation cycle, the Non-Flo Defendants could use this information to target that user…with advertisements for certain skin care products around this time period.”
It appeared there would be an interesting test case afoot for both the digital advertising and wellness industry, but before the details could be tested at trial, Flo and Google settled. (Appsflyer was dismissed from the case in 2022, Flurry settled for $3.5 million and went out of business).
So now the focus switched to Meta and they were prepared to go to a jury trial to decide whether there had been an invasion of privacy under the California Invasion of Privacy Act. Meta denies it received users' sensitive health information through SDKs, claiming that when users interacted with the app, Meta only received the custom app event and the value “known” or “unknown.”
Let’s unpack that.
A custom app event is part of Meta’s advertising package. Meta’s client, in this case Flo, creates custom app events to record information that can be used in further advertising and marketing campaigns. According to the FTC, Flo created 12 custom app events, one of which was “R_PREGNANCY_WEEK_CHOSEN” where a user had inputted how many weeks pregnant they were. If, for example, the user had inputted “12 weeks”, Meta would receive the value “known”. While Meta wouldn’t receive the exact number of weeks, information regarding the user's pregnancy status is crystal clear.

All this is happening as we experience a backlash against women’s rights globally and reproductive rights being rolled back. In this climate of fear for many women, some of the defence seems a little tone deaf to the fears that sensitive information at risk of being weaponised against women is in the hands of who knows.
In order to gain the full picture of what happens to their intimate data, Flo users would have to understand that Flo integrated the SDKs of a variety of companies including Meta into the app, understand what an SDK is, know that survey results were being shared as custom app events and understand what a custom app event is. At the same time users believed, from Flo’s own policies, that their information would not be shared. It is nigh on impossible for the average user to navigate this conflicting and complex ecosystem and the jury agreed.
The verdict and where we go from here
On August 7th, a jury found Meta liable for violating the California Invasion of Privacy Act. As the lawsuit covered all Flo users across the U.S from 2016-2019, some estimates put the damages at $190 billion, with a potential 38 million users receiving $5,000 each.
Meta will likely appeal and maintain that any transmission of sensitive health data is due to a failure of developers (like Flo) to comply with its terms of use. You wouldn’t know any of this from Flo’s comms. They may have settled this case, but there are other cases pending against Flo. There is also a class action in Canada and a consumer rights/GDPR test case ongoing in Portugal. These cases could have implications for the use of sensitive data for advertising and I truly hope so.
As I outlined in The Prompt last week, users are pushing back on T&Cs that claim to provide legal cover for all sorts of things.
What is significant about these cases is that they tell us quite a bit about user expectations, especially women’s expectations of data relating to their bodies and lives. The class action and data protection cases are brought by users, users that had ‘accepted’ T&Cs. The verdict in the Meta case came from a jury, not a judge. Companies might find cracks to fall between in data, wellness and health, and hide behind the T&Cs as legal protection, but users, espeically women, just aren’t standing for it.
As this case demonstrates it’s a bit more complicated than “We don't sell your data”, and women are prepared to stand up, yet again, to protect and defend their choices.