Back when Roe was overturned, I helped some of my friends evaluate their period-tracking apps for privacy concerns. The iOS version of Flo has the following data privacy report card: These have...
Back when Roe was overturned, I helped some of my friends evaluate their period-tracking apps for privacy concerns.
The iOS version of Flo has the following data privacy report card:
Data Used to Track You The following data may be used to track you across apps and websites owned by other companies:
Purchases
Identifiers
Location
Usage Data
Data Linked to You The following data may be collected and linked to your identity:
Health & Fitness
Location
User Content
Identifiers
Sensitive Info
Purchases
Contact Info
Search History
Usage Data
Diagnostics
These have been in place since at least 2022, because I was able to go back in my chats and find a screenshot of it from back then.
This is not to take anything away from how awful Meta is, but Flo wasn’t exactly a paragon of virtue here either. They were massively overreaching and playing fast and loose with their users’ data in an app that is already meant to collect highly sensitive information (all the moreso now in a post-Roe world). A period-tracking app shouldn’t need your search history or contact info, nor should it share your location data with third parties.
Meta sucks here, but Flo absolutely does too.
By the way, for anyone looking for a private period-tracking app, some of my friends switched to Euki which has a privacy report card that looks like this:
Data Not Collected The developer does not collect any data from this app.
There’s also Drip which is open source and doesn’t collect any data.
Some others chose to go offline with their tracking and moved to using journals/calendars which is undoubtedly the safest option, albeit not the most convenient.
I have no sympathy for Facebook and their many many many privacy violations. I actually never touch anything meta. Even WhatsApp that is encrypted (if you trust Meta and their track record at...
I have no sympathy for Facebook and their many many many privacy violations. I actually never touch anything meta. Even WhatsApp that is encrypted (if you trust Meta and their track record at their word)
Anyway, the attorneys do have a point though. It’s the app developers that decide what events are sent to FB (and other telemetry providers), not FB. As I understand it the developers recorded events about the users cycles and sent that as the payload for specific events to FB. So yeah idk. Maybe punishing FB is the right move because they’ll maybe try and find out whether people are abusing the SDK but it seems odd that the people who designed the app to work in this way aren’t defendants too.
While I do agree with you, if I make it financially appealing to, say, have Colombian Drug Lords send me packages every month I can't then turn around and be like "Of course I had to sell it,...
While I do agree with you, if I make it financially appealing to, say, have Colombian Drug Lords send me packages every month I can't then turn around and be like "Of course I had to sell it, blame them for sending it to me!"
If Facebook didn't want to be receiving sensitive health information, they could've told the app developers "stop sending me that stuff."
Fair point, I didn’t appreciate that Facebook was then selling this data. In my mind it was more like they provide an analytics service where the devs can send whatever events and that’s it. Like...
Fair point, I didn’t appreciate that Facebook was then selling this data. In my mind it was more like they provide an analytics service where the devs can send whatever events and that’s it. Like it’s done for feature usage, that data is the used for product development only, and that’s the service. But it’s different in that case and I hadn’t realized thanks for pointing it out.
To clarify the point made my sodliddesu, Facebook isn't selling the data itself. They are selling targeted advertising using the data. So women who don't have a menstruation for the past several...
To clarify the point made my sodliddesu, Facebook isn't selling the data itself. They are selling targeted advertising using the data. So women who don't have a menstruation for the past several months will start seeing targeted ads for baby products as one example.
I also found that confusing. Here's an article about the settlement with the app: https://www.courthousenews.com/flo-settles-data-privacy-suit-over-menstrual-tracking-app/ In the larger scheme of...
In the larger scheme of things, I think it's good that Facebook was held responsible, I hope it leads to AdTech vampires being more careful with what data they use for advertising.
In this particular case though, I can't find where or how the data was actually used, when it was shared to third parties, how it was tied to users, or who profited from it. If anyone has more info please add it. From what I'm reading, I think the Flo app itself is mainly to blame for sharing sensitive data without consent.
Shit like this is why free open source software should be the default.
Came here to basically post this. Many of their users probably don't care but I'm sure many just don't know or understand. I would like to see cigarette pack warnings on websites. If you profit...
Came here to basically post this. Many of their users probably don't care but I'm sure many just don't know or understand. I would like to see cigarette pack warnings on websites. If you profit off stolen user data you get a black bar affixed to the top of the page that says: "Here are all the ways we've abused your data in the last 12 months..."
Any aggregator of personal health information should be legally bound to the same level of confidentiality as a doctor or nurse or health researcher at a university. If it's illegal for a person...
Any aggregator of personal health information should be legally bound to the same level of confidentiality as a doctor or nurse or health researcher at a university. If it's illegal for a person to share, using an algorithm to do it doesn't and shouldn't make it legal. Personal health information is intrinsically sensitive and should be protected.
Im just curious here, but could I ask you to give me a link to the cigarette pack warnings you’ve got in mind? I know that different jurisdictions will have different levels of regulation about...
I would like to see cigarette pack warnings on websites. If you profit off stolen user data you get a black bar affixed to the top of the page that says: "Here are all the ways we've abused your data in the last 12 months
Im just curious here, but could I ask you to give me a link to the cigarette pack warnings you’ve got in mind? I know that different jurisdictions will have different levels of regulation about that kind of thing, and I’m picturing Australia’s packaging laws which might be some of the strongest in the world, but I don’t think that’s quite what you’re suggesting.
For context: there’s zero branding allowed, package must be the most unpleasant vomit-yellow colour that the government could come up with, the name and variety of cigarettes (e.g. “Winfield Blue 20pk”) is printed in the exact same bland font and small size on every pack regardless of brand, and there’s always a big gross picture of some horrible disease caused by smoking, for example
examples hidden because they’re gross. Contents: simple descriptions and a link to an image which shows a handful of these cigarette packs including the gross pictures
a gangrenous foot with missing toes, or a cancer-riddled mouth, or a close-up of a tracheotomy
Yeah you got it. I'm just borrowing the general idea of a government forcing a private business to plaster their product with high visibility messaging about the potential damages and downsides of...
Yeah you got it. I'm just borrowing the general idea of a government forcing a private business to plaster their product with high visibility messaging about the potential damages and downsides of their product.
This article is hot slop. Ok, whose SDKs are they then, Flo's or Meta's? How does and why would even Meta use some third-party developer's SDKs? That makes no sense. I'm guessing they're talking...
This article is hot slop.
During the trial, plaintiffs focused on software development kits or SDKs — prewritten bits of code that developers use to build apps and track analytics.
They argued that Flo's SDKs effectively worked like recording devices. “In 2025, recording devices come in all shapes and sizes,” Canty said.
Plaintiffs argued Meta intentionally used SDKs to record women’s communications through 12 “custom app events” with names like “R_SELECT_LAST_PERIOD_DATE” and “R_SELECT_CYCLE_LENGTH.” They claimed Meta received event data for each survey question users filled out and used the data for advertising.
Ok, whose SDKs are they then, Flo's or Meta's? How does and why would even Meta use some third-party developer's SDKs? That makes no sense.
Then third-party developers can define their own custom events. That's what Flo did.
Meta "collected it, recorded it, used it, exploited it, profited from it," Canty said. "Ladies and gentlemen of the jury, that is intent."
Both standard and custom events are used by Facebook to fine-tune user embeddings. Collected? Yes. Recorded and used it? Not directly. Facebook isn't being like, "this user had a period last week, target menstrual product ads at them!" Rather, it's a tagging system where semantic understanding is implied through tag co-occurence.
As for intent, I'm skeptical that Facebook has engineered its whole system specifically to get people's menstrual data. I'm extremely skeptical that Facebook reached to Flo and was like, "hey, we want you to send us your users' sexual health data" because the payoff is so little and the liability and optics are so bad, it's irrational.
“Does a reasonable person think they were consenting to what happened here?” Canty said. “Show me in the [Meta] data policy where it says you give us permission to secretly record your private conversations with the Flo app.”
A jury found Friday that Meta violated the California Invasion of Privacy Act when it intentionally recorded the sensitive health information of millions of women through the period tracking app Flo.
Mind this is only the jury's verdict.
CIPA protects unauthorized interception, recording, or eavesdropping of a private conversation or communication by telephone, electronic device, or other means.
The definition of an electronic conversation or communication is iffy.
As demonstrated above, it's not possible using the Facebook SDK or the Facebook App Events API to transmit a full text or voice conversation or an embedding thereof, unless Flo was somehow reducing conversations down to a string of keywords all under 100 characters and then submitting those.
Sure, they never tell tampons.store that EgoEimi had their period recently, because selling the actual user data would destroy their whole advertising model. But they sure as heck will tell...
Recorded and used it? Not directly. Facebook isn't being like, "this user had a period last week, target menstrual product ads at them!"
Sure, they never tell tampons.store that EgoEimi had their period recently, because selling the actual user data would destroy their whole advertising model.
But they sure as heck will tell tampons.store that “hey we know when users do or don’t have their periods so if you pay us to advertise, we’ll make sure we put the ads in front of someone who is having their period” and then they absolutely will add EgoEimi to their list of “people to advertise period products to” this week in their targeted advertising. And maybe neither of these things will happen as verbatim or obviously as I’ve described here, because plausible deniability is great at dodging lawsuits when actually you totally still did the thing, just abstracted it enough to make it seem like not the thing you did.
It’s still just as deliberate, and it’s still taking advantage of sensitive medical information that they shouldn’t even have, but certainly shouldn’t be using for advertising and profit!
This seems to me like the same result (Facebook collecting and profiting from menstrual data about its users) but with software in the middle to obfuscate things. That is, maybe one company...
Facebook isn't being like, "this user had a period last week, target menstrual product ads at them!" Rather, it's a tagging system where semantic understanding is implied through tag co-occurence.
This seems to me like the same result (Facebook collecting and profiting from menstrual data about its users) but with software in the middle to obfuscate things. That is, maybe one company innocently put data in and Meta innocently applied it in a way that predicted menstrual behaviors without anyone intentionally designing it to, but the end result is the same isn't it?
When we design software clever enough to emerge behaviors for which it wasn't designed, behaviors which are undesirable to many, but profitable to business, where does culpability go?
Back when Roe was overturned, I helped some of my friends evaluate their period-tracking apps for privacy concerns.
The iOS version of Flo has the following data privacy report card:
These have been in place since at least 2022, because I was able to go back in my chats and find a screenshot of it from back then.
This is not to take anything away from how awful Meta is, but Flo wasn’t exactly a paragon of virtue here either. They were massively overreaching and playing fast and loose with their users’ data in an app that is already meant to collect highly sensitive information (all the moreso now in a post-Roe world). A period-tracking app shouldn’t need your search history or contact info, nor should it share your location data with third parties.
Meta sucks here, but Flo absolutely does too.
By the way, for anyone looking for a private period-tracking app, some of my friends switched to Euki which has a privacy report card that looks like this:
There’s also Drip which is open source and doesn’t collect any data.
Some others chose to go offline with their tracking and moved to using journals/calendars which is undoubtedly the safest option, albeit not the most convenient.
I have no sympathy for Facebook and their many many many privacy violations. I actually never touch anything meta. Even WhatsApp that is encrypted (if you trust Meta and their track record at their word)
Anyway, the attorneys do have a point though. It’s the app developers that decide what events are sent to FB (and other telemetry providers), not FB. As I understand it the developers recorded events about the users cycles and sent that as the payload for specific events to FB. So yeah idk. Maybe punishing FB is the right move because they’ll maybe try and find out whether people are abusing the SDK but it seems odd that the people who designed the app to work in this way aren’t defendants too.
While I do agree with you, if I make it financially appealing to, say, have Colombian Drug Lords send me packages every month I can't then turn around and be like "Of course I had to sell it, blame them for sending it to me!"
If Facebook didn't want to be receiving sensitive health information, they could've told the app developers "stop sending me that stuff."
Fair point, I didn’t appreciate that Facebook was then selling this data. In my mind it was more like they provide an analytics service where the devs can send whatever events and that’s it. Like it’s done for feature usage, that data is the used for product development only, and that’s the service. But it’s different in that case and I hadn’t realized thanks for pointing it out.
To clarify the point made my sodliddesu, Facebook isn't selling the data itself. They are selling targeted advertising using the data. So women who don't have a menstruation for the past several months will start seeing targeted ads for baby products as one example.
I also found that confusing. Here's an article about the settlement with the app: https://www.courthousenews.com/flo-settles-data-privacy-suit-over-menstrual-tracking-app/
In the larger scheme of things, I think it's good that Facebook was held responsible, I hope it leads to AdTech vampires being more careful with what data they use for advertising.
In this particular case though, I can't find where or how the data was actually used, when it was shared to third parties, how it was tied to users, or who profited from it. If anyone has more info please add it. From what I'm reading, I think the Flo app itself is mainly to blame for sharing sensitive data without consent.
Shit like this is why free open source software should be the default.
If I understand the article correctly, they settled their part of the case.
I'm shocked that Facebook would violate user privacy. This is unprecedented!
Came here to basically post this. Many of their users probably don't care but I'm sure many just don't know or understand. I would like to see cigarette pack warnings on websites. If you profit off stolen user data you get a black bar affixed to the top of the page that says: "Here are all the ways we've abused your data in the last 12 months..."
Any aggregator of personal health information should be legally bound to the same level of confidentiality as a doctor or nurse or health researcher at a university. If it's illegal for a person to share, using an algorithm to do it doesn't and shouldn't make it legal. Personal health information is intrinsically sensitive and should be protected.
Im just curious here, but could I ask you to give me a link to the cigarette pack warnings you’ve got in mind? I know that different jurisdictions will have different levels of regulation about that kind of thing, and I’m picturing Australia’s packaging laws which might be some of the strongest in the world, but I don’t think that’s quite what you’re suggesting.
For context: there’s zero branding allowed, package must be the most unpleasant vomit-yellow colour that the government could come up with, the name and variety of cigarettes (e.g. “Winfield Blue 20pk”) is printed in the exact same bland font and small size on every pack regardless of brand, and there’s always a big gross picture of some horrible disease caused by smoking, for example
examples hidden because they’re gross. Contents: simple descriptions and a link to an image which shows a handful of these cigarette packs including the gross pictures
a gangrenous foot with missing toes, or a cancer-riddled mouth, or a close-up of a tracheotomy
Here’s an example but be warned it’s not pretty
https://www.ft.com/__origami/service/image/v2/images/raw/http://prod-upp-image-read.ft.com/0414511a-11ec-11e6-91da-096d89bd2173?source=next-opengraph&fit=scale-down&width=900
Yeah you got it. I'm just borrowing the general idea of a government forcing a private business to plaster their product with high visibility messaging about the potential damages and downsides of their product.
It’s… it’s almost like their whole business model relies on violating people’s privacy? I’m SHOCKED!
They'll stop and become ethical any day now.
Haha yes I’m sure they will. As soon as ethics becomes profitable!
This article is hot slop.
Ok, whose SDKs are they then, Flo's or Meta's? How does and why would even Meta use some third-party developer's SDKs? That makes no sense.
I'm guessing they're talking about app events used for user analytics and audience targeting, see reference.
There are standard events like:
EVENT_NAME_AD_CLICK
EVENT_NAME_COMPLETED_REGISTRATION
EVENT_NAME_SEARCHED
EVENT_NAME_VIEWED_CONTENT
Then third-party developers can define their own custom events. That's what Flo did.
Both standard and custom events are used by Facebook to fine-tune user embeddings. Collected? Yes. Recorded and used it? Not directly. Facebook isn't being like, "this user had a period last week, target menstrual product ads at them!" Rather, it's a tagging system where semantic understanding is implied through tag co-occurence.
As for intent, I'm skeptical that Facebook has engineered its whole system specifically to get people's menstrual data. I'm extremely skeptical that Facebook reached to Flo and was like, "hey, we want you to send us your users' sexual health data" because the payoff is so little and the liability and optics are so bad, it's irrational.
If anything, Facebook actively restricts tags for sensitive data and will even ban developer. Here's a Reddit thread by a developer trying to circumvent being flagged by the platform Health Restrictions.
The max length for an app event parameter value is 100 characters, see doc 1. section Limits -> 3. Are there limits on the number of characters used in event names or parameters?, so it's not really feasible to transmit conversations (unless you chop them up into tiny fragments, which would render them semantically useless), let alone embeddings of conversations.
Mind this is only the jury's verdict.
CIPA protects unauthorized interception, recording, or eavesdropping of a private conversation or communication by telephone, electronic device, or other means.
The definition of an electronic conversation or communication is iffy.
As demonstrated above, it's not possible using the Facebook SDK or the Facebook App Events API to transmit a full text or voice conversation or an embedding thereof, unless Flo was somehow reducing conversations down to a string of keywords all under 100 characters and then submitting those.
So is the user clicking buttons or selecting numbers in app considered conversation or communication? No higher state court in California has yet spoken on the matter yet.
Sure, they never tell tampons.store that EgoEimi had their period recently, because selling the actual user data would destroy their whole advertising model.
But they sure as heck will tell tampons.store that “hey we know when users do or don’t have their periods so if you pay us to advertise, we’ll make sure we put the ads in front of someone who is having their period” and then they absolutely will add EgoEimi to their list of “people to advertise period products to” this week in their targeted advertising. And maybe neither of these things will happen as verbatim or obviously as I’ve described here, because plausible deniability is great at dodging lawsuits when actually you totally still did the thing, just abstracted it enough to make it seem like not the thing you did.
It’s still just as deliberate, and it’s still taking advantage of sensitive medical information that they shouldn’t even have, but certainly shouldn’t be using for advertising and profit!
This seems to me like the same result (Facebook collecting and profiting from menstrual data about its users) but with software in the middle to obfuscate things. That is, maybe one company innocently put data in and Meta innocently applied it in a way that predicted menstrual behaviors without anyone intentionally designing it to, but the end result is the same isn't it?
When we design software clever enough to emerge behaviors for which it wasn't designed, behaviors which are undesirable to many, but profitable to business, where does culpability go?