Any EU based users of reddit should immediately file a complaint under GDPR with their supervisory authority for the sale of their data to Google to train their LLMs
reddit is telling it's future investors with recent news and more info on their IPO, that they're currently selling and looking to sell their user's data to companies wanting to train their LLMs, including Google.
This is a direct violation of the GDPR for any EU based users.
Legal Basis?
Under Art. 6 GDPR reddit may only really use p1 (f) (p meaning paragraph) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.
All other options are impossible, as they don't have consent, nor do they have contracts with their user base to allow for this. Art. 5 p1 (f) is a touchy subject and clearly requires extra provisions being made in case the data subject is a child. Reddit has tons of children (meaning anyone under 18) using their site daily. See for example: https://www.reddit.com/r/teenagers/
What's being processed
Due to the nature of reddit, they are also processing huge amounts of data of special categories Article 9 such as data on sexual orientation, health information, ethnic info, union information, etc. (basically everything in Article 9 can easily be found on reddit):
(note users have not given explicit consent according to the requirements of consent under Art. 7and 8, which would allow for such processing under Art. 9 p2 (a))
These are obviously just a tiny selection of the hundreds of subreddits that are concerned with these types of data, not to mention the unencrypted "private" messages, chats, etc.
My lord, is this legal?
Article 9 p2 (e) states "processing relates to personal data which are manifestly made public by the data subject;"; so they're out of the woods, right? After all, users posted this stuff and it was made public! Sadly, this doesn't work with processing data of children, especially with 9 p4 allowing member states to introduce additional limitations and conditions for further processing.
The real kicker comes with Article 5 p1 (b) though for 9 p2 (e). 5 p1 (b) requires the personal data be: "collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall, in accordance with Article 89(1), not be considered to be incompatible with the initial purposes (‘purpose limitation’);"
Yeeeah... People have posted their stuff publicly, BUT with a clear understanding that the processing of the data ends there. Reddit may process the data insofar they serve the data to the public. That's it. Turning around and selling the data now is a crystal clear violation of Article 5 p1(b). There's no two ways about it. As per Article 5 p2, reddit needs to be able to prove they are in compliance with 5 p1.
They're also processing data under Art. 10 relating to criminal convictions and offences
Processing of such data shall be carried out only under the control of official authority or where processing is authorized by Union or Member State law. Rugh-Roh. I'll admit, that this one might be reaching a bit, as it could easily be considered only to apply to "official" type of data here, rather than just criminal talk overall, but fuck it. Throw it on the pile.
Your rights and how they're being violated (not in a kinky fun way)
Now let's look at the Rights of the data subject! Those are always fun :)
Art. 12 p1 "The controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 and any communication under Articles 15 to 22 and 34 relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child."
This is my favorite. Articles 13and 14 are provisions on informing the data subject where they had data obtained directly from them or not directly from them. For reddit it mostly applies 13, but since people also talk about people they know, 14 also applies.
Let's check in there real quick. We'll keep it to Article 13 for brevity. Reddit needs to:
give info on contact details of the controller and the controllers rep (in the case of selling data to Google for LLM training, that's info for Google, not reddit; anyone got that info via DM maybe? No? Oh shit)
contact details of their data protection officer (both reddit and google, anyone was informed on that for the LLM stuff? No?)
purposes of processing including the legal basis. Love this one. Anyone know that for the sale of their data to Google to train their LLM? No? Shucks.
Since they're likely hinging on Art. 6 p1 (f) they need to tell you what the legitimate interest is - in our case MAD MONEY, not sure if that'll hold up.
recipients or categories of recipients -> so "big evil data churning, election influencing, minority silencing, union busting, mega corp" Sweet.
Transfer to third countries -> likely doesn't apply, as reddit servers are in the US (I believe, no idea if that's true) if not, weeeeeeell...
right to lodge a complaint with supervisory authority (anyone got that notice?)
whether the data is provided due to a contract or statutory -> doesn't apply, users give their data "freely"
existence of automated decision-making, INCLUDING PROFILING (as per Art. 22 -> non of the exceptions in that article apply to the current situation) - I'm sure no LLM will ever be used by the largest Ad company on the planet to help profile users, noooooooo, that's craaazy!
Article 13 p3 clearly states in relation to Article 5 p1 (b) that the data subject must be informed about the data that is being collected for further processing BEFORE such processing occurs, including all the info I just listed above from Article 13 p2.
Article 13 p4 states "Paragraphs 1, 2 and 3 shall not apply where and insofar as the data subject already has the information." yeeaaah... No reddit user knew about their data being actively sold to LLMs (sure LLMs might have scraped it, but that's an entirely different can of worms - in which reddit has a hand, too under GDPR as they're supposed to establish safeguards against such things, but reddit directly selling now without any upfront info... tut, tut..)
Another element of Article 12 p1 is this bit "in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child." I'm sure they'll find a cool and hip way of explaning to all the teens on reddit what an LLM is and how it's using their data.
Send reddit a little e-mail
For added fun, I urge anyone who still has a reddit account and is a EU citizen to contact reddit and make use of their rights under the GDPR to be specifically excluded from any use for LLM training, etc. Which is your RIGHT under Article 12 p2 specifically Article 21 right to object. You can contact them via "dpo@reddit.com"
Let's be really petty and assume that reddit and Google are shit at what they do, so they're likely not even engaged in a Data Processing Agreement required under Article 28 p3 and if so, I'd love my supervisory authority to take a look at that one.
"Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. 2A single assessment may address a set of similar processing operations that present similar high risks."
New technologies you say? Likely to result in a high risk, you say? Remember how most chatbots and AIs turn super racist, super quick? Or AIs being easily triggered into revealing their training data 1:1? Oh I'm sure there's nooooo such risk with LLMs run by evil mega corp known for exploiting the shit out of exactly this kind of info for well over a decade now.
But we needn't even argue that point. Article 35 p3 (b) clearly states:
"processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10;"
Oh nooo. Remember my list from the start? LLMs are 100% definitely large scale processing all that with reddit data sets.
Any assessment carried out would clearly indicate risk and thus Article 36 would apply, where reddit has to consult with supervisory authorities in the EU BEFORE starting this. Knowing reddit, if they ever even did such an assessment, they've come out with "low risk, nothing to see".
Then there's Article 32 of the GDPR: Security of processing. p1 (b): "the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;" yeeeaaah, good luck with that on an LLM there, buddies.
Cool, what now?
Here's what you do to exercise your rights and defend your data against the highway robbery and continuous violation by US Tech-Bros:
Find your supervisory authority (just use google, for added irony) by searching for "Data Protection supervisory authority [the state you live in]".
Find their contact info, usually they have a form to complain ready made
give the company info applicable for your state, I've gone ahead and fished those out for you (see at the end here)
Now i dont want to defend reddit here, but afaik most comments are not subject to GDPR as long as you dont know they contain personal data and they have been detached from other personal data fields (like username).
So by removing personal data fields, they most likely become "anonymized".
Of course thats not the end of it, you have to consider the available technology to de-anonymize this data for it to be legally called anonymized.
But i dont think there has been any case where this was challenged before.. and i bet most supervisory authorities would discard such complaints as being "too hard to follow through". (i got that reply from the Netherlands authority for checking newsletter opt-in from a website)
And i certainly dont think reddit or any operator will be forced to delete comments because they could be deanonymized depending on the content the user wrote, when most comments probably cannot be deanonymized.
Having to check everything for potentially identifiable data in that regard would be ridiculous for website operators.
Maybe some light checks sure, but not as deep as it would be required to truly anonymize everything that a user could have written to identify them.
Alot of that information becomes fragments as soon as you unlink it from the user. e.g. 12 people in a post wrote "I am gay", great. But if you cant link that back to other comments of the same users somewhere else, its not identifiable, just text.
Nope, your username and email are required and linked to your data, so it's entirely personal information. True anonymization is impossible with open text fields, as it's always possible that people reference other users within their posts, etc.
Of course, what the DPAs do with it, is another matter. Doesn't hurt to try.
Of course they are linked, but removing the username from the comments means they are mostly anonymized as far as GDPR is concerned.
It is perfectly fine to unlink data and keep processing it, as long as its considered anonymized under GDPR.
Your post content here is also not considered personal data, it shows up on a lookup request because its currently linked. If i crawl the page and dont save the username, the resulting data can most likely be considered anonymized under GDPR as far as the current interpretation is concerned.
It only becomes a problem as soon as i become aware the content indeed did contain personal data or probably also if i could have expected it to with high probability.
And i'd have to make sure to remove obvious ways to re-link the content to your user (e.g. mentions of your username in comments).
Anything else requires precedence about ways to re-identify someone based on posts on a platform weighed against the users freedom and the difficulty of doing such re-identification.
Recital 26 discusses when something could be considered anonymous. (or rather when gdpr would apply at all, and what it means to have anonymous data)
That is not quite correct. As long as it is possible to identify the user, it is personal data. True anonymization under GDPR is nearly impossible without destroying the data set.
Reddit would have to fully delete it, otherwise simply searching Google with the exact text with site:reddit.com on any comment immediately reveals who the author is.
It doesn't matter if the dataset in use allows for identification, as long as identification remains possible.
mhh.. you might be correct.
I havent considered how easy it actually is to search for a comment and find the exact post.
Question is if searching indexers like public search-engines is enough to call the data easily re-identifiable.
Or if this usage of personal data is covered somehow else e.g. legitimate interest, weighed against the freedoms of the data subjects, as you have listed above already.
I'd argue it is, but, that's where the judgement of the DPAs comes in. It's definitely possible that some, if not all of them, reject this as "it's fine". But unless eyes are being put on it, any shenanigans will simply occur.
I don't know how it might go, but giving it a try is basically free.
Also, I appreciate your consideration of my perspective!
You have to give one, while signing up (just checked); unless you go through apple or google ID services. Either way, they still log your IP and other meta data not to mention your username does exist.