Meta, the maker of Facebook and Instagram, introduced a new privacy setting Thursday that lets you ask, pretty please, for the company not to use your data to train its AI models.
Buried in the nether regions of Facebook’s Help Center—a part of the website most people probably never visit—you’ll find an entry called Generative AI Data Subject Rights. “This form is where you can submit requests related to your third-party information being used for generative AI model training,” Facebook tells the weary travellers who’ve managed to stumble onto the page.
Here, you find three options. You can tell Facebook you want to access, download or correct any personal information, say you want to delete that personal information, or fill out a blank text box if you “have a different issue.”
The form then asks you for your name, email address, and country of residence. You hit submit, the website tells you, “Thanks for contacting Facebook. You should receive an email response shortly.” At this point, you’ll probably want to do some kind of occult ritual to ensure the data gods hear your plea.
The leaders of the tech industry say that AI will soon destroy our world. But if you’re truly concerned about your data being swept up to train artificial intelligence, there are a lot of reasons to think this new Facebook form might be a waste of your time.
As Facebook explains, models like the ones Meta is building analyse pieces of data from a variety of sources. Some of that data comes from the things you type into Meta itself on Facebook, Instagram, and other apps. This form won’t help you with that. Did you think that was your data? There are other ways to delete some of the data you’ve handed Meta, but there’s no way to object to the company using it for AI. Meta has built an untold number of algorithms and AI tools on your information, though the company says its LLaMA 2 language model wasn’t built on user data.
This form only relates to the “third-party data” that Meta scraped, purchased, or licensed from outside sources. What sources, exactly? You may never find out.
When you put your name and your email into this form, it’s hard to know what Meta does next. Presumably, the company has some kind of automated search that looks through the training data for its generative AI models to find exact matches for your name and email. Even if we assume Meta makes a rigorous effort in its search, it’s ridiculous to think that the only data that might refer to you will include your full name or email address.
Perhaps, if there’s information about you that doesn’t identify you by name, there’s nothing to worry about. But many people have a quasi moral objection to giant corporations sucking up data about them, churning it through some kind of opaque machine, and then unleashing it through a robot that behaves in unpredictable ways. Does that moral objection give you any legal rights? Only in few places with laws that specifically govern artificial intelligence and privacy.
That’s probably why the form asks you to give your country of residence. It seems that Meta is granting some people limited rights to intercept their data based on where they live. In some places, the company has a regulatory obligation to do so. “Data Subject Rights” is a standard legal term for the rights that you have, as the subject of data collection, to delete, access, or alter that information under certain local laws. In the UK and Canada, for example, there are rules about scraping consumer data. Not so in the United States, at least at the federal level. If Meta is asking where you live, that may mean the company not going to grant your request if you’re in a country where it doesn’t have to.
Gizmodo asked Meta about the process it uses to grant these AI data requests, and whether users in the United States and other parts of the world can expect to have their requests honored. The company did not immediately respond; we’ll update this article if we hear back.
The Cheapest NBN 50 Plans
It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.