Tinder has a proven track record of providing a dating platform to some less–than–stellar men who have been accused of raping—and in one grisly case, dismembering—women they’ve met through the platform. But even when the company does something right, there are still privacy trade-offs to consider.
While the company still seems to lack some basic safety steps, like, say, preemptively screening for known sexual offenders, the company did announce on Thursday its latest effort to curb the reputation it’s gleaned over the years: a “panic button” that connects each user with emergency responders. With the help of a company called Noonlight, Tinder users will be able to share the details of their date—and their given location—in the event that law enforcement needs to get involved.
While on one hand, the announcement is a positive step as the company tries to wrangle the worst corners of its user base. On the other hand, as Tinder confirmed in an email to Gizmodo, Tinder users will need to download the separate, free Noonlight app to enable these safety features within Tinder’s app—and as we’ve seen time and time (and time and time) again, free apps, by design, aren’t very good at keeping user data quiet, even if that data concerns something as sensitive as sexual assault.
Unsurprisingly, Noonlight’s app is no exception. By downloading the app and monitoring the network traffic sent back to its servers, Gizmodo found a handful of major names in the ad tech space—including Facebook and Google-owned YouTube—gleaning details about the app every minute.
“You know, it’s my job to be cynical about this stuff—and I still kinda got fooled,” said Bennett Cyphers, an Electronic Frontier Foundation technologist who focuses on the privacy implications of ad tech. “They’re marketing themselves as a ‘safety’ tool—‘Smart is now safe’ are the first words that greet you on their website,” he went on. “The whole website is designed to make you feel like you’re gonna have someone looking out for you, that you can trust.”
In Noonlight’s defence, there’s actually a whole slew of trustworthy third parties that, understandably, should have data gleaned from the app. As the company’s privacy policy lays out, your precise location, name, phone number, and even health-related intel supposedly come in handy when someone on the law enforcement side is trying to save you from a dicey situation.
What’s less clear are the “unnamed” third parties they reserve the right to work with. As that same policy states:
When you use our Service, you are authorizing us to share information with relevant Emergency Responders. In addition, we may share information […] with our third-party business partners, vendors, and consultants who perform services on our behalf or who help us provide our Services, such as accounting, managerial, technical, marketing, or analytic services.”
When Gizmodo reached out to Noonlight asking about these “third-party business partners,” a spokesperson mentioned some of the partnerships between the company and major brands, like its 2018 integration with Fossil smartwatches. When asked about the company’s marketing partners specifically, the spokesperson—and the company’s cofounders, according to the spokesperson—initially denied that the company worked with any at all.
From Gizmodo’s own analysis of Noonlight, we counted no fewer than five partners gleaning some sort of information from the app, including Facebook and YouTube. Two others, Branch and Appboy (since renamed Braze), specialise in connecting a given user’s behaviour across all of their devices for retargeting purposes. Kochava is a major hub for all sorts of audience data gleaned from an untold number of apps.
After Gizmodo revealed that we had analysed the app’s network, and that the network data showed that there were third parties in there, Noonlight cofounder Nick Droege offered the following via email, roughly four hours after the company vehemently denied the existence of any partnerships:
Noonlight uses third parties like Branch and Kochava only for understanding standard user attribution and improving internal in-app messaging. The information that a third party receives does not include any personally identifiable data. We do not sell user data to any third parties for marketing or advertising purposes. Noonlight’s mission has always been to keep our millions of users safe.
Let’s untangle this a bit, shall we? Whether apps actually “sell” user data to these third parties is an entirely thorny debate that’s being battled in boardrooms, newsrooms, and courtrooms even before the California Consumer Privacy Act—or CCPA—went into effect in January of this year.
What is clear, in this particular case, is that even if the data isn’t “sold,” it is changing hands with the third parties involved. Branch, for example, received some basic specs on the phone’s operating system and display, along with the fact that a user downloaded the app to begin with. The company also provided the phone with a unique “fingerprint” that could be used to link the user across each of their devices.
Facebook, meanwhile, was sent similarly basic data about device specs and download status via its Graph API, and Google through its Youtube Data API. But even then, because we’re talking about, well, Facebook and Google, it’s hard to tell what will ultimately be milked from even those basic data points.
It should be pointed out that Tinder, even without Noonlight integration, has historically shared data with Facebook and otherwise collects troves of data about you.
As for the cofounder’s claim that the information being transmitted isn’t “personally identifiable” information—things like full names, Social Security numbers, bank account numbers, etc., which are collectively known as PII—that appears to be technically accurate, considering how basic the specs we observed being passed around actually are. But personal information isn’t necessarily used for ad targeting as much as some people might think. And regardless, non-PII data can be cross-referenced to build person-specific profiles, especially when companies like Facebook are involved.
At the bare minimum, each of these companies was hoovering data about the app’s installation and the phone it was installed onto—and for readers that are accustomed to everything from their medical history to their sexuality being turned over into marketer’s hands for profit, this might seem relatively benign, especially considering how Noonlight also requires location tracking to be turned on at all times.
But that’s ultimately beside the point, as Cyphers pointed out.
“Looking at it like ‘the more partners you share with, the worse’ isn’t really correct,” he explained. “Once it gets outside the app and into the hands of one marketer who wants to monetise from it—it could be anywhere, and it might as well be everywhere.”
It’s something to think about when looking at partners like Kochava—which, while collecting similarly basic intel about your phone’s OS, is a company that readily boasts its “hundreds of ad network and publisher partners.” And because the advertising chain of command is more than a little opaque, it’s entirely possible for some percentage of those hundreds to get their hands on this data on an app targeting a very specific (and very vulnerable) population—even if they aren’t supposed to.
In other words, the sheer fact that someone downloaded this app is, at the very least, a tipoff that they’re probably a woman, and probably scared of becoming another statistic. Somewhere down the line, this basic data could be used to target the people who download this particular app with ads for some sort of self-defence keychain. Or counseling services. Or a gun. Because hey, who knows, they might need these things, right?
As Cyphers put it, “The kinds of people that are gonna be coerced into downloading it are exactly the kind of people that are put most at risk by the data that they’re sharing,” which is absolutely true—and that goes for data on their entire digital life, including the apps they download.
Every person—and every trauma, every fear, every painful encounter—plugged into Noonlight will likely eventually be flattened into a single bucket of “people who downloaded” this particular app, and that bucket will be a blip among the rest of the targetable data points floating through the digital ad ecosystem. Ultimately though, it’s not what goes into this particular blip, or the magnitude of this blip, that’s indefensible—it’s that the blip exists at all.