Facebook, Cambridge Analytica, and Grindr: Frank Pasquale Talks About Big Data and HIV Disclosure
The explosion of social media has changed how people connect with each other, with likeminded people, all over the world. It has certainly helped people living with HIV, or people interested in or using pre-exposure prophylaxis (PrEP), to connect with each other and discuss their experiences, helping to break stigma and isolation.
But recent scandals involving Facebook's selling the data of its users to Cambridge Analytica, resulting in an estimated 87 million people's data being collected outside the site (and used for political interference), and gay-dating app Grindr's mea culpa on selling data on the HIV status of its users, has raised a lot of questions about privacy and the unintended consequences for people with HIV, or using PrEP, or just searching for information on those issues on social media.
I spoke to Frank Pasquale, J.D., M.Phil., professor of law with the University of Maryland about these issues. Pasquale is an expert on the law of big data, predictive analytics, artificial intelligence, and algorithms. He is the author of the 2015 book The Black Box Society: The Secret Algorithms that Control Money and Information and co-host of the podcast This Week in Health Law.
Kenyon Farrow: This conversation has been in the back of my head almost since I started at TheBody, but now it just obviously seems much more prescient, thinking about questions of data and surveillance as they relate to health issues.
You've been looking at some of these issues for a long time, and it feels as if your moment has arrived.
Frank Pasquale: Oh, yes. Absolutely.
KF: Probably in some scary ways. In the last couple of months, there have been a number of scandals about how data is being used by companies, even in some ways that might actually be legal, but in ways that the public didn't know about.
Your most recent book, The Black Box Society, which was released in 2015, seems to have been written for everything that we're experiencing right now. I first want to ask you to talk about the premise of your book and how it relates to the issues we're seeing right now.
FP: I'm really glad that you bring it back to the book. I was thinking about a lot of these issues of runaway data from about 2009 onwards. And I started actually looking at the surveillance state. One of my first articles was on these things called "fusion centers" that were combining data from police departments, public governments, private sector entities, etc.
But then, the deeper I got into it, and the more I learned about groups like Patient Privacy Rights or the Electronic Privacy Information Center, [and] I just saw how the data profiling that they were doing about people at big companies was really remarkable.
In the health area, in particular, I was impressed by a book by Mary Ebeling called Healthcare and Big Data: Digital Specters and Phantom Objects about how she was being marketed to on the basis of a child that she had miscarried. But the marketers had assumed that she had had a child, and they would market to her, assuming that this child was growing up, and so she was being marketed to.
KF: And you know, a lot of this has come into concern, particularly in the press, since the 2016 election and the revelations that Russian actors were involved in various forms of disinformation campaigns, through fake news websites kind of creating Twitter bots, and then these fake Facebook groups. And then, obviously, there's the recent revelation of Cambridge Analytica's role in that entire scandal.
But I think there's been a slower recognition of the more mundane ways and reach of data surveillance, and the whole industry behind it, through various aspects of life, including health-related information.
Are there specific concerns you have about how a person's health status -- and for our readers, people who are HIV positive or people who are on PrEP, even -- can be gleaned by how you use social media?
FP: Yes. Well, I'll start from the beginning, which is to say that a lot of us think that HIPAA [the Health Insurance Portability and Accountability Act of 1996] covers health data, but it doesn't. It covers entities, which are like doctors, hospitals, insurers, and their business associates. But if any sort of data broker or social media site or anything gets ahold of, or learns, your status from something that's not a covered entity, HIPAA doesn't cover any of that.
So, your doctor, if your doctor wanted to tell somebody that you had HIV, be it a marketer or any type of firm, they'd have to ask your permission. But anybody else that has that data, under the First Amendment, particularly asserts rights to say, "We get to share it with anybody."
Now, there are some interesting ways in which that could be challenged. But it's really not been challenged in the U.S. at all. That's the first thing.
The second is, there was that article in The New York Times about the guy that was denied life insurance? Did you see that one?
KF: Yeah, yeah, I did.
FP: And you know, I think that's a really interesting area, where I think people don't expect that life insurers and some other service plans are going to be looking at everything. And a huge weakness of the American privacy regime, too, is that they can look at everything. And lots of lists out there are not accurate.
And you would think you could sue them for defamation: for example, the list of people with diabetes. But then these firms say: "Well, we're not calling them diabetics. We're saying they're diabetes-concerned households. And that's an opinion. So, you can't sue us."
So, this whole landscape is incredibly complicated and largely unregulated. And especially with the revelations about Facebook and Grindr, you see that entities that you may have thought would have some fiduciary role, or obligation, or some moral or ethical code to keep your data private are not doing a very good job of it.
KF: You're raising a couple of points that I was going to get to, such as the thing about the HIPAA law that, as you describe, applies to covered entities. And, we've seen a couple incidents where the breach of those laws by covered entities have had legal consequences. Aetna last year had to pay out $17 million dollars to people living with HIV whom they had exposed just by sending a letter to them. You could see through the envelope that it said HIV on it. And so, they had to pay folks based on a kind of HIPAA violation.
And then, recently, there hasn't been a settlement, but there is a pending lawsuit in Cincinnati that the CVS chain in that region did a similar thing, a mailing that was not put together well, and outed about 6,000 people who are HIV positive through that.
So, we see that the way the law can work for entities covered by HIPAA.
But I want to ask you; you kind of jumped to this question about Grindr. Given that that story really just came out [at the time of this interview], what are your thoughts about that, specifically?
FP: I'm really surprised and disappointed that they did not ask for consent. I think their European users may be very interested looking at it, in terms of law in Europe. I think there's a controversy over whether there was some form of consent or not, in terms of the terms of service. But I feel this is a situation where everybody should realize that you ought to have affirmative, expressed consent for sharing data like that. I mean, that would be my, sort of, bottom line, in terms of why I feel disappointed in them for doing that.
KF: In the last 24 hours, I've been looking at debates on Facebook, and people have been making the argument, "Well, because I'm on Grindr and I disclose in my profile that I'm positive and undetectable, or I'm on PrEP or whatever, that is essentially public information at that point and so therefore not a privacy concern, because I chose to disclose it." What would you say to people who are making that argument?
FP: Well, you know, I think that the issue for a lot of the users is that you don't connect it to your name, and you don't connect it to a really durable social media profile. Or you have some degree of distance of those things from the Grindr profile.
And I think, by the same logic, there are ways in which this is particular information, offered for a particular purpose [on Grindr], and it's important to people that the stigma be as low as possible, and the consequences be as low as possible. Otherwise, you're going to be encouraging people to be secretive about it, and that's not a good thing. You want to create a space where it's safe for people to be totally honest about their status.
So, I think that people need to understand: It's not like Tinder, where maybe your profile is up for everybody all the time. It's a particularly sensitive and privacy-protecting site. Or it just seems [like] that.
KF: Right. What are the things you think we need to change, in terms of law and policy, regarding these data disclosures, particularly around health issues or non-covered entities under HIPAA or HITECH [the Health Information Technology for Economic and Clinical Health]?
FP: I think we need something like the European privacy law in the U.S., the General Data Protection Regulation. And I think that [we need] to bring that to U.S. leaders. I think that the first step we need is [for] people to understand what data every company has on them and who they're sharing it with. And I think there are particular categories of data -- for instance, about health -- where people need to be able to give consent, direct consent, for that sharing before it is shared.
I think that we just have to change the law to make certain categories of information really sensitive, where you don't just sign terms of service and then for the rest of your life that governs your transactions, but you actually have to give consent for those types of data sharing.
KF: What do you think that individuals, people with HIV, or people on PrEP, or other kinds of sensitive and highly stigmatized -- and actually, in the case of HIV, criminalized -- conditions should be doing personally to protect themselves?
FP: That is a very hard question, because I think that the problem is that every time someone offers a personal self-defense, a privacy enhancement technology, we usually find out within three months to a year that there are ways of that being overcome.
So, right now, it's very difficult to say that there's a way that you could do that. I think that this is something that has to be solved on a more broad-based social level than by individuals.
This transcript has been lightly edited for clarity.