Mike Wasserman, Amanda Stanhaus, Robert Miller, George Matthew and Sophie Baum on ‘How New Health Paradigm Impacts Rights’ at Converge2Xcelerate | ESG News – Boston, MA
Mike Wasserman, Amanda Stanhaus, Robert Miller, George Matthew, and Sophie Baum at Converge2Xcelerate Conference (Boston, MA)
- HIPPA addresses the use & disclosure of individuals health records and how they are protected
- US is spending 20% of its GDP on healthcare
- Sophisticated cyber-attacks impact millions of individuals annually
INTERVIEW TRANSCRIPTS: Mike Wasserman, Executive Director of Boston Debate League, Amanda Stanhaus, Identity Solutions at Consensys, Robert Miller, Senior Consultant at Consensys, George Matthew, CMO at DXC Technology, and Sophie Baum, Associate for Hogan Lovells
Mike Wasserman – Executive Director, Boston Debate League: 00:00
Good afternoon everybody. Thank you. I know that we are the last presentation before lunch, but thankfully this is going to be highly engaging and interactive. So we have here a small debate for you guys. We’re going to just pack it in. We don’t have that much time. Before we dive into the debate though just some quick introductions, I want to give everyone a chance to introduce themselves, but just so you know, you’re in the right room. We are going to be debating about and talking about the different sides of health, data, health and data paradigms. Really talking about whether HIPAA as it exists today still meets the needs of this much more data rich world that we’re living in or whether there’s a new standard that really needs to be developed to be more comprehensive of the amount of data and information that’s out there.
Mike Wasserman – Executive Director, Boston Debate League: 00:54
So we have four very wonderful and knowledgeable experts. I will say, I think it’s probably up there on the slide somewhere that we have assigned sides to this debate. So really the purpose of this is to explore the different facets of each side, but it’s not necessarily the viewpoint of each person up here or the companies that they are representing. So just quick by way of introduction. So you know who’s doing this talking. My name is Mike Wasserman. I’m the executive director of the Boston debate league. I’m in no way a healthcare expert, which is why you’re not hearing any kind of content from me. I run a nonprofit organization here in the greater Boston area where we bring relevant policy topics like this to middle and high school students, both in their schools on college campuses and in local companies. And we have young people wrestle with these ideas, but I’m happy to take a step away from that for an afternoon to have some deep experts to do a similar activity with. So I’ll just go down the line and have folks quickly introduce themselves.
Amanda Stanhaus – Identity Solutions, Consensys: 01:57
Great. Hi, my name’s Amanda Stanhaus. I’m currently working on identity solutions at Consensys while also finishing up PhD at the University of Michigan, focusing on data privacy and genetics issues.
Robert Miller – Senior Consultant, Consensys: 02:13
Hey everybody. I’m Robert Miller. I do advisory and product work at Consensys health using emerging technologies to build products for healthcare and life sciences. A lot of my work focuses on privacy, value based care. I run a newsletter in this space, but blockchain and healthcare, if you’re interested in that too. Excited for the conversation. Thanks for being here.
George Matthew – CMO, DXC Technology: 02:32
Hey everyone. I’m George Matthew. I’m the chief medical officer for a company called DXC Technology. We’re a 22 billion top line revenue company that does a lot of IT vendor work, 3.5 of that in healthcare. I look over the Americas and we do also look at consent and data monetization as a strategy for many of the Medicaid clients that we’re looking at now.
Sophie Baum – Associate, Hogan Lovells: 03:00
Hi everyone. My name is Sophie Baum. I’m filling in this afternoon for Melissa who had to run. I’m a law clerk at Hogan Lovells in Washington DC focused on data privacy and cybersecurity law in their office.
Mike Wasserman – Executive Director, Boston Debate League: 03:07
Wonderful. So here’s how we’re structuring this debate here. Basically we’ve got Amanda and Robert on the affirmative side. They’re going to be arguing that we need a new, more comprehensive standard to replace HIPAA to address the more complex world we’re living in. And then we’ve got George and Sophie arguing that no, the status quo of HIPAA works well. And is what we should be sticking with. And the way that we’re going to have this run is that we’re going to have each team be able to give an opening remark kind of lay out their case for five minutes after each opening remark. The other team’s going to get a chance to do some cross examination and then each team gives a closing kind of crystallizing statement. And then here’s the is the participatory part. At the end of the round, you all are going to decide who wins this this debate.
Mike Wasserman – Executive Director, Boston Debate League: 04:00
And obviously everyone here who works in this industry probably has an opinion going into it. So I’m not going to ask you who you agree with, right? That doesn’t necessarily speak to the results of the debate. But I want I’m going to do a hand raise at the end for folks who are either in the middle right now or if you can kind of take away your personal views and say which side are you more persuaded by or which if you started out here and you’ve moved in this direction by the end of the debate, then you vote for the side that’s moved you. And so that’ll just be our quick way to crown a completely honorary with no financial or specific prize a winner in this in this debate. And I will be as I do for our middle and high school students keeping time cause there are many ideas that can be shared. But brevity is one of the beauties of debate. So unless anyone has any questions or clarifications, we’re going to dive right in. So we have five minutes here for the opening statement from the affirmative side.
Robert Miller – Senior Consultant, Consensys: 05:28
Okay. So I’m giving the affirmative intro statement. The world that HIPPA was made for is radically different than the world that we live in today. It was a world that didn’t know the smartphone or the app economy that increasingly powers and enables our healthcare. It didn’t know the direct to consumer digital health services that add value to our lives almost every day now, nor the internet of things. And it’s because we in this radically new world, this new health paradigm that I think we need updated legal protections. And my argument today is going to be focusing on how HIPAA and HIPAA protections do not extend to non-covered entities. The broad usage of the identified data and an inappropriate model of consent that we’ve learned. HIPAA only protects information within covered entities. Your traditional healthcare system providers, plans clearing houses and their business associates. But many businesses analyze, collect and disclose sensitive information about your health, the majority of which are not covered entities and as such they are not afforded the standard of protection that HIPAA gives information within covered entities.
Robert Miller – Senior Consultant, Consensys: 06:41
For example, most things that collect information directly from consumers like fitness apps, some PHRs, a lot of direct to consumer genomics are not afforded this higher standard of protection and that leads to flagrant privacy violations. There was a Wall Street Journal application on how fitness apps and women’s health apps were sharing extremely sensitive data with Facebook without the knowledge of their users. And this is in part possible because HIPAA does not extend its protections to non-covered entities and this mishap between the protections for information within covered entities and non-covered entities can be confusing for consumers who might have a limited sense of when their data is protected by HIPAA and erroneously consent to their data being used in a way that they didn’t anticipate. And once data is created or obtained within a covered entity, that covered entity can do whatever it would like with it.
Robert Miller – Senior Consultant, Consensys: 07:38
HIPAA provides for the broad usage and sharing of de-identified data. And this creates perverse incentives. How are you supposed to know when your health system is selling in large streams of revenue, your de-identified data, whether or not that vial of blood they took from you was really necessary or whether they’re taking that to sell your data. And the premise of de-identified data is that you’re able to protect from the negative harms of privacy violations by removing someone’s identity. But increasingly in the age of big data analytics, there are scholars that question whether de-identification is ever really possible. The University of Chicago medical center shared de-identified health records with Google who can then pair that with Google maps information to be able to see when folks were visiting University Chicago medical center and de-identified health records. There’s a large question of whether de-identification is ever possible in the age of big tech and big data.
Robert Miller – Senior Consultant, Consensys: 08:37
And moreover, it undermines the just distribution of benefits of data mining and personalized medicine. You know, personalized medicine largely runs on de-identified data, but we’re not able to share the valuable insights that we derive from big data and personalized medicine with the very that helped to produce those valuable insights because we’re using de-identified data so broadly. And again, I’m, I’m talking about covered entities and de-identification, but there are no protections for how non-covered entities are sharing your data. Moreover, non-covered entities and covered entities, both of these work streams that I’ve been sharing rely on a model of consent that is outdated. Our legal frameworks and ethical oversight create really high transaction costs and puts undue burdens on patients to manage their privacy. There’s empirical and social science research that demonstrates that there are cognitive problems with individuals making informed rational choices, but the tradeoffs in sharing their data and consenting to particular usages.
Robert Miller – Senior Consultant, Consensys: 09:42
And part of this is simply the scale. It’s not feasible to manage your privacy separately for all the entities that we interact in with in this new data economy. And moreover, oftentimes privacy harms come not from individual transactions, individual consenting to use, but when data is aggregated from a variety of different sources. And this is proved to be a very difficult thing to conceptualize and we need to move towards a new models of consent if we are to adequately protect patient’s interests in this new data economy. So it’s for these reasons that HIPAA is protections don’t extend to non-covered entities or broad reliance on de-identified data as well as our erroneous model of consent that I would advocate that we increase legal standards in our new health paradigm.
Mike Wasserman – Executive Director, Boston Debate League: 10:31
All right, thank you for that. Within three seconds of five minutes. So now I’m going to give the negative team a chance to ask a few cross-examination questions.
Sophie Baum – Associate, Hogan Lovells: 10:52
So there are a few things that I wanted to hit on and there’s something in particular two statements you made that I thought were really interesting because they’re wrong. And so you said companies can do whatever they want with data, company companies that are non HIPAA covered. So not clearing houses, not traditional medical facilities, that kind of thing. And you also said there are no protections for covered entities for these non-covered entities. My question I guess to you is how then do you fit into your, your new paradigm, all of the other protections that do exist. So for example, protections laid out by the FDA and the FTC has standards for protecting health information. And there was a patchwork of state laws as well. So Illinois for example, has biometric privacy law. California has a confidentially of patient information. So I guess my question to you is how do you how do you fit all of those laws into your statement?
Robert Miller – Senior Consultant, Consensys: 11:53
I think that while the FTC does have some generic oversight over consumer’s privacy in the regulation of their information, it doesn’t have the full extent that HIPAA provides information within covered entities. And I would advocate that we need to share that with information and non-covered entities as well. And I think that really to unlock innovation and help entrepreneurs and patients navigate our system, I would want a national federal privacy framework as opposed to fragmented state by state. And if States are interested in going further and beyond, that’s okay too. But I think ideally we have a national privacy framework.
Sophie Baum – Associate, Hogan Lovells: 12:31
I think that’s great. And something that you just mentioned there kind of leads me to my next question about balancing innovation and consumer protection. And you said we need to make sure that there’s room for innovation here, but if we do add these further regulations and these laws, I mean that has a large potential to impact some of the income of businesses and even to squash some of some new innovators in this space. So how would you balance that?
Robert Miller – Senior Consultant, Consensys: 12:52
I would reject the premise of the question. I don’t think that we have to necessarily choose between innovation and privacy. I think that we can have both, but we need guard rails to ensure that the interests of people I met in rules of engagement or else we’ll have privacy violations. Like the few that I laid out earlier, I think we need to experiment with different models as well. One that I’m particularly fond of is the idea that patient’s interests could be represented by a data union, which negotiates for the usage of their data on their behalf and provides some sort of oversight function. Or I think my colleague in our closing statement is going to share some of the new models for consent as well. So I think we can have both. I reject the premise that I either have to have innovation and no privacy or privacy and no innovation.
Mike Wasserman – Executive Director, Boston Debate League: 13:36
Thank you. That’s your time. Okay. So now turn it over to George to give the opening statement for the other view.
George Matthew – CMO, DXC Technology: 13:42
And I should point out that some of the points that I was going to bring up have been discussed I think in the cross. When we think about 2019 and healthcare data and analytics and computing power, you know, we’ve got more now than we’ve ever seen and it’s because of the free access to a lot of that data that we’ve seen innovation in ways that we never even conceived. You know, there’s the traditional healthcare data that’s out there such as you know, your blood pressure or your vitals or whatnot, but being able to track where you walk with your cell phone and using that as a way to outwardly determine if you are sick or not. That’s something people didn’t even conceive of 20 years ago and a lot of those, because of the way that HIPAA and other laws are structured currently, it allows for that flexibility for innovation to happen because small startups that are two people in the hypothetical garage can afford to have people there and get access to that data and build some of these new products and services that can help people.
George Matthew – CMO, DXC Technology: 14:41
It’s very difficult to come back and say we’re going to put more regulation on top of it and increase the cost for you to enter that market because that may be denying as some of these lifesaving innovations that people are coming up with. Rather than disrupting the status quo. I think we also have to not have to another discussion when you talk about telling people, Hey, I’m going to consent you. When you really talk about what data literacy means, right? Do you fully understand what’s happening with your data? Do you fully comprehend it and can you actually make an intellectual choice in span of a second? I’ve heard the complaints, you know, obviously you think about your typical app with your end user license agreement or walking into a hospital signing a consent form, but people get care. But people are willing to actually trade their data for access to that app and get the value from it.
George Matthew – CMO, DXC Technology: 15:35
In Pew studies that have been done as recently is three or four years ago. When they ask people, especially younger people as you go, are you willing to trade access to your data healthcare or not for the benefit of this app or service? The answer is almost majority “yes.” Because they realize the value they get out of it is greater than what this potential threat is to something happening to their data. So, you know, my two points are basically trying to keep the door open for innovation as we’ve discussed, but also really figuring out how do you communicate that? How do you educate folks so they can make a decision? Because right now if you open the doors and you just say, Hey we’re going to use your data because we might need to use it for a tertiary endpoint in this clinical trial that might impact something 20 years from now. Most people throw their hands up and say, that doesn’t affect me now. How will that affect me paying my bills or getting a better and it doesn’t. So that’s it. Thank you.
Mike Wasserman – Executive Director, Boston Debate League: 16:38
Great now I’ll give the affirmative team a chance to ask some questions.
Amanda Stanhaus – Identity Solutions, Consensys: 16:39
Cross examination. All right. So to your two points about a stifling innovation or promoting it and then also regarding consent and data value and whether that’s a fair trade to start with innovation. I feel like we’ve already touched on the points of the variation in state law versus it sounded like we kind of both agree on looking at having federal reform. But it’s something that I’m concerned about and we’d love to hear your thoughts on is if there was to be a federal regulation whether it’s health specific, so updating HIPAA or more GDPR like solution. But it’s my understanding that there’s a risk of having a federal law actually as a floor and to then it actually could cause more variation as States then build upon it. And that’s seen actually in the health information exchange space where they’re Juliet Milstein has found that really the main barrier to health information exchange is the state variation that’s a blocking interstate exchange. And then regarding consent and the studies that you cited regarding whether especially to the younger population as a young person at the end wants to trade their data and whether that’s a very fair trade. So there’s a lot of behavioral economist research regarding, you know, actual decision making versus hypothetical. And in those surveys where it is hypothetical situation you know, it’s difficult again for like rationality, etc. And for the survey respondent to really understand you know, what is at stake and then again to the point of aggregation of you individually making a choice you know, is aggregated and compounded by other people’s choices as well.
Amanda Stanhaus – Identity Solutions, Consensys: 18:51
And so I think that that’s those citations aren’t necessarily fair. And what’s interesting is there is also been research still again a survey data, but having people flipping out the different categories of different data and flipping out the different recipients. And there’s a pretty common across age groups, across demographic categories, etc, that in general people know what is appropriate use and not of data. And so I think that we would like to see, I think we can all agree here that we would like to see, you know substantive boundaries at a societal level of how data is used. Cause at the end of the day, I do think that instead of, you know, playing consent tricks on folks that it’s more that we all really know what’s right and wrong for the most part.
George Matthew – CMO, DXC Technology: 19:49
I’ll have to blitz it and I’m going to hand over for that one point you brought up as well. For the first point about having like a national architecture, as you know, there’s laws being put out now that are supposed to be promoted to try to, I don’t want to say contradict, but definitely override some of the state laws as well. Whether there’ll be stronger, weaker, we don’t know. But again, GDPR is in mind. And now at my company, since we work in the EU as well as in States, it is like having another compliance arm. It is very cost prohibitive to maintain that. I’m not saying that can’t be done. But it is going to prevent a lot of business moving forward. And then the second thing in terms of the I think it was people knowing what’s right and what’s not. I think it’s a fair point, but I think it’s also fair to say that when you ask people, you know what they want, you can basically offer a $5 gift card and they sometimes jump at that because the immediacy of getting reward. So it’s hard to tease out and say that one is necessarily right or wrong. I do think though, you’re right, clearer choices would be better. You would also make the point.
Sophie Baum – Associate, Hogan Lovells: 20:59
I think you mostly covered it just in terms of the floor of a federal law. There are ways for federal laws also be ceilings and so it’s really up to the legislature to preempt other state laws. They can write, they can make a value judgment and say, this is what we think is good for the entire country. And so it doesn’t necessarily need to be a floor.
Mike Wasserman – Executive Director, Boston Debate League: 21:19
Great. So now we have our closing statements. Do you want any prep time? Okay. So four minutes for the final statement of the affirmative team.
Amanda Stanhaus – Identity Solutions, Consensys: 21:34
So just to reiterate Robert’s points regarding, you know, it is a different world now that we live in and the HIPAA isn’t appropriate. Given it’s distinction of non-covered versus covered entities. It’s broad use of de-identified data as well as its reliance on consent. And so, you know, this is a factor that there’s more data than ever again coming from what we understand to be our traditional healthcare system that HIPAA addresses. But then you also have your wearables, your social media data, your genetic data is of interest of mine as well as more context than ever. So not only is it a distinction between covered entities and not, but once you’re you know, hospital system that’s technically a covered entity, you’ve signed to your very broad-based consent form where’s that data going? And that there’s a lot of revenue streams coming from the selling of, you know, de-identified data that’s initially obtained in a traditional health context.
Amanda Stanhaus – Identity Solutions, Consensys: 22:46
As well as then again an inappropriately relying on informed consent, getting to these cognitive bounded rationality issues as well as structural issues regarding aggregation. And so a path forward to focus on consent. You know, there’s a false promise and false confidence that comes with consent. One of the behavioral economists that studies data privacy Alisandra Acquisti has found that actually if there was a perceived increase in control of consent, that it actually leads to an increase in disclosure. And it’s, you know, a false confidence as well as there’s limits to transparency. The main researcher in this area is Kate Crawford. And so even if you know something and know how your data is being used, it’s not necessarily actionable of you can take it back, etc. And so thinking about paths forward, especially regarding consent you know, there are some solutions from a more behavioral economics perspective, libertarian perspective of just increasing informed consent, increasing the reliance on an individual throughout the life cycle of data.
Amanda Stanhaus – Identity Solutions, Consensys: 23:55
So instead of just having it be at the point the beginning of the life cycle throughout the life cycle, I personally think that that will be pretty annoying, and we’ll still be plagued by the same issues that solve sites regarding privacy, self-management. But then you also have on the other side, substantive boundaries that I believe that we’ve discussed. But that’s also reliant on federal regulation, whether set the state or federal level. There’s a lawyer over there, so I should make a distinction. And so I’m personally interested in what I’m thinking about as a consent request manager as a hybrid of these two solutions where you would bring consent throughout the life cycle of data. But you would not necessarily handle those requests yourself. There would be some sort of software solution where you would set your intentions more broadly of how you want your data to be used.
Amanda Stanhaus – Identity Solutions, Consensys: 24:54
And those requests would be basically dealt with by this manager of meeting your intentions, whether it’s based off of, you know, data sender recipient, how, whether it’s de-identified or not, etc. And I think that these defaults of us of the sort of system would be able to act as the implicit substantive boundaries that we would like to see, as well as being able to set, you know, preferences within the larger social norms to address unique preferences but also being able to facilitate a new world where we can, you know, consent to all the different ways that our data is being used.
Sophie Baum – Associate, Hogan Lovells: 25:39
The first thing I’ll say is that was really, really good. The second thing I’ll say just in closing is that I am kind of listening to what all the panels are saying and there’s four key things that I think we’ve really been talking about that I want to highlight. And I think I’m at the end of the day, they really go to maintaining the HIPAA paradigm that we have right now. And the first thing is that HIPAA has worked for a really long time since 1996. It’s protected health care information across a variety of entities. There are privacy and security standards that are designed to update and evolve with changing technologies and designed to keep companies kind of on the straight and narrow and that their enforcement mechanisms as well.
Sophie Baum – Associate, Hogan Lovells: 26:31
There’s ways to backup those regulations in case folks aren’t complying. The second thing I’ll say about notice and choice is that it’s also there, so have the ability to consent to the use of their data under several legal regimes. They also have the ability to withdraw that type of consent. There are things like just-in-time notifications, you know, that pop up on your phone. For example, when you have an app up and it says, you know, this app is using your geolocation data. Is that okay with you? Kind of constant reminders for folks. And a lot of those we see in the medical space as well. And so having this constant dialogue in this constant communication for people that are interacting with entities that may or may not be covered under HIPAA is also something that’s very much in place.
Sophie Baum – Associate, Hogan Lovells: 27:19
The third thing I’ll cover is it’s just the burden of more regulation. It is hard to comply. I’m sure many of you come from organizations that have had to implement solutions to deal with GDPR. We have the California consumer privacy act that’s coming into effect very shortly. And I know a lot of companies have been dealing with compliance in that space. And so once you start layering these regulations, whether they be state or federal just the compliance burden becomes very difficult and it’s hard for folks to innovate. When they have all of these regulations kind of weighing them down and in some sense and then finally, I guess I would just encourage you all to think about the benefits of all of this data. So if all of these things were regulated, we might not be able to do a lot of the very cool things that we heard about, you know, on panels and in talks this morning. Things like enrolling folks in clinical trials getting lots of data to innovate, have more home care, that kind of thing. And so, you know, assuming, and of course expecting ethical use of data is very important but you don’t want to impede the learnings and the takeaways. And all the great work that all of you guys do by burdening it even more.
Mike Wasserman – Executive Director, Boston Debate League: 28:36
Wonderful. All right. Give a round of applause to our debaters here. So the big moment of truth, I’m going to ask folks to raise their hand for which side that really pulled them, persuaded them to their arguments. So if you feel like if you feel like the affirmative team has really swayed you and you’re moving in that direction, raise your hands now. Okay. And then a negative team, if you feel persuaded or moved, in a vote of a four to six, the negative team wins. Congratulations. Well, thank you all.