— Manny / Diabetes (@askmanny) March 14, 2015
This weekend, I had the fortune to speak (and attend) SXSW Interactive Festival for the first time. I wanted to take the opportunity to share here some of the perspectives I brought to the panel titled “Why HIPAA Won’t Save You: Protecting Data Privacy” with my colleagues Amanda Sheldon (Medtronic Diabetes), Jane Sarasohn-Kahn (THINK-Health and Health Populi blog) and Marc Monseau (Mint Collective LLC).
Recent data breaches have made the American public more aware about the issue of privacy, and potential for their health information to end up in unwanted hands.
What can be done to make this process easier on the patient?
- People with visual disabilities may struggle reading below certain font sizes (I am one of them!!), and legal documents are also frequently referred to as “the fine print”. It would make sense to use a bigger font for the most sensitive parts of the document.
- People learn from information differently: there are people who are visual learners, there are people who are auditory learners, etc. and even if all people could become informed by reading alone, lengthy documents make them ineffective. Having a 1-pages summary or an infographic with key elements from the policy would be a tremendous advance on this front.
- People’s literacy level (health literacy, ability to understand legalese) can also get in the way. Even if the information is there and it’s accessible to the patient, they may not be able to understand what it means.
- Last, even if you understand what the information means, you may not be able to understand the mid-to-long term implications. This is the hardest item to address, because it involves implications that may not be apparent today.
As my friend, Health Policy Attorney, and Patient Advocate, Erin Gilmer told me:
HIPAA is meant to give patient rights and ensure patients can trust that the system will keep their PHI safe – only used for treatment and payment purposes. [But] it has not yet evolved to tackle the sharing of data via social media or apps or websites or forums.
Makers of apps, devices, etc. where health data is being collected and/or shared need to do the right thing for the patient. That will go a long way towards protecting their privacy:
- Almost every piece of data that is collected about someone using your product could have health implications: GPS data, mobile device usage, etc. so it needs to be treated with the same care as an A1c, weight, or cholesterol value.
- Only data that is needed to do the job should be collecting. Just because you can ask for a piece of data doesn’t mean that you should, if you are not going to use it.
- If you need to collect a piece of data, first default to do it in an anonymized or aggregated fashion, if it can give you the information you need.
- If you are going to use the piece of data that is personally identifiable information, remember the Mom Test that Jane talks about: would you trust the system with your own mother’s data?
- Don’t hide behind regulation or lack of guidance to justify the small print, the cryptic language, or the obscure links. KISS = Keep It Simple and Stupid!
- People will not necessarily resent that their data is sold, as part of their usage of the product. But two things need to happen for this to work out: (1) they need to get a compelling benefit from the experience (and often it’s not a monetary incentive, but rather a sense of being part of something bigger, advancing science, etc.) ; (2) they need to CLEARLY understand that their data could be sold, and probably be told so again as it’s about to happen (always remember they are on the driver’s seat). A great example of how this has been done well is PatientsLikeMe.
- Life happens and health changes with it, so products we interact with should allow us to update our privacy preferences easily at any point. Reminders about what is being done with your data, or what parts of your data are being used in ways that could affect your privacy build an environment of trust that is essential for a healthy ongoing relationship with your customers. “When data is your primary currency, trust is fundamental for your business.”
- Every maker should provide users with a way to access their data easily in a truly portable format. A PDF is not a portable format (no matter what the acronym may stand for). Data siloed behind a company’s walls is not portable. Remember: the patient is in the driver’s seat and they OWN their data. Just because a company collects data doesn’t mean they own it. They are mere stewards of our data.
At the end of the day, it’s about placing the interests of the patient first and doing so with transparency and openness. While it is perfectly fine for companies to protect themselves, it cannot be done at the expense of the interests and the privacy of the consumer. This is something where everyone (patients, companies, and regulating entities) have a role to play.
Thank you to the following people whose incredible insights that helped make my participation (hopefully) more useful to those in attendance:
- Erin Gilmer – Health Policy Attorney, and Patient Advocate (@GilmerHealthLaw)
- Dana Lewis – Moderator for #hcsm, Patient Advocate (@danamlewis)
- Melissa Lee – Exec. Director for Diabetes Hands Foundation, Patient Advocate (@sweetlyvoiced)
- Brian Cohen– Lead Administrator for TuDiabetes.org, Patient Advocate
- Other generous members of TuDiabetes.org
It’s pretty outrageous to watch Facebook defend something which is obviously unethical. I’m talking about the company’s “Instant Personalization” program which the company forces users into, whether they like it or not. Despite the ongoing public criticism about the service, and a number of other products, Facebook is standing strong, arguing that users “love” what Facebook is doing.
On Facebook’s “Privacy” page where you can control this (the one shown in the screenshot above), they introduce it as:
Facebook’s instant personalization pilot program helps you connect more easily with your friends on select partner sites.
Doesn’t sound that bad, huh? The problem with this is that you are OPTED IN automatically, which is intrinsically wrong. At least <start_sarcasm>we are fortunate that we can opt out!<end_sarcasm>
A few years ago, it was Microsoft… then it was Google… now it’s Facebook… the big gorilla in the room always seems to make decisions that they are willing to defend no matter what, in spite of not necessarily being in the best interest of anyone but themselves. 🙁