Saturday, September 12, 2015

Privacy, Security & Consent - an unholy trinity we need to address

So, you have credit cards, maybe a mortgage or auto loan, health insurance, auto insurance, and more.  You are probably using Facebook or some other social networks,  You also have a desktop, a tablet and/or a smartphone.  Maybe other devices like a fitbit.  Do you know who knows what about you?  Does the avalanche of privacy statements from all of the institutions that have your data make sense?  Does it make you feel that your privacy is being protected?  If you actually feel comfortable in the current situation, this blog is not for you.

Last fall I attended a conference that was focused on different aspects of privacy and security.  One of the speakers claimed that a presentation by IBM a few years early stated that every year approximately 2.5 quintillion bytes of data was created.  That is 2.5 followed by 18 zeros and this number is growing very quickly.  I am looking for the actual source of those numbers, but I'm sure they are a good approximation.  These numbers are way too big for a human to understand, but nonetheless, we are confronted by these numbers in our daily lives.  This is because we have this global network, the internet, that links all of this data together in myriad ways and makes enormous amounts of it available to us and the institutions of which we are members (voluntarily or not).

Managing this data is hard to conceptualize.  Ensuring that the data that each of us considers private remains private -- that is, only accessible to people or institutions which we approve -- seems to be impossible.  Yet that is what is needed if the public is ever to be comfortable with sharing data.  And healthcare and human services will remain stuck in the incredible inefficiencies of the 20th century data infrastructure with a disbelieving public if this doesn't change.

Currently, the feds (primarily SAMHSA Substance Abuse and Mental Health Services Agency), have proposed giving the public a way to control a small subset of healthcare data.  A method that has not caught on to be polite.  The scheme (primarily to protect substance abuse and mental health data, but there is a desire to expanded to all healthcare data) allows the patient to determine which data he or she may allow a physician to share.  This is called segmentation.  This is a very bad idea for many reasons:

  1. So far, this only applies to particular forms of the electronic medical record, namely the C32 and CCDA.  Not all commercial medical record vendors support either standard, and those that do in general only support a subset of these standards.
  2. The majority of clinical data sharing takes place with messages that conform to the HL7 or Direct, neither standard supports segmentation nor does it look like they ever will.
  3. "Clinical Swiss Cheese" (thanks to Mark Chudzinski for coining this term).  That is, with segmentation, a physician may never know if all of the relevant data they need to serve a patient is available to them.  This provides a strong disincentive for a physician to participate in data sharing since inconsistent access to data raises large liability concerns let alone the concern of helping to heal a patient.
  4. Hidden conditions may be deduced because of other data that has been shared.  For example (this has been cooked up to make it simple), let's say you have a heart condition that you don't want anyone including say an othopedic surgeon to know about.  But the surgeon needs to know the meds you are on and sees you are taking aspirin once a day.  That would be enough to make it clear you have a heart condition.
  5. Patients may not feel comfortable deciding what data to share or even if they do, they may not make choices that are wise.
  6. Physicians are already functioning under the Health Information Privacy and Portability Act (HIPPA) which makes it illegal for a physician to share data outside of the needs of treatment, payment or operations (TPO).  Allowing a patient to decide what information a physician can or can't see would appear to tell the public that you can't trust your physician with your data.  I fail to see how this can help improve our healthcare system.
So, if segmentation is a bad idea, what is a good idea?  How do we allow our data to be shared for our benefit without losing our privacy?  Great questions, see the next post for an answer.

Introducing Insightamation

What is the relationship between data ("big data" is a stupid marketing term imho), business, politics and economics?  This blog will plunge into this topic in an attempt to create discussion, wake people up to the reality of the continuing data tsunami in which we live, as well as to advocate for policies and services that serve humans no matter their ethnic, racial, gender, class, history or educational level.

My background is appropo for this discussion.  I am a mathematician by training (still working on my thesis topic in Algebraic Topology for over 30 years), a computerphreak since 1963, was the Chief Information Officer (CIO) of the State of Illinois Medicaid Agency, Healthcare and Family Services from 2005 until 2011.  I also served in a similar capacity from 2011 until June of this year as the CIO of the Illinois Health Information Exchange (actually served in 2 different related agencies, but those details are unimportant).

Some topics will be technical and I will do the best I can to educate readers in the background they will need to understand the issue.

I have chosen the title, Insightamation, for the blog, it is also the name I consult under.  The name reflects the need for both human insight and automation to guide our path forward in this more and more data-centric world.  Insight is inherently human from my point of view.  It represents our ability to reflect on our experiences and apply the reflection on any topic whether already experienced or not.  It is an area that still defies automation, though maybe not forever (look to a future blog on the topic of automation and participation in human culture).  I believe that automation needs to be guided by the needs of people not large corporations, not brilliant science fiction fantasies of "singularities" or other such trip outs.  There is a beautiful book called Computer Power and Human Reason published by Joseph Weisenbaum (one of the early pioneers of automation at MIT) that lays out many of these issues very clearly and is still relevant today even 40 years after it was published.  Many of my ideas have been in response to this book.

One more thing.  I intend to create a national (or hopefully global at some point) infrastructure for sharing data intended to be used in services that people need.  Services such as health care, social services, educational services, legal, financial, military, wellness and more.  All of the items related to the creation, somehow, of this infrastructure will eventually make their way to this blog.

Thanks for reading this introduction.  I look forward to feedback.  I have no problem with criticism, but if you think that insults and wild screeds are good ways to communicate, find somewhere else to post.  I will delete posts that add nothing to the discussion including disrespectful posts.