9. Openness and privacy are not in conflict with the right kinds of identity, consent and data control mechanisms in place.
Openness and privacy are not well defined terms, so let's start with an attempt at a working definition. Privacy is mostly about control. A person has rights to control their personal information with some restrictions that depend on the context in which the information is collected and used. Privacy is a direct benefit to one's self. Openness, on the other hand, is mostly about sharing for the benefit of others and typically provides only an indirect benefit to the person herself. The benefit of openness is similar to charity and, as with charity, it helps to have openness acknowledged and occasionally rewarded. When I give money to my alma mater, I receive a thank you card, maybe a listing somewhere, and occasionally a nice news magazine. Openness and privacy need not be in conflict as long as the benefits of openness outweigh the risks.
The risks associated with openness can be understood and mitigated through a combination of social and technical means. Effective risk mitigation makes everyone better off by making more information useful. Typical benefits include teaching, research, quality measures, and reputation mechanisms that protect everyone from bad actors. The social means of mitigating the risk of openness include trust and reputation in the system as a whole. Repressive regimes and dangerous circumstances, for example, will drive people to limit openness. The technical means of mitigating the risk of openness include identity, consent, and data control. Each one of these is a complex discipline in its own right and each is strongly influenced by the decreasing cost and rapid expansion of information technology.
Identity mitigates the risks of openness if de-identification makes it unlikely that personal information can be linked to a particular individual. Unfortunately, de-identification is becoming increasingly difficult. As computers and networks get faster and cheaper, the amount of information associated with any given encounter a person has with a health care service is growing exponentially. Although the information from any one source seems to lack a link to the specific person, the ease in searching for and cross-matching multiple sources allows computers to effectively re-identify the individual.
Identity, when it's misused, reduces the effectiveness of consent and of data control as risk control mechanisms. The result damages trust and the social drivers to openness. A common example of this is the use of de-identified data for marketing or research without patient consent. A high-profile legal case at a major New York hospital resulted from making a reality TV show out of one de-identified patient's emergency department experience.
Properly used, identity can promote both openness and privacy by introducing transparency that builds trust and drives accountability. The key is to provide notice to the patient when the data is used. HIPAA calls this Accounting for Disclosures and it could be as easy as the email you get when your iTunes account buys a $0.99 song. A digital and real-time accounting for disclosures would provide a powerful deterrent to snooping and other privacy invasions that currently are only discovered through whistleblowers. The accounting for disclosures would also add security, as it does with bank accounts already, and it would reduce medical fraud as perpetrators would presume that many eyes are watching. Accounting for Disclosures could be applied regardless of whether Consent was required to share data under HIPAA. This would help build trust and drive openness in the overall system. Best of all, Accounting for Disclosures is already the law and all it needs is the willingness to implement it in a patient-centered way.
Prior consent as implemented today is very difficult to understand because it's abstract and because it's requested or demanded at a time when patients are most vulnerable. An informed and dignified consent practice will ask for consent in the context of a data use request, at a time when the patient could see what data is about to be shared and why. This, again is common practice today outside of healthcare. When software crashes on your computer, the system puts up a consent to notify the developers, states whether the data is anonymized, and offers to display the data to be sent. If we can do this for free consumer apps, why can't we do it for health data worth $ hundreds per transaction?
Taking data that patients can't see and can't use is hardly a path to trust and voluntary openness. Today, most of the flows of clinical data are invisible to the patient and carried out without consent. HIPAA was amended in 2002 to remove the patient's right to control their own data and, in this way, removed the essence of privacy and, some would say, human dignity. As personal data becomes ever more detailed and easier to store forever, the risks of just taking data grow immensely. Two examples of detailed personal data are a genome increasingly likely to be collected as part of clinical care as well as detailed pediatric records that often include mental health and youthful transgressions. School records, on the other hand, are typically put under the control of the subject and then destroyed a few years after graduation. Why would we not ask for the same degree of control over our health records?
Data control also means eliminating the hundreds of hidden data brokers that collect and sell our medical data without our knowledge and control. Hidden data brokers are an essential component of the secret contracts that keep US health care pricing high and quality measures a mystery. In finance, the data brokers, aka credit bureaus, are regulated, forced to provide free credit reports and designed to assist in lowering the cost of credit for the consumer. What is the service that hidden health data brokers are providing to the consumer?
Data control also allows patients to steer their clinical data to the causes and uses that they support. People should have a choice to open their data for public research instead of having it go into privatized data silos. Without control, much of the detailed laboratory and outcomes data collected as part of modern cancer treatment is ending up behind for-profit firewalls to be sold to the next patient. This drives up everyone's cost and it makes developing the next generation of medicine a secretive and non-collaborative process.
Data control by the patient is unfortunately not a priority of HIPAA and general data privacy rights tend to exclude health data because that's supposed to be covered in HIPAA. Catch 22. The only recourse we have is to opt-out of all health information exchanges and research consents that don't voluntarily follow the transparency, consent, and data control principles across both clinical and research uses. For now, exercising our right to opt-out whenever we can, might drive the system to greater openness and privacy.
This article has outlined the essential elements of data openness and privacy. The minimum is transparency, achieved through non-coercive and visible identity management and real time accounting for disclosures. Voluntary labeling of health services that provide transparency, consent at the time of data use, and control over the research uses of data will allow patients to decide where to seek care and reward open and publicly accessible data practices. One current effort to define and standardize this kind of technology is HEAlth Relationship Trust or HEART.
Find out more and consider joining our work here.