65. Individuals have a right to health data privacy. Rights to sharing must be established with the individual it originates from, or their legal agent, in advance of sharing.

Adrian Gropper, MD, CTO Patient Privacy Rights

This is part of a series of essays on the Health Rosetta’s Principles.

A 2011 report by the World Economic Forum states: “Personal data is becoming a new economic “asset class”, a valuable resource for the 21st century that will touch all aspects of society. This report finds that, to unlock the full potential of personal data, a balanced ecosystem with increased trust between individuals, government and the private sector is necessary.”

This is particularly true of health data which drives about $3 Trillion or 20% of the US economy. Personal health data directly and indirectly determines the cost of healthcare and drives the discrepancies and discrimination that averages more than $10,000 / year for every US person.

Recent National Institute of Standards and Technology (NIST) guidance on Privacy Risk Management lists the following: “Catalog of Problems for Individuals:”


Loss of Self Determination

Loss of autonomy: Loss of autonomy includes needless changes in behavior, including self-imposed restrictions on freedom of expression or assembly.
Exclusion: Exclusion is the lack of knowledge about or access to personal information. When individuals do not know what information an entity collects or can make use of, or they do not have the opportunity to participate in such decision-making, it diminishes accountability as to whether the information is appropriate for the entity to possess or the information will be used in a fair or equitable manner.
Loss of Liberty: Improper exposure to arrest or detainment. Even in democratic societies, incomplete or inaccurate information can lead to arrest, or improper exposure or use of information can contribute to instances of abuse of governmental power. More life-threatening situations can arise in non-democratic societies.
Physical Harm: Actual physical harm to a person.



Stigmatization: Personal information is linked to an actual identity in such a way as to create a stigma that can cause embarrassment, emotional distress or discrimination. For example, sensitive information such as health data or criminal records or merely accessing certain services such as food stamps or unemployment benefits may attach to individuals creating inferences about them.
Power Imbalance: Acquisition of personal information that creates an inappropriate power imbalance, or takes unfair advantage of or abuses a power imbalance between acquirer and the individual. For example, collection of attributes or analysis of behavior or transactions about individuals can lead to various forms of discrimination or disparate impact, including differential pricing or redlining.


Loss of Trust

Loss of trust is the breach of implicit or explicit expectations or agreements about the handling of personal information. For example, the disclosure of personal or other sensitive data to an entity is accompanied by a number of expectations for how that data is used, secured, transmitted, shared, etc. Breaches can leave individuals leave individuals reluctant to engage in further transactions.


Economic Loss

Economic loss can include direct financial losses as the result of identity theft to the failure to receive fair value in a transaction involving personal information.”

Protection from these “problems” faced by all of us as individuals requires the right to control and the means to control sharing of our health information. Unfortunately, our individual right becomes a service institution’s obligation and this is the source of considerable tension with regard to electronic health records, data brokers, insurance agents, research databases, and pharmaceutical marketing. Each of these institutional systems benefits from sharing our personal data and therefore employs various strategies to profit from our personal data. This, in effect, is an unintended transfer of our personal data as an asset, to an asset of the institution.

The strategies for institutional data taking fall into three broad categories: prior consent, anonymization, and secrecy. Prior consent is the tactic of demanding rights to share data beyond the institution in advance of providing a service that may not require any data sharing. As compared to asking for authorization to share data for a specific purpose at the time when such sharing is required, prior consent seeks a release in advance. A common example of prior consent is for research uses. Although patients have shown great willingness to provide data for research, it is also clear that patients would want to know what the research is and whether the result will be publicly available or sold for profit.

Anonymization is the taking of personal data that has been de-identified for the benefit of the institution. De-identification can reduce some of the risks associated with the “problems” listed by NIST but it does not eliminate these risks entirely. First of all, as an asset, the personal health data could benefit the individual economically or emotionally, as in a charity donation, if they needed to be asked for access. More importantly, anonymization is used to eliminate accountability for how personal data is used. For example, the common practice of selling prescription data is hidden by anonymization but the result is higher pharmaceutical costs for all of us.

Widespread secrecy and lack of accountability is arguably the most expensive outcome of the taking of personal data. The vast majority of personal health data is shared, without any consent, under the HIPAA “treatment, payment, and operations” exclusion. This makes it difficult for patients to get timely second opinions, clear cost estimates, and to verify the accuracy of medical bills because the institutions that are sharing our personal information can do so with impunity.

Modern computer and communications technology makes our personal data asset more valuable than ever. Ubiquitous connectivity, low-cost secure storage, and sophisticated communities of support make it possible for each patient or their designated legal agent to be notified when health data is shared the same way we are notified when our photo is tagged, our tweet is favorited, or our bank makes auto-payment. The need for prior consent and secrecy should now become the exception rather than the rule. The use of anonymization to protect privacy remains but it is re-interpreted as part of a specific request to use data rather than an excuse to take it without notice.

This does not mean that a hospital needs to ask us to sign in to some portal each time a piece of our data is to be shared. Technology such as User Managed Access, or UMA, enables each of us to specify the trusted on-line authorization service we choose independently of any particular hospital, clinic, or insurance company. Each institution can come to our “authorization server” before they share data with each other on our behalf. Not only does this introduce critical transparency and accountability into the most important aspect of our life, but it dramatically reduces the risks of hacking and security breaches by making breaches detectable in minutes instead of months.

Get involved

Improve your benefits

Start down the path to 20-40% lower costs & better benefits, whether you have 100 or 100,000 employees.

Become a member

We help mission-aligned groups and people learn, connect, advance goals, and scale healthcare's fixes.

Get certified

We help benefits advisors stand out through transparency, expertise, and aligned incentives.

Get our emails

News, case studies, tips, and data on Health Rosetta style benefits and our mission to scale healthcare's fixes.