Your browser is out of date.

You are currently using Internet Explorer 7/8/9, which is not supported by our site. For the best experience, please use one of the latest browsers.

866.430.2595
Request a Consultation
banner

HIPAA Versus 2021: Patients’ Rights in Track and Trace Applications

In the age of Covid-19, technology developers and healthcare providers are working together to find solutions. This kind of partnership is not new, however. In the years leading up to 2020, the healthcare industry was already embracing technology as a means to increase quality, efficiency, and availability of healthcare for better patient outcomes. So, it is not surprising that public health experts are looking to technologies and applications now to help solve Covid-related issues, collecting and leveraging diagnosis data to inform individual and community strategies. This could be an important step in addressing the ongoing pandemic, but also raises longer term questions about patient rights and the adequacy of language in the Health Insurance Portability and Accountability Act (HIPAA) to address new scenarios created by rapidly developing technology.

HIPAA vs. 2021 Issues

Created to address patient privacy concerns in 1996, HIPAA holds healthcare organizations accountable for how the collect, store, process, share, and– most importantly– protect PHI. Written in a very different tech and threat landscape, however, HIPAA pre-dates the current healthcare evolution into digital heath by 24 years, and amendments such as HITECH and the Omnibus rule are still insufficient in dealing with the current threats and technology, which are continuously evolving. Nor does this act consider needs in a national health crisis like the one we are facing, including the kinds of organizations that might be leveraging patient data, how they are using it, and the potential impact on individual patients if data is compromised.

Track and Trace Apps

Two of the largest data collection organizations in the world, Google and Apple, have worked together in the last year to develop a track and trace platform that promises to help combat the pandemic at a personal and regional level. Track and trace applications are designed to identify parties with whom a COVID-19-infected person had contact, drawing on information about the location of a person’s mobile phone and its proximity to other devices. At face value, this may be a worthy endeavor, helping people reduce the risk of exposing others by encouraging testing and self-quarantines during asymptomatic stages of infection– when carriers are more likely to be spreading the virus. Mandatory Contact and Trace Apps are currently in use in China, India and Turkey, and use of these apps is voluntary in Korea, Japan, and many European countries. Due to the lack of a nationalized response, the US has not yet gone one way or the other in requiring individuals to download and use a “track and trace” application. Some states have started adopting the use of these applications, though, which definitely raises security concerns.

The Issue of Consent

It should also be noted here that reliance on “consent” is illusory in this case, because even though download and use of the application is “voluntary” and not government-mandated, the use of it becomes required through economic and social barriers. This would be another form of “justifying monitoring through the fiction of consent”, like social media terms of use policy agreements or “notice and choice” policies that are frequently used to justify monitoring of employee emails.

To bring this home, Google+Apple applications, built on their privacy-optimized Exposure Notification framework, have begun arriving in the USA. They work by sharing “anonymous” Bluetooth beacons with nearby devices running the same software, tagging those that suggest extended and close contact associated with coronavirus spread, and saving the last 14 days of these records. A positive test for COVID-19 in one of those states should include a code you enter into the app to upload its close-contact records to a health-authority server that then makes this anonymized data available to all these apps at their daily check-ins. If the app sees one of these reports match its saved list of close contacts, it warns of possible exposure and advises testing and quarantine.

Security and Privacy Concerns

Google and Apple assure users that their private data is secure, namely because there is no use of the GPS allowed within the app. That assurance obfuscates the issue, however, because it isn’t the data within the application that is in question. It’s the fact that the app sits on a mobile phone– which is a portable tracking device by definition– and the use of Bluetooth technology, which can easily be used to pinpoint a user’s location. Bluetooth is inherently insecure and lacks the controls necessary to actually guarantee the security, confidentiality, and privacy of health data in relation to COVID exposure and diagnosis, and even possible vaccines. Additionally, Bluetooth LE and DP-3T technology is incredibly noisy and often broadcasts its existence to any and all BT enabled devices nearby, regardless of what that device is. (DP-3T is a Bluetooth-based tracking in which an individual phone’s contact logs are only stored locally, so no central authority can know who has been exposed.)

Bluetooth’s Inherent Insecurity

The vulnerability here has to do with the way Bluetooth-enabled devices pair with each other. In that relationship, one device serves as the central connection and the other plays a peripheral role. The peripheral device sends out a signal that contains a unique address (similar to an IP address) and data about the connection. Most devices produce a randomized address which automatically reconfigures periodically. That's meant to protect users' privacy, but BU researchers found that using an open-source "sniffer" algorithm, they could identify Bluetooth connections even when their addresses changed.

The reality is that true anonymity in the age of data collection is misleading. An Apple or Android cell phone is tied to a phone number and to either an Apple ID or other email address, which can be tied to a Facebook account, the track and trace app the user downloaded to their phone, etc. and it takes as little as TWO consistent, corresponding data points to tie a user to their data.

Lack of Patient Protections

The implication of this technology (and the data sources it taps) is that existing HIPAA, HITRUST, and Omnibus Rule wording is insufficient to address these issues from a patient privacy and human rights standpoint. If the data in use were being collected from healthcare providers, there would be a framework for protection, and all software vendors would be required to sign a Business Associate’s Agreement (BAA). However, since the patient data being collected is originally from public health authorities (PHAs), which are not included alongside other healthcare organizations as “covered entities”, as defined by HIPAA, neither the PHAs nor the applications are under any obligations to protect PHI under HIPAA, HITRUST, or the Omnibus Rule.

Once the practice of collecting patient data from PHAs and using it in private software applications is established as standard, if the pandemic no longer requires this app, what is the exit plan? What is the deprecation date? What is the deletion plan for patient data? Will patients ever have the right to ask for their data back? What obligations will Apple and Google be under? Who is overseeing them, and what is the enforcement mechanism? Do we really think Google and Apple don’t plan to monetize this data in the future?

Google’s History With Health Data

Google has already completed its acquisition of the health tech tracking device manufacturer, FitBit. Roughly 80% of our health information is digital, and Google has actively been collecting health records in an undertaking it has called Project Nightingale, in partnership with Ascension Health. During this partnership to funnel health records to Google, in which patient names, dates of birth, and even lab results from 2600 healthcare centers and hospitals were transmitted, neither patients nor doctors were not notified. There was no consent asked or given, and that is a serious issue for anyone who is an advocate of patients’ rights to privacy. The problem here is not the collection of data, it is the collection of data without an individual’s consent to have his or her information released. The collection of patient data by private companies whose business model is built on monetizing personal data presents a serious moral dilemma. What do lawmakers believe will happen when citizens’ location, internet search, purchasing, exercise, and personal health data is connected in a for-profit company’s queriable data sets?

Inadequacy of Current US Laws

 Through this lens, you can see where the current wording of these laws is absolutely inadequate to address privacy and consumers’ rights in this particular subset of developing health technology. Then, one has to consider the next hurdle, how to adequately amend legislation. Legislation is written by lawmakers and academics, who generally have legal and political backgrounds– not by technologists. Even after a law has been written and passed, it must then be interpreted by courts– which are also filled with judges who have been educated on the law, not technology. Experts have been saying for years that the US court system and our legal oversight agencies are notoriously out of step with the rate of technology development, making accountability by prosecution and/or civil litigation very difficult. Point being, without a strong technical understanding of the infrastructure on which these applications are built, trying to write laws that create proper accountability is somewhat problematic. This is not impossible to overcome, but it is certainly an issue that necessitates thought and intention when HIPAA laws are being brought up to speed for the realities of 2021.

Responsible Health Technology Development

For any and all technology and application developers doing the important work of helping us leverage everything we have to fight a deadly pandemic, this is something to keep top of mind. One would want to think about where the market and laws can and will ultimately go, and do themselves the favor of collecting, storing, using, and protecting PHI from any source–covered entity or not– in such a way that they won’t have to abandon their business models later. If done correctly, their technology will also maintain enough security controls and validations of both security and compliance along the way that a startup’s investors will rest easy at night, knowing their investments are secure.

Everyone wants the pandemic to end, and we all want to use every tool available to us in fighting COVID-19– especially technology, but privacy and security minded professionals also need to come alongside the developers and owners of those tools to support accountability for the collection and use of PHI.

If you would like to talk with someone about your organization’s potential risk, a HIPAA Risk Assessment, or how to better protect yourself against malware, you can request a consultation with one of our TRUE professionals.

Ask A Question