How to Navigate Mental Health Apps That May Share Your Data | News & Commentary

If you’ve ever assumed that information shared on a mental health app was confidential, you’re in good company with many others who probably assume that sensitive medical information is always protected. However, this is not true and it is important to understand why.

Many of us are familiar with some kind of digital health application or are active users. Whether it’s nutrition, fitness, sleep tracking, or mindfulness, the arena for apps that can help us track aspects of our health has never been bigger. Similarly, platforms that help us reach healthcare providers and receive virtual care have become more available, and often necessary, during the pandemic. Online therapy in particular has grown over the years and has become a critical resource for many people during quarantine and remote living.

Making health resources and care more accessible to people is of vital importance and easy access to health resources right from your phone is evident.

But among the many, severe effects of Roe v. calf A number of digital privacy concerns have been overturned. There has been a significant focus on period tracking or fertility apps as well as location information lately, and rightly so. On July 8, the House Oversight Committee sent letters to data brokers and healthcare companies “requesting information and documents regarding the collection and sale of personal reproductive health information.”

Less discussed was the huge gap in legal protections for all types of medical information shared across digital platforms, all of which should be subject to regulation and better oversight.

The U.S. Department of Health and Human Services (HHS) recently released updated guidance on cell phones, health information and HIPAA, confirming that the HIPAA privacy rule does not apply to most health apps because they are not “covered entities” under the law Health Insurance Portability and Accountability Act (HIPAA) is a federal law that creates a privacy rule for our “medical records” and “other individually identifiable health information” during the flow of certain health care transactions. Most apps chosen individually by the user are not covered – only platforms used specifically by or developed for traditional healthcare providers (ie a clinic’s digital patient portal where they send you messages or test results).

Mental health apps are a telling example. They, like other digital health apps, are generally not bound by the privacy laws that apply to traditional healthcare providers. This is particularly worrying because people often turn to mental health platforms specifically to discuss difficult or traumatic experiences with sensitive implications. HIPAA and state laws on the subject would need to be amended to specifically include digital app-based platforms as affected entities. In California, for example, a bill is currently pending that would bring mental health apps within the purview of the state’s Medical Information Confidentiality Act.

It’s important to note that even HIPAA has exceptions for law enforcement, so including these apps within the scope of HIPAA would still not prevent government requests for this data. It would be more useful to regulate information shared with data brokers and companies like Facebook and Google.

An example of information that is shared is what is collected during an “uptake questionnaire” that popular services like Talkspace and BetterHelp require you to fill out in order to be matched with a provider. The questions include highly sensitive information: gender identity, age, sexual orientation, psychological history (including details such as when or if you have had thoughts of suicide, whether you have had panic attacks or phobias), sleeping habits, medications, current symptoms, etc. These entry-level responses were provided by Jezebel found to be shared by BetterHelp with an analytics company along with the user’s approximate location and device.

Another type is any “metadata” (ie, data about the data) about your use of the app, and Consumer Reports has noted that this may include the fact that you are a user of a mental health app. Other information shared may include how long you are in the app, how long your sessions with your therapist last, how long you message in the app, when you log in, when you message/talk to your therapist, your approximate location , how often you open the app and so on. Recipients of this information from Talkspace and BetterHelp have been identified as including data brokers, Facebook and Google. Apps regularly justify sharing information about users when that data is “anonymized,” but anonymized data can easily be associated with you when combined with other information.

In addition to collecting and sharing this data, how health apps store the data is incredibly opaque. Some of these apps don’t have clear policies on how long they keep your data, and there’s no rule obliging them to do so. HIPAA doesn’t create record-keeping requirements — they’re governed by state laws and likely don’t include health apps, as they are subject to them. For example, New York State requires licensed psychiatrists to keep records for at least six years, but the app itself is not a practitioner or licensed. If you request deletion of your account or data, not everything may be removed, but there is no way of knowing what remains. It’s unclear how long sensitive information they collect and store about you might be available to law enforcement at a later date.

Accordingly, when navigating health apps that may share your data, keep the following in mind:

The accessibility to care that these types of apps have created is more than crucial and everyone should seek the care they need, including through these platforms if they are the best option for you (and they are for many people) . It is important that you are as informed as possible when using it and take the steps available to you to maximize your privacy.

Leave a Reply

Your email address will not be published. Required fields are marked *