Data brokers, Data broker companies

Do Mental Health Apps Sell Your Client’s Sensitive Data to Data Brokers?

735
0

Please support Telehealth.org’s ability to deliver helpful news, opinions, and analyses by turning off your ad blocker.

A February 2023 report published by researcher Joanne Kim outlines the results of a two-month study of how data brokers sell sensitive data mental health data collected from mHealth mental health apps. The paper offers an eye-opening look into many privacy abuses that can be perpetuated by well-meaning practitioners who select and encourage their clients to use apps without being appropriately informed. In the last two months alone,  the industry has been rocked by reports of scrutiny faced by several of the largest online mental health employers for sharing sensitive mental health patient data for profit. These companies include BetterHelp, Cerebral, GoodRX, Monument, and Workit Health. Dozens of other companies have been exposed by researchers who reported the possible sharing of sensitive patient data with large marketing companies. Those advertising companies include Google, Facebook, Pinterest, TikTok, and many more, which are immune to HIPAA violations because they are not covered entities.

Kim Study: Sharing of Sensitive Mental Health Information via mHealth Apps

The Kim study exposes another potentially critical gap in HIPAA and other laws, which allow mHealth app companies to sell sensitive client and patient information with data brokers, some of whom openly advertise that they are willing and able to sell Americans’ highly sensitive mental health information. The research paper is published on the Duke University website. It is summarized below to provide a brief outline of the issue and alert practitioners who wish to prevent the potential exploitation of their clients and patients.

The Mental Health Policy Gap Excavated by Data Brokers

The COVID-related surge in mental health disorders and the limitations on accessing in-person therapy Ignited a rapid shift towards telehealth and mHealth apps, where downloads increased by 200% between 2019 and 2020.1 In response, many telehealth companies have expanded their telehealth platforms to serve consumers through mental health apps, reducing barriers to entry for many people without the bandwidth or computer systems to interact fully with websites. The shift to mHealth apps reduces barriers to entry for people experiencing mental health disorders and makes them popular among marginalized communities, such as the Latinx community. A 2019 study documented that 20% of LatinX smartphone users were more likely to use a health app than Caucasians. 2

While mHealth apps may have improved needed care access, some have secretly collected and sold sensitive mental health information to other companies.3 The most alarming aspect of this situation is that most mHealth apps are not yet regulated by the Health Insurance Portability and Accountability Act (HIPAA), which only applies to certain covered entities. HIPAA, in some cases, may not legally obligate privately held companies to keep collected information private. As a consequence, wearables, social media platforms, and other technology companies, including mHealth apps, can often share, license, and sell their user’s health and other data to third parties without the user’s consent or knowledge.4

What Are Data Brokers and How Can They Hurt Your Clients?

Data brokers collect consumers’ personal information and resell or share it with other entities. Largely unregulated, data broker companies sell billions of data records ranging from profiling individuals for targeted marketing, assessing health costs and associated risks, and other reasons to law enforcement, health insurance providers, and possibly, scammers.5 The industry’s lack of regulation exposes vulnerable people seeking health through health apps to privacy violations, power imbalances, embarrassment, racial profiling, and economic exploitation.

The Kim report on the activities of data brokers states:

Previous research by Duke University has identified data brokers advertising highly sensitive data on hundreds of millions of Americans, including their sensitive demographic information, political preferences and beliefs, and whereabouts and real-time GPS locations, as well as data on students, first responders, government employees, and current and former members of the U.S. military.

The actual numbers of these companies and the details of what and how they do it is largely unknown, despite investigative reporting and various types of research.6,7,8,9,10,11

The Kim research study involved contacting data brokers to initiate and collect data on their sales inquiry processes and to examine the privacy policies of 10 data brokers. Thirty-four data brokers were found using a Google search using terms such as “healthcare data providers,” “mental health data brokers,” “health information for sale,” “mental health data for sale,” and “data brokers who sell mental health data.” 

Of the data collected, the variable cost of purchasing mental health patients’ data was some of the most concerning. On one end of the continuum, a firm that Kim investigated charged “$275 per 1,000 individuals’ aggregated records, with a minimum order of 5,000 records,” decreasing the cost as the needed records increased. Another firm’s “cost-per-record was $0.20 per record for a total of 10,000 aggregated records and a minimum expenditure of $2,000. Further, the firm’s cost-per-record also decreased as the volume of requested records increased. For example, for 435,780 records, the cost per record was $0.06.”

Kim’s report also states that the highest data broker’s fees are “based on the service and product, quoting $30,000 for a product that provided counts on how many times a specific medication had been filled in a specific area and upwards of $100,000 for additional demographic data.” Some firms omitted names, while others included names, addresses, phone numbers, and email addresses.

Kim explains the following:

The unregulated collection, aggregation, sharing, and sale of data on individuals’ mental health conditions puts vulnerable populations at greater risk of discrimination, social isolation, and health complications. Health insurance providers—which already buy individuals’ race, education level, net worth, marital status, and other data without their knowledge or full consent to predict healthcare costs—could buy mental health data to discriminately charge individuals for care or discriminately target vulnerable populations with advertisements. 46 Scammers could purchase mental health data from data brokers to exploit and steal from individuals living with mental health conditions, as scammers have done to steal from payday loan applicants.

What Concerned Practitioners Can Do

Given the recent number of articles exposing telemental health companies for profiting from sharing sensitive client data, therapists may want to advocate for their clients and their professions. It is time for clinicians to take action. 

  1. Practitioners may want to inform their elected officials of their concerns over the legal sharing of information by mental health apps. Only 5 minutes are needed to outline the problem, include a link to this blog post if you think it will help, and explain how it may impact one or more of your clients (do not use identifying information). Look for elected officials’ websites in your state, and use the “contact” link to send them your email letter. Ten to fifteen minutes is all it should take. Please act today. Send a copy of your email to your colleagues so they can do the same.

While all healthcare associations do not require advocacy, kudos are in order to the American Counseling Association for its long-standing ethical code section requiring that all counselors advocate for their profession. The ACA’s 2014 Code of Ethics reads A.7. Roles and Relationships at Individual, Group, Institutional, and Societal Levels and A.7.a. Advocacy – When appropriate, counselors advocate at individual, group, institutional, and societal levels to address potential barriers and obstacles that inhibit access and/or the growth and development of clients. 

Given the dramatic crackdowns that have been recently reported, most responsible practitioners will understand the critical nature of advocacy by professionals concerned about protecting patient and client privacy in these challenging times.

  1. Professionals may also need to know the following:
  • Where and how to research apps currently being used in their practices. 
  • Who developed the mHealth app, and what’s being done with any data collected? 
  • Where to find clinically helpful apps that can be trusted. 
  • Patient education skills, such as toggling off potential data transmission to developers. 

When clinicians lack information, they may want to participate in reputable professional development training to ask questions and learn to protect the privacy of those who entrust them with their care.

References

Martinez-Martin, Nicole, Ishan Dasgupta, Adrian Carter, Jennifer A Chandler, Philipp Kellmeyer, Karola Kreitmair, Anthony Weiss, and Laura Y Cabrera. “Ethics of Digital Mental Health During COVID-19: Crisis and Opportunities.” JMIR Mental Health 7, no. 12 (December 22, 2020): e23776. https://doi.org/10.2196/23776.

Ethics of Digital Mental Health During COVID-19: Crisis and Opportunities.Marbury, Donna. 3 Reasons Why Wearables Bring New Complications for HIPAA Compliance. Health Tech Magazine, September 23, 2020. https://healthtechmagazine.net/article/2020/09/3-reasons-why-wearablesbring-new-complications-hipaa-compliance.

Huckvale, Kit, José Tomás Prieto, Myra Tilney, Pierre-Jean Benghozi, and Josip Car. Unaddressed Privacy Risks in Accredited Health and Wellness Apps: A Cross-Sectional Systematic Assessment. BMC Medicine (September 7, 2015): 214. https://doi.org/10.1186/s12916-015-0444-y.

Ramirez, Edith. California Legislature Passes Nation’s Second’ Data Broker Registration’ Law, 2014.
https://www.troutman.com/insights/california-legislature-passes-nations-second-dat a-broker-registrationlaw.html.

Grauer, Yael. What Are ‘Data Brokers,’ and Why Are They Scooping Up Information About You?, VICE, March 27, 2018.
https://www.vice.com/en/article/bjpx3w/what-are-data-brokers-and-how-to-stop-myprivate-Data-collection.

Ethics of Digital Mental Health During COVID-19: Crisis and Opportunities. 3 Reasons Why Wearables Bring New Complications for HIPAA Compliance.

Privacy International. What Is Privacy?, PrivacyInternational.org, October 23, 2017.
http://privacyinternational.org/explainer/56/what-privacy.

Nass, Sharyl J., Laura A. Levit, Lawrence O. Gostin, and Institute of Medicine (US) Committee on Health Research and the Privacy of Health Information: The HIPAA Privacy Rule. Beyond the HIPAA Privacy Rule: Enhancing Privacy, Improving Health Through Research. Beyond the HIPAA Privacy Rule: Enhancing Privacy, Improving Health Through Research. National Academies Press (US), 2009.

Sherman, Justin. Data Brokers are a Threat to Democracy, WIRED, April 13, 2021.
https://www.wired.com/story/opinion-data-brokers-are-a-threat-to-democracy/.

Disclaimer: Telehealth.org offers information as educational material designed to inform you of issues, products, or services potentially of interest. We cannot and do not accept liability for your decisions regarding any information offered. Please conduct your due diligence before taking action. Also, the views and opinions expressed are not intended to malign any organization, company, or individual. Product names, logos, brands, and other trademarks or images are the property of their respective trademark holders. There is no affiliation, sponsorship, or partnership suggested by using these brands unless contained in an ad. We do not and cannot offer legal, ethical, billing technical, medical, or therapeutic advice. Use of this site constitutes your agreement to Telehealth.org Privacy Policy and Terms and Conditions.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x