One company advertised the names and addresses of people suffering from depression, anxiety, post-traumatic stress disorder, or bipolar disorder. Another sold a database of thousands of aggregated mental health records for $275 per 1,000 “patient contacts.”
Data brokers are now selling your mental health

But the pandemic-era proliferation of telehealth and therapy apps has fueled an even more controversial product line: Americans’ mental health data. And selling it is completely legal in the United States, even without the person’s knowledge or consent.
In a study published Monday, a research team at Duke University’s Sanford School of Public Policy outlines just how vast the market for people’s health data has become.
After contacting data brokers and asking what mental health information she could buy, researcher Joanne Kim reports that she eventually found 11 companies willing to sell data packages that included information about what antidepressants people were taking, whether have you struggled with insomnia or even attention deficit disorder? , and details of other medical conditions, including Alzheimer’s disease or bladder control difficulties.
Some of the data was offered in aggregate form, which would have allowed the customer to know, for example, a rough estimate of how many people in a particular zip code might be depressed.
But other brokers offered personally identifiable information that included names, addresses and incomes, and one data broker sales representative pointed to lists of “anxiety patients” and “consumers with clinical depression in the United States.” Some even offered a sample spreadsheet.
It was like “a tasting menu for buying people’s health data,” said Justin Sherman, a senior fellow at Duke who led the research team. “Health data is some of the most sensitive data out there, and most of us have no idea how much of it is for sale, often for just a few hundred dollars.”
The Health Insurance Portability and Accountability Act, known as HIPAA, restricts how hospitals, doctors’ offices and other “covered healthcare organizations” share Americans’ health information.
But the law doesn’t protect the same information when it’s sent elsewhere, allowing app makers and other companies to legally share or sell the data as they see fit.
Some data brokers offered formal customer complaint procedures and opt-out forms, Kim said.But because the companies often didn’t say where their data came from, he wrote, many likely didn’t realize the brokers were collecting their data in the first place. It was also unclear whether the apps or websites allowed users to opt out of sharing data in the first place; many companies reserve the right in their privacy policies to share data with advertisers or other third-party “partners”.
Privacy advocates have warned about unregulated data trading for years, saying the information could be exploited by advertisers or misused by predatory means. Health insurance companies and federal law enforcement have used data brokers to scrutinize people’s health care costs and pursue undocumented immigrants.
According to Sherman, mental health data needs to be handled with particular care because it can relate to people in vulnerable situations — and if shared publicly or reported inaccurately, can lead to devastating results.
In 2013, Pam Dixon, founder and executive director of the research and advocacy group World Privacy Forum, testified at a Senate hearing that an Illinois pharmaceutical marketing firm advertised a list of alleged “violence sufferers” for 1,000 names starting at $79. The company removed the listing shortly after his testimony.
Now, a decade later, he worries that the health data problem has gotten worse in some ways, in large part because companies have become more sophisticated at collecting and sharing people’s personal information — not just through specific lists, but regularly updated lists, among other things. search tools and machine learning analytics.
“It’s a disgusting practice and they’re still doing it. Our health data is part of somebody’s business model,” Dixon said. “They build conclusions, scores and categorizations from the patterns of your life, your actions, where you go, what you eat – and what should we do, not live?”
The number of places where people share their information has increased, thanks to the proliferation of online pharmacies, therapy apps and telehealth services that Americans use to seek and receive medical care from home. According to Jen Caltrider, a researcher at technology company Mozilla, many mental health apps have questionable privacy practices, and the team analyzed more than two dozen last year and found that “most” were “exceptionally appalling.”
Federal regulators have recently shown interest in more aggressively scrutinizing how companies handle people’s health data. The Federal Trade Commission said this month that it was seeking a $1.5 million civil penalty from online prescription drug provider GoodRx after it was ordered to compile lists of users who purchased certain drugs, including those for heart disease. and blood pressure treatment. uses this information to better target Facebook ads.
An FTC representative said in a statement that “digital health companies and mobile apps should not cash in on consumers’ highly sensitive and personally identifiable health information.” GoodRx said in a statement that this was a “long-standing issue” related to a software practice called tracking pixels, and allowed the company to “serve ads that we believe are compliant.”
After the Supreme Court decided Roe v. Wade last summer, opening the door to another state abortion ban, some data brokers stopped selling location data that could have been used to track who visited abortion clinics.
Several senators, including Elizabeth Warren (D-Mass.), Ron Wyden (D-Ore.) and Bernie Sanders (I-Vt.) have sponsored a bill that would strengthen state and federal authority against misuse of health data and limits the ability to reproduce. – Health data can be collected and shared by technology companies.
However, the data brokerage industry remains unregulated at the federal level, and the US lacks a comprehensive federal privacy law that sets rules for how apps and websites handle people’s data more broadly.
Two states, California and Vermont, require companies to register with a data broker registry. California lists more than 400 companies, some of which claim to specialize in health or medical data.
Dixon, who was not involved in the Duke research, said he hopes the findings and the Supreme Court ruling will serve as a wake-up call about how this data can lead to real risks.
“There are literally millions of women for whom the exchange, trade and sale of their health information could have criminal consequences,” she said. “This is not theoretical. It’s here, now.”