That mental health app might share your data without telling you
That mental health app might share your
data without telling you
‘Do I trust the person who made the app, and do I understand
where this data is going?’
Free apps marketed to people with depression or who want to quit
smoking are hemorrhaging user data to third parties like Facebook and Google —
but often don’t admit it in their privacy policies, a new study reports. This
study is the latest to highlight the potential risks of entrusting sensitive
health information to our phones.
Though most of the easily-found depression or smoking cessation
apps in the Android and iOS stores share data, only a fraction of them actually
disclose this. The findings add to a string of worrying revelations about what
apps are doing with the health information we entrust to them. For instance,
a Wall Street Journal investigation recently revealed the period
tracking app Flo shared users’ period dates and pregnancy plans with
Facebook. And previous studies have reported health apps with security flaws or
that shared data with advertisers and analytics companies.
In this new study, published Friday in the journal JAMA Network Open, researchers
searched for apps using the keywords “depression” and “smoking cessation.” Then
they downloaded the apps and checked to see whether the data put into them was
shared by intercepting the app’s traffic. Much of the data the apps shared
didn’t immediately identify the user or was even strictly medical. But 33 of
the 36 apps shared information that could give advertisers or data analytics
companies insights into people’s digital behavior. And a few shared very
sensitive information, like health diary entries, self reports about substance
use, and usernames.
“IT’S IMPORTANT
TO TRUST BUT VERIFY.”
Those kinds of details, plus the name or type of app, could give
third parties information about someone’s mental health that the person might
want to keep private. “Even knowing that a user has a mental health or smoking
cessation app downloaded on their phone is valuable ‘health-related’
data,” Quinn
Grundy, an assistant professor at the University of Toronto who
studies corporate influences on health and was not involved in the study,
tells The Verge in an email.
The fact that people might not know how their apps are sharing
their data worried John Torous, director of digital psychiatry
at Beth Israel Deaconess Medical Center and a co-author on the new study. “It’s
really hard to make an informed decision about using an app if you don’t even
know who’s going to get access to some information about you,” he says. That’s
why he and a team at the University of New South Wales in Sydney ran this
study. “It’s important to trust but verify — to say where is your healthcare
data going,” Torous says.
“THEY’RE
BASICALLY LYING.”
By intercepting the data transmissions, they discovered that 92
percent of the 36 apps shared the data with at least one third party — mostly
Facebook- and Google-run services that help with marketing, advertising, or
data analytics. (Facebook and Google did not immediately respond to requests
for comment.) But about half of those apps didn’t disclose that third-party
data sharing, for a few different reasons: nine apps didn’t have a privacy
policy at all; five apps did but didn’t say the data would be
shared this way; and three apps actively said that this kind of data sharing
wouldn’t happen. Those last three are the ones that stood out to Steven Chan, a physician
at Veterans
Affairs Palo Alto Health Care System, who has collaborated with
Torous in the past but wasn’t involved in the new study. “They’re basically
lying,” he says of the apps.
The researchers don’t know what these third-party sites were
doing with this user data. “We live in an age where, with enough breadcrumbs,
it’s possible to reidentify people,” Torous says. It’s also possible the
breadcrumbs just sit there, he says — but for now, they just don’t know. “What
happens to this digital data is kind of a mystery.” But Chan worries about the
potential, invisible risks. “Potentially advertisers could use this to
compromise someone’s privacy and sway their treatment decisions,” he says. For
example, what if an advertiser discovers someone is trying to quit smoking?
“Maybe if someone is interested in smoking, would they be interested in
electronic cigarettes?” Chan says. “Or could they potentially introduce them to
other similar products, like alcohol?”
“WHAT HAPPENS
TO THIS DIGITAL DATA IS KIND OF A MYSTERY.”
Part of the problem is the business model for free apps, the
study authors write: since insurance might not pay for an app that helps users
quit smoking, for example, the only ways for free app developer to stay afloat
is to either sell subscriptions or sell data. And if that app is branded as a
wellness tool, the developers can skirt laws intended to keep medical information private.
So Torous recommends caution before sharing sensitive
information with an app. The potential for mental health apps to help people is
exciting, Torous says. “But I think it does mean you want to pause twice and
say, ‘Do I trust the person who made the app, and do I understand where this
data is going?’” A few quick gut checks could include making sure that the app
has a privacy policy, that it’s been updated recently, and that the app comes
from a trustworthy source like a medical center or the government. “None of
those questions are going to guarantee you a good result, but they’re going to
probably help you screen,” he says.
Long-term, one way to protect people who want to use health and
wellness apps could be to form a group that can give a stamp of approval to
responsible mental health apps, Chan says. “Kind of like having the FDA’s
approval on things, or the FAA certifying a particular aircraft for safety,” he
says. But for now, it’s app-user beware. “When there are no such institutions
or the institutions themselves aren’t doing a good job, it means we need to
invest more as a public good.”
Comments
Post a Comment