Lucene search

K
threatpostElizabeth MontalbanoTHREATPOST:5C829F8CB412001082181F346821E9C7
HistoryMay 03, 2022 - 12:42 p.m.

Mozilla: Lack of Security Protections in Mental-Health Apps Is ‘Creepy’

2022-05-0312:42:35
Elizabeth Montalbano
threatpost.com
13

While they have good intentions to foster mental health and spiritual wellness, the majority of mental-health and prayer apps can harm their users in other ways by exposing personal and intimate data due to a severe lack of security and privacy protections, researchers from Mozilla have found.

Of 32 mental-health and prayer mobile apps investigated by the open-source organization, 28 were found to be inherently insecure and slapped with a “Privacy Not Included” label, according to a report of the same name published online this week. Moreover, 25 apps failed to meet Mozilla’s Minimum Security Standards, such as requiring strong passwords and managing security updates and vulnerabilities, researchers said.

“They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data,” she said. “Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”

Overall, Mozilla researchers spent 255 hours, or about eight hours per product, peering under the hood of the security of a variety of mental health and prayer apps.

The apps that they investigated have functionality such as connecting users with therapists and offering AI chat bots, community support pages, and prayers. They also offer mood journals and well-being assessment, among other features that require collecting sensitive data about users.

Some of the offensive behaviors of the apps include sharing users’ intimate data, allowing weak passwords, targeting vulnerable users with personalized ads, and featuring vague and poorly written privacy policies, according to the post.

For example, at least eight of the apps reviewed allowed weak passwords that range from “1” to “11111111,” while one—a mental-health app called Moodfit–only required one letter or digit as a password, “which is concerning for an app that collects mood and symptom data,” researchers noted in the post.

“Despite dealing with incredibly sensitive information, some apps’ security practices are akin to a flimsy lock on a diary,” they said.

Worst Privacy Offenders

Among the apps investigated, six were designated with the dubious distinction of being the “worst offenders” of user privacy: Better Help, Youper, Woebot, Better Stop Suicide, Pray.com and Talkspace.

Two of those apps—Better Help, a popular app that connects users with therapists and Better Stop Suicide, a suicide-prevention app—have “vague and messy” privacy policies that provide little to no detail about how the apps protect user data and what users can do in case they have concerns, researchers reported.

Three others—Youper, a digital mental health service for treating anxiety and depression; Pray.com, which encourages a daily prayer practice; and Woebot, an AI chat bot to foster better mental health—go even further by sharing personal information from the apps with third parties.

Woebot, for example, collects personal info such as a user’s name, email, phone number IP address, as well as all of the sensitive info users share in conversations with the bot. It also obtains user info “from other sources, including through third-party services and organizations to supplement information provided by you,” according to its privacy-policy notes.

“So Woebot can collect a good deal of personal information, [and] add to the information you give them with even more information gathered from third parties,” researchers noted in the report. “Then they say they can share some of this information with third parties, including insurance companies and a seemingly broad category they call ‘external advisors.'”

The other top offender–an online therapy app with celebrity sponsors such as champion swimmer Michael Phelps and musical artist Demi Lovato called Talkspaces—collects a significant amount of personal information on users, including name, email, address, phone number, gender, relationship status, employer, geolocation information, chat transcripts and more.

Talkspaces even goes so far as to ask for users’ written permission to use their health info and therapy notes for marketing purposes, which Mozilla researchers said is “bad form” for any app, especially one dedicated to mental health.