USA: While there are many apps that can be useful for those with mental health issues, how many of them are secure? Several well-known mental health apps are failing to protect their users' privacy, according to a recent Mozilla study. A significant amount of personal information is being collected by some apps meant to support people with mental health conditions, but their privacy policies are dubious.
Calm and Headspace were two of the 27 apps that Mozilla tested last year with a focus on mental health, meditation, and prayer. This year, the researchers examined the same apps as well as five brand-new apps that the public had requested. The "privacy not included" warning tag, which Mozilla assigns to applications with the greatest concerns about privacy and personal data, was given to 22 of the total 32 apps.
Mozilla reported that roughly 17 of the 27 apps from last year that were reviewed again this year still performed poorly in terms of privacy and security.
Also Read: Neuralink now appears to be in trouble after animal testing
Some apps don't strictly abide by their promises to protect user data even after making such promises. One of them was BetterHelp, which provides counselling sessions online, according to Mozilla. The website has come under fire for sharing private user data with Facebook and Snapchat. The business consented to pay the Federal Trade Commission a fine of $7.8 million in March for its actions.
Replica: My AI Friend, a "virtual friendship" chatbot, was one of the apps that was examined the most in this year's study. It was "possibly the worst app we've ever reviewed," according to Mozilla researchers. It failed to meet the bare minimum security requirements because of serious security flaws. Since the app allegedly broke European data privacy laws, it is prohibited in Italy.
Some apps don't strictly adhere to it even after making a commitment to protect user data. One of them, according to Mozilla, is BetterHelp, which provides counselling sessions online. The platform has come under fire for sharing private user data with Snapchat and Facebook. The business consented to pay a $7.8 million fine to the Federal Trade Commission in March for its actions.
Also Read: 5G smartphone shipments scaled up 14-pc in India
The "virtual friendship" chatbot Replica: My AI Friend was one of the apps that was examined the most in this year's study. According to researchers at Mozilla, it was "possibly the worst app we've ever reviewed." It didn't meet the required minimum security standards because of serious security flaws. Italy has banned it because the app is alleged to have broken European data privacy laws
Thankfully, some of the apps from the previous year's list have advanced. Youper, the most improved app overall, has changed its data collection policies, most notably the requirements for password policies. Wysa and the PTSD Coach also made a fresh start. According to researchers, they were "head and shoulders above the other apps in terms of privacy and security."
Also Read: India, Israel ink deal for Industrial Research, Development Cooperation
Moodfit, Calm, Woebot, and Modern Health are additional apps that demonstrated notable improvements. On the other hand, mental health apps like Pride Counselling (owned by BetterHelp), Shine, Talkspace, and Headspace have subpar privacy policies.