Not all mental health apps are helpful. Experts explain the risks, and how to choose one wisely
<p><em><a href="https://theconversation.com/profiles/jeannie-marie-paterson-6367">Jeannie Marie Paterson</a>, <a href="https://theconversation.com/institutions/the-university-of-melbourne-722">The University of Melbourne</a>; <a href="https://theconversation.com/profiles/nicholas-t-van-dam-389879">Nicholas T. Van Dam</a>, <a href="https://theconversation.com/institutions/the-university-of-melbourne-722">The University of Melbourne</a>, and <a href="https://theconversation.com/profiles/piers-gooding-207492">Piers Gooding</a>, <a href="https://theconversation.com/institutions/the-university-of-melbourne-722">The University of Melbourne</a></em></p>
<p>There are thousands of mental health apps available on the app market, offering services including meditation, mood tracking and counselling, among others. You would think such “health” and “wellbeing” apps – which often present as solutions for conditions such as <a href="https://www.headspace.com/">anxiety</a> and <a href="https://www.calm.com">sleeplessness</a> – would have been rigorously tested and verified. But this isn’t necessarily the case.</p>
<p>In fact, many may be taking your money and data in return for a service that does nothing for your mental health – at least, not in a way that’s backed by scientific evidence.</p>
<h2>Bringing AI to mental health apps</h2>
<p>Although some mental health apps connect users with a <a href="https://www.betterhelp.com/get-started/?go=true&utm_source=AdWords&utm_medium=Search_PPC_c&utm_term=betterhelp+australia_e&utm_content=133525856790&network=g&placement=&target=&matchtype=e&utm_campaign=15228709182&ad_type=text&adposition=&kwd_id=kwd-401317619253&gclid=Cj0KCQjwoeemBhCfARIsADR2QCtfZHNw8mqpBe7cLfLtZBD-JZ5xvAmDCfol8npbAAH3ALJGYvpngtoaAtFlEALw_wcB¬_found=1&gor=start">registered therapist</a>, most provide a fully automated service that bypasses the human element. This means they’re not subject to the same standards of care and confidentiality as a registered mental health professional. Some aren’t even designed by mental health professionals.</p>
<p>These apps also increasingly claim to be incorporating artificial intelligence into their design to make personalised recommendations (such as for meditation or mindfulness) to users. However, they give little detail about this process. It’s possible the recommendations are based on a user’s previous activities, similar to Netflix’s <a href="https://help.netflix.com/en/node/100639">recommendation algorithm</a>.</p>
<p>Some apps such as <a href="https://legal.wysa.io/privacy-policy#aiChatbot">Wysa</a>, <a href="https://www.youper.ai/">Youper</a> and <a href="https://woebothealth.com/">Woebot</a> use AI-driven chatbots to deliver support, or even established therapeutic interventions such as cognitive behavioural therapy. But these apps usually don’t reveal what kinds of algorithms they use.</p>
<p>It’s likely most of these AI chatbots use <a href="https://www.techtarget.com/searchenterpriseai/feature/How-to-choose-between-a-rules-based-vs-machine-learning-system">rules-based systems</a> that respond to users in accordance with predetermined rules (rather than learning on the go as adaptive models do). These rules would ideally prevent the unexpected (and often <a href="https://www.vice.com/en/article/pkadgm/man-dies-by-suicide-after-talking-with-ai-chatbot-widow-says">harmful and inappropriate</a>) outputs AI chatbots have become known for – but there’s no guarantee.</p>
<p>The use of AI in this context comes with risks of biased, discriminatory or completely inapplicable information being provided to users. And these risks haven’t been adequately investigated.</p>
<h2>Misleading marketing and a lack of supporting evidence</h2>
<p>Mental health apps might be able to provide certain benefits to users <em>if</em> they are well designed and properly vetted and deployed. But even then they can’t be considered a substitute for professional therapy targeted towards conditions such as anxiety or depression.</p>
<p>The <a href="https://theconversation.com/pixels-are-not-people-mental-health-apps-are-increasingly-popular-but-human-connection-is-still-key-192247">clinical value</a> of automated mental health and mindfulness apps is <a href="https://www.sciencedirect.com/science/article/abs/pii/S1077722918300233?casa_token=lwm1E6FhcG0AAAAA:saV7szbZl4DqbvmZiomLG9yMWi_4-zbmy3QCtQzVEQr957QX1E7Aiqkm5BcEntR0mVFgfDVo">still being assessed</a>. Evidence of their efficacy is generally <a href="https://journals.plos.org/digitalhealth/article?id=10.1371/journal.pdig.0000002">lacking</a>.</p>
<p>Some apps make ambitious claims regarding their effectiveness and refer to studies that supposedly support their benefits. In many cases these claims are based on less-than-robust findings. For instance, they may be based on:</p>
<ul>
<li><a href="https://sensa.health/">user testimonials</a></li>
<li>short-term studies with narrow <a href="https://www.wired.co.uk/article/mental-health-chatbots">or homogeneous cohorts</a></li>
<li><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9533203/">studies involving</a> researchers or funding from the very group <a href="https://www.theguardian.com/us-news/2022/apr/13/chatbots-robot-therapists-youth-mental-health-crisis">promoting the app</a></li>
<li>or evidence of the benefits of a <a href="https://www.headspace.com/meditation/anxiety">practice delivered face to face</a> (rather than via an app).</li>
</ul>
<p>Moreover, any claims about reducing symptoms of poor mental health aren’t carried through in contract terms. The fine print will typically state the app does not claim to provide any physical, therapeutic or medical benefit (along with a host of other disclaimers). In other words, it isn’t obliged to successfully provide the service it promotes.</p>
<p>For some users, mental health apps may even cause harm, and lead to increases in the very <a href="https://pubmed.ncbi.nlm.nih.gov/34074221/">symptoms</a> people so often use them to address. The may happen, in part, as a result of creating more awareness of problems, without providing the tools needed to address them.</p>
<p>In the case of most mental health apps, research on their effectiveness won’t have considered <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9505389/">individual differences</a> such as socioeconomic status, age and other factors that can influence engagement. Most apps also will not indicate whether they’re an inclusive space for marginalised people, such as those from culturally and linguistically diverse, LGBTQ+ or neurodiverse communities.</p>
<h2>Inadequate privacy protections</h2>
<p>Mental health apps are subject to standard consumer protection and privacy laws. While data protection and <a href="https://cybersecuritycrc.org.au/sites/default/files/2021-07/2915_cscrc_casestudies_mentalhealthapps_1.pdf">cybersecurity</a> practices vary between apps, an investigation by research foundation Mozilla <a href="https://foundation.mozilla.org/en/privacynotincluded/articles/are-mental-health-apps-better-or-worse-at-privacy-in-2023">concluded that</a> most rank poorly.</p>
<p>For example, the mindfulness app <a href="https://www.headspace.com/privacy-policy">Headspace</a> collects data about users from a <a href="https://foundation.mozilla.org/en/privacynotincluded/headspace/">range of sources</a>, and uses those data to advertise to users. Chatbot-based apps also commonly repurpose conversations to predict <a href="https://legal.wysa.io/privacy-policy">users’ moods</a>, and use anonymised user data to train the language models <a href="https://www.youper.ai/policy/privacy-policy">underpinning the bots</a>.</p>
<p>Many apps share so-called <a href="https://theconversation.com/popular-fertility-apps-are-engaging-in-widespread-misuse-of-data-including-on-sex-periods-and-pregnancy-202127">anonymised</a> data with <a href="https://www.wysa.com/">third parties</a>, such as <a href="https://www.headspace.com/privacy-policy">employers</a>, that sponsor their use. Re-identification of <a href="https://www.unimelb.edu.au/newsroom/news/2017/december/research-reveals-de-identified-patient-data-can-be-re-identified">these data</a> can be relatively easy in some cases.</p>
<p>Australia’s Therapeutic Goods Administration (TGA) doesn’t require most mental health and wellbeing apps to go through the same testing and monitoring as other medical products. In most cases, they are lightly regulated as <a href="https://www.tga.gov.au/how-we-regulate/manufacturing/medical-devices/manufacturer-guidance-specific-types-medical-devices/regulation-software-based-medical-devices">health and lifestyle</a> products or tools for <a href="https://www.tga.gov.au/sites/default/files/digital-mental-health-software-based-medical-devices.pdf">managing mental health</a> that are excluded from TGA regulations (provided they meet certain criteria).</p>
<h2>How can you choose an app?</h2>
<p>Although consumers can access third-party rankings for various mental health apps, these often focus on just a few elements, such as <a href="https://onemindpsyberguide.org/apps/">usability</a> or <a href="https://foundation.mozilla.org/en/privacynotincluded/categories/mental-health-apps/">privacy</a>. Different guides may also be inconsistent with each other.</p>
<p>Nonetheless, there are some steps you can take to figure out whether a particular mental health or mindfulness app might be useful for you.</p>
<ol>
<li>
<p>consult your doctor, as they may have a better understanding of the efficacy of particular apps and/or how they might benefit you as an individual</p>
</li>
<li>
<p>check whether a mental health professional or trusted institution was involved in developing the app</p>
</li>
<li>
<p>check if the app has been rated by a third party, and compare different ratings</p>
</li>
<li>
<p>make use of free trials, but be careful of them shifting to paid subscriptions, and be wary about trials that require payment information upfront</p>
</li>
<li>
<p>stop using the app if you experience any adverse effects.</p>
</li>
</ol>
<p>Overall, and most importantly, remember that an app is never a substitute for real help from a human professional.<img style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important;" src="https://counter.theconversation.com/content/211513/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" /></p>
<p><em><a href="https://theconversation.com/profiles/jeannie-marie-paterson-6367">Jeannie Marie Paterson</a>, Professor of Law, <a href="https://theconversation.com/institutions/the-university-of-melbourne-722">The University of Melbourne</a>; <a href="https://theconversation.com/profiles/nicholas-t-van-dam-389879">Nicholas T. Van Dam</a>, Associate Professor, School of Psychological Sciences, <a href="https://theconversation.com/institutions/the-university-of-melbourne-722">The University of Melbourne</a>, and <a href="https://theconversation.com/profiles/piers-gooding-207492">Piers Gooding</a>, Postdoctoral Research Fellow, Disability Research Initiative, <a href="https://theconversation.com/institutions/the-university-of-melbourne-722">The University of Melbourne</a></em></p>
<p><em>Image credits: Shutterstock</em></p>
<p><em>This article is republished from <a href="https://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/not-all-mental-health-apps-are-helpful-experts-explain-the-risks-and-how-to-choose-one-wisely-211513">original article</a>.</em></p>