Webwatch: Call to protect children with depression from accessing unsuitable apps
Google and Apple need to do more to protect children with depression from accessing unsuitable apps on their marketplaces, according to a study.
The review, led by researchers from Lancaster University and Trinity College Dublin, found two of the 29 highest rated apps – all rated as suitable for children – when searching for “depression” contain negative emotional content.
Researchers believe there is a mismatch between age ratings used by app stores, which were created to target material such as violent content in video games, and the need to regulate health-related apps.
Lancaster University research associate Chengcheng Qu said: “Surprisingly these apps with potentially disturbing content are rated as PEGI 3 or PEGI 12 on the app marketplaces, which indicates to potential users, or parents, that the apps’ content merely includes bad language.
“Prior studies have shown that adolescents’ exposure to negative content may trigger negative behaviour, such as self-harm. There is therefore a clear need to look at how to protect vulnerable app users, such as those at risk of self-harm or suicide.”
Although many of the apps studied were free to download, nearly a third contained advertisements of which 80% stated in their their privacy policies that users’ information would be captured and shared with third parties, including advertisers.
However, this information was not within the app descriptions in the marketplaces, potentially leading parents to download inappropriate apps for their children.
Researchers also found evidence on the science underpinning many mental health apps was often lacking.
While most apps in the study claim to be informed by evidence-based treatments, only two provide direct peer-reviewed evidence on their effectiveness for reducing the symptoms of depression.
Many of the apps state they are not replacements for clinical treatments but in many cases these disclosures were written into the terms of use policies which are hard to find, the review also found.
Professor Corina Sas, from Lancaster University, said: “The potential of these types of apps is promising, especially for reaching groups of people such as adolescents who are less likely to seek professional support offline. However, there is a real and urgent need for Google and Apple to regulate their marketplaces to safeguard users and ensure these mental health apps have a positive impact.
“Greater regulation and transparency would help mitigate ethical risks around missing, inadequate or inconsistent privacy policies, sharing data with advertisers, child data protection and the safeguarding of vulnerable users as well as providing clarity about the level of scientific validation behind individual apps.”
Dr Gavin Doherty, from Trinity College Dublin, added: “Introducing a more appropriate set of age ratings that takes into account the sensitivity of the content and data handled by health apps would be a relatively straightforward and helpful step to take, and would give clarity to app developers.”
Copyright (c) PA Media Ltd. 2020, All Rights Reserved. Picture (c) PA Wire.