Mental health apps (MHA) have surged in popularity in India due to the rising prevalence of mental health concerns, increased emphasis on mental well-being and widespread smartphone and internet access. These apps cater to various needs, from stress reduction to treatment for anxiety, depression and PTSD. They offer features like mood tracking, sleep tools and virtual therapy sessions. While MHAs can complement traditional therapy, they are not substitutes for professional care. Users need to exercise caution and consider the evidence behind each app they choose.Reference Singh and Sagar1 This paper aims to examine India's current regulatory landscape for MHAs, compare it with international approaches and propose a tiered regulatory framework. We will analyse the existing regulations and their challenges, categorise different types of MHAs and their regulatory needs and recommend steps to establish a centralised regulatory body. By addressing these objectives, we seek to provide a comprehensive analysis of MHA regulation in India and offer concrete suggestions for improvement. The paper will discuss the current regulatory situation, analyse various MHA types, compare international approaches, propose a new framework and conclude with implications for future regulation.
Current regulatory landscape and challenges
Navigating the murky waters: challenges of app review and regulation
The Central Drugs Standard Control Organization (CDSCO), under the Ministry of Health and Family Welfare, is the primary regulatory body for medical devices in India. The Medical Device Rules 2017 classify medical devices into different categories based on their risk levels and regulate them accordingly.Reference Manu and Anand2 Software integral to the functioning of a medical device falls under this regulatory framework. This includes software used for diagnostics, monitoring or therapeutic purposes, which must meet specific safety, performance and quality standards. At present, standalone software applications not associated with any physical medical device are not regulated by the CDSCO. These standalone apps include health and wellness apps, mobile applications that track fitness or general health metrics and other software that monitors health status and illness through both active and passive digital phenotype tracking. There is no regulation on these standalone health or mental health apps.
While reviews are essential to assess the effectiveness, user satisfaction and adherence to ethical guidelines of MHAs, navigating India's evolving regulatory landscape adds complexity for users and developers alike. The effectiveness of these apps in addressing mental health issues has not been extensively researched.Reference Johnson, Sanghvi and Mehrotra3 There is limited information on quality assurance and validation processes. Unlike medical devices and pharmaceutical products, which undergo rigorous evaluations by regulatory bodies, health apps often do not face such stringent requirements, raising concerns about data privacy violations, inaccurate information and unverified therapeutic claims.Reference Sinha Deb, Tuli, Sood, Chadda, Verma and Kumar4
India lacks a centralised regulatory body to oversee MHAs, ensuring their safety, efficacy and adherence to ethical standards. Without such a body, consumers are left to navigate the vast landscape of available apps on their own, without clear guidance or assurance of their effectiveness or safety. Furthermore, this shifts the onus onto mental health professionals (MHPs) to independently assess the suitability of these apps and recommend them to their clients or patients.Reference Menon, Rajan and Sarkar5 Table 1 depicts MHA regulations in certain countries. Similar comparison can be drawn to frame regulation in India using existing laws related to mental health in the country.Reference Powell, Singh and Torous6,Reference Essén, Stern, Haase, Car, Greaves and Paparova7
FDA, Food and Drug Administration; FTC, Federal Trade Commission; HHS, Health and Human Services; HIPAA, Health Insurance Portability and Accountability Act of 1996; MHRA, Medicines and Healthcare products Regulatory Agency; ORCHA, Organisation for the Review of Care and Health Applications; NHS, National Health Service; NICE, National Institute for Health and Care Excellence; GDPR, General Data Protection Regulation; TGA, Therapeutic Goods Administration; AHPRA, Australian Health Practitioner Regulation Agency; RANZCP, Royal Australian and New Zealand College of Psychiatrists; PMDA, Pharmaceuticals and Medical Devices Agency; SaMD, specific software as medical device; PMD, Pharmaceuticals and Medical Devices; APPI, Act on the Protection of Personal Information; BfArM, Bundesinstitut für Arzneimittel und Medizinprodukte [Federal Institute for Drugs and Medical Devices]; hih, health innovation hub; EU MDR, European Union Medical Device Regulation; DiGA, Digitale Gesundheitsanwendungen [Digital Health Apps Directory]; HSA, Health Sciences Authority; PDPA, Personal Data Protection Act.
Legislations overlaying the evolving mental health service in India: patchwork to regulations
Major mental health legislations in India focus on ethics, registration and recognition of mental health establishments (MHEs) and MHPs. While MHAs cater to both general and clinical populations, current laws mainly cover conventional in-person services. The Mental Healthcare Act (MHCA) 2017 emphasises patient rights, professional qualifications and MHEs’ and MHPs’ registration. However, its applicability to MHAs is limited, necessitating specific regulations for these apps.Reference Math, Basavaraju, Harihara, Gowda, Manjunatha and Kumar8
Recent guidelines, such as India's Telemedicine Practice Guidelines (TPG) 2020 and Telepsychiatry Operational Guidelines, recognise telepsychiatry and address issues of informed consent, e-prescriptions and data security. Additionally, the Information Technology Act 2000, Section 43A, mandates ‘reasonable security practices’ for entities handling sensitive personal data and imposes penalties for wrongful loss or gain caused by negligence.Reference Sharma, Sethi, Liem, Bhatti, Pandey and Nair9
MHAs, dealing with highly sensitive data, would need to adhere to these standards. The National Medical Commission (NMC's) Code of Medical Ethics10 underscores the importance of patient privacy as well. The lack of specificity related to MHAs reveals the need for a regulatory framework that builds upon the NMC's ethical principles. The proposed Data Protection Data Privacy Act (DPDP)Reference Naithani12 also emphasises patient control over health data, including rights to access, correct and potentially erase data, along with the ability to restrict its use without explicit consent, object to automated decision-making and file complaints about violations. However, on the other hand, the act doesn't restrict health data transfer when the data are shared outside of India. Though this enables international research collaborations or use of foreign-based cloud services, it raises alarm on inviting further debate on legal jurisdictions across borders.11
Providing a broader purview, the Drugs, Medical Devices, and Cosmetics Bill (DMDCB) 2022 defines ‘medical device’ as that which could encompass MHAs functioning on a diagnostic or therapeutic capacity.Reference Naithani12 Additionally, the DMDCB emphasises evidence-based validation for therapeutic claimsReference Naithani12). The Indian Council of Medical Research (ICMR) has established guidelines for the ethical use of artificial intelligence in healthcare. These guidelines are particularly important for developers of MHAs powered by artificial intelligence. The ICMR also emphasises patient safety, data privacy, transparency and the mitigation of bias in artificial intelligence algorithms. Observing the trend, it is understood that the country is moving towards regulating laws and guidelines in mental health, which provides more clarity on MHAs’ role, their utility and efficacy in interventions, diagnosis and treatment.Reference Sethi, Kumar and Math13 The requirement is currently only a patchwork overlaying these above-mentioned legislations.
Types of MHAs and regulatory needs
MHAs encompass a diverse range of tools with varying purposes and regulatory implications. These apps range from wellness and self-help tools requiring minimal oversight to artificial intelligence-powered therapeutic platforms demanding rigorous regulation. Wellness apps focus on general mental well-being, while symptom-tracking and mood apps necessitate moderate regulation to ensure data security and algorithm accuracy.
Psychoeducation apps provide mental health information, requiring verification of content accuracy. Therapy support apps complement professional care, demanding integration with healthcare systems. Digital therapeutics (DTx) deliver evidence-based interventions, necessitating clinical trials and potential classification as medical devices. Teletherapy platforms connect users with professionals, requiring stringent regulation of provider credentials and crisis management protocols.
The most complex category, artificial intelligence-powered therapeutic apps, requires the highest level of regulation, including extensive testing and ongoing monitoring. This diverse landscape, as detailed in Table 2, underscores the need for a tiered regulatory approach in India, where oversight is calibrated to the potential risks and benefits associated with each type of MHA.Reference Lagan, Aquino, Emerson, Fortuna, Walker and Torous14
PHI, Patient Health Information.
Implications and conclusions
To safeguard users’ interests and promote responsible app development, it is high time that a centralised regulatory body could establish guidelines and standards for MHAs in the country. This would create a more consistent and robust approach for authentication which should include verifying the credibility and qualifications of app developers, ensuring evidence-based content and practices, compliance with ethical standards and scrutinizing data security measures. India can also draw valuable insights from the international frameworks (Table 1), where only higher-risk mental health treatment apps undergo strict medical device regulation by working alongside local expert organisations like the Indian Psychiatric Society and technology and legal agencies. Additionally, the regulatory body should offer support to smaller developers and establish mechanisms for user feedback. By implementing these recommendations, India can become a leader in responsible MHA regulation, safeguarding users, promoting innovation, and ultimately improving access to effective digital mental health services.
Data availability
Data availability is not applicable to this article as no new data were created or analysed in this study.
Acknowledgment
The authors utilised generative artificial intelligence, specifically Google's Gemini Advanced model, to enhance the language and readability of this article, on 4 June 2024. However, the authors take full responsibility for the content and accuracy of the information presented in this manuscript.
Author contributions
M.I.S.S.: conceptualisation, data curation, formal analysis, visualisation and writing of the original draft. R.C.K.: writing – review and editing, visualisation and data curation. N.M.: writing – review and editing, methodology and data curation. C.N.K.: writing – review and editing, methodology and data curation. S.B.M.: writing – review and editing, methodology, project administration, conceptualisation and data curation.
Funding
This research received no specific grant from any funding agency, commercial or not-for-profit sectors.
Declaration of interest
None.
eLetters
No eLetters have been published for this article.