Mental health apps collect extraordinarily sensitive data: therapy session notes, mood patterns, medication history, and psychological assessments. Understanding how this data is stored, shared, and analyzed by AI is critical before sharing your most vulnerable moments.
Mental health apps score poorly on privacy despite handling some of the most sensitive personal information possible. These platforms routinely collect mood data, therapy transcripts, and psychological assessments. Many use AI to analyze emotional states and share aggregated (and sometimes individual) data with employers, insurers, and advertisers. The FTC has taken action against multiple platforms in this space for deceptive data practices.
Meditation app with relatively restrained data collection. No therapy data, but still tracks sleep patterns, mood logs, and usage behaviors.
Mindfulness app with extensive corporate partnerships. Employer-provided accounts share engagement data with organizations.
Online psychiatry platform cited by FTC for sharing mental health data with advertisers. Collects prescription history and therapy notes.
Employer mental health platform with extensive AI analysis of therapy sessions and detailed reporting to employers.