PoopCheck Privacy Scandal: AI Health App's User Data Allegedly Being Sold Despite Privacy-First Claims
Key Takeaways
- ▸A database of 150,000+ labeled stool images from 25,000 PoopCheck users was offered for sale on Reddit, contradicting the app's public privacy-first messaging
- ▸PoopCheck claims "no data collection" in its App Store listing despite apparently monetizing a commercial user data database for AI training
- ▸Users contributing images to the app's community feature lack clear disclosure that their sensitive biological data will be collected, annotated, and sold
Summary
A database of 150,000+ labeled stool images from approximately 25,000 PoopCheck users has surfaced on Reddit's data trading community, with the app's creator seeking to sell access to the commercial dataset. PoopCheck is an AI-powered gut health analyzer that uses advanced pattern recognition to classify stool images according to the Bristol Stool Scale and provide users with daily health insights. The discovery reveals a stark contradiction between the app's public privacy commitments and its apparent data monetization practices.
PoopCheck's marketing materials and privacy disclosures emphasize privacy-first principles, with the App Store listing stating "The developer does not collect any data from this app" and the company website claiming "Privacy First" as a core value. However, the creator was actively marketing a highly sensitive, annotated database to potential buyers on Reddit, explicitly noting its value for AI training and medical research. Users who shared images to the app's community feature were never clearly informed that their biological data would be collected, labeled, and packaged for commercial purposes.
The incident exposes a broader transparency crisis in AI health applications. While users are encouraged to contribute sensitive biological data through community features and analysis tools, the actual terms governing data collection, annotation, and commercialization remain obscure or misleading. This case raises critical questions about informed consent in health apps and whether privacy disclosures adequately convey how user data will be repurposed for AI training datasets.
- The scandal illustrates a wider industry problem: health AI apps market privacy while deriving commercial value from user-generated training datasets
Editorial Opinion
The PoopCheck data sale exposes a dangerous disconnect between privacy marketing and AI data monetization in health applications. Companies increasingly tout privacy-first principles to gain user trust, yet obscure the commercial value extraction happening behind terms of service. Regulators must mandate explicit, affirmative consent for any commercialization of health data used in AI training, and platforms should face real accountability when privacy claims contradict actual data practices.



