BotBeat
...
← Back

> ▌

Soft All ThingsSoft All Things
INDUSTRY REPORTSoft All Things2026-05-14

Health App PoopCheck Creator Attempts to Sell 150K User Stool Images Database

Key Takeaways

  • ▸A database containing 150k+ stool images from 25k PoopCheck users was offered for sale on Reddit, representing a major privacy violation of consumer health data
  • ▸Users uploaded intimate biometric images for personal health insights, not for third-party access or AI model training purposes
  • ▸This case demonstrates a critical gap between user expectations for data privacy in health apps and actual data handling and monetization practices
Source:
Hacker Newshttps://www.404media.co/ai-poop-analysis-app-offered-to-sell-me-access-to-its-users-poops/↗

Summary

An investigation has revealed that the creator of PoopCheck, a health app that uses AI to analyze stool images, has been attempting to sell a database containing over 150,000 classified images of stool samples collected from approximately 25,000 users. The database was discovered being offered for sale on Reddit's r/DHExchange forum, with the seller seeking to monetize what he described as "rare" and "valuable" training data for machine learning and medical research applications.

PoopCheck, made by Soft All Things, uses AI technology based on the Bristol Stool Scale to analyze users' feces and provide daily digestive health scores. Users uploaded these intimate biometric images for personal health analysis—not with the expectation that their data would be resold to third parties. The attempted database sale represents a significant privacy breach and raises serious ethical concerns about how consumer health apps handle and monetize sensitive user-generated data without explicit consent or disclosure of secondary uses.

This incident exemplifies a critical gap between user expectations for data privacy and actual practices in the app ecosystem. It highlights how consumer health applications can accumulate vast quantities of intimate biometric data and how that information might be repurposed for commercial AI training purposes. The case raises important questions about data governance, user protection, and the ethical obligations of app developers to safeguard personal information.

  • The incident reveals broader industry concerns about how consumer apps collect, store, and sell sensitive biometric data in the AI era without adequate transparency or consent

Editorial Opinion

This troubling exposé reveals a critical failure in data governance and user protection within the consumer health app space. While the PoopCheck creator's attempt to monetize the database is egregious, it likely represents a symptom of a broader industry problem where app developers collect intimate biometric data with minimal transparency about potential secondary uses. Companies must establish clear, enforceable data handling policies and obtain explicit informed consent before collecting sensitive health information. Without stronger regulatory frameworks and corporate accountability measures, users will continue discovering their most private data being treated as a commodity for commercial AI training.

Machine LearningHealthcareEthics & BiasPrivacy & Data

Comments

Suggested

21st Century Medicine21st Century Medicine
INDUSTRY REPORT

Root Access on Request: How Social Engineering Defeats IT Security

2026-05-14
Alibaba (Cloud)Alibaba (Cloud)
RESEARCH

Alibaba's Qwen Achieves 92% Defense Rate Using Automated Reinforcement Learning Red Teaming

2026-05-14
ScribeScribe
POLICY & REGULATION

Ontario Auditors Find AI Note-Taking Systems Routinely Fail Basic Accuracy Tests

2026-05-14
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us