Shuttering Startups Now Selling Employee Slack Chats and Emails to Train AI Models
Key Takeaways
- ▸Failed startups are now selling internal communications data (Slack, email, documents) to AI companies for $10,000–$100,000 per deal as a final cash opportunity
- ▸Specialized firms like SimpleClosure have entered the market, facilitating data sales while claiming to remove personally identifiable information
- ▸The data demand reflects a shift from public internet training data to proprietary workplace datasets needed for agentic AI and reinforcement learning environments
Summary
As AI companies increasingly require complex, real-world workplace data to train agentic models, failed startups are now monetizing their internal communications—including Slack messages and emails—to AI developers. Specialized startup wind-down firms like SimpleClosure have launched new services to help founders extract value from their company data, processing nearly 100 deals over the past year with payouts ranging from $10,000 to $100,000. The data is particularly valuable for training reinforcement learning environments where AI agents practice workplace tasks, with industry insiders reporting that some AI leaders have discussed spending up to $1 billion on such training infrastructure. However, privacy advocates are raising significant concerns about the sale of employee communications, warning that the practice poses substantial risks to worker privacy and data rights.
- Privacy advocates warn that employee communications represent identifiable personal data and urge FTC oversight, despite the financial incentive for startups to sell
Editorial Opinion
While the emergence of a secondary market for startup data reflects genuine demand for realistic workplace training datasets, the practice raises serious ethical questions about worker consent and privacy. Employees typically have no visibility into or control over whether their communications—exchanged in what they reasonably expect to be internal company channels—are being monetized for AI training. Without stronger regulatory frameworks and clearer disclosure requirements, this trend risks turning workers into unwitting contributors to AI development while their data rights remain murky and unprotected.


