Vercel Updates Data Policy: Hobby Plan Users Opted Into AI Training by Default
Key Takeaways
- ▸Hobby plan users are automatically opted into AI training by default, while Pro and Enterprise customers are opted out by default
- ▸Code, agent chats, build telemetry, and traffic data may be used to train Vercel's AI models and shared with third-party AI providers if opted in
- ▸Users have until March 31, 2026 to opt out before their data is used; opt-outs after that date prevent future data usage but don't retroactively restrict past usage
Summary
Vercel has updated its Terms of Service and Privacy Policy to reflect how it uses developer data to support new agentic features and improve its platform. Under the new policy, Hobby plan users are automatically opted into allowing Vercel to use their code and agent chats for AI model training, while Pro and Enterprise customers are opted out by default. Developers can opt out at any time through Team and Project Settings, with a deadline of March 31, 2026 to prevent data from being used for training.
The policy change aims to enable Vercel's autonomous infrastructure capabilities, including proactive incident investigation, performance analysis, and cost optimization suggestions. Vercel assures users that sensitive information such as API keys, environment variables, and account details will be anonymized and redacted before any data is used for training or shared with third-party AI model providers. The company emphasizes that participation in the AI training program is optional and easy to manage through self-serve settings.
- Vercel is developing agentic infrastructure capabilities including automated incident investigation, performance analysis, and cost optimization
- All personally identifiable information and sensitive credentials are anonymized and redacted before any data sharing or model training
Editorial Opinion
Vercel's decision to default Hobby plan users into AI training represents a significant shift in how developer platforms balance feature innovation with user privacy. While the opt-out mechanism and data anonymization are welcome safeguards, the approach raises questions about whether Hobby users—often students, startups, and individual developers with less negotiating power—are truly making informed choices about their data. The three-month opt-out window, combined with tiered defaults that favor data collection for free users, suggests a pragmatic but ethically complex trade-off between funding AI development and respecting developer autonomy.


