BotBeat
...
← Back

> ▌

VercelVercel
POLICY & REGULATIONVercel2026-03-19

Vercel Updates Data Policy: Hobby Plan Users Opted Into AI Training by Default

Key Takeaways

  • ▸Hobby plan users are automatically opted into AI training by default, while Pro and Enterprise customers are opted out by default
  • ▸Code, agent chats, build telemetry, and traffic data may be used to train Vercel's AI models and shared with third-party AI providers if opted in
  • ▸Users have until March 31, 2026 to opt out before their data is used; opt-outs after that date prevent future data usage but don't retroactively restrict past usage
Source:
Hacker Newshttps://vercel.com/changelog/updates-to-terms-of-service-march-2026↗

Summary

Vercel has updated its Terms of Service and Privacy Policy to reflect how it uses developer data to support new agentic features and improve its platform. Under the new policy, Hobby plan users are automatically opted into allowing Vercel to use their code and agent chats for AI model training, while Pro and Enterprise customers are opted out by default. Developers can opt out at any time through Team and Project Settings, with a deadline of March 31, 2026 to prevent data from being used for training.

The policy change aims to enable Vercel's autonomous infrastructure capabilities, including proactive incident investigation, performance analysis, and cost optimization suggestions. Vercel assures users that sensitive information such as API keys, environment variables, and account details will be anonymized and redacted before any data is used for training or shared with third-party AI model providers. The company emphasizes that participation in the AI training program is optional and easy to manage through self-serve settings.

  • Vercel is developing agentic infrastructure capabilities including automated incident investigation, performance analysis, and cost optimization
  • All personally identifiable information and sensitive credentials are anonymized and redacted before any data sharing or model training

Editorial Opinion

Vercel's decision to default Hobby plan users into AI training represents a significant shift in how developer platforms balance feature innovation with user privacy. While the opt-out mechanism and data anonymization are welcome safeguards, the approach raises questions about whether Hobby users—often students, startups, and individual developers with less negotiating power—are truly making informed choices about their data. The three-month opt-out window, combined with tiered defaults that favor data collection for free users, suggests a pragmatic but ethically complex trade-off between funding AI development and respecting developer autonomy.

AI AgentsRegulation & PolicyPrivacy & Data

More from Vercel

VercelVercel
POLICY & REGULATION

Vercel Sets March 31st Deadline for AI Training Data Opt-Out

2026-03-25
VercelVercel
OPEN SOURCE

Vercel Launches Knowledge Agent Template: A File-Based Alternative to Vector Databases

2026-03-22
VercelVercel
OPEN SOURCE

Vercel Open-Sources Knowledge Agent Template, Ditches Vector Embeddings for File-System Approach

2026-03-20

Comments

Suggested

AnthropicAnthropic
RESEARCH

Inside Claude Code's Dynamic System Prompt Architecture: Anthropic's Complex Context Engineering Revealed

2026-04-05
OracleOracle
POLICY & REGULATION

AI Agents Promise to 'Run the Business'—But Who's Liable When Things Go Wrong?

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us