AI Could Break Network Effects: LLMs as 'Rented Cold Start' for Recommendation Systems
Key Takeaways
- ▸Traditional platform network effects rely on observing user behavior at scale, creating correlation-based recommendations but lacking deeper understanding of 'why'
- ▸LLMs could provide cross-domain inference (like connecting moving-related purchases to insurance needs) that single-platform data cannot
- ▸Recommendation intelligence might become rentable via API, potentially lowering barriers to entry and disrupting traditional network effect moats
Summary
Technology analyst Benedict Evans argues that large language models represent a fundamental shift in how consumer internet platforms solve the "cold start" problem—the challenge of providing personalized recommendations before accumulating user data. Traditional platforms like Amazon, Google, and TikTok built network effects by observing user behavior at massive scale, creating barriers to entry for competitors. These systems excel at correlation ("people who bought X also bought Y") but lack deeper understanding of why users make choices or what products fundamentally are.
Evans suggests LLMs could change this dynamic by providing a "step change in automated understanding" that goes beyond simple correlation. An LLM could potentially infer that a customer buying packing tape and bubble wrap is moving house and might need home insurance or broadband—connections that pure behavioral data wouldn't reveal. More significantly, this capability might become "just an API call from a world model," allowing new entrants to rent sophisticated recommendation capabilities rather than building their own user base first.
The analysis also explores how AI assistants could become another "blind man feeling the elephant" of user understanding, potentially aggregating insights across platforms through agentic behavior—making purchases, browsing, using wearables. Evans acknowledges uncertainty about how this will unfold, comparing the current moment to the web in 1997 or mobile in 2007: clearly transformative but with unclear implementation paths. The core tension remains the same problem he's written about extensively: the internet created infinite choice with inadequate discovery mechanisms, and AI represents the latest attempt to solve that fundamental challenge.
- AI assistants operating across platforms could aggregate partial user understanding from multiple sources, creating new forms of personalization
- The current AI moment parallels web (1997) and mobile (2007) transitions—clearly significant but with uncertain implementation pathways
Editorial Opinion
Evans makes a compelling case that LLMs could commoditize what was previously a defensible moat, but the analysis may underestimate the continued importance of proprietary behavioral data and real-time feedback loops. While general-purpose models can provide baseline understanding, platforms like Amazon and TikTok continuously refine recommendations through live A/B testing and immediate user response—a dynamic advantage that's harder to rent. The "rented cold start" thesis also assumes model capabilities will generalize sufficiently across domains, which remains unproven at the level of specificity that drives actual conversions and engagement.



