All Case Studies
Technology

Notion

Prompt caching with Claude cuts AI costs 90% as Notion crosses $600M revenue and 100M users

AI Pilot ProgramAI Transformation ProgramTeam Training
90%
API Cost Reduction
10-20% → 50%+
AI Adoption
$600M ARR
Revenue Growth

The Challenge

Notion, the collaborative workspace platform serving over 100 million users including more than 50% of Fortune 500 companies, faced a strategic inflection point as generative AI redefined expectations for knowledge work tools. Customers demanded AI that could synthesise information across documents, generate content in context, and assist with complex workflows — not just autocomplete.

Notion's existing architecture was optimised for fast, reliable document collaboration, but integrating AI at the depth required meant significant technical investment. The diversity of content types — structured databases, kanban boards, long-form prose, and embedded media — demanded AI models capable of understanding context across heterogeneous data representations. Enterprise customers also required strict data isolation guarantees that their proprietary content would not be used to train models accessible to other organisations.

The Approach

Notion pursued deep technical partnerships with Anthropic and OpenAI to integrate state-of-the-art language models directly into its platform. Rather than treating AI as a bolt-on feature, Notion rebuilt core infrastructure to support agentic AI capabilities with full workspace context awareness.

A critical architectural decision was implementing prompt caching with Claude, which reduced API costs by 90% and latency by up to 85% for AI-assisted features. This optimisation allowed Notion to offer AI capabilities at scale without prohibitive cost structures. Custom indexing and retrieval systems enable AI models to access relevant context from across a user's entire workspace while maintaining enterprise-grade data isolation with no cross-tenant leakage.

In September 2025, Notion launched version 3.0 with autonomous AI Agents and expanded MCP connectors, positioning its Business and Enterprise tiers around AI-driven automations. Notion bundled AI capabilities exclusively into these higher tiers, increasing average revenue per user across its 4 million paying customers.

Results

90%
API Cost Reduction
Prompt caching with Claude reduced AI inference costs by 90% and latency by up to 85%, enabling AI features at scale
10-20% → 50%+
AI Adoption
AI feature adoption among customers rose from 10-20% to over 50% within a year of deep integration
$600M ARR
Revenue Growth
Revenue grew from $400M in 2024 to $600M by December 2025, driven partly by AI-bundled Business and Enterprise tier upgrades

This is an industry case study based on publicly available information. Notion is not a Pertama Partners client.

Want results like these for your organization?

We help enterprises across Southeast Asia design and deliver AI transformation programs. Let’s talk about what’s possible for your team.