The retention chart that shows what your aggregate dashboard hides — whether new signups today behave better, worse, or the same as the ones from six months ago.
Research-based overview. This article synthesizes documentation from PostHog, Mixpanel, Amplitude, and standard cohort-analysis methodology used in product analytics. How we research.
The simplest cohort analysis looks like this. The rows are signup month, the columns are months-since-signup, and the cells are the percentage of that cohort still active.
Signup cohort | M0 M1 M2 M3 M4 M5 ----------------|------------------------------------ 2025-11 (n=84) | 100% 62% 48% 41% 38% 36% 2025-12 (n=97) | 100% 64% 51% 45% 40% ---- 2026-01 (n=120)| 100% 58% 46% 39% ---- ---- 2026-02 (n=143)| 100% 55% 42% ---- ---- ---- 2026-03 (n=167)| 100% 51% ---- ---- ---- ---- 2026-04 (n=181)| 100% ---- ---- ---- ---- ----
Read the table left-to-right and you see how each cohort decays over time. Read top-to-bottom in any column and you see whether new cohorts retain better or worse than older ones at the same age. In this example, M1 retention has fallen from 64% (Dec) to 51% (Mar). Something broke between December and March that's hurting the new-user experience — likely an onboarding regression, a marketing channel change that's pulling lower-quality signups, or a pricing/positioning shift. The aggregate retention number for the whole product would have masked this; the cohort table puts it on the screen.
Aggregate metrics — total active users, monthly churn rate, ARR — mix together customers who signed up six months ago with customers who signed up last week. The math hides a lot. If your old, sticky cohorts mask a deteriorating new-cohort experience, your top-line retention can stay constant while the underlying business rots.
Three patterns that aggregate reporting hides but cohort analysis surfaces:
This is the case Andrew Chen made years ago in "The power user curve" and what every product analytics tool now bakes in: any time-series metric is more honest as a cohort analysis than as an aggregate. The tradeoff is that cohorts need more data to be readable — you need enough users per cohort that the percentages are meaningful.
This is the table above. Rows = signup cohort, columns = months since signup, cells = retention %. Define "retained" clearly: it might mean "logged in at least once," "performed the core action," or "is still paying." For SaaS, "is still on a paid plan" is usually the metric that matters most. For free or freemium products, "performed core action in this month" works better.
Same table shape, but cells contain MRR per surviving customer rather than retention percentage. This shows whether each cohort is expanding or contracting in spend. A healthy SaaS sees revenue per cohort flat or rising over time (expansion offsets some churn). A contracting cohort means downgrades and churn outpace expansion. Combining this with retention gives you a full picture of what each cohort is worth at any age — the LTV by cohort. Our piece on customer lifetime value covers how to extend this into LTV calculations.
Define your activation event — the action that predicts retention. For Slack, it was sending the first 2,000 messages. For Dropbox, it was syncing one file across two devices. Then chart the percentage of each cohort that hit the activation event in their first session, first day, first week. If activation drops for a recent cohort, you have a window to reproduce the problem before it becomes retention damage 30 days later.
Cohort analysis breaks down at small N. If a cohort has 12 people, one churn moves the percentage by 8 points and the noise dominates. The rule of thumb in product analytics: you need at least 50 users per cohort for monthly retention numbers to be readable, and 100+ for confident decisions.
If you're below that threshold, options include:
Once you're past 50 paying customers per cohort — usually around $3K–$5K MRR with monthly billing — cohort retention becomes your most useful retention metric. Below that, treat the chart as directional and rely heavily on qualitative input.
You don't need a data warehouse to run cohort analysis. The three most-used tools by indie SaaS founders:
The real bottleneck is rarely the tool. It's sending the right events. If you're not already tracking signup, activation, and subscription_paid, the analytics tool is empty. Wire these up first; pick the cohort tool second.
The point of cohort analysis is to make decisions, not to admire the chart. The most common reads and the actions they map to:
The cohort chart is a diagnostic instrument. It tells you where the problem is. The fix usually lives in onboarding (bad activation), pricing (no expansion mechanism), positioning (wrong-fit customers), or channel mix (acquisition drift). For tactical interventions specifically on the retention side, our SaaS churn reduction playbook walks through the highest-leverage moves.
Cohort analysis is the antidote to misleading aggregate metrics. By grouping users by signup period and tracking each group over time, you see whether your business is structurally improving, deteriorating, or flat — signals that single-number dashboards miss. For solo founders, the practical setup is: retention by signup month is the default chart, revenue per cohort is the second, activation per cohort is the third. PostHog or Mixpanel give you the chart in minutes; SQL on your own database works at any scale. Below 50 paying customers per cohort, the chart is too noisy to act on, so combine it with qualitative churn interviews. Above that, it becomes the most useful retention chart you have.
The stack, prompts, pricing, and mistakes to avoid — for solo founders building with AI.