This is an intermediate course for PMs who can already read a dashboard — and want to learn how to read between the dashboards.

I teach it at Scaler School of Business over 9 sessions. The students come in with 2-5 years of product experience. Most of them have access to data. Many of them are using that data to answer the wrong questions.

The core distinction

The most important concept in the course is the difference between convergent and divergent data analysis.

Convergent analysis starts with a hypothesis and looks for evidence to confirm or deny it. It's the mode most analysts default to. It's efficient and legible. It's also how you systematically miss the most important signals in your data — the ones you didn't know to look for.

Divergent analysis starts with the data and asks: what's strange here? What doesn't fit my model? What would I have to believe for this pattern to make sense? It's slower, messier, and often produces nothing useful. When it works, it finds things that would never emerge from a convergent analysis.

Great product analytics requires both. The course teaches when to use which, and how to move between them.

Session structure

Sessions 1-3: The RCA Toolkit. Root cause analysis for product problems. Structured approaches to diagnosing why a metric moved, isolating confounds, and avoiding the classic traps (correlation as causation, survivorship bias, Simpson's Paradox in practice). We use the MakeMyTrip NPS case as the primary example.

Sessions 4-5: A/B Experimentation Design. How to design an experiment you can actually learn from. Sample size calculation. How to handle novelty effects. When not to run an A/B test. What to do when your experiment produces null results. The Spotify personalization case study.

Sessions 6-7: Data Storytelling. This is the one students underestimate. Knowing what the data says isn't enough. You have to translate analysis into decisions, and decisions require communication. We work through the Zynga metrics collapse case — a company that had too much data and couldn't act on any of it — and the Sharechat user retention case.

Sessions 8-9: The Full Case. Uber Events — a feature that Uber tested, launched, and quietly killed. We go through the full analytics lifecycle: what metrics they used, what the data showed, what they missed, and what a better analysis would have looked like.

What students leave with

A sharper instinct for the question behind the question. When someone shows you a metric, the first thing you should ask is: what's this metric not measuring? The course tries to make that reflex automatic.