Grain logomark
Back to blog

Heatmaps + Session Replay: The Complete UX Analysis Stack

Heatmaps show you where the problem is. Session replay shows you why it's happening. Here's how to use them together to fix UX issues faster and with more confidence.

Grain Team

Grain Analytics10 min read

A heatmap tells you that 40% of users click a non-interactive element on your pricing page. That's useful. But it doesn't tell you what those users were trying to do, what happened after they clicked, or whether they eventually found their way to the right action.

A session replay shows you one user struggling with your checkout form for three minutes. That's vivid. But it doesn't tell you whether that user's experience is typical or an outlier, and watching recordings one by one doesn't scale.

Neither tool alone gives you the full picture. Together, they do.

What each tool is actually for#

Heatmaps: the aggregate view#

Heatmaps compress thousands of sessions into a single visual. They answer questions about patterns across your entire user base:

  • Where do most users click?
  • How far does the average visitor scroll?
  • What elements attract the most attention?
  • Where does engagement drop off?

The strength of heatmaps is statistical. They show you what's true for the population, not for an individual. When a click map shows a hot spot on a non-clickable image, that's not one confused user — it's hundreds of users running into the same false affordance.

The limitation is that heatmaps strip away context. You can see the "what" and the "where" but not the "why" or the "what happened next."

Session replay: the individual view#

Session replays give you the full story of a single user's experience. You see their cursor movements, their clicks, their scroll behavior, their hesitations, and their frustrations in sequence. Context is preserved — you know what happened before, during, and after each action.

The strength of session replay is narrative. You can see the exact sequence of events that led to abandonment, the moment of confusion, the workaround a user invented when the intended path didn't work.

The limitation is scale. Watching ten replays gives you anecdotes. You need heatmaps to know whether those anecdotes represent a pattern worth fixing.

The workflow: heatmap first, replay second#

The most effective UX analysis follows a consistent sequence: spot the pattern in the heatmap, then explain the pattern with session replays.

Step 1: Identify the anomaly in your heatmap#

Open your heatmap and look for something unexpected. Not just "hot" or "cold" — specifically something that doesn't match what the page was designed to do.

Good anomaly signals:

  • A CTA that's cold while a non-interactive element is hot
  • A sharp scroll depth drop at an unexpected point
  • Click concentration in an area with no meaningful content
  • High cursor activity on an element that users rarely click

Each of these is a question that the heatmap raises but cannot answer.

Step 2: Filter your session replays to the anomaly#

Don't watch random replays. Filter to sessions where users interacted with the anomalous element or reached the problem area of the page.

If your heatmap shows rage clicks on a product image, filter replays to sessions that include clicks on that image. If your scroll map shows a cliff at 45% page depth, filter to sessions where users reached approximately that point and then left.

This is where the two tools connect. The heatmap told you where to look. Now the replays show you what actually happened there.

Step 3: Watch for the story behind the data#

In 10-15 filtered replays, you'll start seeing the narrative that the heatmap couldn't tell you. Users click the product image because they expect it to open a demo. Users stop scrolling at 45% because that's where a horizontal data table appears and it's unreadable on mobile. Users rage-click the pricing tier name because it looks like a toggle between plan options.

The heatmap gave you the "what." The replays give you the "why." Now you have enough to act.

Scenario 1: Form abandonment#

What the form heatmap shows#

Your click map on the signup form page shows heavy clicks on the first three fields (name, email, company) and almost no clicks on fields four through seven. Your scroll map confirms that users see the entire form — they scroll far enough. They just stop filling it out.

What you don't know yet#

Why do users stop at field four? Is the field confusing? Is it asking for information they don't have? Is there a validation error? Is the form just too long and field four is where patience runs out?

What form replays reveal#

You filter replays to sessions where users interacted with field three but not field four. In twelve replays, you see a clear pattern: field four asks for "Company Size" with a dropdown menu. Users open the dropdown, scan the options (1-10, 11-50, 51-200, 201-1000, 1000+), and then pause. Several users close the dropdown without selecting. Two users type in the field as if it were a text input. One user selects "1-10" and then immediately deletes it.

The insight: the Company Size field feels intrusive at this stage. Users aren't sure why you need this information, and the options feel like they'll sort users into sales tiers. It's a trust problem, not a UX problem.

The form fix#

Make Company Size optional, or move it to a post-signup onboarding flow where users understand why the information matters. The heatmap showed you the drop-off. The replays showed you it was about trust, not usability. Without both signals, you might have redesigned the dropdown when the real fix was removing it.

Scenario 2: Checkout friction#

What the checkout heatmap shows#

Your click map on the checkout page shows an unusual hot spot on the "Back" button or browser back navigation — much hotter than you'd expect on a page where users are about to complete a purchase. Scroll depth is high (users see the whole page), but the "Complete Purchase" button gets fewer clicks than the page's traffic would predict.

What checkout replays reveal#

You watch replays of users who reached checkout but clicked "Back" or left without purchasing. The pattern emerges quickly: users scroll down to review their order, see the shipping cost for the first time, and hesitate. Some click back to the product page — perhaps checking if free shipping is mentioned. Others scroll up and down the checkout page repeatedly, looking for a promo code field that doesn't exist. Two users open a new tab (visible from a pause in the replay) — probably searching for discount codes.

The checkout fix#

The issue isn't the checkout UX. It's shipping cost surprise. Users expected free shipping or at least wanted to know the cost before reaching checkout. The heatmap showed you that users were bouncing from a page they should have been completing. The replays showed you the precise moment of friction: the first appearance of the shipping line item.

Solutions: show shipping estimates on the product page, offer a free shipping threshold, or add a promo code field so users who search for codes can apply them without leaving. These are business decisions informed by behavioral evidence, not guesswork.

Scenario 3: Feature discovery#

What the dashboard heatmap shows#

Your move map on the product dashboard shows that cursor attention concentrates on two of your five main features. The other three features sit in attention cold zones — users barely look at them, let alone click them. Your click map confirms: the three cold features get negligible interaction.

What onboarding replays reveal#

You watch replays of users during their first few sessions to understand onboarding behavior. You notice that users follow a predictable path: they explore the two visible features (which are above the fold and prominently styled), accomplish their initial task, and then leave. They don't scroll down. They don't explore the sidebar where the other features are listed. They don't know those features exist.

In five replays, you see users manually doing something that one of the undiscovered features automates. One user exports data to a spreadsheet to build a chart that your built-in reporting feature already provides. Another user screenshots the dashboard to share with a colleague — unaware of the share button tucked in a menu.

The discovery fix#

This isn't a feature quality problem. It's a discovery problem. The heatmap told you which features are invisible. The replays showed you that users would benefit from those features if they knew they existed.

Possible solutions: a guided onboarding flow that introduces all five features, contextual prompts ("Did you know you can share this dashboard?"), or a layout change that gives equal visual weight to all core features. The approach depends on your product philosophy, but the diagnosis is clear.

When to use each tool alone#

Not every question needs both tools. Some situations call for one or the other.

Heatmaps alone work well when:

  • You need to validate CTA placement across a large sample
  • You're comparing behavior before and after a design change
  • You want a quick health check on page engagement
  • You're deciding where to position new content on an existing page

Session replay alone works well when:

  • A specific user reported a bug and you need to see what happened
  • You're doing a pre-launch UX review with a small group
  • You need to understand a complex multi-step workflow
  • You want to watch how power users navigate differently from new users

Use both when:

  • You've found a conversion problem and need to diagnose the cause
  • A heatmap shows an anomaly that doesn't make obvious sense
  • You're preparing a case for a significant UX redesign
  • You want to move from "something is wrong" to "here's specifically what's wrong and why"

For a deeper look at structuring your heatmap analysis, see our guide on heatmaps that drive product decisions. For session replay methodology, session replay best practices covers the filtering and review workflow in detail.

Building the habit#

The teams that get the most from this combination don't treat it as a special investigation technique. They build it into their regular workflow.

Weekly pattern review. Spend 20 minutes reviewing heatmaps on your key pages. Flag any anomalies. For each anomaly, queue up 10 filtered session replays for the following day. Write down the finding in one sentence.

Post-launch check. Every time you ship a change to a high-traffic page, run a fresh heatmap after collecting sufficient data (usually a few days). Compare it to the previous heatmap. If behavior shifted unexpectedly, pull up replays to understand why.

Quarterly deep dive. Pick your three highest-traffic pages and run the full analysis: click map, scroll map, move map, plus 15-20 targeted replays per page. Document findings, prioritize fixes, and track whether the next quarter's heatmaps show improvement.

The cadence matters less than the consistency. A team that reviews heatmaps and replays every week will find and fix more UX problems in a month than a team that runs a big analysis once a quarter.


Heatmaps give you the map. Session replay gives you the ground truth. The map tells you where to look. The ground truth tells you what's actually happening there. Skip either one and you're working with half the information you need.

If you're already using one of these tools, add the other. If you're using neither, start with heatmaps on your most important page, find the first anomaly, and watch the replays that explain it. That first insight will make the case for the workflow better than any argument.

Related articles