Grain logomark
Back to blog

700 Users Tried to Fill Out a Form. Zero Succeeded. Nobody Knew.

How an AI-powered deep investigation uncovered a broken consultation form, 1,600 dead clicks, and a Facebook campaign burning budget on a page nobody scrolled — all in one weekly report.

Grain Team

Grain Analytics8 min read

A fast-growing online education platform in Turkey tripled their signup conversion rate in a single week. Traffic was up 56%. Their Instagram campaigns were driving record numbers to a free consultation landing page. By every standard dashboard metric, things were going well.

Underneath the surface, three things were quietly failing — and none of them showed up in the numbers the team was watching.

What the dashboard showed#

The weekly traffic report looked strong. User volume had grown from about 5,800 to over 9,000 unique visitors. Signup events jumped from 48 to 218 — a 354% increase. The consultation landing page was getting nearly 1,900 visitors, more than double the prior week.

If you stopped there, you'd think everything was working. Most teams stop there.

What the deep investigation found#

Grain's AI investigation — Kai — runs a weekly analysis that goes beyond traffic and conversion summaries. It cross-references behavioral data (clicks, scroll depth, rage clicks, dead clicks, form interactions) with funnel progression, traffic source performance, and page-level engagement. The goal is to surface problems that aggregate metrics hide.

This week, it surfaced three.

A consultation form that captured zero leads#

The consultation landing page had 1,888 visitors. Over 700 of them interacted with the form — selecting options from dropdowns, filling in fields, actively trying to submit. The "Select Department" field alone had 701 clicks. The "Select Class" field had 643.

Total successful form submissions: zero.

Not low. Zero.

The form looked fine visually. It loaded correctly. The fields responded to input. But the submit button wasn't completing the submission — likely a JavaScript handler issue specific to mobile in-app browsers. The majority of traffic to this page came from Instagram, which opens links in its own embedded browser. That browser handles form submissions and redirects differently from Chrome or Safari, and the form broke silently in that environment.

The team had no error alerts because there was no server-side error. The form didn't crash — it just didn't do anything when you tapped "Submit." The only way to detect this was to notice the gap between field-level interactions (hundreds) and completed submissions (zero). That's what Kai flagged.

The revenue impact: every one of those 700+ form interactions represented a potential consultation booking. At the platform's typical consultation-to-enrollment rate, even a conservative conversion estimate puts the weekly loss in the hundreds of leads — during the highest-traffic week the page had ever seen.

Catch what dashboards miss

Kai investigates your behavioral data weekly — cross-referencing form interactions, dead clicks, scroll depth, and funnel drops to surface the problems that aggregate metrics hide.

See what Kai finds

1,651 dead clicks on one page#

The platform's instructor listing page — one of its highest-converting pages — was generating massive frustration. Users were clicking on instructor photos (642 clicks), instructor card containers (981 clicks), and "Quota Full" labels (127 clicks, plus 13 rage clicks). None of these elements did anything.

The instructor photos looked clickable — they had the visual styling of linked elements. Users expected to tap a photo and see the instructor's full profile. Instead, nothing happened. The card containers had rounded corners, shadows, and hover-like styling that implied the entire card was interactive. Only a small "View Profile" button inside each card actually worked.

And the "Quota Full" labels were dead ends. Users were clicking them repeatedly — some hard enough to trigger rage click detection — because there was no alternative path. No waitlist. No "Notify me when available." Just a label that told you "no" without offering a next step.

This page had seen a 16.5% decline in traffic compared to the prior week. The investigation's hypothesis: users who encountered this friction once were less likely to come back. The page that should have been the platform's strongest conversion path was actively pushing people away.

A Facebook campaign nobody scrolled#

The platform was running a Facebook campaign driving users to a dedicated landing page for a specific educational program. The campaign delivered 4,232 users that week. The signup rate from that traffic: 0.07%.

Kai identified the root cause by looking at scroll depth data: 98% of users who landed on this page from Facebook never scrolled past the initial viewport. They saw whatever was above the fold and left. The page also recorded 142 dead clicks on non-interactive text elements — users were tapping on headlines and promotional text expecting them to be links.

For comparison, referral traffic landing on the platform's instructor and pricing pages — without any paid promotion — was converting at 10%. That's a 142x gap in conversion efficiency between the paid and organic paths.

The landing page wasn't broken in a technical sense. It loaded fast. It rendered correctly. But the content above the fold wasn't compelling enough to make Facebook users (who arrive with low intent and short attention spans from a social feed) invest even a single scroll gesture. And the elements they did try to interact with weren't interactive.

What happened next#

The investigation generated a prioritized action list. The team fixed the consultation form's mobile submission handler within two days. They made instructor cards fully clickable and replaced "Quota Full" dead ends with a waitlist capture form. They redirected the Facebook campaign to the instructor page — which was already converting at 10% from other traffic sources — instead of the underperforming landing page.

The form fix alone started capturing leads from the next Instagram push. The dead click fixes on the instructor page reduced frustration events by over 80% in the following week. The Facebook redirect is still being measured, but early signals show significantly higher engagement than the previous landing page.

None of these fixes required major engineering effort. The consultation form was a few lines of JavaScript. The clickable cards were a CSS and link-wrapping change. The Facebook redirect was a URL swap in the ad manager. The hard part wasn't the fix — it was knowing the problem existed.

Find out what your site is hiding

Grain's weekly AI investigation analyzes every page, every click, and every form interaction to surface the problems your dashboard doesn't show. Most teams find something in the first report.

Start your first investigation

Why weekly dashboards miss these problems#

Each of these three issues — the broken form, the dead clicks, the non-scrolling landing page — shared a common trait: the aggregate metrics looked acceptable.

The consultation page had high traffic. Dashboard says: great. But traffic without conversions is just server load. You need to look at the gap between form field interactions and submissions to see the break, and no standard dashboard widget shows that.

The instructor page had decent click-through on the "View Profile" button. Dashboard says: engagement is fine. But the 1,651 dead clicks on non-functional elements aren't visible unless you're tracking frustration signals specifically. Standard heatmaps show where clicks happen — dead click detection shows where clicks happen and fail.

The Facebook campaign drove thousands of users. Dashboard says: campaign is delivering volume. But volume without scroll depth is bounced attention. You need behavioral data — not just pageviews — to see that 98% of those users never engaged with the content.

The pattern is the same in all three cases: the top-level metric (traffic, clicks, impressions) looks healthy, but the behavioral layer underneath tells a different story. The gap between those two layers is where revenue quietly disappears.

The investigation loop#

This isn't about running one investigation and fixing everything. The platform ships changes every week. New campaigns launch. Pages get updated. Mobile browsers release new versions that break things that used to work. The problems Kai found this week didn't exist two weeks ago — they emerged from a combination of a traffic surge, a campaign change, and a mobile browser quirk.

The value isn't in any single finding. It's in the rhythm: every week, the investigation runs. Every week, it cross-references behavioral signals against funnel data. Every week, it surfaces the two or three things that need attention before they compound.

The alternative is discovering these problems during the monthly review, after weeks of lost conversions. Or worse, not discovering them at all — just watching the numbers gradually decline and attributing it to market conditions or seasonality.

Most analytics tools are built to answer questions you already know to ask. The problems that cost the most are the ones you don't think to look for.

Related articles