Etavrian
keyboard_arrow_right Created with Sketch.
Blog
keyboard_arrow_right Created with Sketch.

These 4 Search Console Views Explain Your Lead Slump

14
min read
Mar 22, 2026
Conversion funnel minimalist tech illustration with UI panels showing steady traffic but lost leads

When SEO starts to feel vague, I go back to Google Search Console because it brings things back to earth. It shows me what people typed into Google, which pages earned visibility, which URLs got ignored, and whether Google has flagged a real problem. For a B2B service company, that is not trivia. It helps me explain why leads slowed down, why a service page stopped showing up, or why traffic rose without improving pipeline quality. When polished reporting answers nothing, this is the grounded place I start.

Google Search Console reports

If I had to narrow Search Console down to the reports that matter most, I would focus on the Search results report, Page indexing report, Links report, and Manual actions report. Those four cover demand, visibility, crawl health, authority flow, and outright risk. For most B2B service firms, that is the core set. Search results shows where clicks and missed clicks live. Page indexing shows whether Google can use the pages I spent time building. Links shows which pages get attention from other sites and from my own site. Manual actions is the emergency screen. If something appears there, it needs attention fast.

In the left menu, I find these reports under Performance, Indexing, Links, and Security & Manual Actions. If I check only one place each week, I start with Search results. If I have launched new pages, changed templates, migrated a site, or updated canonicals, I open Page indexing next.

Report Main question it answers How often I check What usually matters most for B2B service firms
Search results report What searches and pages are winning or slipping? Weekly High-impression terms, low-CTR terms, page declines, non-branded queries
Page indexing report Can Google index the pages that should drive leads? Monthly, plus after site changes Service pages, location pages, case studies, comparison pages
Links report Which pages are getting authority, and which are ignored? Monthly Internal links to money pages, external links to proof pages
Manual actions report Has Google flagged spam or policy issues? Every login, plus after risky changes Immediate review and cleanup if anything appears

That sounds simple, maybe too simple. I still think simple is good here. In my experience, most teams do not need more reports. They need clearer reads on the reports they already have.

Search results report

The Search results report gets most of my time because it answers the question every CEO eventually asks: what are we showing up for, and what is that visibility actually doing? It shows queries, pages, countries, devices, dates, and search appearance. In plain English, that lets me see what people searched, which page got shown, where the searcher was, what device they used, when it happened, and whether the result appeared in a special format.

When I review this report, three patterns usually matter most. First, I look for high-impression, low-CTR terms. These are searches where Google is showing the site often, but searchers are not choosing it. That can point to weak titles, weak meta descriptions, a mismatch between query intent and page copy, or a results page crowded with ads and other search features. I sort by impressions, then look for rows with plenty of visibility but weak click-through rate. I do not judge CTR in a vacuum, though. A page in position 9 should not be held to the same CTR standard as a page in position 2.

Second, I use date comparison to spot declining pages. I compare the last 28 days or last 3 months with the prior period, then switch from Queries to Pages. If a page that used to bring steady clicks is slipping, that usually matters more than a few random keyword drops. A service page may have lost rank after a site update, a case study may have fallen out of the index, or buyer demand may have shifted. The report helps me separate those possibilities instead of guessing. In a lot of cases, the real issue is slow content decay, not one dramatic event.

Third, I filter for non-branded opportunities. Brand queries are useful, but they can hide what is happening with new demand. When I remove the company name and branded variants, I get a better read on whether Google sees the site for category terms such as “managed IT support,” “fractional CFO services,” or “Salesforce consulting for healthcare.” For B2B firms, that matters because non-branded search is often where net-new pipeline starts. I also watch seasonality. In many service sectors, budget planning periods can lift impressions before clicks move. And newer SERP features, including AI overviews, can pull attention away from standard listings, so a CTR drop does not automatically mean the copy got worse.

Page indexing report

The Page indexing report answers a less glamorous question, but one that can quietly damage pipeline: which pages are indexed, and which are not? Indexed pages can appear in Google Search. Not indexed pages cannot. That sounds obvious, yet many B2B sites have service pages, city pages, comparison pages, or case studies that were published months ago and still are not properly indexed.

When I open this report, I start with the split between Indexed and Not indexed. Then I look at the status groups that most often affect lead pages first:

  • Crawled - currently not indexed usually means Google found the page but chose not to keep it in the index. That often points to thin copy, weak internal links, duplicated themes, or pages that feel too similar to stronger URLs.
  • Duplicate without user-selected canonical means Google sees multiple versions of the same page or near-identical pages and picked a different URL than the one I intended.
  • Excluded by noindex can be fine if it is intentional, but it can also reveal a template or CMS mistake that blocks good pages.
  • Soft 404 means the page exists, yet Google thinks it is too weak, too empty, or too far off topic to count as a real page.

The trap here is fixing by page count. I do not think that is the right order. I fix by revenue impact. A single service page that drives discovery calls matters more than 80 thin tag pages. A comparison page that supports late-stage buyers matters more than a random archive URL. A location page tied to a strong region matters more than a forgotten press release. So I usually work in this order: service-line pages first, then pages with historical clicks or links, then case studies and comparison pages, then the rest. Not every excluded page is a fire. Some should stay out of the index. But if core pages are sitting in “Crawled - currently not indexed,” that is not a small issue.

Links report

The Links report is less flashy than Search results, and that is fine. Boring can be useful. It shows external links, internal links, top linked pages, and linking sites. I use it to see where authority is flowing and where it is not. For B2B SEO, that matters because service pages often do not attract links on their own. Thought-leadership pieces, research, and case studies usually do. So the job is not just getting links. It is moving value from pages that earn attention to pages that sell the work.

I start with Top linked pages and compare that list with lead pages. If the homepage and blog posts attract most of the links while service pages barely appear, I am usually looking at an internal-linking problem. That is common. A site can have decent authority overall and still fail to pass enough support to the pages that need it most. The internal links view helps here. If a service page has only a handful of internal links even though it is meant to rank for a high-value term, that is a clear miss.

This is where service page clusters help. I think in terms of a core service page, several case studies, a few point-of-view articles, and a comparison page linked in a clear pattern. The Links report will not tell me everything about link quality, but it does show whether important pages are isolated. For firms that sell high-trust services, that matters. Stronger internal links can help Google understand which pages are central, which proof pages support them, and how the site fits together.

Manual actions report

Most sites will never see anything in the Manual actions report, which is exactly what I want. But if something appears there, I treat it as a real business issue, not a technical footnote. A manual action means Google has applied a human review and found a policy problem. That can hit rankings, indexation, or visibility in a serious way.

If there is a manual action, my next move is immediate triage. I read the notice closely, review the sample URLs or examples Google provides, and check whether the issue is tied to spammy pages, misleading structured data, user-generated spam, or unnatural links. Then I clean up the problem, document what changed, and submit a reconsideration request. I keep that explanation factual. No drama, no long story, just what happened and what I fixed.

Security issues sit nearby and deserve a mention too. Hacked pages, malware, or injected spam can destroy trust quickly. If Search Console flags a security problem, I deal with that before I obsess over titles and CTR.

Data availability

Search Console data is useful, but it is not live minute by minute. There is usually a reporting lag of about 1 to 3 days, so I use it for weekly and monthly decisions, not same-day panic. That matters when a stakeholder sees one bad day and assumes traffic fell off a cliff. Sometimes it did. More often, the data is still settling.

Search Console also keeps 16 months of data in the interface. That is enough for trend work, year-over-year comparisons, and seasonal checks, but not enough for a long history unless I export it elsewhere, such as Google Sheets or BigQuery.

If I use the Search Console integration inside GA4, the [GA4] Queries report can add helpful context, but the start date still depends on setup timing. If the web data stream was created first and site verification happened later, data becomes available from the verification date. If site verification came first and the web stream was created later, data starts from the stream creation date. In both cases, the properties also need to collect data for the same pages.

I also do not expect Search Console numbers to match GA4 or third-party SEO tools exactly. That is normal. Search Console counts clicks and impressions from Google Search. GA4 counts sessions and user behavior after the click, and those can be affected by consent settings, tracking gaps, redirects, or session rules. Third-party platforms estimate ranking and traffic from their own models, so they can be useful for direction without serving as the source of record.

When the numbers do not match, the reason is usually methodological rather than alarming. Search Console measures Google Search clicks, not sessions. Analytics platforms may miss visits if tagging breaks. Time zones can differ between tools. Search Console may group data by canonical URL while analytics reports show the landing URL. And short-term changes can look noisy because the data is still being processed.

So when is the data fresh enough to act on? Weekly patterns are usually solid. Monthly trends are stronger still. One-day swings, especially on lower-traffic pages, are often noise unless I also see a sharp sitewide drop or a technical change that lines up with the timing.

Dimensions in the report

Dimensions are the lenses I use to ask better questions. In the Search results report, the main dimensions are query, page, country, device, date, and search appearance. Each one answers a different question, and the value comes from combining them well.

Dimension What it reveals Best use case
Query What search phrase triggered the impression or click Find non-branded opportunities, weak CTR terms, and intent mismatch
Page Which URL appeared in search Spot winning pages, slipping pages, and index candidates
Country Where the search came from Check geo fit for service areas or target markets
Device Whether the user searched on desktop, mobile, or tablet Find mobile CTR gaps or mobile ranking issues
Date When the activity happened Compare periods, spot seasonality, check post-launch changes
Search appearance Whether results showed in a special format See if rich results or other features affect clicks

In practice, I use query plus page when I want to know whether the right page ranks for the right term. I use page plus device when a service page performs well on desktop but underwhelms on mobile. I use country plus query when a business serves more than one market and demand may look different in the US than it does in the UK. And I use date plus page after a migration, major content update, or template change.

A quick note on the GA4 integration: Search Console metrics work cleanly with Search Console-style dimensions. When I start mixing them with unrelated GA4 dimensions, the report can get thin or confusing. That is less a flaw than a data-model mismatch.

Metrics in the report

The four main metrics in the Search results report are clicks, impressions, CTR, and average position. Clicks are visits from Google Search to the site. Impressions count how often the result was shown. CTR is click-through rate, which is clicks divided by impressions. Average position is the average ranking of the highest result for a query.

These sound neat and tidy. They are not always neat and tidy in practice. Average position can mislead when I pull it away from query context. A page can rank in position 3 for a few branded searches and in position 18 for a long list of non-branded searches, then show an average that looks respectable while hiding the real story. CTR has a similar issue. A blended CTR can look healthy if brand queries are carrying the page. That is why I always separate brand from non-brand before I celebrate or panic.

Here is a simple B2B example. Say a cybersecurity consulting page shows 8,000 impressions, 96 clicks, a 1.2% CTR, and a 7.8 average position. At first glance, it is easy to assume the title tag is the problem. Maybe it is. But once I split the data, the picture can change. Branded queries may show a 17% CTR in position 1, while non-branded queries show a 0.7% CTR with average positions between 9 and 14, mostly on mobile. At that point, the page is not mainly losing because of copy. It is losing because it is not yet ranking high enough for the non-branded terms that drive new pipeline, and the mobile result is less attractive than it should be. The fix may include a stronger title and meta description, but it may also require better internal links, tighter section copy, stronger proof, and more support content around the service line.

That is why I use metrics in pairs instead of alone. Impressions plus CTR show missed click opportunity. Clicks plus page show which URLs matter right now. Average position plus query shows whether ranking is truly improving. That one move clears up a lot of confusion.

A practical recap

If I open only one report first, I open Search results. That is usually where the fastest SEO decisions come from. If clicks are flat but impressions are rising, I look at CTR and the search-results context. If service pages are missing from search, I open Page indexing and check whether Google is excluding them. If strong pages are not getting enough authority, I open Links and see whether internal support is weak. If a manual action exists, I stop routine SEO work and fix the flagged issue first.

When I need a short operating rhythm, I keep this in mind:

  • Start with Search results for weekly decisions.
  • Watch high-impression, low-CTR queries for click gains.
  • Compare date ranges at the page level to spot declines that matter.
  • Review Page indexing after launches, migrations, or template changes.
  • Treat revenue pages first, not the longest list of excluded URLs.
  • Use Links to strengthen service pages, case studies, and comparison pages.
  • Escalate Manual actions and Security issues right away.

That rhythm tends to keep SEO reporting tied to business reality rather than vanity charts.

Quickly summarize and get insighs with: 
Andrew Daniv, Andrii Daniv
Andrii Daniv
Andrii Daniv is the founder and owner of Etavrian, a performance-driven agency specializing in PPC and SEO services for B2B and e‑commerce businesses.
Quickly summarize and get insighs with: 
Table of contents