This is the first post from GMM Lab, a space where we’re going to document what we’re seeing in the real world, what we’re testing, and what I think is changing in search. Not in a “we’ve got all the answers” way, more in a “I want to make sense of this properly” way.

The aim with Lab is to share the practical implementation layer that most SEO advice skips: the stuff that often determines whether performance is stable or constantly drifting.

The Pattern We Keep Seeing

I’ve been thinking about this a lot recently, because the same situation keeps appearing across different clients.

A team is doing “good SEO”. The content is genuinely helpful. The basics are covered, technical issues get fixed, the site is fast enough, links come in naturally and still, performance feels messy. Rankings move around (often feels random). Traffic plateaus or declines and it is just harder to predict what will happen next.

I seriously don’t think the usual SEO advice is wrong, but it feels like it only explains part of what’s going on.

The way I’m currently framing it (and I could be wrong here) is that SEO has gone through three big shifts in how search engines understand websites. Not in the “here’s a timeline of algorithm updates” way, but more in the practical sense of what the system rewards, and what it struggles with.

This is just my opinion, but it’s helping me make sense of why SEO feels so different now.

Phase 1: When SEO Was Mostly About “The Algorithm”

In the early days before I started my career in SEO, search engines weren’t very good at understanding what a page actually meant. They needed simple signals they could measure.

So people optimised for those signals: keywords in certain places, lots of links, lots of pages targeting tiny variations. It worked because the system was basic. If it measured something, you could easily improve it. But over time, that approach created a lot of low-quality results. People were making content to rank for the sake of ranking; which meant search engines had to improve or the product would get worse.

Phase 2: When SEO Became “Write for Users”

This is the phase most marketing teams are familiar with, and I think it’s still correct.

Search engines started getting better at rewarding pages that actually help people. Content that answers the question properly. Pages that match intent. Websites that feel trustworthy. Better mobile experiences.

The industry message became: write useful content, build authority, fix technical issues.

And in general, that approach works.

But here’s what I’ve noticed: you can do all of that and still get stuck. Not because your content isn’t good, but because your website might not be “clear” enough for the system to understand at scale.

Where the Usual Advice Starts to Fall Short

This is where I think a lot of teams get frustrated, because nothing is obviously broken. It’s more like a slow loss of clarity.

You accidentally create internal competition

You have a category page, a product page, a buying guide, and supporting blogs all circling the same topic. Each piece is fine on its own, but the search engine can’t always tell which one is meant to be the main answer. Instead of reinforcing each other, they compete. Rankings can end up switching between pages, which makes performance feel unstable.

Example: We worked with an eCommerce client selling outdoor furniture who had a “garden chairs” category page, a “best garden chairs” buying guide, a “how to choose garden chairs” blog post, and seasonal collections that also featured chairs. All solid content. But over six months, we watched the ranking for “garden chairs” bounce between four different URLs. Traffic dropped 23% even though their average position only moved from 3.2 to 3.8. The instability meant Google showed different pages to different users, and the buying guide (which ranked most often) had a 40% lower CTR than the category page.

Your site looks repetitive at scale

Templates are normal, but when thousands of pages end up looking almost the same in search results, you lose the thing that makes someone choose your result. Even if rankings don’t change much, clicks can drop because the listings feel generic.

Example: A SaaS SEO client had 180 location pages for their appointment booking software (“Appointment Booking Software in Manchester”, “Appointment Booking Software in Leeds”, etc.). Each page ranked reasonably well. Most sat between positions 4-8. But the title tags and meta descriptions were almost identical except for the city name. Their average CTR was 1.8% when the expected CTR for positions 4-8 is closer to 4-5%. They weren’t losing rankings. They were losing clicks because users couldn’t tell the pages apart in the SERP.

Great content gets buried

You can publish an amazing guide, but if it’s hard to find on your site, it often underperforms. If a page sits deep with few internal links, the system reads that as “not that important”, even if the content is strong.

Example: A B2B client published a comprehensive 4,000-word guide on compliance requirements for their industry. Really well-researched, genuinely useful. But it sat three clicks from the homepage with only two internal links pointing to it. After 12 weeks, it was ranking on page 3 for its target keyword. We added it to their main navigation under “Resources”, linked it from five related service pages, and featured it in their blog sidebar. Within eight weeks it moved to position 6, and within four months it hit position 2. Same content. Different signals about importance.

Phase 3: What I Think Is Happening Now: SEO Is Becoming “Systems-Based”

Search results aren’t just a list of links anymore. Google increasingly answers questions directly through Featured Snippets, People Also Ask boxes, and AI Overviews. Bing has Chat. ChatGPT now searches the web. So even when you rank well, you might get less traffic than you would have a few years ago.

The data we’re seeing: Across 12 of our clients, pages that hold position 1-3 are getting 15-30% less traffic than they did in 2022, even when rankings haven’t changed. The traffic hasn’t disappeared. It’s just being answered in the SERP itself or extracted into AI summaries.

If that’s the direction we’re in, then it’s not just about ranking pages. It’s also about whether your website is easy for the system to confidently pull information from and understand at scale.

Search Systems SEO means designing your website so search engines can confidently understand, extract, and trust your content at scale, not just optimising individual pages to rank.

What “clearly structured” actually means

When I talk about structure, I mean the system can quickly answer:

What is this page about? (Topic clarity through headings, schema, clear H1s)

Which page is the main answer for this topic? (No competing URLs with overlapping intent)

How does this page relate to others on the site? (Logical hierarchy, internal linking that makes sense)

Can this information be extracted reliably? (Structured data, clear formatting, scannable content)

Is this page important? (Internal link equity, position in site architecture)

Search engines are getting better at understanding entities and relationships. They know “garden chairs” connects to “outdoor furniture” and “patio seating”. They can see when multiple pages target the same concept. They’re building a map of your site’s expertise and authority.

When that map is clear and consistent, the system has confidence. When it’s messy or contradictory, confidence drops. And when confidence drops, performance gets more unpredictable.

Example: We audited a financial services client who had 340 indexed pages about “mortgage advice”. Some were service pages, some were blog posts, some were old landing pages from past campaigns. Google couldn’t figure out which page to rank for high-value terms like “mortgage advice UK”. We consolidated to 12 core pages, used canonical tags on the rest, and restructured internal linking to signal clear hierarchy. Within three months, their main “mortgage advice” service page moved from position 12 to position 4. More importantly, it stayed there. The volatility stopped.

When your site operates as a clear system, rankings become more stable, traffic becomes more predictable, and you can actually explain why performance changed, not just guess.

The Simplest Way I Explain It

People read pages like humans. Search engines read websites like systems.

A person judges a page based on whether it’s helpful and trustworthy. A search engine has to work out what the page is about, how it relates to other pages, which page should win for a topic, and whether the site is consistent and reliable.

When your site is clear, the system has confidence. When it’s messy or overlapping, confidence drops. And when confidence drops, performance gets more unpredictable.

What We Mean by “Search Systems” at GMM Lab

When we say “search systems”, we’re not trying to sound fancy or technical. We just mean: the setup behind your SEO results.

Not just the content, but how everything connects:

  • How pages are organised (so there’s a clear “main page” for each topic)
  • How internal links guide both people and search engines
  • How templates affect hundreds or thousands of pages at once
  • What gets indexed and what shouldn’t
  • How you spot issues early, before they become traffic drops

I’ve started to think SEO success comes less from perfecting individual pages and more from building a website that stays clear even as search changes around it.

A Few Questions I Keep Coming Back To

If you’re a marketing manager trying to sanity-check whether your SEO setup is strong, these are the kinds of questions I think are useful:

  • If someone asked “which page should rank for this keyword?” could you answer quickly and confidently?
  • Do your pages support each other, or are they competing?
  • Are your most important pages easy to find internally, or are they buried?
  • If a template changes, would you notice if it quietly reduced clicks across hundreds of pages?
  • Are you controlling what gets indexed, or is the site slowly filling up with low-value URLs?

If those are hard to answer, it usually means you’re doing page-level SEO, but you don’t yet have the system-level view.

Why GMM Lab Exists

Most SEO content explains what to do. Very little shows how to actually do it.

Teams end up with audit reports full of recommendations they never implement. Strategy decks that sound good but don’t translate into action. A backlog of “SEO tasks” that never quite get prioritised because no one’s confident they’ll work.

The problem isn’t that people don’t care about SEO. It’s that most advice stops at the strategy layer and never gets to the implementation layer: the scripts, workflows, and systems that make SEO measurable and repeatable.

That’s why we built GMM Lab.

GMM Lab documents the systems thinking behind our client work. Everything we share here reflects the actual frameworks, scripts, and workflows we use to deliver reliable SEO performance. Not theory. Not “10 quick wins”. Not generic advice that sounds good but doesn’t help.

We’re sharing the practical layer: the repeatable processes that turn messy signals into clear outputs, the validation checks that catch problems early, and the workflows that make SEO less guesswork and more engineering.

Lab is for technical marketers, growth teams, and SEOs working at scale who want implementation-level clarity. If you’re looking for SEO basics or quick hacks, this probably isn’t the right place. But if you’re dealing with the kind of problems I’ve described (good content that underperforms, rankings that feel unstable, performance that’s hard to predict) then you’re exactly who we’re writing this for.

What’s Coming Next

Before we get into scripts, workflows, and implementation details, I need to explain why modern SEO works the way it does, and why traditional “optimise the page” thinking breaks down at scale.

In our next post, I’m going to reframe SEO through a lens that makes the complexity manageable: data engineering.

If that sounds technical, don’t worry. It’s not about code. It’s about understanding that search systems don’t evaluate pages in isolation. They interpret patterns, validate signals, and extract meaning from structure. Once you see SEO as a data system (inputs, transformation, validation, consumption), a lot of the confusion disappears.

After that, I’ll break down what we mean by “search systems” in practical terms: how pages connect, why internal linking is signal flow, and why templates multiply problems at scale.

Then we’ll get tactical. We’ll share the exact scripts, frameworks, and workflows we use to:

  • Map topics to pages and eliminate cannibalisation
  • Audit internal linking and prioritise fixes by impact
  • Monitor template changes before they tank CTR across hundreds of pages
  • Control indexation and keep low-value pages out of Google’s index
  • Make SEO measurable and less guesswork

If these questions resonated, or if you’re dealing with traffic that feels unpredictable despite doing “all the right things”, you’re exactly who we’re writing this for.

GMM Lab is our space for documenting what we’re testing, what we’re seeing in client accounts, and what’s actually working in practice. Not every test succeeds. Not every theory holds up. But I think the industry needs more people sharing the implementation layer: the messy, practical stuff that determines whether SEO performance is stable or constantly drifting.

Privacy Preference Center