Tom Meredith Tom Meredith

Everyone's optimizing for the wrong end of AI search

I spent a few weeks reading everything I could find on AEO and GEO.

That’s Answer Engine Optimization and Generative Engine Optimization, in case you’ve been blissfully offline.

Every tweet, blog, reddit post, and youtube video said basically the same thing. Write clear answers. Structure your content well. Think about how AI will present you in a summary.

Good advice. None of it is “wrong.”

But, they’re all describing the output side.

How AI presents the answer. What the results look like. Which format gets featured. You hear variations on…

“use more tables”

“Make sure to answer questions that user might ask ChatGPT” (as if this wasn’t the right way to add value in the first place)

“you must have llms.txt”, “no, you need a schema.js file”

What those all sound like to me is work that an agency can show you they did.

Nobody’s really asking what happens before that. How AI actually finds and selects content in the first place.

That’s the part that changes everything.

LLMs don’t search with keywords.

They search for “meaning”

Ok, this is about to get a bit technical… LLMs like ChatGPT, Claude, Gemini, Grok, Perplexity all work through embeddings… chunks of text are given meaning. Basically, the systems encode the text as vectors in high-dimensional space.

Quick math recap…

A vector is a set of coordinates. You’re probably familiar with x, y coordinates. Maybe z as well. That’s two and three dimensions respectively. Well with LLMs they use up to 3,072 dimensions (that’s OpenAI’s latest embedding model… most use somewhere between 768 and 3,072) and those coordinates actually encode the meaning.

It’s weird… I’m not totally sure how it works either. But the foundational research is Google’s Word2Vec paper from 2013 (Mikolov et al.). They showed that vector math on words actually works. King minus man plus woman equals queen. Seriously. The vectors captured meaning well enough to do algebra on concepts.

Now, when a model retrieves content, it’s finding proximity. What’s semantically closest to the query… meaning what’s closest in terms of meaning, not just which words appear. Not what literally matches the words. What matches the meaning.

This is a completely different mechanism than keyword search.

And it means most SEO thinking is the wrong mental model for AI retrieval.

I started calling this MEO… Meaning Engine Optimization. Not because I love coining things (Even though I do. ™ is literally my initials), but because the concept needed a name. Nobody had claimed it yet. So here we are.

The distinction is simple.

AEO and GEO are output-focused. They ask: how do I show up well once AI has already found me?

MEO is input-focused. It asks: how does AI find and select me in the first place?

One layer deeper. Many layers more meaningful.

The clearest proof I’ve seen is Exa.ai. Exa is a search engine built on this concept and trained on link prediction. Not keyword matching. It retrieves pages based on meaning and context. You search for a concept, it finds pages that mean that thing… not pages that just say that thing.

Use it for a week and you’ll notice Google feels manipulated after.

Keyword-optimized content often ranks lower in Exa. Meaning-dense content, where a clear point of view runs through the whole piece, performs better.

LLMs learn the same way. They’re trained on massive amounts of text and build internal maps of how concepts relate to each other. The content that lands closest to what someone means when they ask a question… that’s what gets retrieved.

GEO and AEO tactics are fine. They’ll help at the margins. But, they’re modifications of the old model. You’re polishing the presentation of a result you’re not even being retrieved for… or won’t be for long if you’re thinking in terms of keywords.

The mechanism of the future is meaning. The unit of optimization is meaning.

And that’s what I’m calling MEO. I’ll go deeper on how this actually works and what you can do about it in the next few posts.


Get new posts delivered to your inbox.