Jungle Gym
AKL1150SYD0950MEL0950TYO0850BKK0650LDN0050NYC1950LA1650
AKL1150SYD0950MEL0950TYO0850BKK0650LDN0050NYC1950LA1650
AKL1150SYD0950MEL0950TYO0850BKK0650LDN0050NYC1950LA1650
AKL1150SYD0950MEL0950TYO0850BKK0650LDN0050NYC1950LA1650
SEO/GEOAI

Generative Engine Optimisation: facts, fictions, and what actually works

A practitioner's read on Generative Engine Optimisation — what's real, what's hype, and the five moves worth running this quarter.

10 minute read

Last update 13.05.26

Generative Engine Optimisation, or GEO, is the practice of making your content more likely to get cited when someone asks an AI assistant a question. It overlaps with traditional SEO, not separate from it. Most of what you'll read about it is wrong. Wrong about what it replaces. Wrong about what works.

I've run this work over the last year, and had to sort through all the fake "tactics" to settle on what I think actually works.

Why this matters now

Four signals say this isn't a trend any more, it's something to plan for:

  • AI Overviews peaked at around 25% of tracked Google searches mid-2025 and have since settled closer to 15-16% (Semrush, late 2025).
  • When an AI Overview is shown, position 1 click-through rate drops 35-60% depending on query type (Ahrefs and Amsive, 2025).
  • ChatGPT holds around 80% of the AI chatbot market by search share, processes roughly 2.5 billion prompts a day across 900 million weekly users (First Page Sage, January 2026).
  • OpenAI launched ChatGPT advertising in Australia, New Zealand, and Canada on 17 April 2026, the platform's first multi-country ad surface.

OpenAI doesn't open ad markets in places that don't matter commercially. NZ and Australia are now on the same shortlist as Canada.

Around half of Australian adults have used a generative AI tool in the past year, and ChatGPT is consistently the most-named tool in national adoption surveys (Deloitte AI at Work / YouGov AU, 2025). NZ data is thinner, but adoption rarely splits across the Tasman.

So if someone might ask an AI about what you sell, you have a new place you need to show up. The question is what to do about it.

GEO or AEO?

You'll see two acronyms used for this work:

  • GEO (Generative Engine Optimisation): the term coined in a 2023 research paper (Aggarwal et al).
  • AEO (Answer Engine Optimisation): the older, slightly broader term.

GEO was coined in a research paper. Agencies adopted it because a new acronym sells a new service line. AEO describes the actual work.

I prefer AEO, and I'll use it from here.

The reason is technical. "Generative engine" describes how the engine produces text, but the thing that's actually new for businesses is the user behaviour, not the model architecture. People are asking questions and getting direct answers, instead of getting a list of links and clicking through to find their own answer. That's an answer engine.

ChatGPT, Perplexity, Google AI Overviews, Bing Copilot, all of them are answer engines. Some generate, some retrieve and summarise. From the business's point of view, the job is the same in either case: be in the answer.

How AI actually picks what to cite

The mechanics matter. This is where most bad advice breaks.

When you ask ChatGPT or Perplexity a complex question, the system doesn't run one search. It breaks the question into smaller sub-queries, runs all of them, reads the top results from each, then writes a single answer with citations spread across the sources.

Google calls this query fan-out. ChatGPT and Perplexity do the same thing under a different label. Two implications:

  1. You don't need to be the single best source for the whole answer. You need to be the best source for one slice of it.
  2. Your traditional search ranking still matters. AI assistants pull from search indexes. If you're not in the top results across Google and Bing, the AI never sees you, and no amount of "GEO optimisation" will fix that.

So the foundation is still SEO. AEO is the layer on top, not the replacement.

The fictions

Here's the stuff worth ignoring.

Fiction 1: SEO is dead, GEO replaces it

It doesn't. Every credible bit of research and every practitioner I trust says the same thing: AI systems retrieve from search indexes (Google, Bing) and ground their answers in pages that already rank well. Strong SEO is the prerequisite for being cited by AI, not an alternative to it.

If your foundations are weak, fixing them does double duty. It improves your rankings AND your AI citation likelihood at the same time.

Fiction 2: GEO is a new discipline that needs specialised tools and vendors

Mostly false. About 80% of AEO is just doing SEO well. Clear, accurate, well-structured content. Authoritative sources. Technical health. The new 20% includes some genuinely different thinking (query fan-out, structuring content for extraction, monitoring citations), but it doesn't need a separate stack of tools.

If you've been pitched by an agency or vendor selling a "GEO platform" as a brand-new discipline, ask one question: what specifically are you doing here that I don't already need to do for SEO? The honest answer is usually "not much, but the dashboard is nice." If a pitch opens with "SEO is obsolete", they're selling the acronym, not the work.

Fiction 3: AI-generated content is enough to win

False. Google's helpful content updates penalise scaled, low-effort AI content. AI assistants prefer to cite sources that look authoritative, with named authors, real expertise, and verifiable claims. If your content reads like it was generated, both search engines and AI systems are going to deprioritise it.

You can use AI to draft. You can't use it to think for you.

Fiction 4: AI search is going to drive massive traffic

Not yet. News sites and large content publishers (Wired, the BBC, similar) report under 1% of traffic from AI referrals. AI citations aren't clicks. People read the summary and stop.

So the way to think about AI visibility is closer to a brand and PR play than a traffic-acquisition play. You're being mentioned in front of an audience that has chosen to ask the question. That's valuable, but it's not the same as inbound clicks.

The facts

Now the useful part.

What the research found

The 2023 research paper that introduced GEO (Aggarwal et al) tested nine specific strategies for boosting visibility in AI-generated responses. Three of them worked across every domain they tested:

  • Adding citations to credible sources within the content
  • Adding direct quotations from relevant experts
  • Adding statistics instead of qualitative descriptions

Each of these delivered up to a 40% visibility boost in the paper's controlled tests. Keyword stuffing, the strategy most analogous to old-school SEO tactics, did nothing or made things worse.

The paper has its limits. The lifts came from a synthetic benchmark, not live retrieval through current ChatGPT or Perplexity. Treat the directional finding (specific, attributed evidence helps) as solid. Treat the exact percentages as not generalisable.

But the underlying signal is clear:

AI systems extract and cite content that contains specific, verifiable, attributed evidence.

Vague writing doesn't get cited. Generic writing doesn't get cited. Anonymous writing doesn't get cited.

What this means for your content

The bit most articles skip is simpler: be non-commodity. (In the industry this is known as non-commodity content. Google's helpful content guidance describes the same thing as content with "substantial value compared to other pages".)

Most pages get ignored by AI for the same reason most pages get ignored by readers. They're rewritten Wikipedia. They restate what's already on 50 other sites. There's nothing in them that's only on this page.

What gets cited:

  • Original data from your own work
  • Named experts saying specific things
  • Industry-specific context that isn't easily findable elsewhere
  • Declarative points of view that other sources don't have
  • Concrete examples and worked scenarios

If your top pages could be merged with three competitors' pages and lose nothing, the brief is the problem. No amount of structural optimisation fixes that.

What works structurally

Assuming your content is worth citing, a few moves help AI extract from it cleanly:

  • A clear summary in the first 200 words. Roughly 44% of LLM citations come from the first 30% of a page (Kevin Indig's analysis of 1.2M ChatGPT answers, 2025). Don't bury the answer.
  • Declarative sentences. Compare:
    • Citable: "AEO requires content with named authors, statistics, and verifiable claims."
    • Not citable: "Some experts believe AEO might involve various credibility signals."
  • Named entities. Specific products, organisations, places, dates, figures. "Wine export labelling in NZ under the Wine Act 2003" is extractable. "Wine export rules" is not.
  • Author attribution. Named authors with real credentials. Anonymous content gets deprioritised.

A caveat before the playbook

Things are changing fast in this space, and any specific tactic you read about today might be useless in 12 months. So focus on the principles that will hold:

  • Be useful. Solve the actual question, not the generic version of it.
  • Be specific. Real numbers, real examples, real names.
  • Be cited. Reference your sources clearly, give people something to verify.
  • Be the canonical source. If your business has expert knowledge, write the page that should be the reference.

Treat the extraction layer the way you'd have treated featured snippets a few years back. Optimise for it, don't build a business on it. If a tactic works only because it games extraction, expect it to get neutralised in the next model update. If it works because the content is genuinely better and harder to find elsewhere, it keeps working.

That's the bar.

What I'd actually do this quarter

Five moves, in order. Each one builds on the one before.

1. Audit your current AI citation rate

Run the questions your customers actually ask through ChatGPT, Perplexity, and Google's AI Mode. Note where you're cited, where competitors are cited, and where nobody from your category is cited. That last bucket is your opportunity.

2. Audit your top 10 pages for commodity content

For each one, ask one question: is anyone else saying this, in this way, with this evidence? If the answer is yes for more than half of them, the editorial brief is the problem. The structure can be fixed in a week. The content thinking takes longer.

3. Publish original data or evidence

You almost certainly have something nobody else has. Client outcomes, an internal benchmark, an industry observation from working with a particular kind of business. One defensible number changes citation likelihood more than any tactic. The research backs this up: statistics, citations, and quotations were the three winners in the Aggarwal paper.

4. Build your brand presence off-domain

AI systems retrieve from Reddit, YouTube, LinkedIn, podcasts, and industry publications, not just from your website. Citations follow presence. If your business never shows up on those surfaces, you're invisible to a meaningful slice of AI retrieval.

This doesn't mean spreading thin across every channel. Pick two or three where your audience actually spends time, and show up consistently.

5. Set a measurement loop

Track your citation rate monthly. Run the same prompts through the same AI systems and log what changes. A spreadsheet works, or a tool like Profound if you want it automated. Cadence matters more than format.

Pull AI bot hits from your access logs too (GPTBot, OAI-SearchBot, PerplexityBot, ClaudeBot). Closest thing to a leading indicator you'll get.

Ignore llms.txt for now. It's a proposal with near-zero adoption signal from the major model providers.

Where this is heading

The businesses that come out ahead are the ones running both layers at once. Strong SEO foundations, so the search engines (and the AI assistants pulling from them) can actually find you. And specific, attributed content that's worth citing when those AI assistants generate an answer.

This isn't a new discipline. It's the same job you've been doing, with the AI extraction layer added to the brief.


If you want help auditing where your business shows up in AI answers, or working out what to publish next, that's the SEO and GEO service at Jungle Gym.

Enter the jungle

Business consultants who don't understand digital tech or channels. AI firms that lack business experience. Agencies that don't understand international markets. We bridge all three.

Read more about how we work