AI Features and Your Website: What Google Now Explicitly Documents (Inclusion, Exclusion, and Technical Gotchas)

AI automation impacting search traffic trust amid Google AI Overviews backlash

If you’ve noticed Google answering questions before anyone clicks through, you’re not alone. Google has finally clarified how AI-driven results choose what to cite, what controls you actually have, and why small technical missteps can still cost you visibility.

What Google actually counts as “AI features”

Google now groups things like AI Overviews and AI Mode under AI features, and it’s worth treating them as a new presentation layer rather than a whole new search engine. These results can pull together multiple sources at once, and Google says it may run related searches in the background (they call it “query fan-out”) to build a response and surface supporting links.

Inclusion basics: no special badge, no secret markup

The big takeaway for AI Overviews is oddly reassuring: Google says the usual SEO playbook still applies, and there aren’t extra hoops you need to jump through. If your page can be indexed and shown with a normal snippet in Search, it’s eligible to appear as a supporting link in these AI results. Google also spells out that there’s no special schema.org type or “AI file” you’re meant to create.

The boring technical stuff still decides the outcome

A lot of “why aren’t we showing up?” problems still come back to technical requirements you’ve probably heard a hundred times, but they hit harder now because AI results are picky about pulling clean, readable information. Google lists the usual suspects: don’t block crawling in robots.txt, make key content available as text (not trapped in images), keep internal links sensible, and make sure structured data matches what users can see on the page.

Measuring performance: it’s not a separate report

If you’re hunting for a neat dashboard called “AI traffic”, Search Console isn’t going to hand it to you. Google says clicks and impressions from AI Overviews and AI Mode roll into the normal Performance reporting under the “Web” search type. So you’re stuck doing a bit of detective work: compare periods, watch for queries where impressions stay steady but clicks slide, and keep an eye on conversions rather than clicks alone.

Exclusion and limits: you can restrict snippets, not just AI

The controls Google points to live under preview controls, and they’re blunt instruments by design. Want less of your text reused in Search (including AI features)? Google recommends using nosnippet, max-snippet, data-nosnippet, or noindex. That last one removes the page from Search altogether, so it’s more like cutting off your arm to stop a paper cut. Google also notes that robots.txt is the core way site owners manage crawling for Search, and mentions Google-Extended separately for limiting use in some other Google systems.

The robots meta tag isn’t just for developers

Most people meet the robots meta tag through a CMS toggle that says “Discourage search engines”. That’s fine until you need precision. Google’s documentation makes it clear you can apply rules at page level with <meta name="robots" ...> or target Google specifically with <meta name="googlebot" ...>. It also supports text-level control using data-nosnippet on specific HTML elements, which is handy when you want the page indexed but you don’t want one particular section quoted back at people.

Conflicts happen, and the strictest rule wins

If you’ve ever stacked tags and hoped for the best, nosnippet is the one that tends to steamroll everything else. Google explicitly says that when tags conflict, the more restrictive instruction applies. A page with max-snippet:50 and nosnippet won’t show a 50-character snippet, it’ll show none. That matters when you’re trying to “tone down” visibility without killing it, because one heavy-handed setting can quietly flatten your search appearance sitewide.

Non-HTML pages have their own trap door

PDFs, images, and other files don’t always listen to your on-page settings, which is where X-Robots-Tag comes in. Google notes you can control indexing and snippet behaviour for non-HTML resources via an HTTP response header, and it’s often the cleanest way to handle files you can’t easily edit (like automatically generated PDFs). If you publish guides as PDFs and wonder why they keep showing up in odd places, this is one of the first switches to check.

Don’t rely on JavaScript to “fix it later”

JavaScript can break your controls if you inject or modify meta tags after the page loads. Google straight-up recommends avoiding JavaScript for injecting or changing meta tags whenever possible, and to test carefully if you must do it. In practice, that means your page might look correct in the browser, while Googlebot saw something else entirely. Not fun when you’re trying to reduce snippet length or exclude a section from being quoted.

Paywalls need a clear signal (or you risk mixed messages)

If you publish gated articles, paywalled content can cause confusion for crawlers unless you label it properly. Google provides a structured data approach using isAccessibleForFree and hasPart with a CSS selector, and it frames this as a way to separate legitimate paywalls from cloaking (which violates spam policies). There are also practical rules: don’t nest content sections, use .class selectors for cssSelector, and pick a paywall setup that doesn’t accidentally expose restricted text to the browser if you don’t mean to.

Next Move

Want your website to show up (and be quoted accurately) in AI-driven search results?
Myoho Marketing’s Generative Engine Optimisation (GEO) helps businesses improve how they’re surfaced across AI features like Google AI Overviews and AI Mode, so your content is easier for Google to interpret, harder to misrepresent, and more likely to earn visibility when users stop scrolling and start trusting summaries.

Book a free GEO consultation and we’ll identify the technical gaps holding your AI visibility back, plus the highest-impact fixes to make next.

Picture of Darshin Desai
About Author : Darshin Desai is the Founder and Managing Director of Myoho Marketing, where he helps small and mid-sized businesses grow through Generative Engine Optimisation (GEO), Search Engine Optimisation (SEO) and performance advertising. With 10+ years in digital marketing, he works with brands across Australia, New Zealand, the USA and the UK to improve visibility in search engines and AI platforms like ChatGPT and Gemini. He writes about search strategy, AI in marketing and sustainable digital growth.

Your slot is reserved

✓ Check your email for further details.

🎉 Thank you for entering the lucky draw! We wish you all the best. Winners will be announced by email.