Google SGE AI Answers Showing Content Blocked By Google-Extended Robots Directive

 Google-Extended Robots Directive Does Not Work For Search Generative Experience

Google-Extended Robots Directive Does Not Work For Search Generative Experience

A couple of weeks ago Google released a new robots.txt directive to tell Google not to use your content for Bard or other AI Google projects - Google-Extended. The Google Search Generative Experience does not currently use Google-Extended, Google told me. That means Search Generative Experience's AI-generated answers can and will continue to show unless you block Googlebot fully.

Google wrote, that this Google-Extended robots would tell Google not to use your content to improve "Bard and Vertex AI generative APIs, including future generations of models that power those products." Initially I assumed it applied to the AI-generated snapshots provided by Google Search Generative Experience but it does not.

A Google spokesperson told me, "Search Generative Experience is a Search experiment so website administrators should continue to use the Googlebot user agent through robots.txt and the NOINDEX meta tag to manage their content in search results, including experiments like Search Generative Experience."

For example, here is an AI-generated answer from Search Generative Experience that including a card from The Rolling Stones website:

Google-Extended Robots Directive Does Not Work For Search Generative Experience

If you look at their robots.txt file, Google-Extended it listed there:

Google-Extended Robots Directive Does Not Work For Search Generative Experience

Glenn Gabe shared another example but by the time I wrote this piece, Search Generative Experience is no longer showing VentureBeat in SGE for this query:

Since Search Generative Experience is built into search, Google seems to believe that web publishers are okay with not applying Google-Extended for the AI-snapshots in Search Generative Experience. "The context is that AI is built into Search, not bolted on, and integral to how Search functions, which is why robots.txt is the control to give web publishers the option to manage access to how their sites are crawled. As you know, we’ve used AI and Large Language Models in Search for many years to not only drastically improve the quality of our results, but also introduce unique ways to search, like Lens and multisearch. These efforts have continued to enhance our ability to connect people with more relevant web pages and send valuable traffic to the ecosystem," a Google spokesperson added.

Follow Us Our Official BlogGoogle SEO Official News

Follow Us on Social: 

FacebookTumblrTwitterMediumInstagramLinkedInPinterestMastodon, & Subscribe Our Blog

Content Source: Seroundtable.com

No comments:

Post a Comment

Pages