• Join Administrata today and get 15 free posts!

    Register now and claim a free content order to boost your community activity instantly.

    Register Now

Google begins testing AI-only search results

I think when google starts seeing their advertising revenue from legitimate websites drop proportionately to the ai scraped results increasing, they my be changing their tune. Why pay for keywords when the only visitor will be Google AI pre-answering client queries. And why pay for ad impressions that show up on websites from Google when nobody see's them anymore.
Its a race right now, and there is an existential threat to Google that if someone developers a search portal that makes their answers easier and faster, then Google will ultimately lose its web dominance.

As a user, I objectively would prefer to have answers presented to me (as long as it's accurate, of course). Having to click through 10 websites and read and investigate and search is objectively harder.
 
As a user, I objectively would prefer to have answers presented to me (as long as it's accurate, of course). Having to click through 10 websites and read and investigate and search is objectively harder.
I can understand that. But personally I prefer to go thru ten websites than to be fooled by AI that is maybe 60% accurate.

I have been playing around with Google AI on search queries. It takes a query properly stringed together, and one more click on that very tiny icon to get to see three search results, and then you get to click on one of them. And none of them seem to be near what I am asking.

OR...

I prefer to review a page or two to see the likelihood of one of the websites attracting me to it. If I don't, I reform my query and try again.

But that's just me. I hate the way AI uses your data, and then obfuscates the links that you have to go find. Just give me the danged SERPS, and quit feeding me non-attributed content that is scraped from who knows where. (yes, I have a personal issue with all that lol)
 
I can understand that. But personally I prefer to go thru ten websites than to be fooled by AI that is maybe 60% accurate.

I have been playing around with Google AI on search queries. It takes a query properly stringed together, and one more click on that very tiny icon to get to see three search results, and then you get to click on one of them. And none of them seem to be near what I am asking.

OR...

I prefer to review a page or two to see the likelihood of one of the websites attracting me to it. If I don't, I reform my query and try again.

But that's just me. I hate the way AI uses your data, and then obfuscates the links that you have to go find. Just give me the danged SERPS, and quit feeding me non-attributed content that is scraped from who knows where. (yes, I have a personal issue with all that lol)
In no way am I comfortable with AI scraping our intellectual property and hallucinating answers back, so there are some serious other issues with AI. But on the sole topic of asking for a question and getting an immediate answer, AI overviews and AI summarizing results can be powerful.
 
I can understand that. But personally I prefer to go thru ten websites than to be fooled by AI that is maybe 60% accurate.

I have been playing around with Google AI on search queries. It takes a query properly stringed together, and one more click on that very tiny icon to get to see three search results, and then you get to click on one of them. And none of them seem to be near what I am asking.

OR...

I prefer to review a page or two to see the likelihood of one of the websites attracting me to it. If I don't, I reform my query and try again.

But that's just me. I hate the way AI uses your data, and then obfuscates the links that you have to go find. Just give me the danged SERPS, and quit feeding me non-attributed content that is scraped from who knows where. (yes, I have a personal issue with all that lol)
That might be so, but if you read through 10 websites, what are the chances they are all the same and 100% accurate? If I need something quickly, I end up asking my question to ChatGPT. Time is money.
 
In no way am I comfortable with AI scraping our intellectual property and hallucinating answers back, so there are some serious other issues with AI. But on the sole topic of asking for a question and getting an immediate answer, AI overviews and AI summarizing results can be powerful.
You can block Google’s crawlers from accessing your content via robots.txt, which will prevent them from scraping and summarizing your intellectual property. However, keep in mind that blocking crawlers this way will also prevent your entire site from being indexed and appearing in search results, which could affect your site’s visibility.

The choice and options are there. It does suck, but we have to adapt to survive in this new era of search or we’ll luck out.


One major question is whether an organization should block their site in some way to prevent their site content from showing in the SGE answer.

Answer: To put it bluntly, there is no easy way to do this without harming your site. To remove your content from AI Overviews, you need to block Googlebot itself (not just Google-Extended), which would result in your site no longer ranking and losing all of your organic traffic from Google.

If there is content that you don’t want in AI Overviews, and you don’t want it to rank either, we recommend blocking those individual pages. If you have several pages you wish to block, create a subfolder containing all of them; this folder can easily be blocked in robots.txt without impacting the entire site.

To clarify, let’s look at where Google gets the information it uses to generate AI Overview answers.

How do Google’s AI Overviews get information?​

AI Overviews don’t use a special user agent to crawl and fetch data; it relies on Googlebot. When you consider that Google already has all of this information thanks to Googlebot’s continual crawling of sites, it makes sense.

Additionally, relying on Googlebot limits the control websites can place over how the overviews gain information. If users could block Google from using their information, their AI tools would potentially be quickly hamstrung and couldn’t possibly work as well as Google would like. By relying on Googlebot, Google can acquire as much information as possible without fear of missing out, as blocking Googlebot entirely would mean your site would lose organic traffic and rankings.

This ultimately means that AI Overviews can surface any information about your site that Googlebot can normally access. The only way to hide information is to block Googlebot from accessing those pages via a robots.txt rule.

What if we have proprietary information on our site that we don't want available to the general public?​

Proprietary information would not typically be available on a company’s public website. AI Overviews are powered by the information that Googlebot crawls. The only information it factors in is what you put on your website and allow to be crawled (plus anything others write about your brand, but we’re not addressing that here).

Again, if you have content on specific pages you don’t want to be seen by Google, you can block Googlebot from accessing those specific pages through robots.txt rules that prevent crawling.

Be very careful with this, though. Double-check to make sure those pages aren’t indexed—if they are, and they are ranking, you need to put a noindex tag on them first and wait for Google to remove them. Once that’s done, you’re safe to put in the robots.txt block. Remember, this will only apply to those specific pages, and you don’t want to do this to your whole site or your important pages.
Edit: they’re now working on implementing more links to sites via their ai overviews!
 
Last edited:

Users who are viewing this thread

Back
Top