6 votes

Anyone tried the IndexNow feature yet? Does it benefit SEO anyhow?

Topic removed by site admin

5 comments

  1. DrStone
    (edited )
    Link
    Is it highly talked about? I’m finding practically no current discussion. The linked article is from 2022. I’m not finding much newer than 2023, and it’s all blog posts (and mostly seo spammy...

    Is it highly talked about? I’m finding practically no current discussion.

    • The linked article is from 2022.
    • I’m not finding much newer than 2023, and it’s all blog posts (and mostly seo spammy blogs); nothing on Wikipedia, MDN, or other popular industry publications.
    • There’s only 25 questions on stack overflow tagged “indexnow”, with just two from 2024.
    • The full search engine implementation list is only: Bing, Naver, Seznam.cx, Yandex, and Yep.
    • Anecdotally, I’ve been in the web industry for well over a decade and have never heard it mentioned by my peers in person or online.
    6 votes
  2. [3]
    unkz
    Link
    This is only going to be useful for an extremely small subset of absolutely massive webpages, eg. Ebay, Amazon. Otherwise, like the article mentions, sitemaps are going to do everything perfectly...

    This is only going to be useful for an extremely small subset of absolutely massive webpages, eg. Ebay, Amazon. Otherwise, like the article mentions, sitemaps are going to do everything perfectly fine.

    3 votes
    1. [2]
      pyeri
      Link Parent
      At a broad and collective level, the work of the search engine will be drastically reduced though, if folks started submitting updated pages to crawlers themselves rather than vice versa i.e. the...

      This is only going to be useful for an extremely small subset of absolutely massive webpages,

      At a broad and collective level, the work of the search engine will be drastically reduced though, if folks started submitting updated pages to crawlers themselves rather than vice versa i.e. the crawlers doing the crawling.

      Also, from the search engine's perspective, crawling is just a daily errand and it shouldn't matter whether they crawl a million pages of Amazon/EBay or a million site/blog pages of some John Doe? The churn and grunt work will essentially be the same and can be reduced with this approach.

      1. unkz
        Link Parent
        People already do submit updated pages — via the sitemap protocol. Google also has an API endpoint for submitting new sitemaps. If you check your server logs, you will see Google pull your sitemap...

        if folks started submitting updated pages to crawlers themselves

        People already do submit updated pages — via the sitemap protocol. Google also has an API endpoint for submitting new sitemaps. If you check your server logs, you will see Google pull your sitemap automatically fairly frequently too.

        Also, from the search engine's perspective, crawling is just a daily errand and it shouldn't matter whether they crawl a million pages of Amazon/EBay or a million site/blog pages of some John Doe?

        Crawl budget optimization is a whole topic on its own. Suffice to say, they really do care who they crawl and how much they crawl, and convincing them to change their mind is a lot of work.

        4 votes
  3. post_below
    Link
    DDG's primary source of results is still Bing, so in a sense their mind is made for them. :)

    Duckduckgo is about to make its mind

    DDG's primary source of results is still Bing, so in a sense their mind is made for them. :)

    1 vote