What's the difference between Google's "submit URL" tool and just having a URL on your submitted xml sitemap?

by Kjenkinsss   Last Updated September 20, 2019 18:04 PM

I've used Google's "submit URL" feature in hopes of getting certain URLs indexed (or updated in the index), but I'm curious if submitting the URLs you want to be indexed via the xml sitemap is sufficient? I see the "submit URL" feature as a next step if a URL on my xml sitemap is still not being indexed -- is this correct? Or do they have two totally separate functions?

Answers 1

There is generally no reason to submit URLs to Google at all. Googlebot is perfectly capable of finding your pages by following links from other pages. Google will also come back and re-crawl pages at regular intervals, so when you make changes, all you need to do is wait.

Sitemaps are widely mis-understood. It is a common belief that they help get pages crawled and ranked better. If Google can't find a page through links then it is possible that including a URL in the sitemap will get Google to crawl and index it. However, without any links pointing to a page, it is unlikely to rank for anything with any competition. Sitemaps just don't help with rankings at all. Most sites don't need sitemaps. At best, they give you more stats about your pages in Google Search Console. See The Sitemap Paradox.

Google's submit URL feature is similarly useless. The only time I can imagine that it should be used is if you create a single page that isn't linked from anywhere and don't have it in a sitemap. If you have that page in a sitemap, there is no need to submit it separately.

Neither sitemaps nor submitting URLs will encourage Google to index or re-crawl pages. Those features are mostly for new pages. Even though sitemaps have "priority" and "lastmod" fields, Google says that it ignores them because webmasters don't usually fill them in accurately. See Sitemap update lastmod and ping. what is the advantage?

If you want Googlebot to come back and re-crawl a page right away, the only tool that Google gives you is the Fetch as Google tool. Using that you can get Googlebot to immediately crawl the page and update its content in the Google index. However, if a page isn't indexed, it is unlikely to get Google to start indexing the page. That tool is limited to 10 fetches per day, so it won't work if you update a large number of pages. For more info see Need Google to recrawl the pages.

If you have a page that isn't being indexed, that is usually because it isn't linked well, it doesn't cover topics for which people are searching, or because it is duplicate of something else. See Why aren't search engines indexing my content? If you want to get more pages indexed you need to make you content unique, cover topics for which users search, and get more inbound links to your website. It is almost never an issue of getting Googlebot to come take another look.

Stephen Ostermiller
Stephen Ostermiller
June 07, 2018 11:56 AM

Related Questions

Updated October 14, 2019 00:04 AM

Updated January 28, 2018 19:04 PM

Updated August 28, 2019 17:04 PM

Updated September 15, 2017 09:04 AM