What's Google Sitemap

The concept of website Sitemaps is not new. In the early days of the Internet, having a
Sitemap was as important as it is today, but in those days, Sitemaps were primarily
intended for human visitors; that they also were helping search engine crawlers was
just a side benefit. HTML Sitemaps are the organized collection of site links and their
associated descriptions. Use of HTML Sitemaps was and still is one of the “nuts and
bolts” of SEO, and they are still the most popular Sitemap type.
Over the years, search engines realized the benefit of Sitemaps. Google jumped on this
concept in 2005 with the creation of its own Google Sitemap Protocol. Shortly after,
Yahoo!, Microsoft, Ask, and IBM jumped on the bandwagon. During 2006, Google
Sitemaps Protocol was renamed XML Sitemap Protocol, to acknowledge its “universal”
acceptance. The work of these joint efforts is now under the auspices of Sitemaps.org.
The premise of using XML Sitemap Protocol was that it would help search engines
index content faster while providing ways to improve their existing crawling algorithms.
Using XML Sitemap Protocol does not guarantee anything in terms of better
page rankings. Furthermore, use of XML Sitemap Protocol is not mandatory for all
sites. In other words, a website will not be penalized if it is not using XML Sitemaps.

Understanding Sitemaps

Sitemaps are divided into two broad categories: those created for human users and
those specifically created for search engine crawlers. Ultimately, both of these categories
are equally important.

Why Use Sitemaps?

It is important to use Sitemaps because they help your visitors quickly get to the information
they need, and they help web spiders find your site’s links.
There is no universal Sitemap rule that you can apply to every site. Understanding
different Sitemap options should help you identify the right type for each situation. The
following subsections discuss some of the reasons for using Sitemaps.

Crawl augmentation

Although web spiders are continuously improving, they are far from perfect. Search
engines have no problems admitting this. Here is what Google says about crawl aug

Submitting a Sitemap helps you make sure Google knows about the URLs on your site.
It can be especially helpful if your content is not easily discoverable by our crawler (such
as pages accessible only through a form). It is not, however, a guarantee that those URLs
will be crawled or indexed. We use information from Sitemaps to augment our usual
crawl and discovery processes.

Poor linking site structure

Not all sites are created equal. Sites with poor linking structures tend to index poorly.
Orphan pages, deep links, and search engine traps are culprits of poor site indexing.
The use of Sitemaps can alleviate these situations, at least temporarily, to give you
enough time to fix the root of the problem.

Using Sitemaps to help Google find content hosted on your site

Crawling frequency

One of the biggest benefits of using Sitemaps is in timely crawls or recrawls of your site
(or just specific pages). XML Sitemap documents let you tell crawlers how often they
should read each page.
Sites using Sitemaps tend to be crawled faster on Yahoo! and Google. It takes Google
and Yahoo! minutes to respond to Sitemap submissions or resubmissions. This can be
very helpful for news sites, e-commerce sites, blogs, and any other sites that are constantly
updating or adding new content.

Content ownership

Many malicious web scraper sites are lurking around the Internet. Having search engines
index your content as soon as it is posted can be an important way to ensure that
search engines are aware of the original content owner. In this way, a copycat site does
not get the credit for your content. Granted, it is still possible for search engines to
confuse the origins of a content source.

Page priority

XML Sitemap Protocol allows webmasters to assign
a specific priority value for each URL in the XML Sitemap file. Giving search engines
suggestions about the importance of each page is empowering—depending on how
each search engine treats this value.

Large sites

Using Sitemaps for large sites is important. Sites carrying tens of thousands (or millions)
of pages typically suffer in indexing due to deep linking problems. Sites with this many
documents use multiple Sitemaps to break up the different categories of content.

History of changes

If all your links are contained in your Sitemaps, this could be a way to provide a history
of your site’s links. This is the case if your site is storing your Sitemaps in the code
versioning control. This information can help you analyze changes in page rankings,
the size of indexed documents, and more.

  • All right reserved:
  • 2017