Sitemap is a crucial component when it comes to optimizing your website. They help search engine bots to crawl and index your site so it can be listed on search engine results.
Sitemap not only help Google to index your site, but they also give valuable tidbits of information such as:
- The number of times a page is updated
- The last time that you changed a page
- The relationship of one page to another
Sitemap is crucial if you have a lot of archived content, external links and several hundreds of pages. We will assume that you already have a sitemap and we will look at how to submit sitemap to google and tweak it for optimal performance.
Tips on How to Submit Sitemap to Google Webmaster Tools
1) Start off by submitting the sitemap to Google.
Use the Google Search Console to submit sitemap to Google.
Go to dashboard, click on Crawl Sitemaps > Add Test Sitemap.
This will allow you to test your sitemap, view and analyse the results, before you finally click on Submit Sitemap. The Test Sitemap option allows you to find errors that may prevent your key pages from being indexed.
Submitting a sitemap is not a guarantee that all pages will be indexed. It serves to inform Google of the important pages on your website.
Google will know the layout of your pages and you will see errors to correct so pages are properly indexed.
2) Highlight high-quality pages
If your sitemap sends the bots to low-quality pages, then Goggle will assume that your full website is of low-quality and will not send visitors to your site — from an SEO standpoint this is the death knell for your website.
Use the sitemap to send bots to highly optimized pages, with images and video. The pages should also have unique content and lots of user engagement from comments and reviews.
A sitemap of poor quality pages can even stop people from accessing your login page, one of the simplest pages to design and optimize.
3) Identify “problem pages”
Google Search Console can be a bit of a dozer when it comes to telling you which pages have not been indexed.
You may submit 10,000 pages, and only 8,000 will be indexed. GSC will not inform you which 2,000 pages are problem pages and have not been indexed.
Ideally, you should split each product or service page into a different XML file and then run a test to see if they will be indexed.
Sometimes the problem may be as simple as pages not having images or pages having copy that is not unique. Isolate these problem pages and correct them for proper indexing of your site
4) Robots Meta Tag vs. Robots.txt
The “no index, follow” tag is useful when you don’t want Google to index certain pages. There are pages on your site that you don’t need to appear in search engine results and this tag help achieve this goal.
On the other hand, you use the Robots.txt to stop indexing when your crawl budget is being depleted by unnecessary page indexing.
This is a great tool when you find Google indexing product pages over core business pages. This will help people find the most relevant pages of your website and then they can access the other pages when they visit these important pages first.
5) Exclude “no index” URLs from your sitemap
When you include “no index”. URLs in your sitemap, you are not only wasting your crawl budget, but also creating a contradictory statement.
When you include the URLs in your sitemap, you are asking Google to index the page, by virtue of it being on the sitemap. Since it has a “no index” tag, you are at the same time telling Google not to index the page — a huge controversy and waste of crawl resources.
It is common for people to make these inconsistency mistakes so take care and exclude these URLs from your sitemap for it to be effective and efficient.
6) Combine XML sitemaps with RSS/Atom feeds
Google gets to know when you update your pages or add fresh content through RSS/Atom feeds. Using RSS/Atom feeds in conjunction with your sitemap helps Google understand which are the most important pages on your site, as well as which ones you have updated and the updates should be reflected in search results.
This practice helps people know when you have put in fresh content and they can revisit the pages to see what else you have to offer.
7) Keep your sitemap size small
A small sitemap uses few resources on our server and enhances its performance. In 2016, Google and Bing increased the maximum file size from 10MB to 50MB. Even then, you should keep our sitemap size small, and ensure that only priority pages are included.
It is not wise to bloat your sitemap with unnecessary pages simply because you want to increase your visibility. Let quality trump quality by including only the most important pages in the sitemap.
8) Create multiple sitemaps for large websites
Google allows up to 50,000 URLs on a single sitemap. This is more than enough for a standard website, but some sites require more than one sitemap. For example, large e-commerce sites may need one sitemap for the product pages and another for their blogs, articles, etc.
9) Don’t update modification times until you make credible modifications
Some people try to trick the search engines into re-indexing their sites by updating modification times without making substantial changes.
This is a risky SEO tactic and when Google notices this practice, your date stamps may be deleted. Always provide new value for your visitors before you update your modification times.
10) Use dynamic XML for sitemaps of large websites
Keeping up with the Meta Robots on huge websites is a challenging task. To avoid headaches, set up rules logic to see which pages should be added to your sitemap. Luckily, there are tools that help in creation of dynamic sitemaps. Look for some of the best and use them if you have a very large website.
Sitemaps are a crucial part of your SEO. If you are facing issue related to sitemap then contact to SEO Expert. And take note of the points listed above, and potential visitors will be able to easily find you on the search engines.