In case your site is not getting crawled, you are missing a lot on the block. It is an essential SEO feature under which Google indexes your website to achieve organic traffic from searches. Any site that fails to get indexed would mean that you are lost amidst millions of useless websites. There would be no audience for your content, as it will not be included in Google’s search index.

Before you fix the indexing bug, you need to diagnose the indexing issue. Below specified are some of the most common reasons for your site not getting indexed:

  1. Your Site is indexed under a www or Non-www Domain

It is important to note that www is a subdomain, which is why http://website.com is quite different from http://www.website.com. To make sure both the sites are indexed by Google, you have to add both the websites to your Google Webmaster Tools account. You will need to verify both the sites. However, you can always set a preferred domain.

Domain Name

  1. Google can’t Find your Site

New websites usually experience this. You may try waiting for a few days, but in case Google still does not index your site, check whether your sitemap is uploaded rightly and is operating the right way. You should request Google crawl to fetch your site in this case. Here is what Google suggests:

 

  • Go to the Webmaster Tools Home page, hover and click over the required site.
  • Under the Crawl option, select Fetch as Google
  • Provide the path to the page that needs to be checked.
  • Select Desktop from the drop-down list.
  • Once you click over Fetch, the requested URL will be fetched by Google.
  • After seeing the Fetch status of “Successful,” click Submit to Index. In the end, you need to click for the indexing type based upon your choice and requirement.

 

Site Map

  1. Robots.txt is blocking your Site or Webpage

One of the significant issues faced by websites is when the developer or editor has left your website blocked through robots.txt. No worries, the fix, in this case, is easy! You just need to remove that particular entry from robots.txt. Once this is done, your website will reappear in the index.

Robot.txt

  1. Oops! Your Site Got De-indexed

If this is the case, you are in deep trouble down under.

In case your site has been penalized manually and removed from the index, you would be notified about it beforehand. For those who have a website with a shady past, the website might be on the verge of getting penalized, which could be the reason for de-indexation.

If this is the reason for your website getting de-indexed, you have to gear up for some extra hard work.

  1. Duplicate Content

The golden rule for any website is to stay away from any sort of duplicate content. In case your website is loaded with lots and lots of duplicate content, you are confusing search engines and leading them to de-index your site. When multiple URLs point towards the same material, you are confusing the search engine. You can rectify this issue by picking one page and redirecting rest to 301.

Avoid Duplicate Content

It is critical to fetch proper indexing and crawling from Google to gain more viewership, which leads to more conversions. Keep the given pointers in mind while diagnosing your indexing problems.

Previous Next