5 Reasons Your Page Is Not Indexed On Google Search

There might be several reasons why your website not indexing on Google search console. Here are the top 5 Worth mentioning.

Table of Contents

Why does my website not show up in Google search?

It is a good idea to give it at least a couple of days, at most a week. If it has been more than a week, or even a month and your website pages are still not there. then here is a list of items you need to start checking to see what is stopping Google from indexing your content or website:

Have you checked your robots?

Robots.txt is the first place that Googlebot visits on a website in order to know which web pages are NOFOLLOW or NOINDEX and such.

You can check your robots by going to your website, and typing the string below:

https://YOURDOMAIN/robots.txt

When looking at individual pages, do you have this in your HTML head section? Unless this is intentional Google will not index your page with this code:

				
					<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
				
			

You’ve blocked Google from indexing your page using a noindex tag

The robots noindex tag is handy to make sure that a certain page will not be indexed, therefore not listed on Google Search.

This method is Commonly used when a page is still under construction, and once completed the tag should be removed.

However, because of its page-specific nature, the tag may be removed in one page, but missed on another. With the tag still applying NOINDEX, your page will not be indexed by google, therefore not appearing in the search result.

 

If you want to remove the noindex tag from our system and reindex your page, here’s how to do that:

  • Remove the noindex tag from your page
  • Make sure your robots.txt file isn’t blocking Googlebot from crawling your site

There is many SEO tools that show you issues with the META or ROBOTS.TXT file. Have a look at our article on Free SEO tools that might help you to solve the problem here

Are you pointing the Googlebot to a redirect chain?

Googlebot is most of the cases a patient bot, google would like to go through every link they can, to do their best to read the HTML and then pass it for indexing and finally to show to the general public In the google search engnine.

However, if you set up a long winding redirection, or the page is just unreachable, Googlebot would stop looking. They will literally stop crawling thus sabotaging any chance of your page being indexed. Not being indexed means not being listed on Google Search.

Another thing is to not mix 301 and 302. Is it moved permanently or moved temporarily?

301 redirect chain

Have you set the canonical link correctly?

A canonical tag is used in the HTML header to tell Googlebot which is the preferred and canonical page in the case of duplicated content.

For example, you have a page that is translated into Polish. In that case, you’d want to canonical the page back to your default English version.

				
					<link rel="canonical" href="https://ekyos.co.uk/en">
				
			

Every page should, as standard to have a canonical tag.

Either link it back to itself in the case where it is unique content or link it to the preferred page if it is duplicated. Here comes the question, is your canonical link correct?

In the case of a canonical page and its duplicates, only the canonical page will appear on Google Search. Google uses the canonical tag as an output filter for search. This means the canonical version will be given priority in the ranking.

check if you have exceeded your Crawl budget.

Google has thousands of machines to run bots, but there are a million more websites out there waiting to be crawled.

Therefore, every bot arrives at your website with a budget, with a limit of how many resources they can spend on you.

To check if you have excided your crawl budget navigate to the search console and look at the stats section. 

For example, if you have 500 pages on your website, and google only crawling 10 pages per day, then it will take some time before the pages will be checked.

there are many ways to improve your crawling budget, firstly make sure the content of your website is unique and informative. Google will reward this type of website, and restrict those with low-quality content.

One important thing that I have already mentioned above that also applies in optimizing your crawl budget is to fix those redirect chains. They are not only inefficient they are also eating up your crawl budget.

you can also specify which pages to avoid crawling. That will save you the daily spending on the assigned budget. A simple line of code in the Robot.txt will help to do it:

				
					User-agent: Googlebot
Disallow: /Example-folder/
				
			

Is your page an orphan?

An orphan page is a page that has no internal links. Perhaps the link is faulty causing the page to be unreachable, or during a website migration, the link is accidentally removed. Check for broken links with the FREE SEO TOOLS 

An orphan page can’t be crawled because there is no way to be crawled. It is not linked to your website. That’s why interlinking is so important because it acts as a bridge for the crawlers from one page of your content to another.

Is everything correct on your webiste?

Is everything working fine, and nothing seems incorrect? well, the issue might be on the google side. Some websites might not be on google’s priority list, therefore you have to wait until things change. 

You can try to speed the indexing by doing the following steps:

  • Update your sitemap in Google Search Console
  • Submit your URL directly to Google for indexing
  • Use the Fetch & Submit tool in the search console
  • Get high domain Authority (DA)
  • Check the speed and stats of your website. The faster it gets, the better results you will get with google bots. 

Summary:

  • Check your robots tag and X-Robots-Tag.
  • Reduce redirect chains.
  • Check whether your canonical link is correct.
  • Optimize your website for better crawl budget allocation.
  • Interlink your web pages to avoid an orphan page situation.
  • Update your sitemap in the search console.
  • Fetch and submit your page URL in the search console.
  • Build your website to get higher domain authority.
  • Make sure that your website loads fast.

Subscribe To Our Newsletter

Get updates and learn from the best

By continuing, you accept Our privacy policy

More Articles To Explore:

web design trends 2022
Web Development

web design trends 2022

In web design, as in all things, change is inevitable. What was popular one year may not be the next. So what’s on the horizon for 2022? Here are some of the trends we can expect to see more of in the coming years. Keep these in mind when designing your next website – you may want to jump on these trends early to stay ahead of the competition!

Read More »

Do You Want To Boost Your Business?

drop us a line and keep in touch

wordpress-blog post laptop

Contact Us

Please provide your details below and we will contact you within 24 hours