Site icon Return

15 Reasons Why Your Site is Not on Google

It is essential to make sure your website is showing in the Google search engine listings. Google processes over 40,000 queries a second, equalling 3.5 billion searches per day. If your website is not listed, growing your online visibility will be much harder.

If you can’t see your site in the search results, you can test if it has been indexed by searching for your domain using the search directive ‘info’:

Info:https://www.yourdomain.co.uk

To be listed in the search results, Google must index your website. No-one can organically visit your website if they can’t find it.

Indexing a new website on Google

If you’ve recently published a website, it can be anywhere between a few days and a few months before you see your website in Google. This is because Google has to crawl your site and add it to the indexing queue.

There are many factors that influence the rate at which your website may be crawled and indexed, however there are a few tips that you can do to try and hasten this process:

Technical reasons why you’re not indexed

If your website was once indexed and now isn’t, or you’ve published a new website and followed the above steps, you need to check for technical reasons why your website or pages are not indexed.

Are you telling Google not to index the pages, or telling it to index another one instead? Below is a checklist of what to look for when investigating the issue.

1. You’re using “noindex” tags

The “noindex” tag is a HTML directive placed inside the code of a webpage to tell Google not to add the webpage to its index. All webpages are set to “index” by default, therefore if the page is set to “noindex”, it is very easy to pick up.

This can be manually checked in the Chrome developer tools, or by using crawling tools that will tell you.

If you’re using a CMS such as WordPress or Magento, there are several extensions that allow easy manipulation of the noindex tag through user friendly UI, such as a check box. If your website utilises one of these tools, check that this has not accidentally been activated.

2. You’re using the robots.txt file wrong

It may be that not every page on your site needs to be indexed; some pages may exist purely for user experience, but they could be a negative influence on your organic performance. For this reason, every website can choose to use a file called robots.txt. This, amongst other features, allows you to tell Google what URLs you don’t want it to crawl. It is wise to note that this is a guide and not a rule – all spiders can choose to ignore this file, including spambots.

You can easily check to see if you have a robots.txt file by typing the following in your browser bar:

https://www.yourdomain.co.uk/robots.txt

To check if your robots.txt file is blocking Google, you can use Google’s tester to check for any issues.

Alt text: Use Googles robots.txt tester to see if it’s blocking Googlebot

One common mistake made with robots.txt files is writing the rules in the wrong order. Something as simple as the below can block your whole site from being crawled and indexed:

User-agent: *
Disallow: /

3. Your website has been hacked

Google is adamant on providing the best user experience by returning the most accurate and relevant results to a query. If your website has been hacked, this makes your website insecure and a danger to internet users, so consequently, you will be de-indexed.

Sometimes a hack can be obvious, in that the website is visibly manipulated, such as the Vogue website being hacked to display dinosaurs.

More malicious hacks that can significantly affect your organic performance are often much harder to spot. This includes injecting code, such as links, into your website, asking users to download harmful files or creating landing pages for emails that get you blacklisted from email clients.

If Google has detected a takeover, there will be a message in the search listings saying: “This site may be hacked”. Users will then be presented with a warning when trying to access your site. Another sign of being hacked is finding strange URLs from crawl reports, webserver log files or your Google Search Console Crawl Error Report.

Fix the problem and you will see your performance start to improve.

4. Google can’t find your page

Also known as orphaned pages, Googlebot is the spider in the World Wide Web and uses links to find URLs. If there are no links pointing to the page, then Google cannot crawl and index it.

If you have a webpage that is not indexed, use a crawling tool to see if it is being linked to from within the website and through your XML sitemap.

If there are no inlinks or backlinks, then this is classed as an orphaned page. There are situations where you may intentionally create an orphaned page, such as creating landing pages for email marketing, but you would not want these pages indexed anyway, so they wouldn’t be an issue.

If you find orphaned pages, link to them through content that is relevant to the topic of the page. This will allow users and search engines to discover more information that is useful to them.

5. You are cloaking your pages

Cloaking is the terminology used to describe the act of showing search engines different content to what you show users. If this is done in a malicious way, such as an attempt to trick search engines, your page will be penalised.

With the ever-improving technical advancements in the web, there are more suitable reasons to show different content based on the user’s IP, such as personalisation of product recommendations and nationality. This is why penalisation only occurs when the cloaking is used to manipulate search engine rankings.

6. You have dodgy backlinks

At one point, the number of websites linking back to yours could significantly improve organic performance, despite how irrelevant or toxic the websites were. Because of this, many webmasters bought links in bulk.

The algorithms have since been updated, and those websites drastically lost organic visibility through penalisation.

If you’ve paid for links recently or historically, then it is likely that Google has penalised you. In Google’s eyes, if your content is useful, people should want to link to you without being paid.

You may not have paid for links, but you might still have a large amount of referring backlinks from irrelevant websites of low value, signalling to Google that you could be part of a scheme.

You cannot always control who links to you, which is why you can use a disavow file to tell Google not to consider these external domains when ranking your website.

Conduct a backlink audit to highlight potential harmful referring domains to populate your disavow list.

7. You have spammy structured data

Structured data is code that can be used to provide search engines with more information about the content on the webpage.

It can result in receiving rich snippets in Google listings, such as stars, image thumbnails and stock availability to improve visibility.

If you mark your website up in a deceiving manner, for example if you add product-specific ratings on a category to get stars in the listing, you will get a penalty and the page will be de-indexed.

Check your Google Search Console account for a message from the team at Google for a manual action. Within a few days of receiving a manual action, you should see the reason why in your account. Fix the issue and send a reconsideration file to get the penalty lifted.

Alt text: Check Google Search Console for manual actions

If you do not have a manual action message, you may still have an automatic penalty. Check that your structured data follows Google guidelines.

8. You have too many ads

Many websites use adverts to monetise their website, in the same way TV channels use ads.

More ads equals more money, right? Maybe in the short term, but not in the long term. Not only do ads decrease page speed, they can also make for a bad user experience. If the ads are intrusive and make it difficult or uncomfortable for the website to use, the engagement decreases. This decrease is likely to damage rankings in the search listings, as it does not fit with Googles core values.

If you’re using an ad template, especially one that shows above the fold, think about redesigning your ad layout to be cleaner, keeping user experience in mind.

9. Intrusive pop ups and interstitials are damaging rankings

In the long-running project to improve web experiences for mobile users by Google, websites using pop ups and interstitials, which make it difficult for users to read the content on mobile, will see a decrease in rankings.

These tactics can be very useful for increasing awareness of services you provide, offers, newsletters and even surveys for collecting user behaviour data.

If you’re using these techniques, make sure that your users can easily close the box and access the content on a mobile device.

10. Duplicate content confuses Google

If more than one URL has the same content, Google may not know which one to rank and as a result, both pages rank low, or not be indexed at all.

Using a crawling tool such as Screaming Frog, you will be able to find pages that are exactly the same and investigate why.

Many CMSs create duplicate pages due to directory systems such as Magento. There are many solutions to resolving duplicate content due to limitations with CMSs, development knowledge or budget.

If there are a lot of duplicate content on the website, not only will this affect indexation of those pages, but it will lower the quality of your whole site. It is not good user experience accessing the same content from different locations.

11. Automating content through article spinning

It is against Google guidelines to automate content, because it often does not read naturally or well.

Article spinning is classed as a black hat SEO technique. It is the act of re-writing content to avoid plagiarism and uploading this content onto other domains, then linking back to yourself.

Taking one unique post and using a computer program to re-organise the words to make a seemingly new, unique post is the act of spinning.

This tactic clogs up the internet, making the web a void of the same information, which is why Google has invested a lot of time into detecting this activity and penalising participating websites.

If you are automating content for backlinks and your rankings have crashed, it is likely because of this, and you should remove the offending pages and revise safer ways to build your website profile.

12. Redirecting users to a different page than Google

This is another black hat tactic and is used to manipulate the rankings by showing search engines different content to users.

Typical use of a doorway page is to use plain HTML to target key terms that increase rankings, but when a user clicks through to the site, they are redirected to a more content-rich page.

This is an old technique that used to work, but can now result in a penalisation. If this is a technique you use, remove the doorway pages and optimise the user-targeted page and you should see your organic performance increase.

13. Stuffing content with the same keywords

In Google’s infancy, keywords were one of few signals to drive rankings. If the webpage uses the term that people are searching a lot, the page will rank highly.

As the algorithms have evolved, this tactic will now cause the webpage to fall out of the index. Stuffing the webpage with the same key terms reads unnaturally and is not useful for the user. This tactic upsets user experience, creating an uncomfortable experience and damaging trust in the brand.

Analyse the content on the pages that are not indexed and see if the content unnaturally repeats itself. If you think it does, then re-write the content to see organic performance improve.

14. Your website was overloaded

If the website was down the last few times Google has crawled your site, it is likely that it has been taken out of the indexed or dropped rankings.

The are many reasons why a website might be down, but a common cause is maintenance and server overload.

If there are too many people and bots visiting the website at the same time and your server cannot handle it, the site may go down, which is bad for all involved.

All may appear fine when you visit the website off peak, therefore you may never find the issue.

Access your webserver log files to find out what http response is being served to Googlebot. This will give you a better understanding of what Google is experiencing.

If you do not have server administrators maintaining and alerting you to issues, there are tools such as Status Cake that will email you when your website is down. 

If you find your website is getting overloaded, you need to look into upgrading servers.

15. No reason, just not indexed

Sometimes there is no reason. When Google crawls your site, it does not index it at the same time. Once the page is discovered, it will then be added to the index queue, where URLs are sorted in priority and then indexed. This is why no-one can say exactly when a URL will be indexed.

In the new Google Search Console account, there is a section where you can see discovered URLs that have not been indexed yet.

Alt Text: Google Search Console feature reporting crawled but not indexed URLs

Hopefully you can find the issue with your website by checking out these 15 areas. Every website is different and comes with its own issues, so get an SEO audit of your website to uncover why you’re not ranking with actionable insights to improve organic performance.

If ordeal of sorting out an SEO audit sounds like a lot of faff, give us a call! Our experienced technical SEO team is on hand to do all the hard work for you.