Why is it that the first thing we think of when traffic goes down is that we’ve been penalised by Google?
Penalties are a relatively rare thing and apply only to a very tiny fraction of the web, when a penalty strikes though, there is no doubt about it – the effects are devastating. No wonder so many webmasters panic when their rankings or traffic drop dramatically. The best thing to do, however, is not to panic. Instead examine all the likely technical factors for your website’s poor performance.
Early Detection of Traffic Fluctuations
I am an analytics addict and check traffic of a number of sites daily – even in real time. I would assume a busy business owner would not have the luxury of doing that as a part of their daily routine so the best thing you can do is tell your analytics package to alert you when there are big changes in traffic.
The List of Common False Positives
This is an ever-growing list of frequently overlooked causes of traffic drops, often attributed to a Google penalties. Some of these may cause a slight dip in the rankings (yes people still think it’s a penalty) but others may cause the entire site to be de-indexed.
Looking good, but are you accessible?
Several things can go wrong when launching a new website. It’s not uncommon that your webdesigner leaves you with noindex switched on in CMS settings, intended to stay there only during the development of the site.
Oh by the way, is your new website 100% in flash or using images to display text? Better not be!
Let’s take a dive into specific issues.
Did you actually ask search engines not to index your website?
Check your meta tags and robots.txt file make sure you’re not blocking search engines yourself. This can happen, even on well-established websites. All it takes is a wrong file uploaded to a wrong directory or a wrong line of code added in the template or through your CMS settings.
Example: <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
Change of URL Structure
It’s like driving around with an outdated map.
Other technical issues (which can also appear as a result of a website re-design) are associated with new URL structure. If you used to have https://dejanmarketing.com/index.php and now you have https://dejanmarketing.com/index.html you need to tell search engines this has changed. The easiest way to do so is by redirecting the old page to the new one using a 301 redirect. This will also be helpful to your users as they will be redirected to a relevant page instead of hitting a 404 error.
Broken Links & 404 Error Pages
Are you pouring your link juice into a hollow pot?
The reason for your ranking troubles could be associated with a loss of valuable inbound links from websites which used to link to your old content. Now that you’ve updated your website and/or changed the URL format these links are taking both users and search engines to 404 pages. Unfortunately search engines do not count links pointing to pages which do not exist on your website and this could be the cause for your drop in the rankings.
Too Many Redirects
Search engines may not like your 301 daisy chains.
Have you changed your URLs more than 3-4 times? Search engines are pretty good at following redirects but at some point they may find it a bit ‘weird’ and decide not to follow any further. There are some hints that you also lose part of the link signal on 301 so it’s always better to get the webmaster linking to you to update their link rather than creating redirect chains on your own website. If you’re unable to do that then at least clean up your redirects to avoid chains and redirect traffic and crawlers to the latest version from all old URLs.
- A > B > C > D > E > F
- A > F
- B > F
- C > F
- D > F
- E > F
Analytics Code Integration
It looks like you’ve dropped, but you’re fine.
This has got to be one of my favourites. Webmasters going into panic mode and then realising that their analytics tracking code was somehow corrupted or removed. It’s a good idea to check your statistics using Google Webmaster Tools which do not rely on tracking code, or simply use your server log files or a statistics package such as webalizer or awstats.
Hosting and DNS Issues
Because speed matters.
It could be any number of reasons, but if your website is slow or consistently inaccessible by users and crawlers search engines may chose to display other results in their search results hoping to provide better user experience. Site speed is now a confirmed search engine ranking signal.
When you’re suspicious, but not suspicious enough.
Search engines know when they see a pattern of unnatural links. What they don’t know in every instance is whether it was your intention to do this or not. For this reason Google have been ignoring the borderline cases where links look suspicious but they cannot be certain that the website is in clean violation of their guidelines or if it was a case of hackers, spammers or automated scripts generating links for your website.
If your website relied on unnatural linking techniques and they suddenly don’t count any more you may see a drop in your rankings. Again, this does not mean that your website is penalised, but that some of your links simply don’t have any effect any more. Google’s advice is to clean those up anyway, if you know what they are, and submit a reconsideration request and inform their webspam team of your clean-up efforts.
Now you see them, now you don’t!
Smaller websites don’t have thousands of inbound links and it can often happen that you lose one or two very significant links and experience drop in rankings because of that. This could be your business partner, DMOZ listing or an expired article from a big newspaper which used to link to you.
It’s a good idea to monitor your links, keep a good inventory and check to see why they have disappeared. Some website which no longer link to you may have simply forgotten to do so or decided to use your logo and not link to you. Another case could be a CMS which by default makes all outgoing links ‘nofollow’. A simple call or an email to your colleague is likely to restore the missing links and your rankings.
Hacked Websites: Spammy Links
Are you linking to “Download Adobe Serial Numbers”, “Payday Loans” and “Viagra”?
Yes it can happen. Google’s trust in your website can be compromised if your website is compromised by hackers. What they will do is inject hidden links into your content which is often not visible on simple inspection. Often the spammy links on your website cannot even be detected in the source code. How’s that? Spammers ensure that the links are only visible when Googlebot hits the site, making them very difficult to detect.
On easy way to check is to go to a cached version of your page and view it in the text only mode or inspect the source code of the cache page itself.
Another great way to detect unusual content on your website is by examining the
Hacked Websites: Malware
Is your website serving trojan horses?
Hackers will find vulnerabilities in your web server or a poorly secured script and find a way to distribute malware to yoru visitors. This will no doubt affect your ability to rank, but will not penalise you. Instead your users will see a warning message in search results and avoid your website like it’s diseased.
Luckily, Google Webmaster Tools are equipped with detection mechanisms which are able to alert webmasters when their websites are compromised by malware.
Also you can use the Safe Browsing Diagnostics Tool to check your page quickly.
Content is your top asset. Guard it fiercely!
Your website was down for a few weeks while you had your “coming soon” page displayed (bad idea by the way). In the meantime scrappers or other websites which picked up your content now rank higher for it than you. You always have an option to ask these webmasters to remove the copied content or provide link attribution, credit or a reference to you as the original publisher. If this doesn’t work there’s always DMCA (Digital Millennium Copyright Act).
Duplicate Sites, Duplicate Content
How many sites do you have again…?
Google only wants to show their users unique compelling content. If you’ve copied content from somewhere else or use the stock-standard description of your products supplied by the distributor Google may decide to filter your website out in favour of others which show more value to the user.
Sometimes webmasters will cause the content duplication on their own web properties by cloning one website using multiple domains. There are various reasons why this happens, some associated with ‘SEO myths’ others are legitimate.
An example of an SEO myth is that you’re better off creating 5, 10 or 20 exact-match domain names for specific keywords instead of having one strong website. The problem with this technique is that you’re splitting up your efforts too thin and instead of focusing on content, links and user experience of one website you now have a whole bunch of them to take care of.
A legitimate reason for website cloning would be if you operate in different countries and have different TLDs:
Ensure that you’ve sufficiently customised each domain to reflect local address, currency, date formats, phone numbers and other signals and that your lang tags are set right.
Faceted Navigation & Canonicalisation
Ah, so there’s this thing called Panda, yeah? But it’s not a penalty like many believe, it’s a filter. Panda doesn’t like thin, repetitive and useless content so try not to make any. I see a lot of websites (particularly online stores) repeat this same mistake over and over again – allowing crawlers to index your search results including:
- Faceted navigation
- Browsable searches
If you have 500 products, you should not have 50,000 pages in Google’s index. I hope this makes sense. Google is a search engine, they are the ones providing results, they don’t want to show the results of more results.
In fact if you bloat their index with your ‘fluff’ they will take action against not only thin pages but also against your good content. Again, this is not a penalty, but is a filter which can be felt in both rankings and traffic.
To be continued…
I reached out to my fellow SEOs and asked what they thought were the common reasons people think they got penalised and got an amazing amount of information. I will go over these in the next few days and add them to the existing list with descriptions and recommendations.
If you think of one yourself, please let me know.
Advice From Matt Cutts
Matt Cutts from Google talks about steps he would take to diagnose a drop in traffic/rankings: