Do 404 Errors Hurt My Rankings?

For years this has been argued by SEO professionals who sit on either side of the fence on this subject. But that was then and this is now.

Over the past number of years Google has begun to focus more on what they call the user experience. In fact they’ve even said, on more than one occasion, that the user experience is at the very core of their ranking AI. Remember this.

Maybe you know maybe you dont or maybe you just don’t care, however I’ve been called a “technical SEO” by John Mueller. John is Google’s main webmaster trends analyst and the face by whom all SEOs interact when it comes to questions about Google.

And no I’m not tooting my own horn I’m simply stating this so you understand a little more about how I see things.

At TRESEO we have a clear focus on helping real estate agents rank better in Google. Therefore we have been privy to many different web design companies that design web sites for realtors and agencies. Not only that but we have also worked hand in hand with developers of plugins that interact with the various IDX/RETS and CREA feeds from Canada and the US.

Without calling out any particular design company, we’ve begun to see a pattern. An ugly menacing pattern.

Many of the major web design companies and plugin designers have skated by thinking that SEO from 2008 will suffice and that because Google once said 404 error pages were harmless for ranking, that they were acceptable.

Trouble is…what type of 404 error are we talking about here?

That’s because there are more than one type of 404 error. There are three actually. An external 404 is when another site is linking to a page on your web site that doesn’t exist.

Another one is the outgoing 404 error where you are linking to another site/page that doesn’t exist. This is what we call leaking authority but we’ll take about that another time.

Lastly we have the internal 404. Now this is when within your own site structure you have pages that are being indexed however no longer exist.

So the first two are in our opinion pretty harmless. Although in the day and age of authority rules all, having some über powerful links pointing to a page returning a 404 error isn’t great either. But in large there is no chance the first two types of 404 (external & outgoing) can affect your rankings.

That leaves us with the internal 404 error.

Remember when I told you to remember how important user experience is to Google? Yeah…about that…

Over the past decade I’ve worked on large client sites like Sony Ericsson, GE, EA and more and in each instance I believe that removing internal 404s had a positive impact on rankings.

Of course, that’s hard to prove given all the other things going on with the site, with competitors and with Google’s algorithm. However all things being equal eliminating internal 404s seems to be a powerful piece of the puzzle.

If when you look inside your Google Search Console (formerly Google Webmaster Tools) and you see “404 page not found” in large quantities, wouldn’t you think this might make the search engines believe that your site isn’t complete or is under construction? Is it also not plausible to this that as a result, they may determine that the site isn’t worthy of strong search engine rankings?

Google does not want users having a poor experience. Period. And if they feel there is a high probability that the user could end up hitting a dead end on your site, might they not limit you in search?

The image below is a screenshot of one such case where we saw a massive spike in internal 404 errors and immediately after we witness a big rank drop across all keywords. Now I get correlation vs causation and I hear it all the time however to all of us at TRESEO we’ve seen it far too often and whenever we clean up the internal 404 error pages we see a noticeable jump in rankings.
internal 404 errors hurt ranking

Ok so in the case of realtors, what is the cause of this? Simple…imported data feeds. I’m talking about IDX/RETS and CREA here. Likely all the others handle listings the same way but for this article we’re talking about listing feeds in Canada and the US.

Major plugin providers and web design companies who focus on servicing the real estate vertical are designing their material without any thought whatsoever for SEO in 2017. I say that because if you search back in 2008 or so you’ll see search optimization professionals from all over arguing the impact – if any – 404 errors have on ranking. But it’s 2017 now and things have changed.

You see the problem lies with how the data feeds are displayed. Most have it so that it appears as though they are hosted locally so you’ll have URLs like http://abcrealestatecompany.com/listing/5555-North-Ridge-Road-Victoria-British-Columbia-V2A-A2V

So what happens here is when the listing is removed because it’s been sold or whatever, the next time the user OR Google, visits the page, they get a 404 error. And just like the screenshot shows above, you’ll often see large spikes in internal 404 errors on real estate sites that employ these methods of displaying listings from national feeds.

What’s the fix? Well the hardest part is convincing the developers of websites and plugins that it really is a problem. If we can do that then maybe it’s not unimaginable they would respond to a simple line of code to deploy. What is it?

noindex,follow

Told you it was simple. Look, there really is no need to have listings that may only last a week, in the search results anyways. The risk over reward here simply isn’t worth it. Maybe you’re thinking about fresh content. Ya…not so much. Listings are notorious for having little to no content and the content is does have is across likely hundreds of other realtor sites too making it now a duplicate content issue.

So join me realtors of the world and stand up and demand change and improve rankings in the process.