A month ago Google introduced its Panda 4.0 update. Over the last few weeks we’ve been able to “fix” a couple of sites that got hit in it. These sites both lost more than 50% of their search traffic in that update. When they returned, their previous position in the search results came back. Sounds too good to be true, right? Read on. It was actually very easy.
Last week Peter – an old industry friend who runs a company called BigSpark – came by the Yoast office. BigSpark owns a website called iPhoned.nl and they’d been hit by the every so friendly Google Panda. Now iPhoned.nl has been investing in high quality content about (you guessed it) iPhones for a few years now, and in the last year they’ve stepped it up a notch. They are pushing out lots of news every day with a high focus on quality and their site looks great. Which is why I was surprised by them being hit. You just don’t want your Searchmetrics graph to look like this:
Notice the initial dip, then the return and the second dip, leaving them at 1/3rd of the SEO visibility they were “used to”. I dove into their Google Webmaster Tools and other data to see what I could find.
Fetch as Google’s relation to Google Panda
In Google Webmaster Tools, Google recently introduced a new feature on the fetch as Google page: fetch and render. Coincidence? I think not. They introduced this a week after they rolled out Google Panda. This is what it showed when we asked it to fetch and render iPhoned’s iPhone 6 page:
Even for fans of minimalism, this is too much.
Now, iPhoned makes money from ads. It doesn’t have a ridiculous amount of them, but because it uses an ad network a fair amount of scripts and pixels get loaded. My hypothesis was: if Google is unable to render the CSS and JS, it can’t determine where the ads on your page are. In iPhoned’s case, it couldn’t render the CSS and JS because they were accidentally blocked in their robots.txt after a server migration.
Google runs so called page layout algorithms to determine how many ads you have. It particularly checks how many ads you have above the fold. If you have too many, that’s not a good thing and it can seriously hurt your rankings.
In the past blocking your CSS was touted by others as an “easy” way of getting away from issues like this, rather than solving the actual issue. Which is why I immediately connected the dots: fetch and render and a Google Panda update? Coincidences like that just don’t happen. So I asked Peter whether we could remove the block, which we did on the spot. I was once again thankful for the robots.txt editor I built into our WordPress SEO plugin.
Remarkable resurrection
The result was surprising, more so even because of the speed with which it worked. It’s now a week ago that we changed that block and their Searchmetrics graph looks like this:
They’ve returned on almost all of their important keywords. Just by unblocking Google from spidering their CSS and JS.
When we saw this we went and looked at some of our recent website review clients and we found the exact same pattern. One of them turned out to have the same problem and already looks to be returning too.
Confirmation from Google: don’t block your CSS & JS
Now I don’t usually post my “SEO theories” on the web, mostly because I think that’s more hurtful than helpful in many, many cases as they’re just theories. So I didn’t want to write this up without confirmation from Google that this was really the cause of the issue here. But then I read this live blog from last weeks SMX, and more specifically, this quote from Maile Ohye embedded in it:
“We recommend making sure Googlebot can access any embedded resource that meaningfully contributes to your site’s visible content or its layout”
That basically confirms our theory, which had already been proven in practice too, so I went ahead and wrote this post. Would love to hear if you’ve seen similar issues with the Google Panda 4 update, or (even better) if in a week from now you’re ranking again because you read this and acted!
This post first appeared on Yoast. Whoopity Doo!