Weekly SEO Recap: antitrust ranking factors

Joost's weekly SEO recapI write this knowing that next week I’ll be in Munich with many friends of the SEO industry for SEOktoberfest, a conference unlike any other. We’ll talk shop with some of the best and brightest minds in our industry and to say I’m excited is an extreme understatement.

Google loses antitrust case in Russia

These headlines are coming more and more. This week, Google lost an antitrust case in Russia. This follows on similar cases in India, Europe and many countries throughout the world. It’s not all that surprising, given Google’s power, that governments everywhere are becoming nervous. We’ll see what all these cases together end up doing to how Google organizes itself and how it behaves. Google’s recent change to its corporate structure, creating Alphabet Inc, should probably be seen in light of these issues too.

In other legal news, Google was the suing party in a case against another firm impersonating Google in their robo-calls.

Structured markup and HTTPS as a ranking factor

We wrote about this not too long ago, saying structured markup is not a ranking factor. Obviously, Google reserves the right to start using it for ranking. I would say that too, if I were Google, and I wanted to get more people to add that kind of markup to their pages. We’ll believe it when we see the first research that confirms that they’re using it for ranking. This piece of user-research did show that having snippets on position 2 beats being in the first spot. What I did find curious was they incorporated an author image in the research, something that basically doesn’t exist anymore in the normal search results.

In a similar vein, Google said that HTTPS acts as a “tie breaker“. This is what I’d describe as a “weak ranking factor” at that point. Even if it is only a tie breaker, you should probably still do this for a myriad of other reasons: consumer trust, more reliable analytics data, etc. My thoughts on this are essentially the same as in January 2014.

Over-optimization a common issue?

In a recent hang out, Google’s Gary Ilyes once again stressed the important of quality content:

… what I see often is people try to rank for queries they don’t have high quality and great value content for, and that’s the problem. Sooner or later the algorithms will catch that, you don’t have to overthink it, it’s simple content analysis and they will adjust the rank for the site, and that’s it.

Of course, this is much in line with our own thinking on the topic. If you’re in doubt about your content strategy, our book Content SEO is a must-read. In that same hang-out, Gary also discussed the slowness of the Panda roll out, though we’ve seen reports that Google rolled back parts of Panda as well, we discussed those last week.

That’s it, see you next week!

joost signature

This post first appeared as Weekly SEO Recap: antitrust ranking factors on Yoast. Whoopity Doo!

Weekly SEO Recap: Panda U-Turns, Yandex penalizes and Bing launches new tools

Joost's weekly SEO recapNew iPhones! Search news? Oh right… So… About that search news. Panda’s making U-turns, Yandex penalizing sites for selling links, Bing offering new keyword tools and more!

What happened to Panda?

Google announced a Panda 4.2 update that’d “very slowly” roll out a while back, we covered that. It seems as though, according to this post by Barry Schwartz at SearchEngineLand, the update has been reversed.

It could be that Google’s continuous user testing started giving lower ratings for their search results, indicating that Panda 4.2 was, in fact, not an improvement. Google didn’t comment, but it seems a plausible hypothesis. Looking at some sites myself in SearchMetrics, I definitely see… Weird behavior, visibility going up and coming down again, which could support this hypothesis.

Yandex penalizing link sellers

Yandex has announced that they’ve penalized a bunch of sites for selling links. I’ll spare you a link to the Russian blog post (oops you got it anyway) and give you a link to SearchEngineLand instead. It makes a ton of sense to make these kinds of adjustments, but it’s funny to see them come out of Yandex, who actually announced with some fanfare at the end of 2013 that they would stop using links (for certain commercial queries).

I think it shows how important links are to ranking algorithms and how hard they are to replace. So much that it’s worth cleaning up the signal by making adjustments like this.

Bing launched a new keyword planner

We write a lot about keyword research, as it is, truly, the basis of SEO. So when new tools arrive that help you do keyword research, I always get excited. Even more so when created by Bing as, historically, even though Bing drives far less search traffic, their keyword research tools have always been among the best in the business.

Bing announced a new keyword planner on Wednesday, available in your Bing Ads account. It’s worth creating a Bing Ads account just for that. There are historical statistics in there, suggestions for more keywords etc. etc. Of course, the best tool in doing keyword research is still your brain, but tools like these can help you get new ideas.

Index count in Google Search Console

One of the things you might be keeping an eye on while optimizing a site is the Index Status in Google’s Search Console. It shows the number of indexed pages for a site, and will usually remain static if you’re not doing too much. So it’s weird when that graph spikes or dips suddenly. Recently, it did, this is the graph for yoast.com:

search console status

Turns out that Google had a bug (the dip) and then changed how they calculate this number (the spike). There’s an “explanation” by Google here, but it doesn’t make sense, they just changed it. I’m hoping it’ll stay stable at the “new normal”, as that ways it’s the most reliable.

That’s it, see you next week!

joost signature

This post first appeared as Weekly SEO Recap: Panda U-Turns, Yandex penalizes and Bing launches new tools on Yoast. Whoopity Doo!

Weekly SEO Recap: Google Panda 4.2

Joost's weekly SEO recapFirst things first: I type this as I’m getting ready for a holiday, so the next weekly SEO recap will be on August 21st. Luckily, Google released Panda this week, so I can cover it now. And there’s more, including a statement by Google about the new top level domains. Let’s dive in:

Google Panda 4.2

Google has rolled out Panda 4.2. There are several posts out there, like on SearchEngineLandTheSEMPost and Search Engine Roundtable, covering it. Let me try to explain the most important details about this update in layman’s terms.

Panda 4.2 is not an update, but a refresh

Technically, Panda 4.2 is not an update: Google didn’t introduce new signals, it just reapplied the same signals on new data. This needs some explaining for most people, so let me try: Google Panda is the result of a very deep analysis of Google’s index. One that it can’t run continuously, like it does its normal ranking, but a calculation that takes months. So this is what we call a “data refresh”: it has run the analysis on a new set of data.

Because Panda needs to be “refreshed”, it has a very negative side effect, especially as these updates don’t exactly run often. The previous update was 10 months ago. If you were hit then and have been improving your site since, this was your first chance to “get out” of Panda. If you think that’s harsh, you’re not alone. Many SEOs out there take issue with this but I’m guessing that’s not going to help them. If you get hit now, you should be aware that recovery is going to take several months, probably up to 10 or 12, even if you get it right the first time.

The fact that Panda needs refreshes also means that making changes now won’t do you any good in terms of staying out of Panda. It has a cut-off date and it won’t see anything after that. That being said, now is always as good a time as any to start improving your site.

Panda 4.2 is a slow Panda

The quote from Google’s spokesperson says it all:

“This past weekend we began a Panda update that will rollout over the coming months”

You read that right. This Panda rollout will not take hours. Not days. Not even weeks. It will take months. This is probably why nobody noticed the update as it began rolling out. This slow roll-out will also make it virtually impossible to correctly assess a win or loss as a definite Panda issue.

If you want to read more, I think Jen’s coverage over at theSEMpost was the most extensive.

Don’t want to be hit by the Panda? Don’t be bamboo!

If you’re afraid of being hit by Panda, and want us to make sure you’re not going to be a candidate, order a website review. We’ve seen many Panda victims over the years and we know we can help. Both when you’re hit or when we think you’re bamboo (also known as: a likely victim).

New TLD’s and Google

Other things happened besides Panda this week, and a few warrant being mentioned. The most important thing for many (aspiring) domainers out there was this post by John Mueller on Google’s webmaster blog. It details how Google deals with new top level domains. In short: like it would with any other domain. This bit is very important:

Keywords in a TLD do not give any advantage or disadvantage in search.

Another important bit is whether Google would treat new domains like .london and .amsterdam as local TLD’s or as “global” TLD’s, aka they can rank anywhere in the world. The answer is clear:

Even if they look region-specific, we will treat them as gTLDs. This is consistent with our handling of regional TLDs like .eu and .asia.

Of course, Google wouldn’t be Google if they didn’t add an exception to that straight away:

There may be exceptions at some point down the line, as we see how they’re used in practice.

Sigh. So. They’re global, for now. Over time, they might become local.

My opinion on the new TLD’s

You didn’t ask, but I’ll give you my opinion anyway. I like the concept. I would have liked 3-5 new TLD’s. A number that would work and that maybe people could remember. The gigantic amount of new TLD’s now is pure nonsense in my opinion. Would I use it for a transactional site? Probably not for a while longer. If trust is one of your main issues, and let’s face it, with eCommerce it still is, using a TLD that some of your users might have never heard of is not a good idea. The same goes for getting links to domains like that. It’s going to be harder.

Another problem I see with the new TLD’s is that they won’t work nicely as an email address for quite a while. Of course, you can receive email just fine. Email validation in forms will be broken for at least another decade or so, which means that it will tell you your new hipster email address is invalid when it isn’t.

Overall, I think what the new TLD’s do more than anything, is strengthen the value of .com domains. If you have a nice short and rememberable .com, I think you’ll be stronger in the long run.

Featured snippets and how to get them

This post on SEL by Eric Enge should be required reading material for anyone playing in SEO. This quote, from the end of his article, explains best why you should know about this:

… getting a featured snippet for key pages on your site is a good thing. The business value depends on identifying common questions that a potential customer might ask related to your market space.

I’m thinking of some experiments for our own site right now, but they’ll have to wait. It’s time for my holiday first. Did you notice the banner with my sleek summer outfit? If not, check out the Yoast SEO Premium sale we have. If you’ve been pondering buying it, now’s the best time to do so.

That’s it, see you next month!

joost signature

This post first appeared as Weekly SEO Recap: Google Panda 4.2 on Yoast. Whoopity Doo!

Phantom Quality Panda and Google Search Console

stop being bambooAt Yoast, we’re all really busy in the preparation for YoastCon next Wednesday, as you will understand. When us SEO people are really busy focussing on other stuff, Google tends to ‘surprise’ the online community with breaking news, algorithm updates or other developments that really need our attention as well. How is that, you might say. Well, remember the Panda updates that immediately affected 5% of the internet? That for instance also meant an increase in site review business for us. We want to be prepared :)

Quality Update, Phantom 2 or Reverse Panda?

Google did another one of these algorithm updates on the 3rd of May this year, just a couple of weeks ago. Hubspot mentions mainly the less contributing How-to sites that lost rankings, like HubPages and eHow. By less contributing we mean in terms of contribution of new, quality content to Google. Google claims it wasn’t Panda. Nevertheless, it seems that mainly sites that already dealt with a Panda penalty dropped in rankings.

There are many names that are given to this update, as none was given by Google. Did anyone already mention that Google is less talkative about these updates since Matt left? That on a side note. The name for this update we like least is ‘Quality Update‘ or ‘Quality Algo’. That’s not a distinctive name whatsoever. For years, Google’s wish has been for webmasters to focus on quality. Panda is about quality, and so is Penguin. Google’s entire penalty system could be called Quality Algo. Glenn Gabe called it the ‘Phantom 2‘ update, it being the second large, unnamed algorithm change. I like that one a lot better, but have to say that, as it is clearly Panda related, Reverse Panda is my personal favorite.

reverse pandaThe update is told to positively influence rankings of websites that provide quality content, instead of Panda punishing sites that lack quality content. That would indeed make it a reverse Panda. Call a spade a spade, right?
The algorithm update isn’t targeting low quality sites, but a side effect of an update like that is of course a decline in rankings for these Panda candidates. Just stop being bamboo, as we often say in the office. It’s all about quality content. Google’s search result pages are stuffed with websites and if your website simply isn’t contributing on a larger scale (e.g. in the search results), you’d better up your game fast.

Google Search Console!?

google search consoleYou’ll probably not find notifications on this reverse Panda update in the Manual Spam Action section in Google Webmaster Tools, by the way. For two obvious reasons:

  1. It’s not a manual update, but could be the start of even more real time algorithm updates
  2. It’s not Google Webmaster Tools anymore.

What was that all about, right? Since not all users of Webmaster Tools are webmasters, they renamed it into Google Search Console. To target ‘more users’. Seriously. But hey, it brings in some extra traffic, right. That is why we decided to rename our WordPress SEO Premium plugin into Page Analysis And General SEO That Includes Redirect Options For Crawl Errors In Google Search Console Plugin. That just makes clear that it’s not just for people that want green bullets. Now excuse me, I’ll have to go and rewrite all pages on this website about that plugin.

This post first appeared as Phantom Quality Panda and Google Search Console on Yoast. Whoopity Doo!

Google Panda 4, and blocking your CSS & JS

Yoast liked Google PandaA month ago Google introduced its Panda 4.0 update. Over the last few weeks we’ve been able to “fix” a couple of sites that got hit in it. These sites both lost more than 50% of their search traffic in that update. When they returned, their previous position in the search results came back. Sounds too good to be true, right? Read on. It was actually very easy.

Last week Peter – an old industry friend who runs a company called BigSpark – came by the Yoast office. BigSpark owns a website called iPhoned.nl and they’d been hit by the every so friendly Google Panda. Now iPhoned.nl has been investing in high quality content about (you guessed it) iPhones for a few years now, and in the last year they’ve stepped it up a notch. They are pushing out lots of news every day with a high focus on quality and their site looks great. Which is why I was surprised by them being hit. You just don’t want your Searchmetrics graph to look like this:

iphoned searchmetrics

Notice the initial dip, then the return and the second dip, leaving them at 1/3rd of the SEO visibility they were “used to”. I dove into their Google Webmaster Tools and other data to see what I could find.

Fetch as Google’s relation to Google Panda

In Google Webmaster Tools, Google recently introduced a new feature on the fetch as Google page: fetch and render. Coincidence? I think not. They introduced this a week after they rolled out Google Panda. This is what it showed when we asked it to fetch and render iPhoned’s iPhone 6 page:

fetch as google no css

Even for fans of minimalism, this is too much.

Now, iPhoned makes money from ads. It doesn’t have a ridiculous amount of them, but because it uses an ad network a fair amount of scripts and pixels get loaded. My hypothesis was: if Google is unable to render the CSS and JS, it can’t determine where the ads on your page are. In iPhoned’s case, it couldn’t render the CSS and JS because they were accidentally blocked in their robots.txt after a server migration.

Google runs so called page layout algorithms to determine how many ads you have. It particularly checks how many ads you have above the fold. If you have too many, that’s not a good thing and it can seriously hurt your rankings.

In the past blocking your CSS was touted by others as an “easy” way of getting away from issues like this, rather than solving the actual issue. Which is why I immediately connected the dots: fetch and render and a Google Panda update? Coincidences like that just don’t happen. So I asked Peter whether we could remove the block, which we did on the spot. I was once again thankful for the robots.txt editor I built into our WordPress SEO plugin.

Remarkable resurrection

The result was surprising, more so even because of the speed with which it worked. It’s now a week ago that we changed that block and their Searchmetrics graph looks like this:

iPhoned survived Google Panda

They’ve returned on almost all of their important keywords. Just by unblocking Google from spidering their CSS and JS.

When we saw this we went and looked at some of our recent website review clients and we found the exact same pattern. One of them turned out to have the same problem and already looks to be returning too.

Confirmation from Google: don’t block your CSS & JS

Now I don’t usually post my “SEO theories” on the web, mostly because I think that’s more hurtful than helpful in many, many cases as they’re just theories. So I didn’t want to write this up without confirmation from Google that this was really the cause of the issue here. But then I read this live blog from last weeks SMX, and more specifically, this quote from Maile Ohye embedded in it:

“We recommend making sure Googlebot can access any embedded resource that meaningfully contributes to your site’s visible content or its layout”

That basically confirms our theory, which had already been proven in practice too, so I went ahead and wrote this post. Would love to hear if you’ve seen similar issues with the Google Panda 4 update, or (even better) if in a week from now you’re ranking again because you read this and acted!

This post first appeared on Yoast. Whoopity Doo!