A brief history of Google’s algorithm updates

These days, the way we do SEO is somewhat different from how things were done ca. 10 years ago. There’s one important reason for that: search engines have been continuously improving their algorithms to give searchers the best possible results. Over the last decade, Google, as the leading search engine, introduced several major updates, and each of them has had a major impact on best practices for SEO. Here’s a — by no means exhaustive — list of Google’s important algorithm updates so far, as well as some of their implications for search and SEO.

2011 – Panda

Obviously, Google was around long before 2011. We’re starting with the Panda update because it was the first major update in the ‘modern SEO’ era. Google’s Panda update tried to deal with websites that were purely created to rank in the search engines, and mostly focused on on-page factors. In other words, it determined whether a website genuinely offered information about the search term visitors used. 

Two types of sites were hit especially hard by the Panda update:

  1. Affiliate sites (sites which mainly exist to link to other pages).
  2. Sites with very thin content.

Google periodically re-ran the Panda algorithm after its first release, and included it in the core algorithm in 2016. The Panda update has permanently affected how we do SEO, as site owners could no longer get away with building a site full of low-quality pages.

2012 – Venice

Venice was a noteworthy update, as it showed that Google understood that searchers are sometimes looking for results that are local to them. After Venice, Google’s search results included pages based on the location you set, or your IP address.

2012 – Penguin

Google’s Penguin update looked at the links websites got from other sites. It analyzed whether backlinks to a site were genuine, or if they’d been bought to trick the search engines. In the past, lots of people paid for links as a shortcut to boosting their rankings. Google’s Penguin update tried to discourage buying, exchanging or otherwise artificially creating links. If it found artificial links, Google assigned a negative value to the site concerned, rather than the positive link value it would have previously received. The Penguin update ran several times since it first appeared and Google added it to the core algorithm in 2016.

As you can imagine, websites with a lot of artificial links were hit hard by this update. They disappeared from the search results, as the low-quality links suddenly had a negative, rather than positive impact on their rankings. Penguin has permanently changed link building: it no longer suffices to get low-effort, paid backlinks. Instead, you have to work on building a successful link building strategy to get relevant links from valued sources.

2012 – Pirate

The Pirate update was introduced to combat illegal spreading of copyrighted content. It considered (many) DMCA (Digital Millennium Copyright Act) takedown requests for a website as a negative ranking factor for the first time.

2013 – Hummingbird

The Hummingbird update saw Google lay down the groundwork for voice-search, which was (and still is) becoming more and more important as more devices (Google Home, Alexa) use it. Hummingbird pays more attention to each word in a query, ensuring that the whole search phrase is taken into account, rather than just particular words. Why? To understand a user’s query better and to be able to give them the answer, instead of just a list of results.

The impact of the Hummingbird update wasn’t immediately clear, as it wasn’t directly intended to punish bad practice. In the end, it mostly enforced the view that SEO copy should be readable, use natural language, and shouldn’t be over-optimized for the same few words, but use synonyms instead. 

2014 – Pigeon

Another bird-related Google update followed in 2014 with Google Pigeon, which focused on local SEO. The Pigeon update affected both the results pages and Google Maps. It led to more accurate localization, giving preference to results near the user’s location. It also aimed to make local results more relevant and higher quality, taking organic ranking factors into account. 

2014 – HTTPS/SSL

To underline the importance of security, Google decided to give a small ranking boost to sites that correctly implemented HTTPS to make the connection between website and user secure. At the time, HTTPS was introduced as a lightweight ranking signal. But Google had already hinted at the possibility of making encryption more important, once webmasters had had the time to implement it. 

2015 – Mobile Update

This update was dubbed ‘​Mobilegeddon​’ by the SEO industry as it was thought that it would totally shake up the search results. By 2015 more than 50% of Google’s search queries were already coming from mobile devices, which probably led to this update. The Mobile Update gave mobile-friendly sites a ranking advantage in Google’s mobile search results. In spite of its dramatic nickname, the mobile update didn’t instantly mess up most people’s rankings. Nevertheless, it was an important shift that heralded the ever-increasing importance of mobile.

2015 – RankBrain

RankBrain is a state-of-the-art Google algorithm, employing machine learning to handle queries. It can make guesses about words it doesn’t know, to find words with similar meanings and then offer relevant results. The RankBrain algorithm analyzed past searches, determining the best result, in order to improve. 

Its release marks another big step for Google to better decipher the meaning behind searches, and serve the best-matching results. In March 2016, Google revealed that RankBrain was one of the three most important of its ranking signals. Unlike other ranking factors, you can’t really optimize for RankBrain in the traditional sense, other than by writing quality content. Nevertheless, its impact on the results pages is undeniable.

2016 – Possum 

In September 2016 it was time for another local update. The ​Possum update​ applied several changes to Google’s local ranking filter to further improve local search. After Possum, local results became more varied, depending more on the physical location of the searcher and the phrasing of the query. Some businesses which had not been doing well in organic search found it easier to rank locally after this update. This indicated that this update made local search more independent of the organic results.

Read more: Near me searches: Is that a Possum near me? »

2018 – (Mobile) Speed Update

Acknowledging users’ need for fast delivery of information, Google implemented this update that made page speed a ranking factor for mobile searches, as was already the case for desktop searches. The update mostly affected sites with a particularly slow mobile version.

2018 – Medic

This broad core algorithm update caused quite a stir for those affected, leading to some shifts in ranking. While a relatively high number of medical sites were hit with lower rankings, the update wasn’t solely aimed at them and it’s unclear what its exact purpose was. It may have been an attempt to better match results to searchers’ intent, or perhaps it aimed to protect users’ wellbeing from (what Google decided was) disreputable information.

Keep reading: Google’s Medic update »

2019 – BERT

Google’s BERT update was announced as the “biggest change of the last five years”, one that would “impact one in ten searches.” It’s a machine learning algorithm, a neural network-based technique for natural language processing (NLP). The name BERT is short for: Bidirectional Encoder Representations from Transformers.

BERT can figure out the full context of a word by looking at the words that come before and after it. In other words, it uses the context and relations of all the words in a sentence, rather than one-by-one in order. This means: a big improvement in interpreting a search query and the intent behind it.

Read on: Google BERT: A better understanding of complex queries »

Expectations for future Google updates

As you can see, Google has become increasingly advanced since the early 2010s. Its early major updates in the decade focused on battling spammy results and sites trying to cheat the system. But as time progressed, updates contributed more and more to search results catered to giving desktop, mobile and local searchers exactly what they’re looking for. While the algorithm was advanced to begin with, the additions over the years, including machine learning and NLP, make it absolutely state of the art. 

With the recent focus on intent, it seems likely that Google Search will continue to focus its algorithm on perfecting its interpretation of search queries and styling the results pages accordingly. That seems to be their current focus working towards their mission “to organize the world’s information and make it universally accessible and useful.” But whatever direction it takes, being the best result and working on having an excellent site will always be the way to go!

Keep on reading: Should I follow every change Google makes? »

Feeling a bit overwhelmed by all the different names and years? Don’t worry! We made a handy infographic that shows when each Google update happened and briefly describes what the purpose was.

Google's algorithm updates 2011-2020

The post A brief history of Google’s algorithm updates appeared first on Yoast.

Weekly SEO recap: Google update galore

This week we started with a Google update that rolled out over the weekend. Then there was “some” more news that came out of Google. Ranging from Google Penguin to a Search Console homepage redesign, we’ve got quite a bit to cover, so let’s dive right in!

Joost's weekly SEO Recap

Major update over the weekend

I won’t go into much detail on this as I already did that earlier this week, go read that post if you haven’t yet and want to know. It looks like more of the brand terms update that I covered last week too. Honestly, I wouldn’t be surprised if we got another one this weekend. It looks like Google is testing at a huge scale.

Google Search Console homepage redesign

From the “this isn’t such big news, but it’s fun nonetheless” department; Google redesigned their Search Console homepage. Not the inner pages with info about your site, but the homepage has a brand new look.

Google Webmaster hangouts

One of the things you’ll see on that new homepage is times for things like webmaster hangouts. Some people have the time to attend all of these and I’m very grateful that they write up the stuff that comes out of them as sometimes they’re very, very interesting, like these bits from the SEM Post:

There are also bits that are at the very least very annoying. John Mueller said, according to this post, that titles are not a “critical ranking signal”. Well…. As my friend Philipp Klöckner said:

It’s as secondary as oxygen is to a human. It’s not the primary signal that defines your humanity, but it’s *&^%$ hard to do without it, right?

I honestly like John Mueller and don’t think he meant it in the way it was written down in the post linked above, but of course, several people already commented here on yoast.com asking about it. Titles are Important. Trust me on that one.

Google Penguin

Google Penguin is still weeks away, apparently. It will also, just as Google Panda recently, become part of Google’s core algorithm. What it means for something to be “part of the core algorithm” is explained in this post on SearchEngineLand. The bombshell sentence in that explanation is hidden in a quote:

“but essentially nothing changed”

More interesting is this part of the article, near the end:

Ammon Johns, in the hangout, then said, “Once they forgot how it works, it is core?” To which Andrey Lipattsev (of Google) replied, “That is exactly right.”

You might be surprised by this, I’m not anymore. It’s becoming increasingly clear that algorithms run and improve on their own and engineers don’t always know why something ranks somewhere anymore. Machine learning truly is the future of search.

That’s it, see you next week!

joost signature

Another weekend, another Google update

Hot on the heels of last week’s update, we had another Google update this weekend. Looking at the data there are some outliers, probably due to individual issues with sites, but most of all it seems like Google is changing the branding results all the time now.

Another Google Update

Google updates galore

I wrote 3 posts last week, all to cover Google core algorithm related updates. If you haven’t read them and this is the first you hear of all these changes, you might want to read them, in order:

  1. Real time Penguin? Or something else?
  2. Google Panda part of Google’s core algorithm
  3. Google core algorithm update: brand terms

As I covered in my brand terms post, this Google update mostly seems to be changing the results for specific brand related terms. This weekends update was no different. What’s surprising is the enormous amount of SERPs (search engine result pages) that were updated.The volatility in the SERPs is huge though, as evidenced by all the screenshots in Barry’s post. The update was dubbed “Burj Khalifa” by Dan Petrovic over at Dejan SEO, although I honestly think Petronas Towers would have been a better fit, given that the volatility the next day was just as high.

We’re still analyzing the patterns of this update. In part, it looks like last weekend added some news results back in to the SERPs that had been taken out the weekend before. Some of the “winners” this week were losers last week, but none of them recovered their entire visibility. Results didn’t quite return at the same spot and there were definitely some other tweaks. It seems Google is searching for a balance of sorts.

Another core algorithm update

This was definitely not Penguin, you can see that based on what changed, but also because Googler Gary Ilyes confirmed it:

From the type of changes SearchMetrics shows, it’s clear that it’s “more of the same” compared to last weeks update, with an emphasis on “more”.

Theories are forming

We’re seeing some first theories form around the web as to what this is, but I honestly have a hard time believing any of them, which is why I’m not linking them yet. The only thing that is painfully clear when you look at the changed results is that URLs seem to be more important than I think they’d ought to be. But this could still just be correlation, I have no proof for causation yet.

Have you been hit by these changes? Are you seeing anything particularly weird? Let us know in the comments!

Read more: ‘Google core algorithm update’ »

Weekly SEO recap: it’s not Panda, not Penguin, it’s… Brands!

This week we had quite a bit of news. But…. I’ve written about it all, already. So I’m gonna be a lazy guy and point you straight at the three posts I wrote this week on Google changes:

Joost's weekly SEO Recap

The week started with us not knowing much yet, but it’s good to read this to get an idea of what everybody thought had happened:

Google update: Real time Penguin? Or something else?

Then we realized it wasn’t really Penguin, so we moved on. We got the news that Google had made Google Panda a part of its core algorithm:

Google Panda part of Google’s core algorithm

And finally, we figured out what had changed in Google and what this update was really about: brand terms. Read this post for the full view:

Google core algorithm update: brand terms

In all honesty, this is what we’d call, in Dutch “een storm in een glas water”, which translates as: “a storm in a glass of water”, basically: much ado about nothing.

That’s it, see you next week!

joost signature

Google core algorithm update: brand terms

Over the weekend we saw an incredibly big shuffle in Google search results. I wrote about it earlier this week, as we were researching what happened. I’ll be honest: we’re still researching. But let me update you on what we know and don’t know about this Google core algorithm update.

Google Core Algorithm update

What we know

We know a few things about this update now. Despite all the promises about a Google Penguin update early this year, this is not it. It’s also not Google Panda. But there’s news about Google Panda anyway, which I’ve written a separate post on:

Read more: ‘Google Panda part of Google’s core algorithm’ »

How we know that this is not Penguin or Panda? Googler Gary Ilyes said so. New Google star Zineb Ait tweeted that this update didn’t have a name but was “just one of the frequent quality updates” (in French).

What did this Google core algorithm update change?

So… What changed? We don’t know. The changes are… Weird. We’ve been using a couple of datasets to look at this update, but most of all we’re looking at SearchMetrics. They publish a list of winners and losers every week and this week the changes seem to have happened mostly for news sites, specifically for brand terms. For instance, check this list of the keywords that the biggest loser in the US, the Atlantic, lost its position for:

Keywords the Atlantic lost traffic for

Almost all of these are brand terms.

Bartosz has written a good post (with tons of interesting screenshots if you don’t have access to SearchMetrics), that touches on some of the things I had seen too. He calls it a “brand bidding update”, which I don’t think is right. I do agree with him that the change was in the type of results that Google shows for brand queries. The switch seems to have been from news articles to more “timeless” URLs.

Slugs and/or site structure?

You won’t believe this, and it’s a correlation only (so don’t say I’ve said this is true), but I’m seeing a high correlation between the keyword(s) being the first word(s) of the slug (the URL) and the ranking. It can’t be that simple though. It’s very possible it has to do with a better site structure for the winners versus the losers. Some of the biggest winners are category URLs on sites that have good optimization for their categories and good internal linking, like Mashable. So… This might be a good time to revisit your Category SEO tactics:

Read more: ‘Using category and tag pages for your site’s SEO’ »

Visibility impacted, but traffic?

SearchMetrics (and many similar tools) calculate a score based on the amount of traffic for a specific term and the position you’re ranking on. The idea is that if you rank, for instance, #3 for a term, you’ll receive a certain proportion of the traffic for that term. This is a very valuable way of looking at a site as a site’s visibility score usually has a high correlation to a site’s traffic.

The problem with this visibility score is when searches are mostly navigational. For instance, we rank in the top 7 for [[google analytics], but we get close to 0 traffic for that term. The reason is that 99.9% of people searching for [[google analytics], actually want to go to Google Analytics.

This means that the actual changes in terms of traffic for this update, even though the changes in visibility are huge, will differ highly per term and will, very often, be negligible. This is in my opinion something in the SearchMetrics visibility score that has to be changed, and something I’ve discussed with my good friend (and SearchMetrics founder and CTO) Marcus Tober before.

Conclusion

The impact of this Google core algorithm update in terms of search results and visibility was huge, the impact on actual traffic might not be as big. There are definitely things we’ll need to figure out over the coming weeks / months though, like what the importance of site structure and URLs are. Interesting times!

 

Google update: Real time Penguin? Or something else?

We’re currently seeing one of the biggest and most dramatic changes in Google rankings we’ve seen in the last few years. At first we were guessing it’s the promised real time Google Penguin roll out, but all the reports seem to say that it’s multi-faceted and not really tied purely to links, so we’re not 100% sure. In this post I’ll explain what we know right now and what we think about that.

real time penguin penalizes bad link building

Why everybody thinks it’s Google Penguin

There’s a simple reason why everyone thinks this is a Google Penguin roll out: Google has been talking about an upcoming real time Google Penguin update for months and has said, through several people, that it’d come early this year.

There’s another reason: there are few types of algorithm updates that can have such a dramatic impact on rankings. There are 200+ ranking factors in play (some even say thousands but whatever). When you tweak one of those ranking factors, not all that much is bound to change. Unless the factor you’re tweaking is one of the factors that the original algorithm was built on. That’s exactly what Google Penguin does: it changes how links are evaluated.

Reports from around the web

We’re seeing quite a few reports from around the web saying things are in turmoil. Let me link a few:

Read those and you’ll see that none of them really know what’s happening yet. One of the things I would guess, based on looking at the data, is that a couple of other signals have been factored in.

It’s absolutely not certain yet that this is truly Google Penguin.

One post that came in while I was writing this is this one by Bartosz Góralewicz. He shows some interesting data from the spammier side of the net, but isn’t ready to draw conclusions yet.

In the Netherlands we’ve also seen some changes coming in on Friday / Saturday and the search results looking the same as early last week again today. I’m waiting for the SearchMetrics reports to come in to see whether we can truly establish a trend.

What a real time Google Penguin could do

While looking at what this could be, I was thinking about what the impact of a “real time Google Penguin” would be. Links are the very essence of how Google started making the web searchable. Its link based Pagerank algorithm was what made it so good when it first started. Links have (largely because they were so important) become the Achilles heel of every search engine that followed that method. We’ve been writing about link building a bit recently, mostly because a lot of people seem to be doing it wrong.

Google Penguin, when it first came out, aimed to make links usable again as a metric, by punishing bad actors so severely that people were scared away from trying to game Google. It only rolled out every 8 to 12 months, so you could be punished for quite a while before your site started to properly rank again.

A real time Google Penguin, which would update constantly instead of every once in a while could, in my eyes, therefore have quite the opposite effect of what Google wants it to do. If people can start testing with what works and what doesn’t work, and can get “out of jail” pretty easily, this might actually lead to more link spamming, instead of less. Time will tell.

Weekly SEO recap: Googlebot: Apples to…

A nice couple of small news bits this week, of course mostly about Google as usual, but also one about Apple’s spider! Let’s dive right in!

Joost's weekly SEO Recap

A Google phantom update

There was some discussion a few weeks back about whether an update had taken place. Our friends at SearchMetrics have shown that it was one without a doubt, and it looks a lot like the Quality Update we’ve seen earlier this year. They connect it to the quality guidelines I wrote about 2 weeks ago, which, if you haven’t, you really should read.

Penguin won’t be coming this year

Remember when I said last week that Gary Ilyes had been wrong before? I’ve rarely been proven right this quickly: Google Penguin won’t come till next year. Even though Google has said on numerous occasions, mostly through Gary, that it’ll happen this year. It is what it is, but if you were hoping to get out of your Google Penguin caused pain before the holidays, that hope is now completely gone.

Rel canonical everywhere?

We’ve done a lot of work with rel canonical here at Yoast. We were among the first to implement it and we still have the best rel canonical implementation in the business for WordPress. If you’re looking for a setting to enable it: no need. It’s always on.

In this weeks Webmaster hangout there were some questions as to whether it was wrong to have canonical on all pages. We’ve known this to be the right solution for most sites for quite a while, but it’s always good to get some confirmation.

No more changing your location in Google

If you were relying on Google’s filters to allow you to change the location you were searching from and see results for another country, tough luck. They’ve removed that feature altogether. I’m personally very sorry to see this go as it was sometimes fairly useful and now will need to switch to using proxies more often.

Applebot, disguising as Googlebot

If you see Applebot behaving weirdly in your logs (most people never bother to look and probably shouldn’t, but the best SEOs out there do look at those logs regularly), there’s a reason: it’s using Googlebot rules if you don’t have Applebot lines in your robots.txt.

Makes sense, I think, though I don’t know whether I like it much as a precedent. If every new bot would do this, it would make it impossible to do stuff specifically for Googlebot in your robots.txt file.

That’s it, see you next week!

joost signature

Weekly SEO Recap: AMPing up to SSL all the Penguins

This was very much the “week after” for us. Yoast SEO 3.0.4 saw the light, but other than that most of us are still recovering. It was also a week full with tiny little nuggets of newsy niceness. From SSL on WikiPedia to AMP to Penguin. Let’s dive in!

Joost's weekly SEO recap

SSL all the things

This is a pet topic of mine and will be for a few more years to come. We all care about privacy, but not everyone realises that saying that means you also have to make some choices. In particular, we should all make the choice to use HTTPS for all our sites. I wrote about that in January 2014 and my thinking didn’t really change.

This week Jimmy Wales of Wikipedia gave an interview (via The SEM Post). Some of the key quotes highlight why HTTPS is so important. With HTTPS, you can’t block specific pages on a site, only the entire site. Jimmy Wales tells the story of Wikipedia in China:

Around the time of the Beijing Olympics Wikipedia was opened up, the Chinese had a period of liberalisation of the internet, and they opened up and they allowed access to almost all of Wikipedia. But they were filtering certain pages, they were filtering about the usual suspects: things that are sensitive issues in China. So the Tiananmen Square incident; the artist Ai Weiwei; there’s a religious cult called Falun Gong; anything to do with Taiwanese independence -these are the kinds of things they were filtering, just those pages.

Now that Wikipedia moved to HTTPS, that makes that impossible for the Chinese government:

With https, the only thing that the Chinese authorities can see today is if you’re talking to Wikipedia or not, they can’t see which pages you’re joining, which means they no longer have the ability to filter on a page-by-page basis, so they can’t block just Tiananmen Square. They now have a very stark choice: the entire country of China can do without Wikipedia, or they can accept all of Wikipedia.

Unfortunately, that means that right now, all of Wikipedia is blocked in China. But it also means that they can’t profile people based on which pages they’ve visited anymore. I hope that this situation improves in China, but I also hope you understand just one more very solid reason to move to an all HTTPS web.

Google AMPing up on AMP

Gary Ilyes of Google has been talking about AMP (Accelerated Mobile Pages) a lot, and in a blog post, Google gave us something to chew on:

Google will begin sending traffic to your AMP pages in Google Search early next year, and we plan to share more concrete specifics on timing very soon.

What that means? Good question. I don’t know any more than you do at this point. I still don’t like AMP much, so I hope it won’t mean they’ll force us all into it.

In other mobile search news, Google has told us that you really shouldn’t use spammy (mobile) networks. What a surprise.

Google Penguin is coming

Google Penguin always seems to be “coming”. It’s almost like winter. But now we know that the next Google Penguin update will be a true update, not just a data refresh, and it’ll be real time. And it’s coming. Really. Apparently still in 2015 too, according to Google’s Gary Ilyes. But he’s been wrong before.

That’s it, see you next week!

joost signature

 

 

5 link building DON’Ts you didn’t know about

A lot of link building strategies can backfire, causing more damage than doing good. If you want to improve your ranking in the long term, use a holistic SEO strategy, and avoid certain link building tactics. In this post, I’ll discuss some link building DON’Ts: tactics you should most definitely NOT use. Some of these, you probably already know about; I’ll cover those first. Then, let’s move to 5 less well known – but equally important – DON’Ts.

Before we dive in, if you want to learn more about link building strategies and other essential SEO skills, you should check out our All-around SEO training! It doesn’t just tell you about SEO: it makes sure you know how to put these skills into actual practice!!

Risky link building tactics

In the old days, link building meant putting links on as many external pages as possible, often by buying or trading links. Since Google Penguin, these tactics have become a risky SEO strategy. If your link building tactics include spamming, your site risks a Google Penalty and could be banned from Google’s results completely. Placing a lot of links may help the ranking of your site for a short while, but probably not in the long run.

Obvious link building DON’Ts

I suspect most of you will already know quite a few of these obvious link building tactics one must definitely not apply:

  • You shouldn’t buy large amounts of links.
  • Don’t exchange links.
  • Avoid automated programs to get links.
  • Don’t do guest blogging with very thin and off-topic content.
  • Don’t comment on blogs or forums if your only purpose is to leave a link in the comment.
  • Over-optimizing your anchor text isn’t a good idea.
  • You shouldn’t have links that are unrelated to the topic of your website.
  • Avoid having links from sites that have no real content.
  • You shouldn’t have links from spammy sites whose only purpose is to advertise for gambling, viagra, or porn (unless your website is about gambling, viagra or porn).

So, now that we’ve covered these, let’s move on to some lesser known link building DON’Ts.

1. Linking only to the homepage? Don’t!

You should make sure to get links to different pages on your website and not solely to your homepage. If you only – or mainly – receive links to your homepage, your link building will look spammy. Of course, if someone is writing about your brand, a link to your homepage is appropriate. But if a website writes about products or about news from your company, they usually link to your product, news or blog pages. That is just the natural way people link to other pages.

You should make sure your link building strategy resembles the natural way people link to websites. Extra benefit: linking to a more specific page will probably lead to a better conversion on your website! So you should work on getting links for important product pages, or for your cornerstone content pages. Get links to those pages where the deal is closed! It will get your website a trustworthy link profile and will increase the conversion at the same time!

2. Pay for specific links? Don’t!

Another link building DON’T is buying links. You probably all know that buying links in large bulks from companies claiming to get you ranked fast is not something Yoast would recommend. But what about a single link from an individual company? From a high-quality website right in your niche? Is it wrong to buy one link from such a company? How will Google ever find out about that?

Google won’t know about one link you buy from one company. Still, we would recommend not to do so. If this company has sold one link to you, they could sell more links to more people. And although one link will not alarm Google, as the number of questionable links on a website increases, the risk of getting hit by Penguin or a manual penalty rises as well.

3. Recycling your content on different sites? Don’t!

A way to get links is to write articles about your company or about your products and try to get these articles published on other sites. Beware to not publish the same content on different sites though! An article in which some sentences and paragraphs are switched and a few words are altered still remains duplicate content to the original article. Especially if you repeat this trick several times.

In short: recycling content is not creating new content. It is a link building trick and it could backfire. Write the articles for the audience of the website you’re sending your piece to. Yes, that is a lot of work. No one said link building is easy…

4. Forgetting about social media? Don’t!

If you’re building links, do not leave out social media! Social media should definitely be included in your link building strategy, even if it’s not totally clear to what extent links from social media actually help in your ranking. If you receive many links from other websites all of a sudden, it would be extremely weird if you didn’t get any links from social media sites as well. That doesn’t add up and could make you look spammy in Google’s eyes. So alongside your attempts to receive links from appropriate websites, invest in getting shares, tweets and likes on social media platforms as well.

5. Faking it? Don’t!

Your link building strategy shouldn’t look natural, it should be natural! Make sure your link building isn’t fake. Links should be placed because the link could benefit the user of a website. They should fit the content of the page they’re added to. Ideally, your link building strategy is part of a marketing strategy aimed at telling people about your company, your website or your products. It should never aim only at getting as many quality links as possible.

Conclusion: links should always be useful

From a holistic SEO perspective, links should be useful for the user of a website. A link should be there because it means something; because the text in which the link is embedded refers to that specific page. If a link is merely there for Google and won’t receive any clicks, the link probably shouldn’t be there.
Links are meant to be clicked on. Link building should, therefore, be about creating links that are useful for the audience of a website, so they will click! Keep that in mind, with these DON’Ts, and your link building strategy should be well under way.

Read more: 6 steps to a successful link building strategy »

The post 5 link building DON’Ts you didn’t know about appeared first on Yoast.

Weekly SEO Recap: RankBrain and Archive Search

This week in search we saw the following important events: Google (finally) got a brain; we’re getting search for the web’s history; Google said Penguin will come before the end of the year.

Joost's weekly SEO recap

RankBrain

In a piece on Bloomberg Google talked about using AI for their search results, something that we’ve been speculating as coming for years now. They call this AI “RankBrain” and seems to be somewhat related to Hummingbird. It took a while for me to realize this, but there’s a reason this was revealed on Bloomberg, not on a search marketing conference or a technical conference. This has a profound impact on the business overall.

The “key phrase” of the whole piece, to me, was this one:

If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries.

This means it’s a true aim at semantic search. At relating topics. At finding entities and relating them to each other and thus providing you with results that are truly what you meant.

There is one post so far that does a good job of explaining what RankBrain could be (in my opinion), and it’s this post by Kristine SchachingerThis satirical piece – on Google getting a heart too – hit the ball home as well though!

Wayback Search Machine

The Wayback machine is a huge project that archives pages and has been archiving pages on the web for ages. You can look up the history of any website on it, for instance for yoast.com. This week they announced that they’re building a search engine, which would allow you to do history keyword search.

I think this will get a whole lot of people thinking about how to get rid of pages in those archives, as it will make your mistakes from the past painfully visible.

Google Penguin is coming

I wrote about Google Penguin extensively recently, and Google has now said that a new version will come before the end of the year. I expect a roll out just before the holiday season, as that’s something Google has done more of in recent years. This will force many people who spammed their way to the top of search engines to scramble and possibly pay for ads. I’m personally curious how this update will affect the search results.

That’s it, see you next week!

joost signature