As we know, Google is constantly tweaking its rankings algorithm to improve the relevancy of its search listings. The first iteration of one of the most dramatic updates in some time, known as the “Panda” or “Farmer” update, rolled out in the spring of 2011. This update’s focus was to eliminate low quality content spam from its database. Since then, numerous tweaks to the Panda update have been implemented without major ramifications for most webmasters.

That is until the most recent, Panda 3.3, which rolled almost exactly a year to the original Panda update. No less controversial, this one targets “over optimized” web sites; sites that use moderate to aggressive link building tactics, the utilization of blog networks and black hat tools to obtain links, the use of “un-natural” or over optimized anchor text and so on.

Since then, Google has sent out signals that indicate SEO is going to matter less in the future, and quality content is going to be paramount. According to Matt Cutts, new changes are going to “level the playing field”, making it easier for the folks who don’t focus on SEO to rank higher just by having a great, content rich site. This has left some webmasters wondering if this signals the “end of SEO.”

The short and quick answer is no, it doesn’t.

Why?

To answer that, we have to determine how any search engine determines relevancy, and assesses the quality of any of the sites in its database. There are three ways…

1. What you say about you: Essentially, this is the content of your web pages, and the various related HTML elements (Title and Meta tags, H1, H2 and H3 tags, etc.).

2. What others “say” about you: These are the in-pointing links from authority web sites, guest posts, blog comments links, bookmarking, directory sites, and so on.

3. What the social web says about you: These are the “Tweets”, “Likes” and “Google +1’s” that your site generates. Simply put, these are another form of links.

If we go back to the days before Google, when AltaVista, Fast, and Inktomi-powered engines ruled the ‘Net, ranking algorithms were determined largely by on page factors, or “what you say about you.” Since the vast majority of webmasters have a vested interest in presenting their web sites as the “best” regardless of whether they are or not, this didn’t always lead to the highest quality results. Plus, the results were pretty easy to game, by analyzing the densities of various on page factors, and altering your pages so they met them.

Then Google came along.

Google changed the rules of the game, because it figured “what others say about you” (in the form of in pointing links, which in essence act like “votes” for the quality of your content) would be a better metric to use in determining search relevancy than relying purely on on-page metrics.

And while there have been some glitches, this strategy has worked pretty well for Google. Despite the fact that Google’s results have been – up to this point at least – relatively easy to game using aggressive link building tactics, focusing on in-pointing links is still by far the best way to determine relevancy.

To suggest therefore, that SEO or link building is no longer applicable in 2012 is counter intuitive.

Think about what makes content great for a moment.

Is it fantastic prose? A conversational style? Citations or documentation to support statements? A certain number of words? Links to authoritative resources? A specific ratio of nouns to verbs?It could be any or all of these things. Or perhaps even none of them. The point is, what makes content “great” is entirely subjective. Determining great content from average content based entirely on on-page factors is, at the time of this writing at least, not something at which the search engines are particularly good.

It still makes the most sense to use off-page ranking factors – that means links from credible resources – to best determine this.

What about social signals?

These too are going to be important moving forward, but it’s unlikely that they can or will play a huge role in any ranking algorithm for the simple reason that most sites don’t get more than a handful of “Likes”, “Tweets”, and so on. You also need an established traffic base to generate social signals, so older sites would gain a huge and probably insurmountable edge over newer, or less popular sites, regardless of the quality of their content. That seems to go against what Cutts is saying.

So let’s go back to the “level the playing field” statement, and the assertion that Google is making it easier for sites that don’t focus on aggressive SEO to rank higher simply by having a great, content rich site. How does this play out for you?

Sidebar: Let us ignore the obvious incongruity; that engaging in smart SEO somehow precludes quality content. Investing in SEO has always been a smart business strategy and for many businesses; it generates a decent return on investment, which is exactly why it is done. If Google really is interested in delivering the best possible results to its audience, it can’t simply penalize sites that have used link building tactics in the past, if those sites really are the best possible option for their audience.

As anyone who has built a brand new site in the last year or two will tell you, expecting Google to drive traffic to it without engaging in some sort of link building is akin to waiting on the winning numbers in the lottery. It just does not happen.

What Mr Cutt’s is most likely saying therefore is this…

Sites that develop slow building, natural link profiles – with links that likely constitute genuine “votes” for the quality of your content, are going to do just as well or better as sites that obtain tons of links via aggressive link building for the sole purpose of manipulating search rankings.

Of course, it’s difficult for Google to assess what really constitutes a genuine link or not, and since the majority of small, content rich sites don’t receive many (or any) links from high quality authority sites, it simply can’t eliminate the value of all low quality links without compromising the integrity of its database. In many ways, low quality links are the most natural of links, and the sort of links most sites acquire (blog comment links, forum links and social bookmarking links are all prime examples).

So what’s the bottom line, moving forward?

Your link building has to appear as natural and “un-manipulated” as possible. Some ways to do this?…

1. Your anchor text needs to be varied and a percentage of it should be “un-optimized” (i.e., “click here”, “for more information”, etc.).

2. Links should point to internal pages of the site and not just the home page.

3. Links should come from a wide variety of link types (i.e., press releases, guest posts, blog comments, videos, social bookmarks, etc.).

4. A significant percentage (perhaps as much as a quarter) of your links should be “no follow.” Such links are part of a natural link profile.

5. The amount of links you obtain should be in direct proportion to your site’s traffic. A site that receives 10 visits a day, for instance, is not going to obtain 5,000 social bookmarks in a month.

6. Stay completely “white hat.” Forget tools that offer instant backlinks, traffic, etc. If Google doesn’t know about them now, it will soon, and will take the appropriate action. You will lose whatever ranking benefit you obtained, and may possibly incur a penalty.

It goes without saying that your content should be great. This is something Google has always claimed to hold paramount and should not come as a surprise. However, the biggest take home lesson from Panda 3.3 when it comes to building links is that if it “seems unreasonable”, then it probably is, and Google will act accordingly. Now more than ever, it seems, SEO is a race that goes to the tortoise, not the hare. Slow and steady should be your link building mantra, post Panda 3.3.

Mike Clarke is a blogger, author, webmaster, SEO consultant and occasional contributor to School Grants Blog.com.