SEO Guide for 2011

Split Testing

Google is most definitely split testing search results and has probably been split testing for the whole of 2010 – I may have taken almost a year to write about it, but I’ve been watching intently as the “Google Dance” starts to make a lot of sense. In the old days the “Google Dance” was a result of different Google servers being out of sync, which may be why this SEO split testing phenomenon has gone un-noticed.

The theory is this – Google is actively displaying conversion rates from “SERP” to “Click” in Google Webmaster Tools. They have the data and now they are measuring the data. I’ve started to notice the data being fed back into the SERPS and small changes to page titles and meta descriptions are having disproportional results in rankings.

It makes complete sense – Google wants to display the most relevant result, so why not let the user choose? Google has been dropping hints “Meta Descriptions no longer matter for SEO, instead use them to sell the click” etc, this is a very positive move for SEO as it detracts from the spam and increases focus on the real websites.

User Voting

In this SEO Guide for 2011, unlike last year, a humble user can favourite web pages in the search results – I’ve found just a small number of glowing yellow stars makes a difference to search results. Again it makes complete sense – it’s like the social search engine concept I’ve been harping on about for the past 2-3 years. Beyond on-page SEO and Link Building, the only other data Google can gain to best serve results, is user generated. Like on Digg, Facebook, Youtube, and etc, users vote/like/digg up (or down) a story, video, item and now, search result.

Yes – I have been automating voting on Google :-)

Synonym Match / Tense Match / Plural & Singular Match

In 2009 and 2010 people have been buying up all the keyword domains in the world – even the most obscure long tail keyword with little over 400 searches a month will have no available exact match domain names. That’s because they are too powerful.

But in 2011 (and for the sake of this pages SEO, I’m just going to write “SEO Guide for 2011” again) – Google is starting to share results for synonym matches. If you search for “Car Leases”, Google will highlight both the word “Car” and the word “Leasing” – which isn’t exactly what you searched for… Interesting huh?

I’m guessing the new domain name land rush in 2011 will be for synonym/plural/singular and tense match domain names.

Exact Match Domain Names

My prediction for exact match domains in this SEO guide for 2011, is that exact match domains are far less powerful than they once were, and this trend will continue. Joe Bloggs and his $25k could buy a nice big juicy keyword domain name and up pops a new $1mil business over-night.

I theorise that beyond a certain competition level, exact match domains offer a diminishing return. So in a tiny market your exact match domain might get you to the top with 10 links vs the competitions 1000, but in bigger markets you’ll need 9010 links to the competitions 10,000.

It makes sense, since it is against Googles best interests to make it so easy to rank.

Other TLD’s

I’m theorising this year, that TLD’s other than .com .net etc, are just as good as any other TLD, but they loose the keyword match domain bonuses. With that in mind any old domain name, including .info’s and .biz’s, can be just as powerful with the right content and promotion. This theory is based on my experience with a few “bad” TLD domains, which aren’t really all that bad after all.

Guru’s

My SEO guide for 2011 is based on experience, and in my experience SEO “Guru’s” are very so far behind the times its dangerous.

I’m going to pick on Leslie Rohde for a second – not because it’s anything personal, but because Leslie Rohde is the Guru’s Guru. I heard an interview with him earlier this year where he said in one breath “Blog Commenting doesn’t work because of no-follow linking” and in the next breath “Blog commenting is good because you get into RSS feeds” – Which is it Leslie?

Blog commenting does work, firstly because “no-follows” don’t mean anywhere near as much as they used to (inter-site) and secondly because yes, RSS feeds are powerful.

But the reason I’ve decided to single Leslie Rohde out above anybody else, is because while researching this SEO guide for 2011 I had a look at his website. If you have correct plugins install you’ll immediately notice that Leslie is linking internally with no-follow links… bad bad bad – more on this to follow. (Image below, no-follow links highlighted in pink)

I’ve spoken to far too many Guru’s that aren’t implementing anything they are harping on about – this really does my head in. SEO Companies are just as bad, I had a similar experience with “The top SEO Company in the USA” where they were destroying one of the UK’s top 50 websites with Internal no-follow linking.

Guru’s generally make too much money teaching and don’t implement for themselves. And in this fast pasted industry that just leaves them in the dust of the practitioners. SEO Companies make too much money selling rubbish, so they continue selling rubbish because it makes them money. Can’t blame them really.

No-Follow Links

Any internal no-follow links are bad for your site. No-Follow was created to mean “I don’t trust this”, so you aren’t sending out a very good message if you are “not trusting” pages on your own website. Stop trying to trick the search engines, because you are just shooting yourself in the foot (Sorry to pick on you again Leslie Rohde)

External no-follow links (links to sites other than your own, or links from other sites to your own) are far far far more powerful than they used to be. The no-follow attribute has been relaxed substantially in the past few years for links to external sources.

There is a very firm distinction to make between internal no-follow links and external no-follow links. I wish more people had a brain cell and could tell the difference.

Time

Time seems to be becoming an ever more powerful SEO factor. Some of my sites have huge back-link counts, indexed page counts, keyword densities, PR, etc… but there will be one little website above mine with 70 back links, no page rank, a few hundred indexed pages, and not all that good on-page SEO. The only barrier between them and myself is time.

It seems 6 months is the amount of time it takes to have your site taken seriously (without some big time exposure) – 12 months before you start cracking top positions, and depending on the market up to 2-5 years for major positions on major keywords.

I think it’s fair to say Google is mindful of the power it holds over companies and won’t just dump you out of the rankings if you’ve been there for a while and get good CTR’s. This creates a new barrier to entry for our new websites.

In general a site can have a lot of power, which will get you ranking for a lot of longtail keywords, but you still won’t rank for the top level keywords regardless of the competition. All you can do is continue pushing and pushing, eventually you will crack it.

Is this because most SEO’s are only targeting 10-25 keywords at a time and Google is weighting against this phenomenon? Or maybe Google has a conscience and wants to give those companies slow on the uptake, a chance to protect themselves.

Google Mobile

Google mobile is consistently showing different results to “normal google”. With mobile browser usage growing rapidly this is a big deal. I’ve found that simple pages rank better in Google Mobile, and especially if they have a specific mobile front end.

Google Instant and Instant Previews

It’s becoming more important to rank for shorter keywords because many people aren’t typing complete keyword strings if they find what they are looking for. Further, Google is jumping to conclusions more and more as we begin to type, which may mean they are shifting away from delivering traffic for one word keywords, towards slightly longer keywords.

As for instant previews, these are making it far more important to make your website visually appealing.

It feels strange saying this, but design is now an SEO factor… ewww, that does feel strange, and goes completely against what every decent SEO has been saying for the past 10 years.

Site Performance

Site performance has been touted as a big factor, however in every single test I’ve done this isn’t really strictly true.

Instead it seems there is a threshold. If you site takes longer than 5-10 seconds to load, then you may be penalised, but any quicker than that and it doesn’t make “fanny adams” of a difference if it loads in 100th of a second, or two seconds.

Speculative theory

My speculative theory for 2010 was “Reverse page rank” – In this seo guide for 2011 I’ve got a new theory.

Different algorithms for different SERP positions! For example, position 1 might be for the most powerful domain over-all, position two might be for the most powerful on-page SEO, position 3 might be for the strongest back link profile, position 4 might be for a sub-page of a majorly powerful site, position 5 for the site with the most indexed pages etc.

To be honest I think this theory goes hand in hand with Google’s split testing, and I’m starting to see trends which suggest there are certain opportunities for websites which lack in a certain area, but are strong in other areas, to take a top position.

It makes perfect sense in an age where some people are hammering SEO so extremely hard in various different ways, while other genuine companies are losing out on traffic because their SEO firm is rubbish, or because they aren’t expending income on SEO.

This theory still needs some more thought, but there is some decent logic behind the idea. SERP Results need to be accurate and varied enough to provide the answer for anybody looking.

2010 SEO Guide – In Review

What did I say last year? And how did it turn out?

Time – An even more major factor in the SEO Guide for 2011
User Experience – Again, this is becoming more important with split testing and google instant previews.
Black Hat/Gray Hat/White Hat – Its far more difficult to pull a penalty these days – probably because it used to be far too easy to kick a competitor in the balls in the past.
Reverse Page Rank – Jury is still out on this one, have to wait and see for conclusive test data
Variation – Extremely important still.
Content – A sheer quantity of content is now more important than ever
Ramp – You still need to ramp up your SEO campaigns and build on them continually – stop watering your SEO plant and it will die.
PR Sculpting – forget about it, stop trying to play games and just do things right.

Conclusion:

The SEO guide for 2011 is far more focused on user experience than in 2010. This makes a lot of sense considering us SEO’s are spamming links and over-seo’ing our pages, trying every trick in the book to get ranked higher.

But…

Exactly the same conclusions are drawn as last year, be resilient and focus on these core things:

1. User Experience
2. On-Page SEO and Internal Linking
3. Building Backlinks.

Merry Christmas – Happy New Year – Good luck with your SEO in 2011!

18 thoughts on “SEO Guide for 2011

  1. Great post Vince. A lot to think about and shows your obviously knee deep in this stuff on a daily basis. It’s a pity you’re too busy putting this stuff into action because I’d like to see you posting a lot more of your secrets :)

  2. Oliver on

    Great guide. Your reverse-page rank theory in particular is very interesting. When doing SEO it often feels like the top 2-3 positions become “hardened”. I have sites that constantly dropped off in the SERPs when I neglected it. Now some of them are in the top 1-3 position, get heaps of traffic and stay there even if I neglect them for weeks in terms of content AND linking. Very interesting.

    I think User Experience has become even more important because of the site-preview. I observed myself not wanting to visit sites that have a butt-ugly preview.. like sites that are all black and look like they are plastered with banners.

  3. Great article. Nice tips. I do realy think that Google is going to pay more attention on users (Clicks, mentions, social vibe) than on links.

  4. Vince,
    I like your take on the seo gurus… and how many fail to actually practice what they preach… :)
    I also like your “Speculative theory” (different algorithms for different SERP positions). This is the first time I’ve heard of this idea, and it makes sense as it would (probably) give the searcher more variety instead of just the top-ten-most-seo’ed-websites.
    BTW, regarding your speculative theory, it’s also nice to actually read something unique in a seo blog post and not just re-hashed stuff. :)
    Thanks for sharing, and MERRY CHRISTMAS!
    Charles

  5. 1. User Experience
    2. On-Page SEO and Internal Linking
    3. Building Backlinks.

    Nothing has changed since Google began. These three issues are the only ones that count for anything.

    And yes, Vince, you are correct on many levels here, especially when you refer to so-called gurus who don’t even bother to present their own sites with correct on-page optimization.

    Happy New Year

  6. Great post i just read, Rand Fishkin’s post on the same topic interesting comparison.

    Your thoughts on no-follow were thought provoking.

  7. But Vince,

    why do you write in your page code:

    div class=”clearfix”

    span class=”moderate”

    good SEO dictates:

    div class=”SEO clearfix”

    span class=”Google moderate”

    or even

    div class=”Search Engine”

    span class=”Google”

    two examples from many opportunities passed up

    or do you think Google just ignores these positions?

    maybe they are discounted but they are the pennies and cents of what we do – enough of them add up to dollars

    Google ignores nothing, every single letter of text is calculated, evaluated and indexed

    these two are legitimate keyword positions, adding real SEO value

    used by nobody

  8. Exact Match Domain Names: 10 Links vs 1000 links, but you never mentioned the revelance of the links, in any ranking situation 10 revelant links are easily worth 1000 random links, Google, I believe is now placing more weight on the revelance of the links altogether, in order to deter the automated linking processes. As they should.

  9. Richard – And yet, then again no… In none of my testing is there an extremely strong correlation between relevance and SEO strength. More so than Yahoo, but still not as severe as you suggest.

    10 relevant links might out-perform 15 irrelevant links. But in other situations this hasn’t proved to be the case.

    To clarify my definition of relevant, naturally the anchor text on both links is relevant. The differentiation is the relevance of the source site.

  10. Thanks for the insightful post Vince. The point about mobile is such a hot topic at the moment – whilst the Rand Fishkins of the SEO world disregard the theory, others such as yourself are certain about it.

    I’m very keen to see how it all plays out, which I’m certain will be within the next 2 years.

  11. Well thought out, and a good read…

    I just wanted to point out that the likely reason Leslie has got links for “read the full story” nofollowed is because it’s IRRELEVANT text, and any juice is best passed through the title tags to those posts, (which I’ll bet a dollar are followed).

    Whether that’s even necessary is debatable, since Google only passes any PR or relevance through the first instance of a link found on a given page anyway (@randfish circa 2007) .

    I have to disagree that internal nofollows are at all a “credibility” loss, and still use ‘em for unimportant pages and unimportant text. I think the more important reason for not
    overdoing nofollow is bad is because you can “evaporate” some of your own internal PR, and sort of “shoot yourself in the foot” as you put it, so it’s a matter of personal preference at that point, and I’d (and maybe Leslie?) would rather have most of my internal links be more relevant. – (Or, maybe hes just not touched the site in years? ;)

  12. @Scott – Every single link blows up PR, so there is no excuse for internal no-follow. I also wouldn’t still subscribe to the “first anchortext rule” of 2007 – I can’t replicate the theory with any data in my testing.

    There are much better ways to do what Leslie is obviously trying to achieve. Using internal no-follow should be completely removed from any SEO strategy. It is categorically wrong. Image links with alt text would be much more effective at achieving the desired result.

  13. great guide… but I still think keyword domains are great… I tested it:D maybe they’re not as powerful as they were once, but they are still very important

  14. Useful review, thanks for keeping up and looking at previous years. Can you expand on what you mean when mentioning time as a ranking factor – the age or results / domain? or something else?

  15. Nice Post. Synonyms, different tenses, and word endings are still somewhat blindly ignored by many marketers/SEOs. Even in writing content people mess up their grammar just to drop the “s” on their keyword when it really does not matter.

    I’m not trying to call you out, but what’s your evidence for nofollow links getting stronger?

    Thanks!

  16. Very interesting for me as I’m just starting to look seriously at SEO for my site.
    A question about the time variable (which may seem a stupid question to those more experienced than me). What matters here ? Is it time since the domain was registered ? Time since the site was first indexed, or something else ?
    Thanks

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>