Reading the Market Samurai SEO Competition Matrix

Market Samurai produces a very concise report on SEO competition for the top 10 results on any key phrase. It allows an experienced observer to make a decision within seconds, not just on whether or not it’s possible to compete in the market, but also how long it will take to achieve rankings.

There are two elements currently missing:

The first is identification of Authority Listings – which are first position search results that are given additional space and extra links to internal pages. This is earned after an extended period of time by being the most authoritive website in that market. Its almost impossible to attain, and its almost impossible to beat a website once it has attain that authority listing.


If you search for “Wikipedia” – you see the above. The links on the bottom indicate an authority listing.

The second critical factor is the page rank of the domain name, whereas the “PR” column currently shows the page rank of the search result page. The reason we need to know the page rank of the home page (PRD) is to quantify the BLD value (number of back links to the domain.) Since not all links are created equal, and they all have vastly different values, we need to know the domain names page rank in conjunction with the total number of domain backlinks in order to assess the overall power of the website as a whole.

Other schools of thought:

The highly respected “Thirty Day Challenge” currently teaches a person to take two other factors into account when assessing niche competition. These two values are SEOC (The total number of results for a given search term) and PBR (Available traffic Phrase Match to Broad Match ratio) – I strongly disagree with using these two values when assessing competition, and here’s why…

This just complicates an already complicated process…

PBR is simply a value used to determine traffic potential and nothing more. It has absolutely nothing to do with the SEO Competition.

SEOC (Miss named “SEO Competition” where it should be named “Competing Pages”) used to be one of the only metrics readily available to assess a market. Back in those days (and what feels like the SEO stone age) it was a reasonable indicator, but things have changed. The SEOC value doesn’t have any bearing on the SEO Competition Matrix – but I will need to elaborate on why.

If a phrase has an SEOC value of 1000 (in other words, if there are 1000 competing pages for that search result) but the SEO Competition Matrix looks like this:

You just aren’t going to touch it, since you don’t have a chance in proverbial hell of ranking.

Conversely, if you get an SEOC value exceeding 2,000,000, but the SEO Competition Matrix looks like this:

… the SEOC value again, doesn’t mean anything because this is happy days, plain sailing, and you can rank for this term in a matter of days.

Two ways to read the SEO Competition Matrix:

New Acronyms:

First of all, lets create some new acronyms in addition to the Market Samurai acronyms
PRP – Page rank of the page listed in the search result
PRD – Page rank of the domain homepage
CP – Competing Pages, formerly known as SEOC in Market Samurai.

Short Term:

When you are assessing a competition matrix for short term rankings, such as the ones people want to go for within the Thirty Day Challenge, the two important factors are BLP (the number of backlinks to a specific page) and the on-page optimisation elements, Title, URL, Desc, Head. If there is a lot of green in these columns, happy days!

It’s also wise to look at the matrix from a barriers perspective – try to identify barriers to entry. If there is a single red PRP (Page PR of 5 or above) result, you need to consider that result to be a solid barrier. If there is a single page with more than 50 BLP (Backlinks to the page) this will also be a significant barrier for short-term rankings.

Long Term:

The only reason you would be interested in your ability to rank in the short term is if you are testing a completely new niche which you aren’t sure is worth any money – At the end of the day you should be thinking long term, and you shouldn’t be entering or even testing a market if there is no obvious way to monetise it.

When considering long-term rankings, you can essentially ignore six columns. YAH, Title, URL, Desc, Head and CA… Here’s why:

• You can buy the yellow “Y” in the YAH Column, so that’s not a barrier.
• You can get yourself red yes’s for Title, URL, Desc and Head, so you need not worry too much about the competition in these areas.
• CA (Cache Age, the number of days since google has cached the page) has very little to-do with SEO Competition, and you should expect your site to eventually achieve a steady value of less than 5 days. Also, this metric can be deceiving since a page may only be cached once every ten years, but that day just happened to be yesterday.

When I say “Ignore” – what I really mean is, if it looks all red don’t worry about it too much, but if it looks green/orange with splotches of red you can be a lot more lenient on the maximum values on the other factors, and markets which otherwise look too competitive can become viable targets.

Now that we have eliminated a lot of the cause for potential confusion, lets take a look at the remaining columns in reverse order of importance.

DMZ (dmoz.org listing, is your site in the DMOZ directory?) – Since it’s just a single minor element and not a deal breaker, its worth paying very little attention to. In fact, just ignore it…

BLEG (Backlinks from EDU and GOV sources) – If the domain has any do-follow links from an Edu (Educational institution) or Gov (Government) source, it is likely to have a high page rank and high number of backlinks to the domain (BLD). Since the vast majority of these links will be no-follow, and since we can judge the effect of these links in other ways, we can essentially ignore them.

DA (Domain Age – number of years the domain has been registered) – You need to heed the domain age of the entire list, and you need to be looking for weak spots in the list. You also need to look at the DA (Domain Age) in conjunction with the

BLP (Number of backlinks to the page) – If the domain age is high, but the BLP is low, you should be ok. If the domain age is low, but the BLP (Back links to the page) is high, you should be fine (remember we are talking long term here) – If both the DA (Domain Age) and BLP (Back links to the page) are high, that will pose a significant barrier for a new website.

PR (Google Page Rank of the page which appears in the search results list, New Acronym “PRP”) – You might be surprised to find that I’ve put the importance of the PRP below BLD and BLP – this approach fly’s in the face of what many “Guru’s” are teaching. Essentially if you wanting to compete, getting a PR 2-3 is very easy within 6 months, and PR 4-5 within a year. This is based on doing lots of off-page SEO, however if you are serious this can definitely be achieved.

It’s also an ambiguous number that really doesn’t have a lot to do with where you rank. Also, the PR of the page (PRP) doesn’t help us quantify the BLD quality, nor give us a complete picture of the BLP quality, since the PPR will likely be passed down from the domain PR (PRD), and without knowing the PRD, we can’t get a clear picture of the overall quality of the BLD’s (Backlinks to the domain) – Of course we can get the domain PR (PRD) manually, and I’ve suggested to the Market Samurai team to include this in future updates of Market Samurai.

To conclude the advice in regards to the PR (PRP) column, if there are PR2-3 pages in there you’ve got a reasonable chance, but if they are all PR4-6 and above, you’ve got an up-hill hike and you better be wearing sturdy boots.

BLD (Backlinks to the domain) is quite an important value because it gives you part of the picture in regards to the overall power of the website. I suggest you manually collect the home page PR (PRD) as well… If the BLD is above 1000, and the PRD is 4+ then you need to heed the BLD value. Similarly if the BLD is above 10,000 but the PRD is 2 or less, you can take the BLD value with a grain of salt. I have pages with 10,000 backlinks and PR0, and then I have pages with 3 backlinks and PR 4.

BLP (Backlinks to the page) – the single most important value, but you need to read it with the PRP, hand in hand. If the BLP is high, but the PRP is low, chances are the links are low quality… HOWEVER… if the page has more than 50 backlinks you still have a bunch of work to do, as that is potentially a lot of SEO relevance in that competing pages favour. If the BLP is below 10, you can ignore the BLD numbers entirely, but you still need to take the PRP into account.

Conclusion:

It takes a very experienced person to really make sense of competition matrices, and its a skill learned over a long and dedicated period of time. Unless you’ve stared at these things every day for 12 months or more, you won’t be able to get a completely clear picture of the SEO Competition in your market.

The purpose of this post is to simplify reading the matrix AS MUCH AS POSSIBLE for inexperienced internet marketers, and even for some of the experience internet marketers that just don’t have the sheer experience of daily SEO Competition Analysis.

Businesses shouldn’t be attempting to do this themselves. If you need profession SEO Competition Analysis, you can contact me via the “Ask Vince” page and I can quote you for a report.

If you are an individual without the resources to procure professional help, the most basic formula I can give you is…

If there are plenty of results with BLP below 50, and a few results with low BLD, and if there aren’t high PRP’s across the board… then you can be reasonably sure you will be able to compete for the top 10 positions in this market within a 6 month period. Below 6 months you are really at the whim of what google feels like doing on that particular day, and you may get initial high rankings due to news algorithms, only to lose them a week later.

Exceptions:

There is always an exception…

Consider this matrix – BLP’s are low, BLD’s aren’t excessive. The PRP’s are fairly shite… lets also pretend DA doesn’t exist. The purpose of this example is to highlight the point that you need to look a little bit deeper. What is wrong with this picture?

All of the results except for one, are .gov domain names. This is a huge warning sign.

Check this out – this is the SEO Competition Matrix for one of my niches. The first result is a .gov page, the second result is mine.

Despite having a PRP of 4, compared to the .gov’s PRP 0… Despite having 180x more BLP’s… Despite being fully optimised on-page, compared to an un-optimised page… I’m still being beat… Simply because that ratty page above mine, is a .gov

So if you see .gov’s in the results, take them seriously as formidable competitors for listing positions, regardless of the other elements.

18 thoughts on “Reading the Market Samurai SEO Competition Matrix

  1. MikeB on

    Interesting, though I think it misses the point of SEOC which is a preselector for sites to look deeper at strength of competition. Meaningless by itself, it helps to identify some expressions that warrrant deeper analysis in which it is more likely (not certain) there will be lower difficulty.. And it helps to save a lot of time looking for needles in haystacks.

    If you were to create a scattergram of general difficulty for SEO, being a mixture of Rank, Link quality etc you would see a general but not exclusive or perfect positive correlation between high numeric SEOC and difficulty.

    Which can be explained to a degree – the more commercial traffic in an expression the more serious marketers will go after it.

    The correlation is not perfect but it certainly exists.

    Amongst expression that have SEOC of 10000000 , there will be far fewer easy nuts than appear with SEOC of 10000 – and that is the purpose of SEOC…..Low SEOC increases the LIKELIHOOD,not certainty that you will find expressions which can easily be optimized and that is all. The acid test as you say is rank links etc.

    Another issue is “commerciality” – Some expressions have mentions on a huge heap of sites – so High SEOC – amongst those are some that are easy to optimize. In that situation you have to be careful to ensure that traffic converts – it could be the lack of serious SEO competition is lack of buyers.

    A final issue is bound up in “porters forces” in the harvard business model.- markets that are easy to enter for outsiders (eg selling baby products) – low barriers to entry – will always attract a lot more serious marketers – than niches which cost a fortune to enter – and amongst those will be some easy SEO targets.

    In the end it comes down to arbitrage…. If there is a vibrant adwords market, high competition and click value, and STILL low SEO difficulty and cost, there is the hidden gem…..

    .In summary, looking for expressions with low SEOC will kick out some real gems unnecessarily

    However, if you have complete freedom of market choice it will generally save you a lot of time finding candidate niches worth analyising for SEO strength.

    Just my thoughts…

    There are discrepancies both ways.

    Commerciality is an issue – man

  2. Fantastic reply – thank you.

    As you’ve said, the “SEOC” value is a doubled edged sword.

    SEOC is simply not an appropriate pre-selector for finding niches and keywords. Programatically I can understand why its tempting for keyword research automation applications to take this value into account as a quick and nasty metric to avoid having to scrape a plathora of information in order to provide a metric worth having, but that makes it the half arsed and useless metric.

    A more appropriate metric – or series of metrics – would require the software to essentially process the front page results more completely.

    From a coders perspective keeping in mind technical restrictions, a simple solution would be to scrape each of the top ten results pages and dump out an assessment of the on-page factors. Lets say there are ten results, each with 4 on-page factors – 4 metrics x 10 results = 40 total values from which to make a judgement.

    Numbers below 15 = an easy niche
    Numbers below 30 = a moderately easy niche
    Numbers above 30 = a competitive niche

    Now THAT would be a decent SEOC value to highlight keywords and markets as potentials worth further investigation.

    From a software point of view only a single Google page needs to be scraped, and then the client side software can quickly and easily scrape the 10 result pages. The biggest restriction to automation software is Googles automated requests trigger, where if you hit Google too often it will require you to enter a capture code, this proposed solution wouldn’t increase the load on Google.

    The original post is about assessing SEO Competition, for which the SEOC value is entirely useless (and this sentence highlights the potential confusion and the need to change the SEOC acronym to something more appropriate such as CP)

  3. Excellent post and discussion. I will be looking at this more deeply using my niches to see if i can put this to good use. I see where you are coming from about SEOC not being important if the matrix is good/easy but more good matrices will appear as Mike said from lower SEOC because serious marketers have not got in there.

  4. BradStCroix on

    I completely agree with you here Vince. When I was brand new to internet marketing I based everything on rote learned market samurai statistic thresholds without knowing really what they meant.

    A lot of what many ‘gurus’ and IM course content creators have a preponderance to regard as ‘super important’ thresholds and factors in competition analysis, now to me make hardly any logical sense. Glad to see you again with a no bollox post that challenges IM status quo. Loved your post on Positively Ignorant too. keep it up mate!

  5. The number of competing web pages is no longer a useful measure of keyword competition (was it ever I wonder?)

    Someone should come out with a tool that allows you to filter 1000′s of keywords, based on simple signals of intentional competition, so you can spot the low hanging fruit & dig further.

    Perhaps a metric that shows the number of urls with the exact keyword within the title tag *and* at least one incoming link to the page.

    Wouldn’t that be great?

    ;)

    -Mike

  6. Yeah but Mike, the sheer volume of data you would need to process a metric such as that is beyond comprehension… You would need like… a TRILLION url’s!!!

    Amazing chatting to you last night – We are blasting up the motorway back to Manchester, I’ll reply to your emails when doing so won’t make me feel car sick ;-)

  7. Where on earth can we get a trillion urls from Vince! We’d had to like crawl the whole friggin’ internet. It would take years. ;)

  8. Ok, I’ve seen two winky faces now… I smell something fishy… fishy and awesome!

    The last time google announced how many pages they had indexed, they were around the 1 trillion mark… to have access to that amount of info wouldn’t you basically have to be google?!?!?!?

  9. Pingback: Inspirational SEO Results

  10. Prabhu Ram on

    I agree with Mike – MS doesnt give ration of keywords found in intitle to inanchor. This result of SERP returned by this ratio will be the perfect output to measure the competition for the keyword.

  11. I don’t agree with your statement about authority links.
    To get an authority listing is not that difficult in a low competition niche. I know I have done it several times targeting about 10 keyword and having about 200 back links. One you are there it helps tremendously such up clicks. A double listing … one that has an indent is easy to achieve if you know how to work it.

    Where you are correct is that once you have that spot it is very difficult for a competitor to bump it unless you fail to keep promoting the site.

    Otherwise good article.

  12. can i beat authority site just like as amazon, nexttag, etc in long tail keyword for short terms??

  13. Faheem on

    This is kind of a post I’ve been looking for. The best guide for MS users in my opinion.
    Thanks a lot Vince.

  14. Thank you for sharing your expertise. MS is a great too, but you need that razor reading skills to make use of it to the max.

  15. This was the best resource I found about understanding the SEOC matrix I`ve found so far. Some things there are crazy though. You see pages with PR0 and a handful of backlinks beating sites with all indicators red. I would love to know their secrets… Or maybe it is just google being insane.

  16. Anthony on

    Why do you get different color ‘Y’ in the matrix for things like title? Either it has the keyword in the title or it doesn’t?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>