Here are few important things you must follow to beat Google Algorithms, ranking factors, and future updates. These things are output from research of SEO Experts.
Benefit of EMD for a Search term:
First of all, to clarify, this particular element is quite specific to commercial terms.
Product names, buying words or generic service descriptions. It won’t come as a surprise to see that their influence in positive ranking has declined slightly. What may be a surprise is that the decline isn’t as great as some people seem to have suggested over the last few weeks. Bear in mind that this data comes well after the so-called EMD slap.
It is worth mentioning that the suggestion is that the reduction in the influence of an exact match domain is still in decline, in which case the next quarter’s update will also show a decrease in the correlation.
Title Character length > 48 characters:
A strange one. This is specifically the last canonical title at whatever depth the URL is filed behind the root domain. It does not take the entire domain name into account its just the actual specific page title. As such I’m surprised to find many that have a character length of 48 and not particularly surprised to find that this is not favorable.
The fact that in the last three months it has taken a little bit of a hit might raise the odd eyebrow but overall not a statistic with any particular significance nor one that is likely to affect many people.
Adsense blocks in first 720 pixels:
Here’s one that came right out of the left-field. I had no idea this was even being measured. An adsense block that begins in the first 720 vertical pixels of any given URL seems to have a dramatic effect on that page’s ability to rank for its chosen keywords.It has an unfavorable impact far greater than the exact match domain re balancing. By factor of two or three.
I can only speculate that given that many exact match domains are made for adsense sites and therefore have the commercial block prominent in the top half of the page that the reduction in ranking that some sites have reported and attributed to the EMD recalculation may actually have been misdiagnosed and in fact it is the placement of the advertising block that has caused the issue.
As somebody who doesn’t use this form of advertising any more it’s not my area of expertise but given that it’s the biggest relative change in performance metric over last three months it’s very surprising that it hasn’t been picked up. Given that adsense is Google’s own network, it seems they are looking to keep their value (no surprise) and stay as white hat in terms of SEO as possible. At least in this regard.
Is This An Over Commercialization Penalty? Or Maybe A Specific MFA Site Slap?
I’m going to say no more about it and perhaps leave this for others who are far better equipped and understand Google advertising to look at this.
Keyword in Title and H1 Tag:
Two perennial favorites. Some of the deeper data shows that the hidden meta data that used to be so important in establishing a site’s credentials in a particular niche is now virtually worthless. Interestingly on page elements such as titles and header tags have picked up some of the slack here in terms of establishing content and context.
Not a big shift but one which sensibly puts the ability to establish these important metrics into the hands of the content creator rather than the technical website designer or HTML coder.
This points to content and context and the viability of any particular URL to compete within any particular niche is now derived statistic rather than one that Google simply fetches from a sites hidden meta-data
% External/Internal backlinks with stop word:
I bunch these two together here as well. The use of stop words as part of your anchor text on both incoming and inter-site links has had a positive effect recently. From where I’m sat it is almost certainly a consequence of people diluting their keyword orientated anchor text to one which satisfies Penguins need to see a more diverse set of anchor texts used
Word count >130 <3000:
Keeping your word count above 130 and below 3000 bestows a small relative benefit. It might encourage those with very little content to pad out to get to the minimum, although padding to get 130 should be that much of an effort. It also tells those who have produce huge amounts of content to perhaps split it up to avoid going over 3000 words.
There is a variable sweet spot that seems to relate these were numbers to the commerciality of the niche website is in. Obvious product review sites can indeed get away with 130 words, although you often see many with far fewer words than that per page/product.
Whereas sites have reports of a technical nature or discussion pieces and a slightly less commercial tone should aim higher in the spectrum. Boy am I glad that’s the case
Word count (Backlink anchors < 0.4% of total word count):
Another way of saying this is to have no more than one outbound link (even if it’s to your own site) per 250 words.
I would hazard a guess that this is to show that any particular article or page is designed to offer value to the visitor in its own right rather than just being a conduit to other URLs. So although this is a small shift, it’s probably best not to “over link” on any particular page.
Image Count > 0:
Pretty simple really. Have a picture. Again it’s not a huge benefit that it is a measurable one, and they all add up. An omission here is the inclusion of videos. There are some KPI’s that are being looked at for video but they are separated into self hosted, professionally hosted, and those held by commercial video sites such as YouTube. The overall data regarding any benefit that having a video on page as is a little bit confused by these various ways of hosting them. Hopefully in a later post I’ll be able to sort out what it all means and perhaps give some insight. But for now the simple issue is that images seem good, video seems good. But we all knew that didn’t we.
% Backlinks with keyword (>1 <33%):
Have at least one incoming back link with the keyword you want to rank for. The thought that springs to my mind here is that one is probably nowhere near enough but it does depend on your niche.
But importantly don’t have more than 33% of your back links with the same keyword anchor anywhere in it.
Be aware of composite anchor text that includes many words and repeating the same word in several different anchor text variations Each one will count. It’s a mistake I see made time and time again when I’m asked to appraise peoples backlink profile. They have variations of the same word occurs in every variation. Then it’s time to say hello to Mr Penguin.
% Backlinks rel = Nofollow (>1 <=50%):
Having nofollow links confers a measurable benefit. The benefit that they confer has increased over the last three months. This is where I feel like getting my tin hat on and hiding in a bunker.
That’s not to say you can’t have over 80%+ do follow links, that would be fine where is 80% no follow would not be.
A 50-50 split or split – or having a split slightly in favor of *** would seem to be absolutely best SEO practice.
I didn’t design these tests, it’s just the way it is, and it follows on from similar studies done recently. Nofollow links help, not in passing Page rank but they certainly do as regards search engine ranking and at the end of the day that’s all that really matters.
Word Context (>95% language appropriate, >12% keyword context):
I would have preferred it if this indicator was split into 2 to see how they measure up against each other. The 95% language appropriateness means that if you’re writing in a given language then Google or Bing will expect 95% of the words in any page or article to be recognized in its dictionary of that language. This means that using foreign words, new products or service names that Google has not indexed yet and of course spelling mistakes may count against you if there are too many of them. This all helps with establishing good latent semantic indexing for your URL’s as well.
%age keyword context is an interesting one and something I would like to see extrapolated in its own metric.
It seems search engines understand the blocks of words and types of phrases that should occur in any particular niche. It would expect to see a broad range of these within any article to establish that it is indeed an article specific to that niche. In other words your other on page content needs to establish context.
In practice means that about one word per sentence on average should have something to do with the topic. So, for example, if you’re talking about mobile phones Google may expect to see things like “recharging”, “3G”, “cell phone”, “Orange”, “Ice cream sandwich”, “Ios”, “Apple” etc…included within the written work.
If these references (and this does not include stop words) fall below 12% then you are failing to establish context and the SE might come to the conclusion that the written work may not be specific enough for the keyword that is being targeted to be to rank highly. Again this is nothing particularly new.
Twitter/Facebook/Bookmark (URL in Tweet, Retweet or “Like”):
This is a slightly wider category and includes all the major bookmarking, micro-blogging and social media sites in one. It is the third biggest increase over last three months, and it was already pretty large before that. Social noise is important and has becoming increasingly so as time goes on.
Authenticity (% unique 3-5 word phrases >60%):
I wish we understood this better. Maybe somebody who understands language and the way that Google works might be able to tell me exactly what this means. It is something to do with the uniqueness of the work and the way that Google parses text in 3 to 5 word blocks or phrases. Once again there are caveats regarding the inclusion or exclusion of stop words and the way in which these blocks are put together.
But I know there are some real experts out there who may well be able to give me and my readers are much better overview of what this actually means. Full disclosure here – I’m not really sure myself.
Number of inbound backlinks:
Another surprise. The value of links has increased. Given that many links are now being ignored by the search engines the relative value of those that remain has gone up. The idea of making hundreds of thousands of them is being devalued where the idea of making a few hundred good ones has never been better.
The criteria alluded to in parenthesis involves content quality, keyword density, context and anchor text primarily. In other words many of the other metrics that were measured above.
•Content continues to push hard and is increasingly important. (though still secondary to links)
•Spammy links and over linking has a decreased value and is a trend that has now been consistently downwards for a couple of years
•Quality Links – especially links surrounded by good quality and contextual content are more important than ever.
•Add to the mix social noise and you’re probably onto a winner.
The main danger signal here seems to be that over commercialization of sites has taken a real hit, and this is suck in under the radar. The adsense block reduction has come completely as a surprise to me and coupled with the decrease in benefits of exact match domains seems to point quite definitely at sites that offer little value but high commercialization.
So What Does This Mean For Search Engine Optimization Going Into 2013. Any suggestions are welcome by way of comments here.