The 80/20 Rule for Google Search Engine Optimization

Computers & TechnologySearch Engine Optimization

  • Author Mark Nenadic
  • Published October 21, 2006
  • Word count 717

The techniques being used by internet experts for search engine optimization are always changing and evolving. This is a necessary due to the ever-shifting considerations and guidelines used by the search engine spiders of the top search engines such as Google, Yahoo!, and MSN. This is especially the case when it comes to Google, which is not the most sophisticated and advanced search engine. Especially with its latest algorithm purchase, which places them at the head of the indexing game by leaps and bounds.

There are two primary types of search engine optimization. These are on-page optimization, and off-page optimization. It is clear that search engine optimizing a website today has to do with a great deal more than using Meta tags, or sprinkling a few keywords throughout your website. Though this is still a factor, there is a great deal more to it if you actually want to see results.

Those techniques just mentioned, though, are what is considered to be on-page optimization techniques. Today, on-page optimization effort alone will only achieve the desired results about 20 percent of the time. However, when it comes to keywords that are even mildly competitive, your odds shoot right down through the floor.

This leaves a whopping 80 percent that depends on off-page optimizing for success. Off-page optimizing is all about:

the number of inbound links to your site. That is, the number of links to your site that are not located on your own site.

the actual linking text the anchor text of these inbound links.

the quality of the pages where the inbound links are located.

It is clear, therefore, that though onpage optimization is still important, the majority of webmasters are going to need to place a great deal more emphasis on their offpage optimization efforts if they ever intend to see any results.

This is what is called the 80/20 rule.

The reasoning behind the 80/20 rule is actually quite logical. Though the onsite search engine optimization does help search engines such as Google to index a site in detail, it is the offpage optimization that allows the search engine to gauge the site's actual relevancy and quality. It is what allows the search engine to differentiate between spammers, and actual sites that are filled with useful information, or extremely relevant content with regards to the search word or phrase in question.

This works because onpage optimization is controlled entirely by the webmaster. Therefore, it is subject to abuse, manipulation of the indexing criteria of the search engines, and other unscrupulous (according to search engines) activities. On the other hand, offpage optimization is entirely controlled by other webmasters; not the webmaster who owns and/or maintains the actual site itself. This means that it is much harder for a given webmaster to manipulate the analysis of the search engines. It is the attempt of Google and the other large search engines to stop certain webmasters from gaining an unfair advantage over the results shown by Google searches.

True enough, a webmaster who truly wishes to make the best effort to achieve those top ranks with the search engines can do all that he or she is able to do to encourage other webmasters to add a link to their site. However, this is entirely up to those other discriminating webmasters. Therefore, the odds are that only the more relevant websites will be able to achieve the best offpage optimization results.

Because the search engines are considering not only the number of links to the site, but also how many are only inbound that is, where there is a link going to the site, but it is not reciprocated the quality of the site holding the inbound link, and the actual text used for the link, this also allows them to discourage an unfair advantage from webmasters who attempt simply to crosslink among their own many sites using keyword rich text. For one thing, a webmaster will only have so many sites that can be cross linked, and for another, this would have to be done very carefully so that the links remain inbound, and not reciprocated. The search engines aren't as much concerned with one or two inbound links as they are with hundreds, or perhaps thousands when it comes to the number one ranking spot in their searches.

Mark Nenadic is the director and face behind 15Degrees-North http://www.15dn.com where you will find articles and resources to help with Search Engine Optimisation, Internet Marketing and Web Design.

Article source: https://articlebiz.com
This article has been viewed 1,322 times.

Rate article

Article comments

There are no posted comments.

Related articles