Learning Center
Plans & pricing Sign in
Sign Out

Search Engine Optimization by Industry Experts at Delhi - SEO


Self-Study Search Engine Optimization Training

The self-study SEO Course covers a wide range of search engine optimization topics including „on-page'
and „off-page' optimization factors, LSI, link building, Page Rank and much more. Individual units
cover the following topics in order to lead you systematically through the entire SEO process from
researching your website through to coding.

Each training module will be uploaded as an individual hub which you can work through at your own

Course Index                                                                           Page No.

01: SEO Course Outline                                                                        3

02: An Introduction to SEO                                                                    10

03: An Introduction to Search Engines                                                         21

04: Search Engines and Latent Semantic Indexing                                               31

05: Search Engine Users                                                                       44

06: Keyword Research                                                                          55

07: Competitor Research                                                                       67

08: A Guide to PageRank                                                                       73

09: On Page SEO                                                                               79

10: What are Long-Tail Keywords?                                                              94

11: Directory Submissions                                                                     96

12: Heat Map of a Website                                                                     107

13: Google Panda Update vs. Google Penguin Updates                                            112

14: SEO Check List for Google Ranking Factor                                                  119

15: Google Sandbox                                                                            140

16: Free SEO Tools You Should Know About                                                      141

Chapter 1.
SEO Course Outline
Over the last decade or so, the Internet has truly revolutionized the way we live. Thanks to rapid
developments in Information Technology, we are now able to communicate directly with billions of
people the world over. We can now come into contact with people who were hitherto separated from us
by location or culture to exchange views, information, or data via email, forums, chat rooms, multiplayer
games, peer to peer systems and so on.

One further benefit the Internet has brought is the ability to reach out to a vast audience simply by
maintaining a website. Whether we aim to impart information, sell a product, or offer services online,
we now have the potential to reach an audience that is truly global.

The Search Engine

However, the rapid expansion of the Internet over the last few years has brought its own problems for
the individual website. Quite simply, how does one‟s website get found amongst the myriad pages on
the World Wide Web? The answer to this lies in the technology offered by search engines. Current
search technology allows us to quickly search the billions of websites that make up the World Wide
Web and access pages that are relevant to our needs.

                                                                  Thanks to the rise of the search
                                                                  engine as the main portal for
                                                                  accessing the World Wide Web,
                                                                  users are now able to find pages
                                                                  suited to their needs without knowing
                                                                  the address of those pages in advance
                                                                  or without „surfing‟ links. A large
                                                                  proportion of the billions of people
                                                                  who use the Internet turn to search
                                                                  engines to locate information,
products or services. Yet while the search engine enables people to actually locate your website, it does
not automatically ensure that more people will visit your site.

Although your website may appear in search engine results pages, chances are that it appears amongst
thousands of listings for other pages, many of which offer similar products or services to your own.
There are literally billions of websites out there all competing for the attention, time or hard earned cash
of Internet users, so how can you help ensure that the right kind of users visit your website, consult the
information you offer, buy your product, or take up your services in the face of often stifling online
competition? Furthermore, how can your website reach its potential audience when that audience
experience extreme difficulty in finding your site in search engine results in the first place?

Search Engine Optimization (SEO)

This is where Search Engine
Optimization comes in. Search Engine
Optimization, or SEO, refers to a set
of practices and methods aimed at
improving your visibility in search
engine results. It does this by
improving the ranking, or position,
your website gains in search engine
results in relation to other pages that
compete with your own.

If, for example, your site currently
appears in the tenth page of about one million relevant results, chances are that searchers are going to
get bored before they get to that results page or are going to visit websites that appear higher in the
results. If, however, you have a high ranking in search engine results for specific search terms or
keywords, then searchers are more likely to visit your site. Search Engine Optimization works, then, by
optimizing your web pages in such a way that they gain a higher ranking in search engine results and,
from this, improved traffic and online sales.

The Search Engine Optimization process in a series of simple, easy to learn steps

                                                                     Although          Search         Engine
                                                                     Optimization, or SEO, is a relatively
                                                                     new industry, it is a field that has
                                                                     experienced rapid growth in recent
                                                                     years. There are thousands of SEO
                                                                     companies trading online today, all of
                                                                     which will claim to be able to gain
                                                                     you top rankings for your website. In
                                                                     most cases, this is true. The growth of
                                                                     SEO as an industry is founded on its
                                                                     success in achieving what it sets out to
                                                                     do - that is to improve search engine
visibility - and this success is itself based on tried and tested methods and practices that have proven to
be extremely effective in improving search engine visibility.

However, the cost of hiring in a professional Search Engine Optimization firm can be costly and can
prove to be prohibitively expensive for many website owners. Moreover, this expense can be avoided as
many of the methods and practices employed by SEO firms can be easily learned. The following course
aims to introduce you to these methods.

We do so in the belief that SEO is not some form of esoteric knowledge known only to industry
professionals, but is rather a form of knowledge that you can quickly learn and apply yourself. With this
in mind, this course aims to give you all you need to know in order to either enable you to optimize your

own website or become proficient in SEO techniques. By completing the course, you will gain a good
insight into the fundamental methods of Search Engine optimization and will be able to carry out your
own SEO campaign.

We wrote this course in response to not only an emerging demand on the part of web-designers and site
owners to learn how to optimize their web pages, but also in response to the kind of SEO tutorials and
courses currently available to such individuals. There are numerous free articles available online that
claim to explain various aspects of the SEO process. However, the world of SEO changes rapidly,
meaning that such material is often out of date or erroneous, and what advice exists is often piecemeal,
meaning that you do not learn effective knowledge of Search Engine Optimization in a systematic or
holistic fashion.

Unfortunately, much the same can be said of some of the SEO books available for purchase online.
Although these books are often updated to reflect changes in search engine technology, we find that they
tend to speak to fellow SEO professionals rather than SEO beginners and they often take an „anecdotal‟
approach to the subject rather than presenting it in a simple, systematic and progressive manner.

This course, by contrast, aims to take you through the whole Search Engine Optimization process in a
series of simple, easy to learn steps, leading you from an understanding of search engines through to a
practical knowledge of how to optimize web pages.

This course has been written in close consultation with both SEO professionals and educationalists to
help ensure not only that it keeps abreast of current developments in SEO but also that such
developments are presented in an accessible and pedagogically effective manner. The course is designed
for self-study either at home or within a workplace and, as such, has been divided into sections are fairly
short and easy to digest.

We understand that many people have jobs and other pressing commitments and that self-study can eat
into your precious time. With this in mind, we have not imposed the kind of deadlines for study that
might appear in a tutor-led course. You can read the course at your leisure, spending as much time on
each unit as you can spare per evening or week.

If this course places any demands on you they are that you read slowly, taking just enough to time
understand the principles and concepts outlined in each unit, and that you move through the course in a
systematic fashion. While it is possible to use this course as a reference guide and to „dip in‟ to consult
whatever section interests you, we recommend that you work through the entire course from beginning
to end, even if some material is already familiar to you. By this means, you will develop a fuller and
more complete understanding of search engine optimization than can be gained from a partial or
piecemeal approach to the subject.

As for working through each unit, we recommend that you read each unit at least twice. When learning
anything, one of the most effective practices for reading you can develop is to rapidly survey the
contents of the text to gain a sense of its main points and argument and then, on the second reading, to
actually read the text in depth, taking notes where appropriate, underlining key points and asking
questions about the material presented. This protocol for reading makes you active in the learning
process and far more likely to memories and - more importantly - understands what is being taught.

With this in mind, each unit includes a section that offers you the opportunity to „reflect‟ on the
principles it contains. Make sure that you are comfortable with the contents of one unit before you move
on to the next. In order to assist the active learning and comprehension (as opposed to „passive‟
memorization) of key principles we have also included a number of practical tasks and exercises.

Resist the temptation to skip these tasks. These tasks have been specially devised not only for pedagogic
reasons - i.e. they help you learn the principles of Search Engine Optimization - but also for practical
reasons, in the sense that they allow you to put key methods into practice, to try things out for yourself,
and to apply some of the concepts and techniques outlined in the course to real-life situations. For
instance, the course will ask you to try things out on your own website at certain stages, thereby offering
you not only practical experience of SEO but also helping you to optimize your own site during and
throughout the learning process.

If some of this sounds daunting, we should reassure you that it shouldn‟t be. In fact, we think you will
find that search engine optimization is a satisfying, rewarding and often fun experience. Although this
fun often comes at the expense of your online competitors - from spotting the absurd mistakes they have
made on their web pages to outstripping them in search engine rankings! - we predict that you will
derive some satisfaction from watching your website rise in results pages and even from getting your
hands dirty delving into and tweaking the content of your own pages.

SEO is very much a „hands on‟ process and this course should see you as much in front of the PC as
sitting studying.

Whom will this SEO course suit?

Of course, any form of instruction makes demands on students in the form of required or „background‟
knowledge. You may be an IT professional with considerable knowledge of coding, HTML, and so
forth, or, alternatively, you may be a relative newcomer to web-design and HTML. This course does not
require an in-depth knowledge of HTML as it will show you how to understand and use the relevant
code. All we expect of you is a very basic familiarity with HTML in the sense that you know roughly
what a HTML page looks like and does (numerous guides to HTML are available online). Naturally, this

course also assumes that you currently maintain a website or plan to develop one in the near future! We
also expect you to have access to the following:

        An Internet connection.
        An Internet Browser, e.g. Microsoft Internet Explorer, Netscape Navigator, Firefox.
        A program which you can use to edit HTML, e.g. Dreamweaver, Microsoft Front page,
           Windows Notepad or one of the may free editors that are available online.

In the long term, you will also require hosting for your website (the computer that acts as a server for
your website) and a domain name (e.g. as this will allow you to test and
implement your Search Engine Optimization efforts. If you already have these things or if the site you
plan to optimize is already live, you will have to make sure you have necessary administrative rights and
ftp access to your hosting server. If you plan to SEO an existing site that has been developed by an
external web-design company, you should contact them to make sure that they do not retain the
copyright to this site and that it is also okay for you to alter individual web pages.

This course, therefore, does not require a lot of background knowledge. Ultimately, this course is a
Basic SEO course aimed at individuals who are entirely new to search engine optimization. This course
is designed to suit a broad range of people, including the following:

        Webmasters
        Web-designers
        Website owners
        Students
        Those who are simply taking their first steps in moving towards a career in SEO.

While we cannot tailor this course to the specific needs of each of these individuals (even within this
range there will be a diversity of needs, abilities and experience), we have attempted to ensure that even
the novice will be able to understand the methods associated with SEO and put them into practice.

Experienced IT professionals may find some of the material familiar, but they are still encouraged to
work through this material as it is presented in a context that will most likely appear unfamiliar. For
example, an experienced web-designer will probably know what an HTML Title tag is and does but may
not understand its crucial significance for search engine optimization. Individuals with some prior
knowledge of Search Engine Optimization are encouraged to consider our more advanced tuition,
although the systematic approach of this course will still be of some benefit to them. Note: Although this
SEO course makes frequent reference to products, marketing, and so forth, we do not assume that you
are in some way involved in e-commerce. We realize that not all website owners and webmasters will be
involved in online selling. The principles outlined in this course are applicable to ALL websites even if
the „product‟ you offer is simply a service or free information.

Course Content and Layout

The Basic course is comprised of sixteen units. These units are divided logically into four broader
sections devised to take you progressively through the fundamentals of SEO. Section one (Units 1 and
2) introduces you to search engine optimization course outline. Section two
An Introduction to SEO takes a slightly more advanced look at the search process in terms of how
search engines retrieve information ; An Introduction to Search Engines (Unit 3) and how the average
searcher searches with theory of how Search Engines and Latent Semantic Indexing (Unit 4). This
moves us on to the crucial „research for Search Engine Users ‟ phase of the SEO process, during which
you learn how to find the right keywords for your web pages (Unit 5) and how to analyze online
competition with Keyword Research Analysis (Unit 6). Finally, we allows you to put your research to
Competitor Research Analysis to use and actually start optimizing your website by altering the „on-
page‟ code (Unit 7), write optimized copy and A Comprehensive Guide to PageRank (Unit 8), and
Importance of On Page SEO along with start link building (Unit 9) What are Long-Tail Keywords? (Unit 10
) Directory Submissions (Unit 11) (Unit 12) Heat Map of a Website (Unit) 13. Google Panda Update vs.
Google Penguin Updates (14) SEO Check List for Google Ranking Factor (Unit 15) Google Sandbox
(Unit 16) Google Panda Update vs. Google Penguin Updates and Free SEO Tools.

Each unit consists of more or less the same layout, and includes various tasks and illustrations aimed at
helping you understand the material under discussion. Each unit also ends with a „SUMMARY‟ of main
points and an opportunity to „REFLECT‟ on the key concepts or terminology used in that part of the
course. These are included to help you „pull together‟ the key SEO points of each unit and should be
used in conjunction with your own notes.

A Note on Conventions

Certain formatting conventions have been adhered to throughout this SEO course. Where technical
terms or common terminology is first introduced (e.g. „latent semantic indexing‟) it appears in bold type.
Paragraphs that contain important points to note also appear in bold type, prefixed with the term „Note:‟
HTML code appears in a different font style in order to distinguish it from other prose, e.g.

<title>Page Title</title>

References to external sources and other websites are footnoted - students are not required to refer to
these sources, although some may wish to do so.

Further SEO Information and Support

Because SEO is a dynamic industry, the material in this course will necessarily be updated from time to
time. Students should periodically visit these pages to check for course updates (please note that we
cannot take responsibility for any use of information that has become obsolete following your reading or
downloading the course).

If you have an individual question about any aspect of the Search Engine Optimization course, you can
ask relevant questions via the comments at the bottom of this page.

Our staff will be only too happy to assist you. We strive to make our course as beneficial, accessible,
and useful for the user as possible and also welcome any feedback you can provide on any aspect of
course content or layout. Finally, we would like to thank you for downloading this course and wish you
the best of success in your Search Engine Optimization efforts.

Chapter 2.
An Introduction to SEO
This unit introduces you to the basics of search engine optimization. By the end of this unit you should
be able to:

        Understand what search engine optimization is and does
        Understand the basic principles behind search engine optimization
        Distinguish between SEO and web design

This unit assumes that you have a basic working knowledge of HTML. Before reading the following
material, we also recommend you read our course outline in order to acquaint yourself with the structure
of this course and the things that are expected from you in order to derive full benefit from the course.

1.1 What is Search Engine Optimization?

Search engine optimization, or SEO for short, is the process of altering your web pages so that they
rank more highly in search engine results.

As the Wikipedia puts it, search engine optimization is „a set of methodologies aimed at improving the
visibility of a website in search engine listings. The term also refers to an industry of consultants that
carry out optimization projects on behalf of client sites.‟

Search Engine Optimization is also sometimes referred to as search engine marketing (SEM), although
this usage is not strictly accurate as search engine marketing involves high-end marketing strategies that
encompass more than just optimizing web pages.

The purpose of search engine optimization is to make your site visible on the Internet. It does this by
optimizing web pages in such a way as to improve their visibility to search engines and ultimately their
search engine ranking.

There are other ways to get your site found on the Internet. One way is to get listed in various
directories. This usually involves some kind of fee. Another way to make your site visible on the
Internet is to pay for advertising. One of the most popular is the Pay per click system or PPC for short
(e.g. Google Ad words). Under this system, you pay the company (commonly a search engine) who
advertises your site a fee every time somebody clicks the link in your advert.

These systems can prove to be expensive and are, in fact, not entirely necessary. A much more cost-
effective way of making your site visible is to learn the basic techniques of search engine optimization.
Unlike PPC or directories, listings in search engine results are absolutely free.

Of course, hiring a professional SEO company to do the work of improving your visibility in listings is
far from free, but by following the techniques outlined in this course, you can learn how to do the work
of optimizing you site yourself site while saving a lot of money in the process.

                                       Because search engine listings are free, this does not mean that
                                       they are far from important. It is estimated that somewhere in the
                                       region of 84% of Internet users have used search engines at
                                       some point. This suggests that millions of people search the
                                       Internet for products and services every day. An optimized
                                       website allows you to tap into this vast reservoir of potential
                                       visitors and customers by allowing your site to be found in
                                       search engine listings.

                                       Consider also that when Internet users actually go to the bother
                                       of typing a search term into a search engine, they are already
                                       actively looking for specific products, services, or information.
                                       SEO helps you to direct this qualified traffic to your site and
thereby convert visits into sales.

1.2 How search engine optimization works

Search engine optimization works, then, by optimizing your site for search engines in such a way that it
brings targeted traffic to your site. By this, we mean Internet users who are actually looking for your
products or services (visits from people who are not actually looking for your products or services may
increase the number of hits your site gets but is not likely to achieve conversion, i.e. turn visits into
sales, or encourage repeat visits).

Therefore, search engine optimization attempts to bring the right kind of Internet users or potential
customers to your site. These users are individuals who have used a search engine like Google, Yahoo,
or MSN to find the specific kind of products or services that you offer.

In basic terms, SEO does this by attempting to match key aspects of your page content with the kinds of
phrases that people type into search engines when looking for something on the Internet.

1.2.1 A Basic Internet Search

Let us say, for example, that you are an Internet user who is looking to buy a used car online. Besides
going to an online auction site such as eBay, you are likely to start your search for a car by going to the
home page of a search engine and typing relevant terms into the search box. This is known as a search

Here are some of phrases we might type into search engines when looking for a used car:

Used cars buy a used car online second hand cars Second hand Automobiles

These collections of words are known as search terms or keyword phrases or key phrases.

Note: A key phrase is a collection of words that people actually type into search engines when searching
for products or services. As Internet users are becoming more sophisticated, they now tend to string
keywords together into key phrases as this qualifies their searches and produces more specific results.

Of course, the above terms are not the only phrases that people are likely to use when searching for a car
online, but they do show you the kind of words people might actually type into search engines. Note that
searches are not case sensitive, so it doesn‟t matter if people use capitals in their search terms or not.

When we enter one of the above terms into a major search engine, we are presented with a list of results
that the search engine believes to be relevant to the term searched for. For example, if we type „buy a
used car online‟ into Google, we are presented with the following page of results:

Buy a used car online

Google returns the above results as it thinks these are the most relevant WebPages for the particular
search query entered. This is known as page relevancy. This does not mean that when you enter a search
query, Google visits every page on the web to find the most relevant results. This would take too long.
In fact, search engines refer to a collection of web pages that have been indexed prior to your query and
cached on their servers. :

When you enter a search query, Google checks that query against its index and the textual content of the
web pages it has cached on its servers. It then returns what it believes are the most relevant results for
the search query entered.

We will explain how search engines index web pages in the next section of the course.

The query above shows the first page of search results returned by Google for the search „buy a used car
online‟. Of course, there are more pages of results that follow – in fact Google informs us that this page
only shows „Results 1-10 of about 23,900,000‟. The important thing about this page, however, is that it
shows us the pages that the search engine thinks are most relevant for our search term.

Note: search engine results are hierarchical, meaning that results are ranked according to their
perceived relevance; with the most relevant results appear higher in the listings - i.e. on the first page of
results - and the less relevant lower down in the listings, i.e. on later pages.

We have all seen this kind of page before. Note however that Google has highlighted certain words for
every result it has returned. If we look at the page in first position for example, it highlights the
following words:

„Buy new & used cars online, research prices & dealers, sell your ... is your online source to
buy new and used cars. Sell your used car, or research car prices, reviews and more.‟

(You may see slightly different results depending on your location however the principle is the same.)

Now compare these words with our original search term „buy a used car online‟. Note how the
individual keywords from our search phrase appear throughout this listing, i.e., „buy‟, „used‟ „car(s)‟,
„online‟. The same holds for all the other results returned on the first page. Because this page includes
all the terms we have used in our search, the search engine has deemed it a relevant page for our specific
search term.

This example allows us to outline a few important points.

   The search engine is trying to match the search query, or the key phrase that we have entered, with
   the textual content of individual web pages.

   This means that search engines read the textual content of web pages.

   Therefore, SEO involves optimizing the textual content of web pages so that search engines can read

   Finally, a major part of the SEO process involves trying to match the textual content of your site
   with the kinds of phrases people actually use when searching for the kinds of product or services you
   offer, so that the search engine thinks that your site is a relevant result for those phrases.

To put it another way, search engine optimization attempts to improve your search engine visibility by
improving your ranking for specific keywords or key phrases. It does this, in part, by optimizing key
parts of your web pages so that the text in these pages employs the key phrases people are likely to use
when searching for your products or services.

Note: this does not mean that web pages should simply use the same phrases again and again. This is
known as keyword spamming, and can result in your site being penalized by search engines. In later

parts of this course, we will show you how to research, select, and deploy keywords in a safer, more
effective and more ethical manner.


Our first task is a very simple one aimed at offering you a chance to try out the above principles.

    Navigate to the home page of a major search engine like Google, Yahoo, or Bing
    Pretend that you are buying flowers for a partner online. Make a short list (no more than 3) of the
     key phrases you might use to search for this product.
    Enter each of your key phrases in turn into the search engine.
    For each key phrase you enter, look at the leading results, and note the frequency of words that
     correspond with the phrases you used.
    Ask yourself the following question: „why did the sites returned rank for my key phrases‟?

Note: There are other factors that affect the ranking of a site in search engine results, meaning that
                                               some sites that have all the „right‟ keywords will rank
                                               much lower than other sites which mention the same
                                               keywords. We will explain these factors later on.

                                               1.3 Optimizing web pages – an introduction

                                               In order to introduce you to the methods explored later in
                                               this course, the following material is a basic primer to
                                               optimizing your site. In later units of this course, we will
                                               take a more in-depth look at the techniques used by SEO

                                               For convenience, we will divide this primer into two parts.
                                               Firstly we will look at what are known as on-page factors,
                                               or the actual code and content of individual web pages.
                                               Secondly, we will look at off-page factors, or factors that
                                               affect the ranking off individual web pages which are not
                                               determined by the actual code of your web pages.

1.3.1 On Page Factors

                                                 Bearing in mind that search engines read the textual
                                                 content of WebPages, it is particularly important that
                                                 we optimize this content to ensure maximum visibility
                                                 for our products and services.

                                                 Every web page has a number of „hotspots‟ or points
                                                 that are considered particularly important by search
                                                 engines when they try to determine the relevance of that

page in relation to a search query. These are as follows:

The Title tags

The text which appears between the HTML title tags, i.e.: <title>YOUR KEYPHRASES HERE</title>

The Meta tags

The text that appears between the tags:

<meta name="description" content=" YOUR PAGE DISCRIPTION HERE” />

<meta name="keywords" content="YOUR KEYWORDS HERE" />


The text that appears between your heading tags, particularly your first heading i.e. <h1>YOUR

Page content

The actual text which appears on your WebPages, or page copy. Your pages should contain a substantial
amount of text and this text should be keyword rich without spamming the search engine by repeating
the same phrases over and over again. The secret of SEO copywriting is to make your page content
keyword rich while ensuring that it is still written in a simple, flowing and informative style that users
can understand, i.e. plain English.

When optimizing web pages, the text within all these areas should contain the keywords that you aim to
make your site rank for. This is the key to optimizing on-page factors.

In general, then, your pages should aim to distribute your key phrases throughout these key areas in
order to increase the perceived relevance of those pages for your key phrases.

Note. This should not be confused with Keyword density which is simply a measure, by percentage, of
the times that your keywords appear in relation to the other text on your web page. The more your
keywords appear on your page, the higher the keyword density for that page.

We will show you more about researching and deploying key phrases later in this course.

Off-page factors

On-page factors are easy for the user to manipulate, as all you have to do is access the code on your web
pages and make it appear more relevant for the key phrases you wish to rank for. One unfortunate aspect
of this, however, is that (if recent experience of the web is anything to go by) it can easily tempt people
to adopt unethical practices, such as keyword spamming or attempts to make pages rank for terms
which they are not really relevant for.

As a way of getting round the problems caused
by unethical manipulation of on-page factors,
search engines now take off-page factors into
account when determining the ranking of web
pages in search engine results.

Off-page factors, refer to factors that are, to a
large extent, outside the control of the
individual webmaster. The most important of
these are external links. External links are the
links on a different domain or website that
point to your website. Because external links
exist on other websites, we can‟t manipulate
them as easily as we don‟t have the access and administrative rights necessary to add links to our own

When search engines return results for a search query, they look at not only the relevance of your page
for a particular key phrase, but the importance of your page also. Each major search engine has
different ways of measuring this importance, one of the most famous being Google‟s Page Rank
system. Although one search engine differs from another in the manner in which they measure
importance, they all take the following into account when doing so:

Link Popularity
                                   Link Popularity is a measure of your site‟s importance based on the
                                   number of external links pointing to your site from different
                                   domains. Think of a link from another site as a positive review of
                                   your site. After all, another webmaster has gone to the bother of
                                   linking to you, and for this reason must have found your site valuable
                                   or relevant for some reason or another. Search engines take this
                                   perception of value into account when ranking sites and factor it into
                                   their measure of page importance. The more external links pointing
                                   to your site, the more important it is perceived to be by the search
                                   engine and the higher it is likely to rank.

                                   The quality of external links coming to your site

As well as taking the number of external links into account when measuring the importance of a website,
search engines also look at the quality of external links. This „quality‟ is proportionate to the perceived
importance of the site that links to you. Because all sites are ranked in terms of importance, as we noted
above, search engines give greater weight to external links coming from a site that already has a high
page rank or measure of importance.

Relevance of text links

Another favorable factor that can help search engine ranking is to ensure that the anchor text in the links
pointing to your site contains the keywords that your site wishes to rank for. In this way, the anchor text
in external links passes added relevancy on to your page.

   For example, if your site sold used cars online, the link to your site might read something like the

<a href=“”>Buy Used Cars Online</a>

Where the text before the close of the anchor tag </a> is the anchor text for your link. For a search query
such as „buy used cars‟, this anchor text reinforces our keywords and passes relevancy on to our pages
for that search query.

We will show you how more about working with external links later in this course.

1.4 Web design and search engine optimization

If you are a webmaster rather than a web designer, you may be thinking that some of the above elements
relate to web-design and should really be a problem for whoever designed your website.

The truth, however, are that many professional web-designers still remaining ignorant about what search
engine optimization is or what it does. For this reason, web-designers often make the mistake of
including design elements that are not search engine friendly. Many of the elements that we tend to
associate with „good‟ contemporary web-design actually impede the ability of search engines to spider
web-pages as they do not contain a strong textual component that the search engine can read and index.
The following list shows a few common design elements that can have a negative impact on search
engine ranking:

1.4.1 Flash

                 Flash is a multimedia format devised by Macromedia that allows one to incorporate
                 sound and visuals in web pages. However, search engines find Flash files difficult to
                 read, and it is only now that some search engines are beginning to find ways to
                 extrapolate text from .swf files. Websites that rely heavily on flash for navigation and
                 presentation can find that they do not rank very highly in search engine results.

1.4.2 Java

                 Java is a high level programming language that allows you to run scripts and
                 applications, or applets, within Java enabled browsers. However, like Flash, search
                 engine spiders still find Java code difficult to read. Websites that use Java for internal
                 linking systems and drop-down menus can impede the ability of search engine spiders
                 to move around your site and locate every page.

1.4.3 Frames

Frames are a way of splitting your page into different parts, whereby parts of the page are divided by
             effectively placing each within their own window. However, search engine spiders can
             find it difficult to move between frames on a page. For example, instead of quickly
             locating your main content, the spider may be presented with numerous windows each
             with the same apparent priority.

1.4.4 Graphic Navigation Systems

Remember that search engines read text not graphics. If your site uses images such as .gifs or .jpegs for
its internal links rather than links with keyword-rich anchor text, you could be missing out on an
                                  opportunity to increase the relevancy your pages.

                                 Of course, there are benefits to having a well-designed site. Firstly, it
                                 may look impressive from a user‟s point of view. Secondly, your site
                                 may be user friendly, with a good structure and a navigation system
                                 that allows people to quickly locate the information they are looking
                                 for. However, these factors are not much use if people can‟t find your
                                 site in the first place.

                                 You may in fact spend a lot of time and money on web-design only to
                                 find that your site ranks poorly (or not at all) in search engine results.
                                 This means that the vast majority of Internet users will find it difficult
                                 to locate your website. For many companies, this also means that they
                                 have to seek professional SEO help even after having already spent a

lot of money on professional web-design.

In general, then, search engine friendly web-design should limit the use of the above design elements,
and ensure that web pages contain a strong textual element that search engines can read.


        Search Engine Optimization (SEO) improves the visibility of your pages in search engine
         results in order to increase the amount of targeted traffic coming to your site.
        Unlike other systems aimed at gaining greater exposure for your site, SEO works with a
         system of free listings that is commonly used by the majority of Internet users.
        At its most basic level, SEO involves optimizing your pages by making sure that they employ
         the search terms, keywords, or key phrases that people are most likely to use when searching
         for your products or services.
        Search engines read and index the textual content of web pages in order to return relevant
         results for search queries.
        In order to optimize your site, you should ensure that the title, Meta tags, heading, and page
         content of individual web pages are all rich with relevant, targeted keywords.
        As well as on-page factors, the ranking of web pages is also dependent on the number and
         quality of external links pointing to a site.
        Web-design and search engine optimization are not the same thing. Professional web-
         designers often include design elements that are not search engine friendly.
        Search engines find it difficult to read sites that rely too heavily on Flash, JavaScript, or
         frames. An optimized web-site should contain a strong keyword-rich textual content that
         search engines can understand.


By now you should have some idea of the ways in which search engine optimization can benefit your
website. By using the methods associated with search engine optimization you can utilize the most
popular system of free listings on the Internet, make your web pages appear more visible to everybody,
and ultimately bring more visitors and customers to your site - and all without having to spend a penny
on further web design or professional SEO help! In the next unit, we will take a further look at how
search engines work and at which search engines you should be targeting.

Chapter 3:
An Introduction to Search Engines
In the last unit, we explained why search engine visibility is important. In this unit we will take closer
look at search engines. Because SEO is about improving the visibility of your web pages in search
engine results, we have to understand a bit about how search engines work. By the end of this unit you
should be able to:

        Understand what search engines do
        Understand which search engines to concentrate on when optimizing your site
        Understand how search engines rank results
        Measure the Page Rank of individual web pages
        Understand how to perform advanced searches

This unit assumes that you have read and understood the last part of the course and that you are
comfortable with the terms: keyword, key phrase and search engine optimization.

                                         2.1 What is a search engine?

                                         Wikipedia defines a search engine as: „a program designed to
                                         help find information stored on a computer system such as the
                                         World Wide Web, or a personal computer. The search engine
                                         allows one to ask for content meeting specific criteria (typically
                                         those containing a given word or phrase) and retrieving a list of
                                         references that match those criteria. Search engines use
                                         regularly updated indexes to operate quickly and efficiently.‟

                                         In other words, a search engine is a sophisticated piece of
                                         software, accessed through a page on a website that allows you
                                         to search the web by entering search queries into a search box.
                                         The search engine then attempts to match your search query
                                         with the content of web pages that is has stored, or cached, and
                                         indexed on its powerful servers in advance of your search.

Note: many search engines allow you to search for things other than text: for example, images.
However, for the purpose of this course, we will focus on text-based searches. As we pointed out in the
last unit, SEO methods are largely (but not exclusively) centered upon text as they involve matching key
parts of the text in your web pages with the keywords or key phrases that people actually type into
search engines when looking for something on the internet.

There are two main types of search indexes we access when searching the web:

        Directories
        Crawler-based search engines


Unlike search engines, which use special software to locate and index sites, directories are compiled and
maintained by humans. Directories often consist of a categorized list of links to other sites to which you
can add your own site. Editors sometimes review your site to see if it is fit for inclusion in the directory.

Crawler-based search engines

Crawler-based search engines differ from directories in that they are not compiled and maintained by
humans. Instead, crawler-based search engines use sophisticated pieces of software called spiders or
robots to search and index web pages.

These spiders are constantly at work, crawling around the web, locating pages, and taking snapshots of
those pages to be cached or stored on the search engine‟s servers. They are so sophisticated that they can
follow links from one page to another and from one site to another.

Google is a prominent example of a crawler-based search engine.

Note: Some search systems are „hybrid‟ systems as they combine both forms of index. Yahoo, for
example, features both directories and search engines.

As we will see later in this course, the SEO process often involves optimizing your site in such a way
that it allows search engine spiders to locate every page on your site quickly and easily.

Spidering vs submitting your site manually

If you browse the web, you will notice that many companies will offer to submit your site to search
engines for inclusion in their listings. The services these companies offer are largely unnecessary and
can prove to be a waste of time and money.

It is important to remember that search engine spiders are constantly crawling the web, following links
and indexing pages. Because spiders automatically index your pages when they find them, there is
absolutely no need to submit your site manually to the major search engines.

Note, however, that the process of being found can take some time, and it can be weeks before the major
search engines index your site. SEO is a cost-effective way of making your site visible, but it can take
time especially for new sites. However, there are ways to accelerate the indexing process which include
xml site maps and RSS. Both these topics will be covered in the next tutorial.

2.2 Which search engines to target?

In the last unit, we suggested that the vast majority of Internet users use search engines to locate
products or services. This free system of listings is a more popular method of locating sites than paid-for
advertising such as PPC and is thus a better way of improving the visibility of your website. But which
search engines do you want to be found by and which search engines should you target?

Although the majority of Internet users rely on search engines to find what they are looking for, they do
not all use the same search engines. There are, in fact, numerous search engines out there, all vying for a
share in the lucrative search engine market. Here are just a few of the search engines that we use when
looking for something on the Internet:

As you can see, then, there are numerous companies we can turn to when searching the Internet. Note,
however, that not all of these search engines use truly distinct search technology. AOL, for example,
bases part of its search results on Google. Teoma uses Ask Jeeves technology. Dogpile is a meta-
crawler, which means that it searches all the major search engines for you and compiles results from
places like Google, Yahoo, and Ask Jeeves.

This may seem like a bewildering array of search options and a formidable amount of search engines to
optimize your site for. However, we only have to concentrate on the largest players in the search engine
market as they have the most people using their search technology, and because they also act as search
providers, leasing out their search technology to other search engines.

Let‟s look at who the leading players are in the search engine market. The following chart, compiled
from data provided by Hit wise shows the search engine market share for December 2009, November
2010 and December 2011.

As we can see, Google, Yahoo, and Bing are the big players in the search engine market, accounting for
just over 90% of the total market. This means that more people use their search technology to search for
products or services on the web than any other search engine. For this reason, these are the search
engines you should primarily focus on when analyzing optimizing your site. Consequently, these are the
search engines we will focus on throughout this course:

        Google –
        Yahoo –
        MSN –

There are some important things to note about these search engines.

   1. each use different systems to rank pages
   2. because different systems are used, a high ranking for a specific keywords in one search engine
      does not automatically mean that your page will rank highly for the same keywords in another
      search engine
   3. nevertheless, each use similar principles to determine the relevancy and importance of web pages
      in relation to search queries

2.3 Anatomy of a search

In the last unit of the course we began to show you how search engines work. For the sake of simplicity,
we can consider the search process to work something like the following:

   1.    Search Engine Spiders the web
   2.    Search engine caches pages that its spiders on its servers
   3.    User enters a search query
   4.    Search engine checks the search query against its index
   5.    Search engine returns what it believes to be the most relevant results for that query

Although the process is actually more complex than this, the above diagram is useful in helping us to
visualize how searches work, more so in reminding us that when we enter a search term, the search
engine does not actually rush off and check every page on the web. This would take far too long. Instead
it checks your search term against an index that is stored on its servers. Spiders working their way
around the web constantly update this index.

Note: because pages are indexed in advance of searches, the results returned might be out of date. When
you click on the link for one of the results, for example, you may find that the page has been updated
since the search engine last spider it, or even that the page you want has moved.

If I carry out a search for cheap web-hosting, the search engine checks its index to see which pages carry
the terms „cheap‟, „web‟ and „hosting‟. It then returns a results page containing what it believes are the
most relevant pages for these particular keywords.

Let‟s look at a typical search result page. This page shows the results for the above search in Google
(Illustration 1). The results page is set out as follows:

          Search box with our search query.
          The number of results Google returned for our search query (plus the time the search took).
          Sponsored links. This is paid-for advertising. For this results page, Google has selected
           adverts that are relevant to our search query.
          Search results. This section shows the pages that Google thinks are most relevant to our
           particular search terms. These listings are free.
          Link/Page title. The text is the exact text that appears between the title tags (<title></title>)
           on the page that the search result links to. Notice how keywords from our search query have
           been highlighted.
          Page description. This text is commonly the actual text that appears in the Meta description
           of the page that the search result links to. This is the text between the quotation marks in the
           HTML tag <META NAME="description" content="YOUR TEXT HERE">. Again, Google
           has matched this text with our search query.
          Domain. This is the address of the page linked to.
          Cached page link. Unlike the above link, which links to the domain that the page is on, this
           link takes us to the cached version of the page that Google has stored on its server.
          More results. Links to further pages of results

We will now look at some of the ways in which search engines rank pages when determining search

2.4 Ranking Algorithms

Search providers use complex mathematical equations called algorithms to rank web pages. These
algorithms make calculations about the relevance of words on web pages in relation to search queries or
the perceived importance and link popularity of websites. They may also take other factors into account
when ranking results, such as the age of the domain your site is on, or whether the terms used in a search
query appear in the URLs of sites in the search engine‟s index.

You may be surprised to learn that SEO professionals are not entirely sure how these algorithms work.
In fact, search algorithms are a closely guarded trade secret. If they were made available to the public,
we would see a lot more websites trying to find ways to exploit them in order to gain better search
engine rankings.

Algorithms tend to be patented, and these patents can sometimes give SEO professionals a clue as to
how search engines rank the relevance and importance of web pages. Otherwise, SEO involves a fair
degree of trial and error, and most of the SEO process falls back upon tried and tested methods that
circulate amongst the SEO community and that have been shown to be effective in improving search
engine visibility (SEO websites and forums can be a good place to visit to see SEO professionals
discussing these methods and exchanging ideas).

2.4.1 Page Importance

There are two main factors that search engines use to determine the position that pages will gain in
search results:

       Keyword relevancy
       Page importance or link popularity

As we noted above, when you carry out a search query, the search engine tries to return relevant pages
for that query by returning pages that contain the keywords in your search query.

However, search engines also take the importance of the page into account when ranking pages. This
importance is based on the number of external links pointing to a page. The more links pointing to your
pages, the more important they are deemed to be by the search engine.

The best example of this system of ranking pages is Google‟s patented Page Rank.

2.5 Google Page Rank

                                      Google‟s Page Rank is a system that rates the importance of pages
                                      in direct proportion to the number of external links pointing to that

                                      Page Rank exploits the network of links on the web in order to
                                      determine the relative value of individual web pages. It does this
                                      by counting the number of links pointing to one page from other
                                      sites. As Google puts it, a link to one of your pages from another
                                      site is considered a „vote‟ in favor of that page. The higher the
                                      votes, the greater the value or perceived importance of the page.

                                        However, Google also takes the importance of the page that links
                                        to your page into account when determining the value of your
page. If the page that links to you is already seen to have a high importance –in other words, if it already
has a high Page Rank – then the link it provides is „weighted‟ higher than a link coming from a page
with a lower Page Rank or lesser importance.

Google then combines Page Rank with page relevance to ensure that the pages returned in results are not
only important in themselves but are also relevant to your search.

You can find Google‟s own explanation of Page Rank here:

Pages Vs Websites

Google PageRank applies to individual pages and not websites as a whole. Pages on the same site will
often have a different PageRank.

It is important to note this emphasis on individual pages rather than sites as a whole. Similarly, when we
carry out a search in a search engine, the results returned refer to individual pages rather than whole
sites. This makes absolute sense from the point of view of the both the search engine and the user. Some
pages within a website will usually be more important than others, e.g. the homepage. Also individual
pages within websites are not always relevant to the same things and may cover topics that are unrelated
to user‟s search query.

From an SEO point of view, you will be looking to optimize individual pages so that they rank for
different keywords. We will show you effective methods of achieving this later in the course.

Although Page Rank is specific to Google, most of the major search engines now use a similar system to
determine the position of pages in search results.

2.5.1 How to check PageRank

It is particularly important that you learn how to understand and measure Page Rank, as it will play a
significant part in your future SEO efforts. The ability to measure Page Rank will help you analyze
competitor‟s web pages and to keep track of how well your own web pages are faring when they are
optimized and online.

To measure Google Page Rank you must first install the free Google Toolbar into your browser. 2.5.2
Installing the Google Toolbar To get the toolbar, navigate your browser to the following URL:

Different versions of the toolbar are available for different browsers like Internet Explorer and Firefox.
Google should automatically detect which browser you are using and offer a download for the
appropriate version.

2.5.3 Measuring PageRank

                                           With the toolbar installed, try browsing the web. As you
                                           navigate from page to page, the little bar next to „Page Rank‟
                                           will fill up green as the Page Rank for the current page
                                           increases and go down as the Page Rank for the current page

                                           To get a more accurate numeric measure of Page Rank, hover
                                           your mouse over the part of the Toolbar that reads Page
                                           Rank. A small dialogue box should appear with the following

„Page Rank is Google‟s measure of the importance of this page (x/10)‟

Where x is the actual value of the page out of a total of 10. The higher the number, the higher the Page
Rank for that page.

You can now measure the Page Rank of web pages. This has many SEO applications, including:

    The ability to measure the Page Rank of your own pages
    The ability to measure the Page Rank of competitor‟s web pages
    The ability to measure the Page Rank of potential link partners. Remember that the more
     important the site that links to you is, the more weight is given to that link, hence the greater
     your perceived importance.


Let‟s try measuring the Page Rank of some web pages:

    Install the Google Toolbar into your browser, making sure
     that you enable advanced options.
    Once installed, try carrying out a search for the kind of
     products or services that your website offers.
    Visit all the pages returned on the first page of search
     engine results, and note down their Page Rank.
    Compare the Page Rank of these pages. Which pages have
     the highest measure of importance and which the lowest?


    Search engines allow us to search the web by entering search queries that the search engine
     compares against its index of web pages.
    The leading search engines are currently Google, Yahoo, and Bing.
    Crawler-based search engines use software called spiders to crawl the web and index web pages.
    Search engines use complex mathematical algorithms to rank web pages.

    Search engine ranking is based on a combination of page relevance and page importance.
    Page importance (or Page Rank) is based on the link popularity of a web page and the quantity
     and quality of external links pointing to that page.
    Page Rank is calculated on a per-page basis and does not apply to websites as a whole.

2.6 Conclusion

Search Engines are sophisticated engines that allow users to quickly locate products and services on the
Internet. Since SEO is aimed at improving your visibility in search engine results, it is essential that you
understand the criteria they use to rank web pages. In the next units of this course we will show how to
use search engines to help locate the right keywords for your products and help analyze the competition
you will face in search engine listings.


What do you understand by the following terms?

       Search Engine
       Search query
       Spider
       Page Importance
       Link Popularity
       Page Rank

Chapter 4.
Search Engines and Latent Semantic Indexing (LSI)
                                                                       In the last unit of the course, we
                                                                       offered a basic overview of how
                                                                       search engines work. In this unit,
                                                                       we are going to take a more in-
                                                                       depth look at search technology,
                                                                       explaining      some     of     the
                                                                       innovations that have been made
                                                                       to help return relevant results for
                                                                       search queries. In particular, we
                                                                       will be looking at some of the
                                                                       factors involved in Latent
                                                                       Semantic Indexing (or LSI for

                                                                   Because of its very nature,
                                                                   beginners may find the material
                                                                   in this unit quite advanced, so
                                                                   you are encouraged to take your
                                                                   time and try to absorb the main
                                                                   points. We have also provided
footnotes and suggestions for further reading. You are not required to read this material, but more
advanced webmasters may find some of the sources mentioned useful.

By the end of this unit you should be able to:

    Understand the basics of Latent Semantic Indexing
    Understand how a search engine sees documents
    Understand how a search engine weights keywords

This unit assumes that you have read the previous parts of the course and are familiar with major search
engines such as Google.

3.1 Another look at search engines

SEO requires quite a lot of background knowledge if you are going to optimize your page in a manner
that is effective and does not actually damage the ranking of your website. Before you even get your
hands dirty altering the code on your web pages, you need to do quite a bit of research into the most
effective keywords to use for your product or services and into the competition you face in search
engine rankings. However, even prior to this, it is necessary that you understand a bit about:

    Search engines
    The individual searcher

In one sense, these can be considered the „rocks‟ upon which effective SEO is founded. After all, SEO is
about improving your search engine visibility in order to bring targeted traffic to your site, and this
implicitly involves understanding the nature of both search engines and the searcher. Knowledge of
these areas will prove an immense help when you come to optimizing your own pages.

The two areas are inter-related: after all, the function of a search engine is not to search documents per
se, but to search documents in a way that satisfies the needs of the searcher. In order to maintain trust,
the search engine must continue to provide the user with reliable and relevant results for search queries.
In this context, any innovations made in search engine algorithms can be considered, first and foremost,
as refinements aimed at providing ever more relevant results for the searcher.

We will now look at some of the advanced techniques that search engines are beginning to employ in
order to satisfy the needs of the searcher.

3.2 Google and Latent Semantic Indexing (LSI)

                                                            In the last unit of the course, we showed you
                                                            how a search engine attempts to find relevant
                                                            documents for a search query by locating
                                                            pages in its index that match the search query -
                                                            that is, pages that contain the specific words
                                                            we entered. However, the process is rather
                                                            more complex than this, largely because of an
                                                            innovation on the part of the world‟s leading
                                                            search provider, Google.

                                                             In order to return more relevant results for the
                                                             user, Google has begun to employ a method
called „Latent Semantic Indexing‟ when indexing documents on the web. Although this method is not
used universally by all search engines, it is likely that other search engines will begin to factor this (or a
similar) method into their algorithms in the future.

Note that Google does not rely entirely on LSI for finding relevant results. However, Google has been
using LSI for some time and has recently increased its weighting. This means that while traditional
keyword based search queries are still relevant - i.e., Google still tries to retrieve documents that contain
the specific search terms or keywords you use - Google‟s search algorithm has begun to place more
importance on LSI when attempting to determine and retrieve relevant documents for a specific search

So what is LSI and how does it differ from a standard keyword search? In essence, LSI is a method for
retrieving documents that are relevant to a search but that may not contain the specific keyword entered
by the user.

For example, in a traditional keyword based search, if I enter the search phrase „used cars‟ into the
search engine, it will only return documents that mention those actual terms somewhere on the page. It
will not return web pages that mention terms that we normally consider to be closely related to our
search query, e.g. „second hand‟, „vehicles‟, „automobiles‟, and so forth (unless these pages also happen
to use the key phrase „used cars‟).

When using LSI, on the other hand, the search engine finds a means to locate pages that contain related
terms as well as our specific key phrase. Therefore, our search might also return pages that only mention
„second-hand automobiles‟ as well as pages that specifically mention „used cars‟.

As you can see, then, LSI allows the search engine to return documents that are outside our specific
search phrase, but that are still relevant to our search. It begins to approximate how we actually use
language in real life, where we are aware of alternative terms and synonyms for words and for this
reason should prove to be more useful to the searcher than a standard keyword search.

3.3 Latent Semantic Analysis

LSI is based on a theory called Latent Semantic Analysis. This theory was devised in 1990 by Susan
Dumais, George Furnas, Scott Deerwester, Thomas Landauer, and Richard Harshman.

                                    According to Landauer, Foltz and Laham, Latent Semantic
                                    Analysis, or LSA, is a theory and method for extracting and
                                    representing the contextual-usage meaning for words by statistical
                                    computations applied to a large corpus of text.

                                    In other words, LSA is statistical and mathematical method for
                                    finding the contextual meaning of words in a large collection of
                                    documents. Such a collection could be something like the Internet,
                                    which contains a vast corpus of text based documents in the form
                                    of web pages.

If this begins to sound like advanced mathematics meets advanced linguistics, that‟s because it is! (LSA
even borders on cognitive science). This method however has immediate applicability to search engines
because we are dealing with the problem of making a mathematical machine, or computer, „understand‟
or analyze, the meaning of words (semantics is the study of word meaning, hence Latent Semantic

Unlike most humans, who usually acquire the ability to use and understand language at an early age,
computers cannot understand what words mean. The same holds for search engines. Despite their
sophisticated mathematical algorithms, and despite the fact these algorithms „read‟ the text on web
pages to some extent, search engines are actually rather stupid and cannot form even the most basic
understanding of what words mean.

What is the „contextual-usage meaning‟ of words? To explain this we have to look at two features of
everyday language which cause particular problems for computers and search engines.

       Synonymy.
       Polysemy.


A synonym is a word that roughly has the same meaning as another word. To find synonyms for words
you simply have to consult a Thesaurus, where you will find a list of alternative words that can be
interchanged with the original word.

I say „roughly‟ because we can‟t just select any alternative listed in the Thesaurus to replace our original
word. In fact, some words only become synonymous with other words when used in the right context.

For example, if I consult my Thesaurus for a synonym for part of our earlier search for a car, „used‟, I
am provided with the following list of possible alternatives:

       Cast-off
       Hand-me-down
       Nearly new
       Not new
       Reach-me-down
       Second-hand
       Shop-soiled,
       Worn

If I were looking for second-hand clothes rather than used cars, I could use many of the above
synonyms, as we customarily refer to second-hand clothes as „cast-offs‟ or „hand-me-downs‟. However,
we don‟t use such phrases as „hand-me-down cars‟ or „shop-soiled automobiles‟. The context of our
original phrase, or the word „cars‟, determines that only one of the above phrases - „second hand‟ - is an
appropriate alternative for „used‟.

In other words, we understand which words are synonyms according to the context in which they

Of course, it would be of great advantage to us as searchers if the search engine were to automatically
find commonly-used alternative terms for the search phrases we entered. While we could simply
construct a search-engine with its own built-in Thesaurus, the above example shows us the problems we
would inevitably encounter if we did so.

If the search-engine attempted to substitute our search terms with all the alternatives found in its
Thesaurus, it would produce some very strange search results. Without some understanding of
„contextual-usage means‟, or the context in which the term to be substituted appears, the search engine
would be unable to pick the „right‟ synonyms.


„Polysemy‟ can roughly be translated as „many-meaning‟. It refers to the fact that most words in any
given language have more than one meaning.

To see this you simply have to look in a dictionary, where you will find that most words have more than
one definition. If, for example, we use a term from our earlier search, „vehicle‟, we can see that it could
have more than one meaning. According to the Oxford Concise Dictionary, a „vehicle‟ could be a thing
for transporting people, a means of expressing something, or a film intended to display its leading
performer to best advantage!

How do we decide which of these possible meanings is called into play at which point then? This is
where „contextual usage‟ comes into play. As language users, we know which meaning is being used
according to the context in which it appears. If for example, I was to say that „Top Hat was a vehicle for
Fred Astaire and Ginger Rogers‟; you would know that the word „vehicle‟ in this context refers to a type
of film and not a car. If, on the other hand, I use the phrase „second-hand vehicle‟ you are likely to know
that I am referring to a car.

Unfortunately, a computer has no way of distinguishing between the two as it lacks the ability to
understand the context of statements and has no knowledge of the linguistic customs that give rise to
polysemy. This means that the search query „second-hand vehicles‟ could potentially return any page
that happens to mention the two words, including pages that mention films or even „vehicles‟ for
expression such as poems.

We clearly have a problem then, because computers can‟t understand the meaning of words according to
the context in which they appear. It either has to stick with the terms given and ignore all possible
alternatives - which mean that we could miss documents that are relevant to our search but don‟t contain
our key phrase - or include all possible alternatives - which mean that numerous irrelevant results could
be returned.

LSA provides us with a means of getting round the problem of computers not being able to understand
contextual-usage meaning. It has been successfully applied to the process of information retrieval - that
is, the process of retrieving information from large databases and collections of documents (like the
Internet) - because it adequately gets round the two problems of synonymy and polysemy.

It does this by looking at the collection of documents as a whole and finding words that are commonly
closely related. For example, by looking at enough documents it could find that „used cars‟ and „second-
hand automobiles‟ are closely related terms simply because all the above terms customarily appear
together on the same pages. Let‟s have a closer look at how this works.

3.4 Latent Semantic Indexing in Action

The following material is based on the paper „Patterns in Unstructured Data‟ by Clara Yu, John
Cuadrado, Maciej Ceglowski and J. Scott Payne. In this paper, the authors present what is probably the
best introduction to LSI and search engines currently available, and one that is popular amongst the SEO
community. A copy of the paper can be found here:


Advanced webmasters may wish to consult this paper themselves, although this is not an absolute
requirement as the following material provides a simplified version of the research it contains.

3.4.1 What people expect from a search engine

Yu et al. begin by pointing out some of the problems faced by current search technology. The internet is
growing at an exponential rate, to the extent that, as the authors point out, Google has over 8 billion web
pages in its index. This effectively means that more and more users have access to a vast collection of
information, and that search engines face the task of indexing and searching this vast reservoir of data to
return results that are both relevant to the individual searcher and simple enough for the average user to

The trouble is that, given the sheer size of the Internet and the current state of search engine technology,
any relevant information we find will still appear among a ton of irrelevant pages. Therefore search
engines today still face the task of coming up with ever better ways of finding relevant results for the
individual searcher.

According to Yu et al, there are three main things that people expect from a search engine (what they
call the „Holy Trinity‟ of searching). These things can be defined as follows:

    Recall
    Precision
    Ranking

„Recall‟ refers to the ability of the search engine to recall relevant information for a search. This relates
to our desire to gain all relevant information that exists on a topic when we search for it. „Precision‟
relates to the fact that we want the results returned to be precise, i.e. to contain more relevant than
irrelevant information. Finally, we expect the results returned to be presented to us in an ordered
manner. We expect them to be ranked in such a way that the search engine presents what it perceives to
be the most relevant results for our specific search first, and the least relevant results last.

We can use this criteria for judging the efficacy of any current search technology, e.g.. Google. When
we use a search engine, we expect it to not omit information (recall), to return relevant results
(precision) and to arrange those results in SERPs (ranking). Yu et al. envision that the ideal search
engine would be able to quickly search every document on the internet and return up-to-date results
quickly while still satisfying this criteria. However, where it is relatively easy for a search engine to
increase its scope or speed up its searches as this largely involves investing in additional resources, it is
still difficult for a search engine to improve upon the recall, precision and ranking of searches. This is
where latent semantic indexing comes in.

3.4.2 A Middle Ground?

Yu et al. outline two main ways in which we can search a collection of documents such as the Internet.
For the sake of simplicity we will call these methods:
    Human
    Mechanical


The first type of search is not likely to be exhaustive as nobody has the time or resources to go through a
whole collection of documents word by word (can you imagine reading all the pages on the internet!).
Human beings are more likely to scan pages for relevant information rather than read pages as a whole
to see if they occasionally contain the phrase or information we are looking for.

Although this kind of search is not exhaustive, it is based on a high-level understanding of context, in
the sense that human searchers usually know that certain parts of documents - e.g. page titles, headings
and indices - usually contain relevant information regarding what a page is about. Because this kind of
search is carried out with an understanding of context, it can also successfully uncover relevant
information in unexpected places, e.g. in articles which are not dedicated to the subject we were
originally looking for.


The second type of search is exhaustive in the sense that it works methodically and mechanically
through an entire collection of documents, noting down every single mention of the topic we are looking
for. Computers are particularly good at this kind of task.

Although this second search can find every single instance where a term is mentioned, it has no
understanding of context. Without this understanding of context, the computer cannot return documents
that are related to our search but that don‟t actually our search terms. Alternatively, the search engine
returns documents that mention our specific search terms but that are using those terms in the wrong
context (see the problems of synonymy and polsemy outlined above).

In short, a human search understands context but remains inexhaustive or sorely incomplete, while a
mechanical search is exhaustive but has no understanding of context.

An obvious solution to the difficulty of searching a collection of documents like the Internet would be to
find a way to combine the two. That way we would have the best of both worlds, where an exhaustive
mechanical search would also display an understanding of context, thereby allowing „synonymous‟ or
related material to be found while cutting out the irrelevant material caused the problem of polysemy.

Yu et al point out that past attempts to combine mechanical searches with a human element have only
met with limited success. Attempts to supplement searches by providing a computer with a human
compiled list of synonyms to search have not proved successful. Surprisingly, there would also be
shortcomings were we to employ a human „taxonomy‟ or a system of classification such as the systems
that have been used by libraries for generations (e.g. the Library of Congress). Under such systems,
documents are classified according to different human defined categories (e.g. a library book could
belong to categories such as science, natural philosophy, natural history, and so forth).

Even though traditional archivists successfully employ such systems, they might not work so well for
the Internet. How, for example, would one find the means and resource to go about placing the billions
of pages in the internet into little pigeonholes? What, moreover, would happen if many of these
documents were relevant to more than one category (as most will inevitably be), or if your average
Internet user didn‟t have knowledge of the category their search belongs to?

Latent Semantic Indexing can be said to be a solution to the above problems in that it appears to offer a
„middle ground‟ between the two methods outlined above. LSI offers an exhaustive search that still
understands context. Better still, LSI is entirely mechanical and doesn‟t require human intervention.

3.4.3 How LSI works

In the last unit of this course, we pointed that a search engine attempts to find relevant results for a
search query by finding pages that contain the terms used in that query. For example, a search for
„mobile phone accessories‟ will return pages that actually mention the words „mobile‟, „phone‟, and

This system is not ideal, as it deems all pages that don‟t contain our specific search term as irrelevant,
even if those pages potentially contain information that is relevant to our search.

As Yu et al suggest, LSI still takes account of the words a document contains, but it takes the extra step
of examining the document collection as a whole to see which other documents contain the same words.
If it finds other document which contains the words, it considers them to be „semantically close‟.
„Semantically distant‟ documents, by contrast, are documents that don‟t have a lot of words in common.

The important thing to note here is that, by calculating the similarity values between documents, LSI can
actually find words that are semantically related. For example, if the terms „cars‟, „automobiles‟ and
„vehicles‟ appear together in enough documents on the Internet, LSI will consider them to be
semantically related. Therefore, a search engine that uses LSI in its index will return pages that mention
„vehicles‟ when you search for „cars‟.

                                                         In short, then, Latent Semantic Indexing
                                                         enhances our searches by taking account of
                                                         related terms. By looking at enough documents
                                                         on the Internet, it can find which words are
                                                         related to other words, or words that are
                                                         synonymous with other words. A search engine
                                                         that uses LSI can thereby return documents that
                                                         are relevant to but outside of our specific
                                                         search query.

                                                         3.5 How search engines view web pages

                                                         As we noted above, this process does not
                                                         require human intervention. There is nobody
                                                         telling Google, for example, that „cars‟ and

„vehicles‟ are related terms. Instead LSI finds related terms all by itself simply by looking at enough

LSI, in fact, is simply a statistical and mathematical computation that looks at word patterns across
documents. It is not an Artificial Intelligence program that gives Google a way to actually read
documents as humans would. In fact, the search engine that uses LSA to index pages remains as stupid
as ever in the sense that it cannot understand even the basic meaning of words.

But that is not to say that LSI doesn‟t focus on word meaning. Nor does it pay attention to every single
word on the page.

In every language, you have two different kinds of word:

    Content words - e.g. car, phone, liberty, celebrity, etc.
    Function words - e.g. and, but, to, the, etc.

In simple terms, the first kind of word has some kind of meaning for us (i.e., we can visualize what a car
is or understand the concept of liberty), while the second doesn‟t have the same kind of meaning (ask
yourself, what is the meaning of „the‟?) In other words, words can be divided those that carry meaning
and those which do not.

LSI works by stripping documents of function words and extraneous terms to focus on terms with
semantic content. It is useful to know this, as it is what a search engine will be doing to the words on
your web pages when it reads them.

In fact, the search engine employs what is known a stop list in order to strip web pages down to a
skeleton of content words. This stop list is a list of commonly used words, function words, verbs,
prepositions, etc, which it removes from the page to focus on words that carry the main meaning of the
page. This greatly reduces the „noise‟ on the page and helps the search engine determine what the page
is about.

This is all part of a process the search engine performs upon web pages in order to determine the
relevance of each page objectively. The process LSI performs upon web pages when indexing a
document is as follows:


The search engine removes all markup tags (i.e. code) from a page so that all its content is represented
as a series of characters. The search engine moves through the page systematically, working from top to
bottom and left to right, removing content of from tags as it finds it.


The search engine strips the page of formatting such as punctuation, capitalization and


The search engine applies a stop list to remove commonly used words from the document. This leaves
us with only content words.


The remaining content words are then „stemmed‟. That is to say that the remaining terms are reduced to
common word roots (e.g. „techno‟ for „technology‟, „technologies‟, „technological‟).


Weighting is the process of determining how important a term is in a document.

3.5.1 Term Weighting

By „term weighting‟ we mean the importance given to terms or words that appear in a document.

A search engine does not see all terms in a document as equally important (the use of a stop list, for
instance, shows that the search engine treats common words, function words and non-content words as
wholly unimportant). Similarly, the search engine does not treat the content words that remain after it
has filtered a document as if they are all equally important.

According to Yu et al, the weighting of terms by the search engine is based on two „common sense
insights‟. Firstly, there is a likelihood that content words that are repeated in a single page are more
likely to be significant than content words that appear once. Secondly, words that are not used very often
are likely to be more significant than words that are used a lot.

For example, if the word „aircraft‟ appears a number of times in a single page, it is likely to be fairly
significant to that document. Remember that a search engine can‟t read, so a recurrence of such terms
may just indicate roughly what that page is about.

However, if one takes a word that appears in lots of pages - say, a common content word - then it is
treated as less significant. It would not, for example, be much help in allowing the search engine to
distinguish between these pages in terms of their different content.

There are therefore three types of weighting employed by a search engine:

    Local weight
    Global weight
    Normalization

Normalization simply refers to the process by which documents of different lengths are made to appear
„equal‟. If this did not occur, longer documents - which, of course, contain more keywords - would tend
to outweigh or subsume shorter documents.

Local weight refers to the number of times a term appears in a document. A word that features numerous
times in a single document will have a greater weight than a word that features only once. This is also
known as term frequency (tf).

Global weight refers to the number of times documents in the collection appear that feature the term.
This is often referred to as inverse document frequency (IDF)

Keyword weighting is calculated according to the following equation:


Where tf = term frequency and IDF = inverse document frequency.

3.5.2 Weighting and distribution v/s Keyword density

Although this is material is fairly complex as it appears to involve advanced linguistics and complex
mathematical formulas, it is useful material to have a basic grasp of as it does have ramifications for the
SEO process.

Traditionally, SEO professionals have focused on a thing called keyword density when dealing with
term weighting. Keyword density is a measure of the number of times your keywords appear on a page
in relation to other terms. For example, if the keyword „cars‟ appeared three times in a document that
contained 100 words, the keyword density for that page would be 0.030 or 3% (3/100).

Under the keyword density model, the more times a keyword appears on a single page, the more likely it
is that the search engine will find you relevant for that keyword. Under this system, optimizing your
page simply involves increasing its keyword density by mentioning your keywords as many times as
you can on a single page.

However, SEO professionals have come to realize that this is not how search engines work when they
look at keywords or determine the importance of terms to a page. Keywords density only refers to the
use of keywords on a per page basis and not across the document collection as a whole. As Dr. E. Garcia
points out, modern search engines also have to take into account the following factors when dealing with

    Proximity - the distance between keywords on a page
    Distribution - or where keywords appear on a page

These factors have a direct bearing on what a document is about. For example, if the keywords „used‟
and „cars‟ have a close proximity, i.e. they appear on the page together as „used cars‟, then that page is
more likely to be about used cars. The same goes when one looks at where the keywords appear on a
page (e.g. do they appear in titles and main headings and so forth?).

The concept of keyword density, by contrast, does not take into account the position of keywords in
relation to each other on a page. If search engines actually used keyword density as a measure of the
relevance of a page, they could potentially return pages that mention „used‟ and „cars‟ enough times no
matter where they appeared on a page. For the sake of illustration, we could say that the following
phrase might make a page relevant for the keyword search „used cars‟:

„I used to cycle to work a lot but most people drive their cars to get there.‟

As you can see, this takes no account of the proximity or distribution of keywords, all of which will
have an impact upon what the page is about.

Note: In future units of the course, we will occasionally refer to keyword density as it is a term still used
by SEO professionals, and, as a concept, it still works as a suggestive way to get SEO beginners to start
increasing the frequency with which they employ their keywords on web pages. However, it will pay to
remember that search engines use a different system than keyword density for determining the
importance of keywords on a page. Bear this in mind when we start showing you how to employ
keywords in your own pages.

3.6 Conclusion

In this unit, we have only touched upon the basics of search engines and LSI. There are many other
concepts that could be covered (including such things as singular value decomposition and the term-
document matrix!). The material we have covered so far, however, has been simplified to a number of
main points that have a bearing on the basic principles of SEO. These points should help you form a
basic understanding of:

The methods search engines are beginning to use for information retrieval
How search engines actually see web pages
How search engines weight keywords

By now you should have a more in-depth understanding of how search engines are beginning to index
documents. It is useful to understand the principles behind LSI as we will refer back to it later in the
course. As we will see later, LSI also has direct and practical implications for key areas of the SEO
process such as deploying keywords, writing page copy, and constructing anchor text for external links.


       In order to return more relevant results for the user, Search Engines like Google are beginning to
        use Latent Semantic Indexing to retrieve web pages.
       Latent Semantic Analysis is a statistical method for determining the contextual meaning of
       Latent Semantic Indexing helps get round the problems of synonymy and polysemy encountered
        when people search the internet.
       Latent Semantic Indexing can return relevant pages that do not contain the actual terms of a
        keyword search.
       Search Engines do not read every word on a web page. Instead they focus on content words.
       Web pages are subjected to a complex process of linearization, tokenization, filtration, and
        stemming, whereby Mark-up, punctuation, and a stop list of commonly used words are removed
        from web pages.
       The content words on web pages are weighted differently according to how frequently they
        appear on a page and in the collection of documents as a whole.
       Keyword density is not an accurate measure of the importance of keywords on a page. Search
        engines actually use methods that look at keyword weighting and distribution.


What do you understand by the following terms?

       Latent Semantic Analysis
       Polysemy and Synonmy
       Content Word
       Linearization
       Tokenization
       Filtration
       Stop List
       Stemming
       Weighting
       Normalization
       Keyword Density

Once you feel that you satisfactorily understand the above terms, move on to the next unit of the course.

Chapter 5.
Search Engine Users
In the last unit of the course, we looked at some of the advanced methods search engines are beginning
to use to index and retrieve information from web pages. In this unit of the course, we will look at
another crucial factor in the search process, the search engine user.

By the end of this unit you should have a better understanding of:

       The habits and patterns of behavior exhibited by the average searcher
       The implications these patterns have for the optimization of your site
       The perceived difference between 'organic' and sponsored listings
       The ideal position to gain in search engine results
       Aspects of your site that persuade search engine users to visit that site

This unit assumes that you have completed the previous units of the course and that you have a basic
understanding of the major search engines and the manner in which they work.

4.1 The importance of the searcher

In a certain sense, the search process involves two elements that come into contact every time a search is
performed. Firstly we have the search engine itself as a sophisticated piece of technology. Yet we also
have to take the individual searcher into account. This individual probably doesn‟t even understand how
search engines work, yet, in some respects; all the efforts of the search engine and search engine
optimization are determined by the actions of this individual.

At the end of the day, search engines become more complex and sophisticated only in so far as they are
trying to satisfy the demands of the individual searcher and return results that are fast, relevant and easy
enough for him or her to understand. As far as SEO goes, your entire efforts boil down to getting the
right kind of searcher to your site, and much of the research for, and content of, your site will be
determined what they type into search engines when looking for your products or services.

With this in mind, the individual searcher is an important factor to understand when optimizing your
site. By understanding something of the psychology of the searcher, we can gain important pointers
about the most effective way to plan and optimize our site.

4.1.1 Self-reflection

One of the easiest ways to get inside the mind of an average searcher is to reflect
upon what you yourself do when searching online. As an Internet user, there is a
great likelihood that you already use search engines on a regular basis to find
information and purchase goods. The next time you look for something using a

search engine, pay attention to such things as the search engine(s) you use, the kind of searches you
typically perform, the amount of time you spend researching an item before you buy, the amount of
search engine results you look at, and the amount of time you spend visiting pages that appear on search
engine results pages (or SERPs for short). This kind of reflection can give you clues as to what the
average searcher might do when looking for your products and allow some consideration of what you
should do to ensure that your website satisfies their needs, demands and expectations.


In this task we are going to start to get you to reflect upon what you do when searching for a product
online. Ask yourself the following questions:

   1. Which search engine do you use? Why?
   2. Do you use more than one search engine? If so, which do one do you prefer and under what
      circumstances do you revert to another search engine?
   3. When searching for products in a search engine, do you start with a general search using broad
      key phrases, or do you start with specific terms (brand names, models, etc.) and broaden your
      search if this does not return the required results?
   4. How many pages of search engine results pages - or SERPs - do you normally consult after
      performing a search?
   5. How many results per page of SERPs would you normally click on?
   6. In SERPs do you ever click on the advertising or sponsored links at the top of or to the right of
      the page? If not, why not? If you do, what circumstances lead you to click on paid-for links and
   7. Do you use search engines to research products or to research and buy products (in other words,
      do your visits to vendor's sites for information normally result in your visit being converted to a
   8. On average, how much time do you spend on each page linked to in a SERP?

By reflecting in this manner upon your own experience as a search engine user, you can begin gain
valuable insight into numerous aspects of the SEO process, from writing Title tags through to finding an
effective way to market your product online. In order to explore some of these aspects, we will now take
a look at what is said to be the typical experience of other Internet users.

4.2 Into the Mind of the Searcher

The material in this section is based on a whitepaper produced by Enquire Search Solutions called 'Into
the Mind of the Searcher'. The full .pdf report can be found at the following address:

This report analyses the habits of search engine users during a typical search situation. Although the size
of the sample group consulted for the report was very small – consisting of a focus group of 24 people of

varying ages and backgrounds – it can nevertheless provide us with some insight into how the average
searcher operates.

In the material that follows we will summaries some of the more important findings of this research.
Taking them as indicative of general search trends (unfortunately, no wide scale or comprehensive study
of searcher habits exists at the time of writing), we will then highlight some of the implications they
have for designing and optimizing your website.

4.2.1 Searchers and Search Engines

The first point we will look at deals with individual preference when it comes to choosing a search
engine. According to the report, approximately 70% of participants in the survey use Google, with MSN
and Yahoo being the next two main search engines. This accords with the material presented in unit 2 of
this course which, in a wider survey of search engine usage, showed Google, Yahoo and MSN as the
leading search engines, with Google as current king of the search engine market. In fact, some 16% of
the users who participated in the survey had set their home page to Google.

SEO Implications

This gives us extra impetus to optimize our sites for all the leading search engines. Although one might
expect a degree of loyalty as regards which search engines users prefer, 60% of users reported that they
will take recourse to another search engine if they are not happy with the results returned by their
preferred search provider.

4.2.2 Researching and buying

According to the whitepaper, more people are likely to use search engines when researching a product
rather than when they are buying. In fact, although the number of people who are prepared to use a
search engine to research a product is significantly high (at around 60%), there seems to be a distinct
drop in the number of users who will actually use a search engine to buy the product online (28%).

If the searcher is familiar with the product, they show a tendency to bypass search engines and navigate
directly to a site that sells the product or brand. If such familiarity does not exist, searchers tend to revert
to search engines to research products.

Users also appear to spend more time researching a product if the cost of the product is high. This
research usually involves more search sessions than would be the case with a relatively low cost

Although this suggests that internet users spend most of their time on search engines researching a
product rather than actually buying, conversion – i.e. visits to your page being converted into sales – still
occurs. Conversion tends to be more likely on sites that offer information in a non-aggressive manner
rather than on sites that offer users the „hard sell‟ on a product.

SEO Implications

The report therefore suggests that individual searchers exhibit a tendency to use search engines most
during the „research or consideration phase of the buying cycle‟. This would suggest that people use
search engines to locate information even when attempting to buy something. While there is the
possibility that the information users seek is simply the comparative price of goods, there is no reason
not to offer as much information about your product or services as is necessary. This not only supplies
an evident need for information on the part of the user; it also gives you the opportunity to place
keyword-rich text on your web page, thereby increasing your chances of being picked up by a search
engine for your chosen keywords (we will show you how to research keywords in the next unit of the

With regards to the text that you place on your web page, the research further suggests that a site is more
likely to convert visits into sales if it doesn't actually try and push the product too much. To this end,
make sure that your page copy offers product information and other relevant details rather than
aggressive                                                                                      marketing.

4.2.3 Organic Vs sponsored listings

One of the most interesting points raised by research into searcher habits concerns the status of paid-for
advertising in relation to 'normal' search results. Sponsored listings are the adverts that appear SERPs.
They usually operate by a Pay Per Click (PPC) system whereby the company advertising pays the search
engine every time a user clicks on the link in the SERP. How much the vendor pays per click is
determined by their position on the page. „Organic‟ listings, by contrast, are the free listings that appear
in SERPs. These appear in the main area of the SERP.

The following illustration shows both organic and sponsored listings in Google: Illustration 1: Organic
and Sponsored Links in SERPs.


    1. Sponsored links in top position of results page. These are more expensive than adverts in the
       right hand side of the page.
    2. More sponsored adverts in the right hand position of results page.
    3. This section holds the free „organic‟ listings.

Research suggests that search engine users actually „mentally divide‟ SERPs into different sections such
as the organic listings and sponsored links outlined above. It appears that a significant number of users
will actually ignore some of these sections. In particular, the majority of users exhibit a tendency to
actually ignore the sponsored links. Nearly 80% of searchers polled in the Enquire survey ignored such
links completely or only considered them after they had been through the organic listings.

There are numerous reasons why users might remain wary of paid-for listings. Users of Google in
particular appear to be particularly wary of sponsored links. This may be due in part to their perception
that organic listings on Google are less 'commercial' than the other search engines. Users also show a
tendency to ignore formatted advertising or anything on the right hand side of SERPs, which is the
traditional positioning for advertising. As the Enquire report suggests, many searchers may actually have
become 'conditioned' to ignore anything that looks like paid-for advertising.

Users also appear to be wary of clicking on sponsored adverts to other sites. This reluctance seems to be
motivated by a basic mistrust of sites that market aggressively and the problems of pop-up windows
associated with such sites. One can only interpret this as a 'legacy' from the early days of Internet
marketing when pop-up ads were the norm. It seems, then, that the techniques employed by the
'pioneers' of online business (if the continued popularity of pop-up blockers is anything to go by) created
an issue of trust that still affects online business today.

SEO Implications

Since many searchers appear to remain wary of sponsored links, this gives you even more reason to
focus on, and optimize for, the free system of listings offered by the major search engines. Even if you
have the budget to finance a PPC campaign, it may not be as effective as a high organic ranking in

4.2.4 Position in SERPs

How high should that organic ranking be? Well, research
suggests that a high positioning in search engine results
can have a dramatic impact on the exposure one's site
gains. Furthermore, your site's ranking should be as close
to the first page as possible. In fact, research suggests that
100% of searchers will actually check the top three
organic listings before clicking on links (a number of
people actually appear to assume that a top listing for a
search term in SERPs immediately makes the site
relevant for that term).As far as scrolling down the page

to later results goes (and here one might speculate about whether Internet users in general find scrolling
frustrating), searchers would tend to do so only if the first three results did not return relevant sites. If a
relevant site was found in the top results, only three fifths of the participants in the Enquire survey
scrolled down to look at other listings before clicking on a link.

While the majority of users will actually examine the first few results on the first page of SERPs,
evidence suggest that there is a significant drop when it comes to looking at the second page. The
number of searchers that actually go to the second page of results may be as little as 20%. The remainder
tends to launch a new or refined search if they don‟t find what they are looking for on the first page.

SEO Implications

It is quite a worrying factor that a significant portion of search engine users may not actually check
results pages after the first page or may only do so if that page does not return relevant results. With this
in mind, your aim should be to gain first page ranking for some of your main key phrases in one of the
major search engines, and preferably a position in the first few results.

4.2.5 Ever changing search fields

Research also suggests that an average search session is complex - in the sense that it involves multiple
search queries - and dynamic in the sense that it involves an ongoing alteration of search terms and
shifting of expectations in response to results returned, sites visited, and information received from
previous queries.

Interestingly, the typical search process involves an increasing refinement of search terms. During an
average search, users appear to spend some time refining their search terms, and, in general, this
refinement appears to move from general search terms to more specific terms. 70% of users start broad
because it seems easier to type in broad phrases, in order to generate other options, and because they
don‟t want to exclude potentially relevant results that would be excluded by too specific an initial
search. Thereafter, these broad queries are then refined using extra qualifying terms. Such qualifying
terms may include the introduction of brand at various stages during the search session.

SEO Implications

The implications of this for SEO are twofold. Firstly, your site should be constructed in such a way that
it „catches‟ or draws in these users at the different stages of their search, as they move from general
terms to more specific terms. Secondly, the individual pages that this navigation structure gives rise to
should be optimized to rank for the key phrases used at different stages of the search session.

To illustrate this principle, let‟s look at a search within a highly competitive field. Say that a user is
looking for a hands-free set for a Nokia 7250i mobile phone. A typical search might involve an
increasing refinement of search terms such as the following:

    Mobile phone accessories
    Mobile phone accessories UK
    Nokia mobile phone accessories

    Nokia 7070 phone accessories
    Buy 7070 hands free set

As you can see, the user has started at a general level and increasingly become more specific in reaction
to the results gained. Note also the introduction of brand at key points during the search session.

Note: As a rule, the more general a search term is, or the less keywords it contains, the more results it
returns in SERPs. It seems likely that as Search Engine users become increasingly sophisticated, they
react to this on an experiential level (i.e. they know by experience that general terms return results that
are too broad or numerous) even if they do not reflect upon why this occurs.

Note that the above list does not take into account the possibility that a user could start at a specific level
(e.g. „7250i hands free set‟), not find the required information, and then move to a more general level
(„Nokia phone accessories‟). Nor does it exhaust the possible keywords a user could use when searching
for that specific product. Nevertheless, it does show us that a good way to bring targeted traffic to your
site would be to rank for the keywords the user employs at different points in their search, whether that
is during the general initial enquiry or at more specific 'qualifying' stages.

Now, it is highly unlikely that an Internet site dealing with mobile phone accessories would sell just one
product such as a 7250i handset! So, let‟s assume that we are running a mobile phone accessories site in
the India. One way to ensure that you are attracting traffic would be to base your page navigation and
optimization on the principle that searchers move from general to specific search terms.

Starting with your homepage, you could make it rank for the most general term. Because general, the
term will be more competitive, but this simply means that it gains more searches than specific or
specialized terms. You then make other pages rank for other less competitive key phrases, thereby
ensuring that you draw searchers into different parts of your website depending on the specificity of
their search.

Our mobile phone accessories site, for example, could attempt to rank for the general and highly
competitive key phrase 'mobile phone accessories' on the first page, and use other pages to rank for more
specific and less competitive terms. For instance, later pages could introduce key phrases related to
brand (e.g. Nokia, Sony Ericsson, and so forth), type of accessory (mobile phone covers, data cables,
spare batteries, etc.), and a combination of the two (Nokia mobile phone covers).

4.2.6 Why users click on links

                                     Of course, gaining a good ranking in search engine results is of little
                                     use if people don't actually click on the link to your site.
                                     Fortunately, the report gives us an indication of the kind of things
                                     that will entice the searcher to visit a site during the average search

                                     Some of the main reasons why searchers click on links in SERPs are
                                     as follows:

    If their exact search terms appear in the title and description
    The amount of information on products, including prices, features and reviews
    „Trust‟ - consumer reports, trusted brand names, or trusted URLs.

There are also additional factors said to persuade the customer to visit if they are buying online, such as:

    The opportunity to buy online
    Offers and discounts

SEO Implications:

The above list gives additional reasons why key phrases should be used in the Title and Meta
Description tags of your HTML pages. Title tags and Meta Description tags are not just there for benefit
of search engine in determining the relevancy of a page in relation to a search query. The text they
contain is also the text that appears in search engine listings. The following illustration shows this:

Illustration 2. Search key phrases as they appear in Title and Description Tags


   1. Page Title as it appears in search engine listings.
   2. Page description as it appears in search engine listings.

1 is the actual text that appears in the tags of the HTML page the listing points to

<title>TITLE GOES HERE</title>

2 is the actual text that appears in the Meta Description tag:

<meta name="description" content="TEXT GOES HERE" />

Illustration 2 shows a search for the key phrase „mobile phone accessories‟. Note how the actual
keywords from our search phrase - „mobile‟, „phone‟, and „accessories‟ - appear highlighted in the page
title and description (we have further highlighted these in the first example by underlining them). A
search engine user looking for mobile phone accessories would immediately notice the extent to which
these results are relevant to their search.

As the first point of contact between the search engine user and your site, these on-page factors are also
instrumental in persuading searchers to visit your site (thereby moving them one step closer to
conversion). Relevancy for the user, or their ability to see an exact search phrase appear in listings, is
therefore an important factor and one you should take into account when optimizing your own site. We
will show you how to optimize your Title and Description tags for specific key phrases at a later stage in
the course.

The above list of reasons can also help you select the actual text that makes up your page description.
For example, as well as containing the main exact key phrases that the searcher searched for, your
description can make it clear that they can buy or purchase the product online. If you offer price
comparisons or reviews of products, you can also mention it here (although if you do sell the products
online, you may want to make it clear to potential buyers that you are a vendor instead of a dedicated
price comparison or review site).

4.2.7 Once searchers have visited your site

Of course, persuading searchers to visit your site is only half the battle. Once they have navigated to
your site, you will probably want them to do one or all of the following things:

       Stay for a while
       Buy something
       Find your site worthwhile
       Consider returning

This can sometimes be easier said than done. If the Enquire report truly reflects patterns of online
behavior, it appears that the average visitor quickly decides whether a site they have visited suits their
needs. This decision can be as fast as 10 to 18 seconds, so strategies have to be adopted to persuade
them to stay. We also have to take into account the possibility that more visitors will use your site to
research products than to actually buy them, so conversion can be difficult to achieve.

Fortunately, the report lists a number of things that can help in this matter. When searchers visit your
site, they are said to look for the following things, all of which can positively affect their decision to stay
and/or buy:

    Relevance, or their search phrases appearing in visible parts of the page
    Images of products
    Recognized brands
    A good product selection

       Product information, reviews, and comparisons between products
       Price listings
       A professional layout and user-friendly navigation system
       Online offers and discounts
       Simple and convenient methods of purchase
       Preservation of searcher „anonymity‟ as far as possible during the buying process

SEO Implications

Although enticing visitors to stay and buy is not strictly an SEO issue, as the end of SEO is to improve
your search engine visibility in such a way that it brings more targeted traffic to your site, one must
realize that this itself is a means to an end for many websites and that the ultimate aim of bringing
visitors to a commercial site is to increase online sales and achieve a good rate of conversion. The above
list is included to reflect this factor. Ultimately, you should consider adding the above features to your
site at the same time as you work on optimizing it.


In this task we will try to synthesize some of the above material and give it real life application by
applying it your own website. Answer the following questions, trying as far as possible to base your
answers on the material presented in this unit of the course:

       When optimizing your site, which search engines should you aim to gain high rankings in?
       Why might organic links be more popular, and more important for your site, than sponsored
        links in SERPs?
       What position should you strive to attain in SERPs?
       True or false: search engine users tend to move from specific search queries to broader or more
        general search terms?
       What things might entice a searcher to click on your link in a SERP?
       What things might entice a searcher to stay on or buy from your site once they have visited?


    Understanding the habits of the average searcher can help you in devising your optimization
     strategy as these habits have significant implications for the optimization process
    The majority of search engine users pay attention to, and trust, the free 'organic' listings in
     SERPs rather than sponsored links
    More users consult search engines to research products than to buy.
    Typical search sessions are complex and dynamic, involve a continual refinement of search
     terms, and tend to move from general to more specific search queries

    Your position in SERPs can have a dramatic impact on the number of people who visit your site.
     The first 3 results are particularly important. Users are reluctant to scroll down, and are less
     likely to consult later pages of search results.
    Your site is more likely to appear relevant to a searcher if their exact search key phrases appear
     in the title and description areas of your listing in SERPs.
    Alongside perceived relevance, product information, reviews, prices and price comparisons,
     online discounts and brand recognition can encourage Internet users to visit, stay on, and buy
     from your site.

4.3 Conclusion

By now, you should be able to see how a basic understanding of the average searcher can assist you in
optimizing and designing your website as well as marketing your product or services. Although the data
presented above cannot and should not be taken as a fully accurate reflection of search habits, as it is
based on a fairly small sample of search engine users, it could be taken as indicative of wider or more
general search trends. In this respect (and in the absence of a wider survey), this data can be considered
a provisional model of how searchers search, one that possibly matches your own experience of using
search engines, and one that can be employed to gain ideas about the most effective manner in which to
optimize websites. It can, for example, underline the importance of employing key phrases in key areas
of your web pages such as Title tags, Meta Description tags, headings and page copy.

Of course, you cannot employ key phrases in your page titles and descriptions until you know the
keywords and terms Internet users are most likely to enter into search engines when searching for your
product or services. In the next unit we will show you how to research keywords for your web pages.
This takes us from outlining the basic background knowledge necessary to understanding search engine
optimization to the crucial research phase of the SEO process.


What do you understand by the following terms?

       SERPs
       Organic Listings
       Sponsored Links
       PPC
       Conversion

Chapter 6.
Keyword Research

This unit introduces you to keywords and keyword research. Keyword research is a fundamental part of
the search engine optimization process. After completing this unit, you should be able to:

       Understand how keywords work
       Understand the basics of keyword research and competition analysis
       Be able to compile a list of possible keywords for your own products or services
       Understand the basics of integrating keywords into your WebPages

This unit assumes (but does not demand) that you have completed the previous units in this course and
that you are comfortable with the concept of how search engines index pages and rank them for

5.1 What is a keyword?

A keyword is the term that people will type into search engines when looking for your product or
services. Consequently, these are the phrases that you want your web pages to rank for in search engine

Keywords are usually stringed together to form larger keyword phrases, or key phrases, of up to 4 or 5
words. When we refer to keywords, we usually mean phrases of this kind:

e.g. „search engine optimization‟

Single keywords are either too general or highly competitive. If you type one word into a major search
engine, it might return millions of pages. By using a combination of keywords that are relevant and
specific to the product you are looking for, you lessen the number of results, and are more likely to
locate what you are looking for. Search engine users are aware of this, and tend to use phrases when
looking for something on the web.

e.g. If you are looking to buy a used car in the UK, entering the term „cars‟ into Google will return too
many results that are not relevant to what you are looking for. If, on the other hand, you use the phrase
„used cars uk‟, you will return less results and more relevant pages.

5.2 Why keywords are important

In order to succeed in online marketing and gain a higher ranking for your product, it is essential that
you choose the right keywords for your products and services. By doing this, you

    Allow customers who are looking for your product to find your web pages in search engine
    Bring targeted traffic to your site.
    By selecting keywords that are both relevant and specific to your product, you are more likely to
     convert visits into sales, or attract the right kind of audience.

If, on the other, hand you target the wrong keywords for your product, you run the risk of

    Not being found in search engine results when people search for your product.
    Bringing the „wrong‟ kind of traffic to your site. By „wrong‟, I mean search engine users who are
     not actually looking for your product. This is not likely to turn visits into sales or encourage
     repeat visits.

5.3 Researching Keywords

Keyword research involves determining the correct keywords for your product or services. These are the
words and phrases people are most likely to type into search engines when searching for that product.

5.3.1 When to research keywords?

                                          Ideally, this should be done before you design your website.
                                          This is because the correct keywords have to be deployed in
                                          the right places of your web pages in order to boost the
                                          relevancy of those pages for the desired keywords. Your
                                          content, navigation and link structure all has to be built around
                                          the proper keywords for your product.

                                          If you design your site prior to working out a keyword
                                          strategy, your pages will eventually have to undergo extensive
                                          redesign if you want them to rank higher in search engine

results. The same goes if you are planning to optimize an existing site.

It is not impossible to optimize your site for keyword relevancy after it has been constructed. After all,
this is one of the main tasks that Search Engine Optimization companies undertake when taking on
clients with existing sites. However, this process can be labor-intensive, costly, and may involve
undoing a lot of the work already done by your web-designers.

Brainstorming Keywords
5.3.2 Devising a Keyword Strategy

Your keyword strategy involves finding the right keywords for the individual pages on your website and
taking account of the competitiveness of those keywords. We will look at keyword competitiveness
analysis in the next section. For now we will focus on beginning to find the right keywords for your

In order to determine the correct keywords for your products:

    Find terms that people are likely to type into search engines when looking for your product
    Use a keyword research tool
    Check the keywords and key phrases used by leading competitors‟ websites

Finding the keywords people use in order to find the terms that people use, ask yourself, what words
would you type into Google to find your product. Also ask friends and colleagues what terms they
would use when searching for your product online. This will allow you to begin compiling a list of
possible keywords for your web pages.

5.3.3 Keyword research tools

Keyword research tools are another way to research keywords for your site. Because they access
information on the search terms that people actually use, keyword tools can be an invaluable resource
when researching key phrases.

Some keyword research tools are free, while others are commercially available. Try typing „keyword
research tool‟ or „keyword suggestion tool‟ into Google for a list of available tools.

One of the more popular keyword research tools out there is wordtracker

( ). wordtracker works with a database of the search terms that people
have actually used in search engine searches for the last 60 days. By subscribing to their service, you can
search this database for key phrases and suggested alternatives. You can also view the frequency of
searches for particular terms over the last 60 days and other data regarding the sites that use those terms.

Search Engine Optimization professionals frequently use tools like wordtracker. The benefit of these
tools is that they give you a broader survey of the main and alternate phrases people search for than that
gained by simply asking friends and colleagues.
While keyword research tools are an extremely useful resource, note that the results returned may not
always be one hundred percent relevant for your product. You will have to use your judgment, which
can be better informed by using keyword analysis tools in combination with the other techniques
outlined in this section of the course.


Let‟s put some of these principles into practice. Over the course of this unit, we will try to construct a
list of the possible keywords and phrases for your product/services, which we will add to and edit later
on. At this stage you should

    Write down a list of the words that you would type into search engines when looking for that
    Ask friends, family and colleagues to do the same.
    Visit and sign up for their free trial. Try out some of the phrases you
     gathered above, check their popularity as search terms, and note down any suggested alternate
     search terms that you think might be relevant to your product.
     Retain this list for further reference and move on to Competition Analysis

5.3.4 Competition Analysis

In order to employ keywords effectively, you must be aware of the sites you will be competing with for
the use of certain key phrases. Some key phrases are highly competitive, meaning that you will be less
likely to rank in the first page of search engine results for those terms even after optimizing your pages.

Your competitors may also have neglected to use a key phrase that people commonly use when
searching for a product. You might be able to exploit this „gap‟ in the market by employing such a
keyword on your own pages.

Therefore, there are two things that help your keyword research:

    Competition analysis
    Competition gap research

We will return to the subject of competition in later units of the course. For now, let us look at the way
in which competition impacts upon your selection of keywords.

5.3.5 Checking Competitor’s Websites

To help determine which keywords are right for your product, you should find a few websites that rank
highly for that product by search for it in one of the leading search engines (Google, Yahoo, BING).

Once you find some relevant sites, check the Meta information of their pages. This, as you are probably
aware, is located in the <head></head> of the document. This can be read by finding the menu in your
browser which allows you to view the „source‟ code of the document (in Internet Explorer, click the
„View‟ menu, then „Source‟ in the drop- down menu), or by saving the page locally and opening it in a
web-editing package like Dreamweaver.

Look at the Meta tag that deals with „keywords‟. If present, you Here will see a list of the key phrases
for which the page is attempting to rank.

Compare the phrases in the „keywords‟ Meta tag with the terms and phrases in the following areas

    The „Title‟ of the document
    The „description‟ Meta tag
    The page copy itself, including headings
    The anchor text for internal links.

If the site is optimized for search engine ranking, then the same keywords or key phrases are likely to be
repeated in all these key areas of the page (we will discuss the process of making your keywords
synchronous in this manner at a later stage).

EXAMPLE 1: Here is the HTML code for part of an optimized web page. Assuming for now that this
one of your competitor‟s sites, check it using the methods above and try to find the keywords that it is
attempting to rank for.

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
<title>Search Engine Optimization Training Course - SEO Training Company</title>
<META name=description content="Search Engine Optimization Training Course - SEO Training
<META name=keywords content="Search Engine Optimization Training Course - SEO Training
<meta name="title" content="Search Engine Optimization Training Course - SEO Training Company">
<link rel=stylesheet href="includes/styles_home.css" type="text/css">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<div id="contentWrap"><div id="main"><div id="content">
<h1>SEO Training Course - Search Engine Optimization Training Company </h1>
<p><strong>What is Search Engine Optimization</strong></p>
<p>Search Engine Optimization is a logical process involving four distinct stages</p>
<li>Resolving technology issues.</li>
<li>Competitor and Gap Analysis. </li>
<li>Keyword Research and deployment. </li>
<li>Maintenance and Reporting. </li>

<p>The aim is to make a website more visible to its target market via the natural listings of the major
search properties. Failing to correctly address all four stages of the SEO process will prejudice the
effectiveness of any SEO campaign.</p>
<p><em>A distinction should be made between Search Engine Optimization (SEO) and Search Engine
Marketing (SEM). Search Engine Marketing is an umbrella term that covers many disciplines including,
but not restricted to, Pay Per Click Management, Bid Management as well as Search Engine
<p><strong>Search Engine Optimization Training</strong><strong>&nbsp; </strong></p>
<p>The objective of this SEO training course is to impart in the student a comprehensive understanding
of the principles and methodologies required </p>

Example 1: Search Optimization Training Web Page

By now, you should have spotted that the term „Search Engine Optimization Training Course - SEO
Training Company‟ is repeated four times in the <head> of the example document. It appears in the title
tag and the META tags dealing with description, keywords, and title. The same phrase also appears in
the actual page content as the first heading (<h1>).

Now look at the page content. Notice the recurring use of the phrases „search engine optimization‟,
„SEO‟ (an acronym for „search engine optimization‟) and „training‟. This copy is therefore keyword

What then are the keywords for this page? Well we can safely assume that nobody is going to type the
whole phrase „search engine optimization training course SEO training company‟ into Google! Instead,
here are some of the keywords that this site could rank for:

       Search engine optimization
       SEO
       Search engine optimization training
       SEO training
       SEO training course

        Note that I have put these terms in lower case because searches are not case sensitive.

        We‟ll return to this example again later to explain why these keywords have been chosen for this
        particular page. For now, suffice to say that the keywords combine two highly competitive and
        popularly searched terms („Search Engine Optimization‟, „SEO‟) with more refined terms that
        are more relevant to our specific product („Training‟, „Course‟).

        This is a fairly easy example in some respects, as this site has already been optimized. Now let‟s
        try some of these methods on your competitors‟ sites. Note that this may be a little more

        difficult, as not all sites that rank highly in search engine results have been optimized, or
        optimized in the same way. However, by looking at the information in both the <head> and page
        content of their pages, you should be able to gain some idea about (a) the keywords they are
        attempting to rank for, and (b) why they were returned as first, second, or third in the search
        results for the phrase you searched for.


   1. In Google, Yahoo, and Bing, locates websites that offer the same product or services as your
   2. Check the keywords/key phrases they use, using the methods outlined above.
   3. Add this to the list of keywords you compiled earlier.

Once you have finished this, move on to the next section on how to edit a list of candidate keywords.

5.4.1 Editing and revising your keyword list

There are a few things to check when compiling and revising a list of possible keywords. Foremost
amongst these are:

    Linguistic factors
    Are the keywords specific enough?
    Level of competition

5.4.2 Linguistic factors

Search engines „read‟ the keywords on your pages to a certain extent, but they cannot read in the same
way as adult humans can. Where we can generally understand what someone means when they spell
something incorrectly, use a synonym, or say something ambiguous, search engines are generally stuck
with the words on the page. Search engine algorithms can recognize words but they cannot truly
„understand‟ language or think about it in an abstract manner.

For this reason, there are linguistic factors to take into account when editing and revising your list of

Jargon. Don‟t use jargon or „specialist‟ terms that only people „in the businesses are likely to know
about, unless that is the market that you are specifically targeting. Instead, use terms that the wider
public is likely to use.

       Synonyms and Polysems. Synonyms are two words that have approximately the same meaning
        within a certain context. For example, „second hand motors‟ means roughly the same as „used
        cars‟ in the context of automobile sales. You may want to add commonly used synonyms for
        your product (if any exist) to your list of key phrases. Polysems, by contrast, are words with
     more than one meaning. If you use a keyword that could also refer to something entirely
     different from your product, you could direct the wrong kind of traffic to your site.
    Acronyms. Acronyms are abbreviated forms of words. If your product is commonly known by its
     initials only (e.g. „SEO‟ for „Search Engine Optimization‟), you may want to add its acronym to
     your key phrases.
    Plural and singular forms. You will have to use both the plural and singular forms of words if
     people are likely to search for both (e.g. „used car‟ as well as „used cars‟)
    Spelling. Be aware of national differences. In example 1 above, note that „search engine
     optimization‟ uses the UK spelling, whereas the US spelling would be „search engine
     optimization‟ with a „z‟. You may have to take such differences into account if you plan to rank
     highly in both territories.

Note however, that many major search engines now offer to correct (and „Americanize‟) bad spelling in
search words, so this is likely to become increasingly less important as search engines become more

    Capitalization. Search engines do not carry out case-sensitive searches on your pages, so you do
     not need to have the same key phrases in both capitalized and lower-case form.
    Function words. Function words are words like „and‟, „the‟, and so forth. These are „dead‟ words
     as far as search engines are concerned. Because they are used so often in searches, search
     engines simply ignore them. For this reason, they should not form part of your key phrases.

5.4.3 Specificity

When researching keywords, you should ensure that the keywords you eventually select are specific
enough for your product. The more specific your key phrases are, the more likely you are to bring
targeted traffic to your site. If, on the other hand, your keywords are too vague or general, you may end
up attracting traffic that is not actually looking for your product or services.

There are ways of making your key phrase more specific, including:

    Refining your key phrase by adding another term.
    Adding location to the key phrase

Earlier on, we used the example of „cars‟ as a keyword. Now, if you plan to sell used cars online, this
keyword is too vague and tells search engines nothing specific about your product. You can refine your
key phrase by adding „used‟ to the key phrase. Other words can be of course be added, such as
synonyms like „second hand‟ or even the make and models of cars that you sell.

However, if you trade within the India, the phrase „used cars‟ in itself is unlikely to bring the right
potential customers to your site. This is where location becomes useful in making your key phrases more
specific. You can refine your key phrase by adding „India‟ to it. If you trade on a smaller level, you can
refine your key phrase even further by making the location a town or an area.

Remember, the larger the key phrase, the more specific the results returned in search engines.

5.4.4 Level of Competition

We showed you how to check the keywords on competitors‟ sites above. Some of the key phrases you
draw from this source (and some of the ones you come up with on your own) might be highly

Highly competitive keywords return numerous pages of search engine results and are difficult to rank

For example a competitive phrase likes „mobile phones‟ returns a staggering 30,000,000 or so results in
Google! Now, if you are a Indian trader who sells mobile phone accessories, you are not likely to rank
highly for this key phrase or gain desired visibility through use of this phrase alone.

There are, however, ways of reducing the competition and ensuring some kind of visibility in search
engine results

       Refine your keywords in the manner outlined above
       Exploit a gap in search engine marketing

If one of your key phrases is highly competitive, you may want to refine it by making it more specific to
your actual product or by „localizing‟ it in the way outlined above.

For example, the term „search engine optimization‟ is a highly competitive term, and will return many
pages of results from competitors in the field. To this we could add „Search Engine Optimization
Scotland‟ to localize our services. In our case we have refined our key phrase to include „Training‟, as it
is more specific to the actual service we provide. In this way we have combined a highly competitive
term with a less competitive factor.

Another way to fight off keyword competition is to find a key phrase that people commonly use to
search for your product but that none of your leading competitors are using in their web pages.


By now you should have a list of possible keywords. Now edit and revise your list according to the
principles outlined above

    Revise your list according to the linguistic factors outlined above, adding or removing keywords
     where necessary.
    Revise your list to make keywords specific to your product.
    Check the competitiveness of key phrases by searching for them on Google and by looking at the
     PageRank of leading sites for those terms, and edit your list accordingly.

5.5 How to use keywords in your web pages

By now you should have a list of potential keywords for your web site. We will now begin to start
looking at how to integrate them into individual pages on that site.

5.5.1 Keyword placement

You should try to place the key phrases you wish a page to rank for in the following areas of that page:

       Title
       Meta information, i.e. keywords and description (NOT case sensitive)
       Page heading
       Page content,
       Anchor text of links.

Look back at Example 1, and our „Search Engine Optimization Training Web Page‟. In this example we
saw how the exact phrase „Search Engine Optimization Training Course - SEO Training Company‟ was
repeated word for word in key areas that are the title, the Meta information, and the first heading of the

This page is therefore synchronous, as the key phrases in one crucial part of the page are synchronized
with, or mirror, the key phrases used in other parts of the page. This is boosted by the fact that the page
content is keyword rich, and repeats parts of our key phrases („Search Engine Optimization‟, „SEO‟,
„Training‟, etc.) throughout the first few paragraphs.

One thing this example does not show is that keywords can be used in the anchor text of links. Because
the anchor text in links passes relevancy on to the page being linked to, key phrases for certain pages
should also be used in the anchor text of links to those pages. Internal linking and anchor text are
covered more fully in later units on this course.

In particular, pay attention to the following „hotspots‟:

   1. Page Heading. Your page heading should contain your main key phrases and mirror the exact
      phrase used in the page title and Meta information
   2. First Paragraph. Although your entire page copy should be keyword rich, try to place all your
      main key phrases in the first paragraph or two of your page copy, as it will be one of the first
      things to be read by search engines and visitors.

5.5.2 Keyword Spamming

A word of warning: the wrong deployment of keywords on your web pages can cause your site to be
penalized by search engines.

„Spamming‟ in this context means an attempt to trick search engines into believing that your page is
more relevant for a certain key phrase - usually by repeating that keyword over and over again.

In the early days of search engines, people tried to fool the search engines into believing that their pages
were more relevant for certain terms than they actually were. This was often done by cramming loads of
keywords into the Meta keywords tag of the page. Often these keywords were not strictly related to the
content of the site.

Search engines have now built in measures to counter such crude attempts at deception. Most search
engines will not rate the relevance of pages on the basis of on-page factors alone.

The general rule of thumb is that your page content should be keyword rich, but not to the extent that
you simply repeat the same phrase on the page a hundred times.

Besides running the risk of being penalized by search engines, keyword spam can put customers and
visitors off. Internet users are becoming increasingly sophisticated and generally know when they are
being „spammed‟. Instead of simply repeating a single phrase again and again, give visitors the
information they want. This is more likely to create trust and encourage repeat visits.

SEO copywriting is not about spamming. Rather, it involves finding a balance between keyword rich
content and the normal demands of internet copy, such as making your page content clear, informative,
„scan-able‟, and professional. Your content should contain the keywords you are targeting, but not at the
expense of plain English. Try to integrate your keywords into your page copy in a way that still allows
that copy to „flow‟ and impart information in the normal manner.

5.5.3 Where and when to use your keywords

When you research and revise your list of potential keywords, you will probably end up with a fairly
long list. Here are some suggestions on how to deploy them.

Firstly, don‟t try to make your homepage rank for all your potential keywords (especially if you have
numerous potential key phrases).

If you look back to example 1, you can see how this page has targeted 4 or so main key phrases on this
page. Note also how even these few phrases allow for various combinations of keywords. This allows
potential relevance for searches like „search engine optimization training‟, „SEO training‟, „SEO
course‟, and so forth. Try to integrate your key phrases in a manner that allows for these kinds of
searcher combinations.

There are good reasons why you should not use all your keywords on your main page. This will lessen
the keyword density of that page, as you will have to fit more key phrases into your page content. Focus
on the main key phrases for your product on your main page, and reserve other keywords for other

Later in this course we‟ll talk about reinforcing keywords and focusing Page Rank on different pages
within your website. For now, we‟ll just point out that you should aim to get individual pages ranking
for different key phrases. Remember that search engines rank individual pages on your site rather than
your site as a whole.

A good rule is to place the main key phrases for your product on your main page and other less
important key phrases on your other pages. This can allow those pages to rank for a less competitive
term or a key phrase for a related or subsidiary product or service.

5.6 Conclusion

As in most areas of Search Engine Optimization, there is no magic formula that will automatically give
you the winning keywords for your product or services, and finding the right keywords takes time,
effort, and a bit of trial and error. However, by following the principles outlined above, you arm yourself
with the ability to research keywords in an effective manner and deploy them in a way that exploits their

REFLECT: What do you understand by the following terms?

       Keywords and Key phrases
       Keyword Relevancy
       Keyword Competition Analysis
       Keyword Placement
       Keyword Density
       Spamming

Chapter 7 .
A Brief Introduction to Competitor Research and Analysis

This unit covers competitive research and analysis. Competitive analysis is essential if you wish to carry
out effective search engine optimization of your web pages.

After completing this unit, you should be able to:

    Locate competitors‟ websites
    Determine the level of online competition for your product or services and whether there is a
     viable online market for your product.
    Determine whether competition is too fierce for a given key phrase (or not), and whether you
     have to revise your keyword strategy and target less competitive key phrases.
    Determine quantity and quality of external links you need for your links campaign in order to
     compete with other leading sites.

This unit assumes (but does not demand) that you have read the last unit of this course on Keyword
research and understand how to a select and refine a list of keywords in response to search engine

8.1 Competition

In this context, the competitors you face are the people who rank highly in search engine results for the
same or similar products or services that you offer.

8.1.1 Why Competition Research is important

Competition analysis is an essential part of the search engine optimization process. In order to succeed
in online marketing and gain a higher ranking for your product in search engine results, you need to
assess the state of the online market for your product and gain some idea of who your leading
competitors are. By doing this, you

    Get a better idea of the market for, and the viability of, your product or services.
    Gain a better idea of the audience you should be targeting
    Gain a sense of the level of optimization required for your site.

If, on the other, hand you fail to assess your market in the correct manner, you run the risk of :

    Being outstripped by your competitors in terms of search engine rankings
    Attempting to sell a product that is not viable
    Applying the wrong level of optimization to your site

8.2 Competition Analysis

In the last unit of the course, we looked at how competition impacts upon your choice of keywords.
Competition analysis can be considered a further essential stage of research to augment and complement
your keyword research. As well as giving you a better idea of the keywords you should be targeting,
analysis of your competitors‟ web pages give you a much better idea about the condition of the market
you are entering and the level of optimization required for your own site.

8.2.1 When to Research Competition?

As in keyword research, the best time to analyze competition is before you actually design your web
site. This is because you will have to make decisions about site content and the level of optimization on
your pages in direct response to competitor‟s web pages.

    Finding Competitor‟s sites
    Refining your search
    Assessing Page Rank
    Assessing Level of Optimization

8.3 Finding and Analyzing Competitors’ Web Pages

The most basic way of finding competitors‟ sites is to enter one of your targeted key phrases into one of
the major search engines. This will show you the leading sites that already rank highly for the key
phrases you are targeting.

The scope of any competitive analysis will vary according to industry and your specific goals, that said,
most online competitive analysis does have some common themes. Below are a few that nearly always
make the competitive analysis list:

8.3.1 Search Engine Visibility:

Type your main keywords into Google, Yahoo, Bing, ASK and AOL. This gives you a good idea of who

your main competitors are and by comparing the number of top 30 rankings can give you a sense of your
keyword market share.

Note: At time of writing Yahoo and Bing show different results based on their own proprietary ranking
algorithms. It should be noted that at some point during 2010 Yahoo will switch to showing results from

8.3.2 Domain Age:

                               The age of a domain is one of many factors that can skew the way search
                               engines determine authoritativeness and trust. However, there is much
                               speculation in the SEO community that domain age is calculated from
                               the date the domain was first indexed rather than the date it was
                               registered. In essence, the longer a domain has been indexed and
                               remained active without interruption, the more value it receives from
                               search engines.

                               There are a few online tools that will check the age of your competitors'
                               domains; however this is calculated from when the domain was
                               registered not when it was first indexed.

Another resource that can shed some light on the age of a domain is the Way back Machine. The Way
back Machine will let you browse the content of billions of web pages as they appeared as far back as
1996. As well as historical snapshots that let you see how a website developed over time it will also give
you a good idea of the date the domain was first indexed.

8.3.3 Site Traffic:

                                  It is impossible to be precise when comparing site traffic unless you
                                  have direct access to your competitors' analytics, however
                         offer a service that provides some valuable insights into
                                  kind of online traffic competitors are receiving. Compete.con offer
                                  both paid and free services but the free service only offers limited


8.3.4 Inbound Link Quality:

Using Raven Tools you can determine the number and quality of back-links for a given domain or page
within that domain.

8.3.5 Anchor Text: (A Holistic viewpoint)

                                                 The quality of an inbound link is primarily based on
                                                 anchor text i.e. does the anchor text contain the target
                                                 keywords, synonyms or semantically related terms of
                                                 the page it points to. Also one should take into account
                                                 the linking page as a whole and look at the page title,
                                                 URL, inbound link quality, the context of the page and
                                                 theme of the linking website as a whole.

                                                 Contextual links, that is links that are positioned within
                                                 the copy of the page, carry more weight than footer
                                                 links. Similarly links within a blog posts carry much
                                                 more weight than blog comments or forum signatures.

There are a few free online tools that let you examine the anchor text of competitors inbound links, one
such tool can be found at Although free, this type
of tools is incapable of scouring the web and finding every link that points to your competitors‟ pages
and therefore delivers only a partial picture of what is happening at the anchor text level. However, for
websites competing in markets or niches where competition is low this may be all that is needed.

If you require more in-depth link and anchor text analysis both and offer
services often used by SEO professionals.

8.3.6 Meta Tags:

                                    Although search engines no longer consider meta tags important
                                    they are still useful as far as competitive analysis is concerned. For
                                    example many sites incorporate site search as a navigation aid and it
                                    is not uncommon that these tools make use of the meta keywords tag
                                    for their own internal search purposes.

                                   Meta descriptions when considered along with the page titles can
                                   help you determine how competing sites are attempting to
                                   differentiate their services, products and offers. The page title – page
                                   description couplet can also bring to light effective calls to action
designed to encouraging click through from search engine results pages.

8.3.7 Index Saturation:

Index saturation refers to the number of web pages that a search engine has index from a given website.
Google provide the site: advanced operator which lets you see how many of your pages and those of
your competitors‟ have been indexed:

    (includes every page from the site and all sub-domains)
    (restricts results to the www sub-domain of the site)
    (restricts results to the news sub-domain of the site)

When checking competitor sites it is important to specify the correct URL, if for example if you are
interested in analysing querying Google with site :www. will return the wrong results.

Note: Search engines treat the main domain and each sub-domain as different websites.

If you are looking at a competitor who has thousands of pages indexed and you have only ten or so,
search engines are going to see the larger site as the better resource to send their users to. If the
competitor site uses your target keyword in the main navigation of every page, the page that those links
point to is already receiving a big boost as far as inbound links and relevant anchor text is concerned.

8.3.8 Analyzing Paid Search Campaigns:

There are a crop of online competitive intelligence applications that provide valuable data about
competitors' PPC keywords, ad creative‟s, daily budget, bid prices, clicks/day, and other interesting PPC
facts & figures. The market leader is this area is probably

A word of caution, none of these tools accurately report an ad spend. That said, gaining access to the
PPC data and information regarding niches and keywords that your competitors are targeting (or testing)
is extremely useful.

8.4 Conclusion

Search engine optimization is not just about paying close attention to your own web pages; it is also about
understanding them in relation to other websites that are similar to yours in terms of the products and services
they provide. In order to optimize your site effectively, you have to understand that a high position in search
engine results is relative to the position of other websites, some of which may currently have better optimization,
more links, or higher PageRank than your own site. Your position in search engine results is also dependent on
other websites, which means that as well as on-page factors such as your HTML code, you also have to take
account of other off-page factors such as the links pointing to your site.

Chapter 8.
A Guide to Google Page Rank
If you‟ve spent time investigating Search Engine Optimization then you will have noticed that the topic
of Google PageRank pops up with the predictable regularity of a well engineered Swiss watch. So, in
line with tradition this article provides a brief overview PageRank, what it is, what it isn‟t and some
useful guidelines to help ensure PageRank is maximized and flows throughout your website without

What Is PageRank?

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an
indicator of an individual page‟s value. In essence, Google interprets a link from page A to page B as a
vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page
receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves
“important” weigh more heavily and help to make other pages “important.” Source Google.

PageRank or PR is a measure of importance Google applies to every web page maintained in its index.
The higher the PageRank the more important a page is. PageRank is calculated in the absence of a
search query and, contrary to what you might read elsewhere, has very little influence on search engine

Importance, as far as PageRank is concerned, should not be confused with relevance otherwise high
PageRank pages like Google‟s index page would top search engine results pages for every search
regardless of the original query. Nor is PageRank influenced by on page elements and tags such as Page
Titles, Heading Tags, length of copy, semantics or anchor text.

PageRank values are completely governed by inbound and outbound links from and to pages within the
same domain and those of external domains.

Why Is PageRank Important

PageRank on its own, like many other metrics that influence rankings, offers only minor benefits. To
gain top search engine rankings many different metrics have to be considered along with PageRank.
Because PageRank has nothing to do with relevancy it sort of stands out on its own. The following
calculation illustrates how PageRank influences rankings:

Ranking Score = Relevancy Score * PageRank Score

Even at first glance it is pretty easy to see how low relevancy high PR pages could outrank highly
relevant low PR pages. This would indeed lower the quality of Google‟s results so to combat this the
calculation probably looks more like this.

Ranking Score = Relevancy Score * (PageRank Score/100)

Even with a dampening factor applied when two pages with identical relevancy are scored the one with
the highest PageRank will come out on top and outrank the other in Google‟s SERPs.

PageRank forms a major part of link analysis. Link analysis is used by search engines to detect link
spam, hub sites, authority sites and neighborhoods, it is therefore important to ensure its effective
distribution through your website. Moreover, by sculpting the way PR flows through a website can
encourage Google to spider you‟re most important or regularly changing pages more frequently.

How Do You Check PageRank?

The simplest way to compare the relative PageRank of your pages with related pages or those of
competitors is to install the Google Toolbar ( Once installed the toolbar will show
PageRank from the Google data center supplying your results, however the server Google chooses to
deliver your search results can change depending on load and geographic location. If you wish to inspect
PageRank over multiple data centers there are a few tools like
engines/pagerank-dc.html that will allow you to do this.

There is much speculation that the option to view PageRank in the toolbar will be removed in the near
future. Within Google´s Webmasters tools the PageRank distribution feature found on the crawl stats
section under "Diagnostics," has already gone. Google had this to say about its removal:

We've been telling people for a long time that they shouldn't focus on PageRank so much; many site
owners seem to think it's the most important metric for them to track, which is simply not true. We
removed it because we felt it was silly to tell people not to think about it, but then to show them the data,
implying that they should look at it. :-)

How Is PageRank Calculated

The PageRank of any webpage is based on a
calculation that takes into account the number of pages that link to it and the PageRank of those pages.
PageRank is calculated on a page by page basis therefore each page on the web has its own value which,
when viewed in the Google Toolbar, can be anything from 0 to 10.

Although the Google Toolbar PageRank is only be updated every 3 months or so, actual PageRank is
dynamic and is recalculated every time Google discovers a new link or one is dropped. Google
constantly spiders the web by following the links that connect web pages and by doing so not only maps
out the web on a page by page basis but also how those pages are connected. In a very basic sense the
most important pages (those with the highest amount of links pointing to them) are awarded the highest

There is however more to PageRank than simply counting the number of links that point to a page,
although this aspect does reflect the democratic nature of the system where a link from Page A to Page
B counts as a vote for Page B. One also has to take into account the PageRank of the page casting the
vote and the amount of links on that page. So a PR2 page with 5 links is going to channel more page
rank through each link than a PR3 page with 10 links on it. In short the amount of PR passed through
each link is based on the PageRank of the page divided by the number of links on the page.

If you were to add up all the PageRank available to a website prior to any inbound links being placed it
would be equal to the number of pages in the site * 1. This amount is decreased with every outbound
link to another website; remember PageRank is channeled through links so every outbound link draws
PageRank away from a site. Conversely, this number can be increased by gaining inbound links from
other sites.

Finally there is a damping factor; this part of the PageRank algorithm represents the likelihood that an
imaginary surfer engaged in randomly clicking links will eventually stop. The damping factor adjusts
derived PageRank values downward.

The PageRank calculation is rather complex, so I do not intent to go into the mathematics that neither lie
behind the system in any depth, nor do I think it required knowledge as far as understanding the main
points of this document is concerned. I will however, include links at the end of this document for all the
sadists with an interest in such things.

Dispelling a Long Standing PageRank Myth

There are many descriptions of PageRank online that suggest PR values can be increased through the
use of Meta tags and optimizing specific HTML tags and elements. These steps may well improve
rankings by improving relevancy but have nothing to do with PageRank.

The amount of PageRank available within the Google index is equal number of pages in the index * 1,
that is billions of PageRank points to be distributed amongst billions of pages. The number is dynamic in
the sense that it constantly changes as new pages are added to the index and others dropped. PageRank
is not contextual in any means and cannot be increased by adding a few more keywords to your pages.
Nor is it influenced by anchor text, in fact a graphic link without any anchor text will pass the same PR
value as a link from the same page with well targeted anchor text. The later would be the better link
however because it passes both PageRank and keyword relevancy to the target page.

URL Conventions

Although the URLs,, and
index.html refer to the same page there is a danger that search engines may see them as four different
pages. This could result in PageRank being shared on these URLs and appearing much lower than it
actually is.

This can come about by using relative paths within the links that connect your pages. For example if
someone decides to link to your home page from another website the URL would look something like:


However, a link from the same domain using a relative path would look something like this:


This could result in the PageRank from external links being channeled to the absolute URL and
PageRank from internal links being channeled to the relative URL. The end result is that PageRank is
lowered on both URLs.

To get round this problem and help ensure that PageRank is correctly channeled throughout your entire
site use absolute URLs on every link.


Search engines consider and different websites. If you
are logged in to log out now and try visiting the homepage using both www and the non
www version of the URL.

Both URLs should take you to the site but pay close attention to what happens to the URL when you use
the www version. It automatically drops the www and takes you to

Now try it on your own website, it doesn‟t really matter if you end up on the www version or the non
www version, however if your site can be accessed by both you have a problem. If your website has
inbound links using a mix of www and non www URLs you are effectively splitting the benefits of
PageRank and the relevancy of any anchor text.

Using a 301 redirect, which is essentially a “permanent” redirect, you can elect one URL and effectively
consolidate all of your link popularity to either the www or non www version of your site. This
consolidation will insure that every link counts towards the same site and increase your website‟s
chances of obtaining and maintaining top rankings.

Brocken Links

Search engine robots crawl the web by following links. PageRank flows through links. Contextual
relevancy is defined by analyzing the anchor text associated to links. Essentially, Links map out the
structure of the entire web, identify neighborhoods and communities, define hubs and authorities, and
infer the popularity and relevancy of each and every page on the web.

                                                                     Search engines want to send
                                                                     users to current well maintained
                                                                     information, a site that contains
                                                                     lots of broken Links is an
                                                                     indication that a site is no
                                                                     longer maintained and can
                                                                     result in lower search engine
rankings. Brocken links also starve all your pages of PageRank by channeling part of it towards pages
that don‟t exist.

There are lots of reasons why broken links come into existence, moving pages, renaming pages and
deleting outdated pages. When renaming or moving a page remember to update all the links that point to
it. Also put a 301 redirect in place so that new URL picks up the benefit of any inbound links pointing to
the old URL from external sites.

PageRank Devalued

PageRank does effect rankings but not to the same extent that it once did. At one time simply
exchanging links with similarly themed sites would quickly build PR and rankings would improve
                                                  considerably. Google took a stand against this
                                                  SEO technique in two ways.

                                                       Firstly the reciprocal links themselves were
                                                       devalued to such an extent that they are now
                                                       virtually worthless. Reciprocal linking was for
                                                       some time the single most effective way to
                                                       improve search engine rankings; unfortunately
                                                       Google viewed this practice as a means to game
                                                       the PageRank algorithm. Webmasters would
                                                       simply email each other offering reciprocal link
                                                       exchanges for the sole purpose of improving
                                                       rankings rather than exercising editorial control
                                                       by linking to pages that would enrich their
                                                       visitors experience.

                                                   Secondly, as a further consequence PageRank
                                                   itself was devalued and no longer figures as an
important factor in ranking web pages. PageRank was and still is a great way to identify quality pages
however considering PR as a valuable asset when analyzing your backlinks or sourcing new links is of
little consequence where rankings are concerned.


The fact is, a single PR7 link from a completely unrelated site could promote a page to PR5 or even PR6
although It is doubtful whether such a link would have any significant effect on rankings. There is

however, a perception amongst web users that high PR is indicative of quality. So, although such a link
might not improve rankings per say it can ease trust issues and make conversions easier.

Fake Rank

It is possible to fool a browser into showing falsified PageRank; this is a known practice amongst
unscrupulous link sellers who base their prices for the most part on Google‟s Toolbar PageRank. One
method of doing this is to wait for Googlebot to visit and sneakily redirect it to a high PR page. The end
result is that the link seller‟s page adopts the PR of the page it was redirected to and he can sell a
completely worthless link for a few hundred bucks or more. A simple way to detect this is to look at
Google‟s cached version of the page. If it‟s not the same website something is afoot and you should be
very wary.

N.B. Not all link sellers are unscrupulous.

Chapter 9 .
On-page Optimizations
It is now time to start optimizing your web pages. By this stage you should have a good idea about the
keywords you wish your site to rank for and some knowledge of who your main competitors are. This
unit will show you how to use this information when constructing web pages.

By the end of this unit, you should be able to:

    Optimize the page title of HTML pages
    Optimize the Meta tags of HTML pages
    Optimize your page headings
    Write keyword focused page content and copy

This unit assumes that you have read the last two units of the course, and that you know how to
construct a basic HTML web page.

10.1 On-page factors

As we explained in earlier units of the course, on-page factors relate to the code and content that actually
appears on your web pages. In SEO, on-page factors are usually distinguished from off-page factors
such as external links. Unlike external links, on-page factors are largely under your control as a
webmaster, web designer or site owner. This means that they are fairly easy to manipulate in order to
improve your search engine visibility.

In this section we will be looking at how to optimize such basic things as:

       Page titles
       Meta tags
       Headings
       Page copy
       Img. alt attribute

In simple terms, the key to optimizing your web pages is to focus all of these areas around the main
keywords for your product or services - assuming that you have followed the instructions set out in
previous parts of this course and have already researched the key phrases that your website should be

Note: focusing your pages around the wrong keywords can potentially have a detrimental effect on
search engine visibility, so make sure that you have researched your market and competitors prior to
attempting to optimize your site.

Before we cover the above topics, let‟s have a basic look at the areas of a HTML web page that concern

       Title tags
       Meta tags dealing with „keywords‟ and „description‟
       Internal links (including the anchor text used in internal links)
       Headings (particularly the first heading <h1></h1>)
       Page copy

10.1.1 Page Title

The Page title is the title that appears at the top of your browser when you visit a web page. It is the first
thing that a search engine sees and is therefore a particularly important part of the page from an SEO
point of view.

It is one of the strange facts of Internet life that people often forget to include a title in their web pages.
No doubt you have come across one of the countless pages on the Internet that simply read „Untitled
Document‟ when you view them in your browser. By failing to give your web pages a title you miss a
golden opportunity to communicate what they are about to search engines and Internet users.

The page title appears in title tags (<title></title>) in the header section of your HTML web page as

<title>Put your page title here</title>

It is particularly important that the page title contains the main keywords for the product or service that
you offer on that page.

For example, the title of this page looks like this:

<title>On Page SEO Part 1, SEO Training Basics</title>

This title will appear in search engine results as an active link to our homepage:

Besides letting the user know what the page is about – which is an important function of the page title
and should not be neglected – the title tag also signals to the search engine that our homepage is relevant
for the following keywords:

       On
       Page
       SEO
       Part
       1
       SEO
       Training
       Basics

Note how these words can be combined in numerous ways to make more complex key phrases, e.g.:

    SEO Training
    On page SEO
    On page SEO Training
    Basic SEO Training

This is a good principle to follow as it will help your page appear relevant for a variety of search

To optimize your page title, take the following steps:

    Follow the instructions for researching keywords outlined in the „Keywords‟ section of the
    Make a list of keywords for the page that you are optimizing.
    Write a title for the page that contains all your keywords.
    Insert this title between the title tags of your HTML page.

Note: It is generally a good idea to limit the number of keywords/key phrases you wish to make a page
rank for to about a handful (ideally 3 or 4 closely related terms). Keep pages highly focused and use the
most relevant and most searched terms for the product first. If you are targeting more than 4
keywords/key phrases consider making another page that to target these additional terms so that each
page is focused and highly relevant.


We will now begin the practical optimization of web pages by looking at your homepage.

If you already have a homepage, it would be a good idea to make a copy of your site and save it locally
somewhere on your hard drive before you make any changes. Do not upload any changes made to your
site to your hosting server at this stage, as you will be making further alterations to your homepage in
later tasks.

If you do not have a homepage at this stage, construct a basic web page for your product using a HTML
editor like Dreamweaver or a basic text editor like Windows Notepad.

If you are using Notepad to construct your page, make sure that you save the document with its default
extension (.txt) for now as we will be adding to it in later tasks. In order to view your finished page in as
a web page, you will have to change its extension to .htm or .html. To save time, you can cut and paste
the following basic template into notepad then save it with the name „index‟:

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
<html xmlns="">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Untitled Document</title>


Now optimize the page title on your homepage by taking the following steps:

   1. Make a list of the commonly searched key phrases for the products or services that your
      homepage offers.
   2. Insert your title between the title tags of your HTML page.
   3. <title>put your page title here</title>

Save your work locally for future editing. Do not upload it to your server.

Congratulations, you have just made your first practical attempt at optimizing your website!

10.1.2 Meta Tags - ‘description’

Meta tags offer information that does not actually appear on the web page when viewed in a browser.
The prefix „Meta‟ comes from the Greek for „above‟ or „beyond‟, and in this context refers to
information that is „beyond‟ our view and that we do not normally need to see. These tags include the
keywords and description tags and even though these tags have been devalued somewhat there is still
good reason to use them.

Why include a Meta description tag

Although the Meta description generally has no direct impact on search engines primary results, you
might want your web pages to meet with current accessibility standards, if so, then it should be included.
I say generally because sites that are accessible will achieve higher rankings through Google Accessible
Search than those that are not. Accessibility and the way it impacts search results will be discussed in
future tutorials.

When you omit a description tag search engines will create one for you. This is done by selecting
snippets of text from the page that are relevant to the users search. You have no control over which
snippets will be used which can result in descriptions made from multiple snippets that can appear
incongruent and confusing.

The meta description appears in the head of your HTML page just after the title tag:

<title>YOUR TITLE HERE</title>
<meta name="description" content="YOUR PAGE DESCRIPTION HERE">

Although Meta tags contain information that people don‟t normally see, the description Meta tag offers
information that is both visible to search engines and visible on some search engine results. Google, for
example, displays the actual text that appears in your Meta page description just below the link to your
page. Bear this in mind when you write your page description, as search engine users use it to learn to
what your web page is about.
With this in mind, the following principles should guide you when optimizing the Meta description tag
on your web pages:

    The Meta description tag can, but does not need to, incorporate your main keywords however it
     should reinforce what you have already said in your title.
    Whether you use your main keywords or not, the description tag should clearly describe what the
     page is about.
    The page title and description should work together as a benefit driven call to action designed to
     engage searchers and entice them to click through to your site.


Now optimize the description Meta tag on your homepage. Open the local copy of the homepage you
saved after completing Task 1, then take the following steps:

   1. Write a page description that will describe the page content and compliments the title.
   2. Insert your description into the correct HTML tag.

Save your work locally for future editing. Do not upload it to your server.

10.1.3 Meta ‘Keywords’

Many crawler based search engines, including Google, now ignore the meta keywords tag completely.
But again to meet accessibility standards one should be included.

The reason that many search engines ignore the tag is because too many sites „spammed‟ the tag in the
early days of search engines. These sites attempted to rank for numerous phrases (sometimes phrases
that they were not even relevant for) by cramming every keyword they could think of into their Meta
tags. Even today, you will still encounter sites that have an incredibly long list of keywords in their Meta
keywords tag.

The first rule to remember when optimizing your keywords tag, therefore, is don‟t „spam‟ it. Even those
few search engines that still index the tag will only read a limited amount of characters. Try and limit the
number of characters (not words) to about 70 or 80, or even fewer if possible.

Despite the fact that a search engine giant like Google ignores the keywords tag, it is still a good idea to
use the tag in order to improve your chances of being found in those search engines that still read it. In
an HTML page, the keywords tag appears in the header section below the title tags as follows:

<title>YOUR TITLE HERE</title>
<meta name="description" content="YOUR PAGE DESCRIPTION HERE">
<meta name="keywords" content="YOUR KEYWORDS HERE">

    The keywords tag should employ the main key phrases for your product or services and should
     be synchronous with your title tags and page description.
    Keywords should be limited to words that appear on the page
    Each keyword or key phrase should be separated with a coma
    You do not have to use capitals in your keywords as search engines don‟t treat them as case


Now follow the above steps and optimize the keywords Meta tag on your homepage in the local copy of
the homepage you saved after completing Task 2A. Don‟t forget to save your work

10.1.4 Internal Links

Internal links are, of course, the links on your site that point to other pages on your site. These are the
means by which people navigate around your website. We will cover links more fully in the next unit.
At this stage, the important thing to remember when optimizing internal links is to use the key phrases
for the page that you link to in the anchor text of the link. In this way, the link passes relevancy on to the
page that it links to.

For example, if your homepage sells gift products and includes a link to another page that aims to rank
for the term „christening gifts‟, your anchor text might appear as follows:

<a href=“”>Christening Gifts</a>

10.1.5 Headings

In this context, your page headings are the words that appear between HTML heading tags. These tags
are usually run in numerical order starting with <h1></h1> and moving up through <h2></h2>,
<h3></h3>, and so on up to <h6></h6>.

The heading to pay particular attention to is your first heading. This is one of the first things that both
search engines and Internet users see.

Your first page heading should be placed in the body section of your HTML page. Place it between
<h1></h1> tags just before your page copy, i.e.:

<title>PAGE TITLE</title>
<meta name=“description” content=“PAGE DESCRIPTION”>
<meta name=“keywords” content=“PAGE KEYWORDS”>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">

Here are the main points to follow when writing your page heading:

    Place your first heading in <h1></h1> tags before your page copy
    Make sure that this heading employs the keywords that you want your page to rank for and that it
     reinforces the keywords in the header section of your HTML page

The same principles can be applied to other page headings of course.

Note: Remember that, unlike the information that appears in your Meta tags, visitors can read your first
heading. If you have too many keywords, or if the use of all your key phrases in this heading makes it
appear too long or clumsy, then just write a heading that contains your main key phrase. You can
perhaps put your other key phrases in later headings.


Now open your saved homepage and add a first page heading which follows the principles outlined
above. Save your work then move on to optimizing the page copy

10.1.6 Page Copy

When we refer to copy, we are referring to the actual words and information that appear on screen when
a visitor reads your page.

In some respects, writing copy is the most difficult part of optimizing your web pages as it involves
knowing the principles of marketing, „proper‟ English, and SEO. We can‟t possibly cover the basics of
effective web copywriting in this short section, so the next unit will take a more in-depth look at the
area. Nevertheless, since page copy is an on-page factor, here are some basic tips to help you write
optimized web copy:

       Keep it informative.
        People use the Internet to gain information as well as to purchase products. Give them what they
       Keep it short and to the point.
        People don‟t like to read a lot of scrolling text on screen.
       Make it „scan-able‟.
        People tend to „scan‟ web pages rather than read their entire content. Break your copy into
        sections and put your most important points in the first sentence of each paragraph.
       Keep it plain.
        Grandiloquence, digression and circumlocution are superfluities that epitomize impenetrable
        online prose!
       Call to action.
        If you are selling a product or service, remember to include a „call to action‟ somewhere on your
        page (e.g. „buy now‟, „call today‟, „email us for a free consultation‟, etc.).
       Keep it professional.

Always proofread your copy for spelling mistakes and grammatical errors.

One important thing to note, however, is that SEO copywriting differs slightly from normal Internet
copywriting. Whereas normal copywriting simply markets, or informs someone about, a product or
service, the aim of SEO copywriting is to make your page appear more relevant for your chosen
keywords by employing those keywords in your page copy in such a way that it improves the pages
relevance and is wholly self descriptive.

As we noted earlier in this course, keyword density is a measurement by percentage of how many times
keywords appear on a page in relation to other text. However search engines don‟t count keywords in
this way. However it is important that you get your keywords into the areas we have already discussed
plus featured prominently in the first paragraph, in bold text and in lists.

The key to writing search engine friendly copy is to ensure that you get your keywords into these areas
while still writing in plain, easy to understand English. That way, you satisfy both the demands of the
search engine and the needs of the individual Internet user. The main purpose of SEO copywriting is to
engage the reader from the first headline to the closing call to action. Along the way explain „What‟s In
It for me‟ i.e. clearly explain what benefits potential customers will enjoy by using or switching to your
products or services.

Here are the main principles you should follow when writing search engine friendly copy.

    You should have at least some text in your web page, as this will give the search engine
     something to latch on to when attempting to determine the relevance of your page for particular
    Your copy should employ the same keywords used in the page title, Meta information, and page

    Although your text should include the keywords you are targeting, write naturally and do not
     spam keywords by repeating the same phrases over and over again.
    Make sure that your copy is still written in plain English and still satisfies the aim of informing
     the Internet user about your products and services.

Note: placement of text can be important. Aim to make your main page copy one of the first things a
search engine sees by placing it just under your first heading. If you have a table or css div to the left of
your main page copy, make sure it is not full of text that takes precedence over your main page copy.

Page copy usually appears in paragraph tags <p></p> on your HTML page, as in the following example:

<p>FIRST PARAGRAPH OF PAGE COPY HERE: Lorem ipsum dolor sit amet, consectetur
adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim
veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute
irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint
occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </p>
<p>SECOND PARAGRAPH, and so forth</p>


Your task is to write a short, two-paragraph description of your main products and services (no more
than 300 words for the purpose of this task, although your homepage can include a lot more text) for
your homepage in such a way that it uses the main keywords for that page. Keep your prose simple and
do not „spam‟. Once you have finished, place it in the relevant section of your HTML page and save
your work.

10.1.7 img alt attribute

The contents of an image alt attribute, often mistakenly called an alt tag, are no longer considered when
calculating the relevancy of a page by any of the major engines. However, improper use of the alt
attribute can adversely affect the rankings in the SERPs. As search engines evolve they continue to catch
up with SEO tactics that are intended to improve rankings by gaming their algorithms while damaging
the visitor experience.

The purpose of the alt attribute is to provide alternative text descriptions of your images. An example of
an alt attribute used for a company logo might look like this:

<img src="images/logo.jpg" width="100" height="78" alt="XYZ SEO Company " />

From an accessibility point of view every <img> tag should have an alt attribute that describes the
image, if that provides an opportunity to use keywords appropriate to the page so much the better.
Descriptive alt attributes, headings and the text close to the image can improve rankings within image
search. If you have a large catalogue of products this can be an important and sizable source of traffic.
Rather than use the alt attribute as an excuse for keyword stuffing use them to describe the content of

Note. The last thing that visually impaired visitors using screen readers want to listen to a long list of
keywords stuffed into every image. Also stuffing every keyword you can think of into 1 pixel by 1 pixel
invisible image is, and always has been, spam.

Images that are purely decorative require no alt text but still require an alt attribute. That is to say, the
required alt attribute should have a null value: alt="". Don't define alt=" ", the alt attribute should be an
empty string, not a space. If your layout uses invisible images to help with positioning, for example:

<img src="spacer.gif" width="1" height="10" />

Change it to:

<img src="spacer.gif" width="1" height="10 " alt="" / >

Images that are purely decorative should be treated the same way. For example:

<img src=" top-right .gif" width="40" height="20" />

Change it to:

<img src=" top-right .gif" width="40" height="20" alt="" / >

These changes might seem a bit pedantic, especially if you don‟t show product images. However there
are accessibility issues to be concerned about and one should also strive to make HTML code standards
compliant. This can cause a lot of work for webmasters trying to rectify things after a site has gone live
especially if every page contains multiple html errors. How Accessibility impacts SEO will be discussed
in future tutorials, all that needs to be said at this juncture is that accessible sites are generally more
search engine friendly and can be viewed on a wider selection of devices and browsers.

Congratulations, you have now carried out a basic optimization of your first web page. Repeat the above
steps to optimize your other web pages.

Remember that search engines rank individual pages and not websites as a whole. This means that you
should not use exactly the same keywords as your homepage on other web pages. You should use your
other pages to target different keywords related to your products or services.

You can now upload your saved page to your hosting server if you like.


    on-page factors refer to the content on your individual web pages
    the easiest way to optimize your web page is to employ the keywords and key phrases you wish
     the page to rank for in key areas of that page
    your page title, Meta tags, page heading and page copy should all mention your key phrase
     without resorting to being overly repetitious.

10.2 Conclusion

Optimizing on-page factors is an essential step in the SEO process and one that you can easily perform
on your own web pages. By following the principles outlined in this unit you can now start optimizing
every page on your site.

In the next unit, we will look at links and the external factors that influence your search engine ranking.


What is your understanding off the SEO implications of the following?

       On-page factors
       Title tags
       Meta tags
       SEO copywriting
       In the previous tutorial we looked at some basic on page factors including the alt attribute. It was
        suggested that every img. tag should also have an alt attribute even if the image referred to was
        entirely decorative. These changes might at first seem a bit pedantic, however it makes for better
        accessibility and standards compliant HTML.
       Ensuring pages are accessible and standards compliant can cause a lot of work for webmasters
        trying to rectify things after a site has gone live, especially if every page contains multiple
        HTML errors. So is it worth all the bother? The simple fact is that accessible sites are generally
        more search engine friendly and can be viewed on a wider selection of devices and browsers.
       Making sure that every piece of html code on every page validates and meets current
        accessibility standards are signals that a business cares about every single visitor to their website.
        Spammers using „throwaway domains‟ are more likely to shy away from this type of work
        because of labor, time and expense.

   Signals of quality are rarely about relevance, for example it‟s easy to understand why allowing a
    page to go live as an „untitled document‟ would harm relevancy, it‟s not so obvious why
    including a telephone number would increase search engine rankings.
   There is a distinct difference between quality and relevance and search engine must necessarily
    balance both aspects in order to deliver the best results. The task of Identifying quality is
    becoming increasingly important due to the amount of low-quality content that is being uploaded
    to the web every day.

Bayesian Filters
      Bayesian filtering is utilized by most modern day mail clients as a means to weed out spam
       emails from legitimate emails. Search engines use it to categorize documents and Google uses it
       to deliver relevant Adsense ads. How do Bayesian filters Work? Initially the process starts with a
       list of sites that have been classified as high quality and another list that has been classified as
       low quality. The filter looks at both and analyzes the characteristics common to either type of
      Once the filter has been seeded and the initial analysis completed they can be used to analyze
       every page on the web. The clever thing about Bayesian filters is that they continue to spot new
       characteristics and get smarter over time. Before we delve into any great detail on how Bayesian
       filters work, here is a couple of quotes from Matt Cuts regarding Signals of quality that clearly
       show Google is addressing the problems caused by low quality mass generated content.
      “Within Google, we have seen a lot of feedback from people saying, Yeah, there‟s not as much
       web spam, but there is this sort of low-quality, mass-generated content . . . where it‟s a bunch of
       people being paid a very small amount of money. So we have started projects within the search
       quality group to sort of spot stuff that‟s higher quality and rank it higher, you know, and that‟s
       the flip side of having stuff that‟s lower-quality not rank as high.”
      “You definitely want to write algorithms that will find the signals of good sites. You know, the
       sorts of things like original content rather than just scraping someone, or rephrasing what
       someone else has said. And if you can find enough of those signals—and there are definitely a lot
       of them out there—then you can say, OK, find the people who break the story, or who produce
       the original content, or who produce the impact on the Web, and try to rank those a little higher.
       . . .”
      There has been mention of Signals of Quality in Google patents and some specifics have been
       discussed by Google engineers so hopefully the days of article mills and article spinners are

How Bayesian Filtering Works

      Although it is known that search engines use Bayesian Filtering the exact algorithm is of course
       proprietary and unlikely to be made public, however the actions of Bayesian filters are well
       understood. So lets start by looking at how Bayesian filtering works.
   To begin a large sample or white list of known good documents (authoritative highly trusted
    pages) and a large sample of known bad documents (pages from blogs, scrapper sites etc) are
    analyzed and the characteristics of each page compared. When a large corpus of documents is
    compared programmatically patterns or „signals‟ emerge that were hitherto invisible. These
    signals can then be used to provide a numeric value (or percentage likelihood) of whether the
    characteristics of other pages lean towards those from the original sample of good documents or
    those from the original sample of bad documents.
   Some simple examples of this would be to compare the words in the good documents to those in
    the bad documents, if it is discovered that many low quality pages use the terms like „buy cheap
    Viagra‟ or have a section on each page for „sponsored links´ then other pages that do the same
    might be of low quality also. Conversely if it is discovered that high quality pages often contain a
    link to a Privacy Policy or display a contact telephone number then other pages that do the same
    might also be high quality pages.
   As the process continues more signals are uncovered. In this way the filter learns to recognize
    other traits and whether they are good or bad. There is likely to be many signals of quality
    measured, each one adding to or subtracting from an overall score of a pages quality.
    This means is that SEO‟s web designers and webmasters need to adopt a holistic approach that
    takes into account information architecture, relevancy, accessibility, usability, quality, hosting
    and user experience.

The Link Structure of the Web

   Although links will be covered in future tutorials, it makes sense to discuss some of the
    implications of recent changes in the link structure of the web now. Once upon a time reciprocal
    links were all that were needed to achieve top search engine rankings? Because reciprocal links
    were easy to acquire and made it easy to promote sites of lesser quality so that they outranked
    quality sites search engines stepped in and devalued reciprocal links along with PageRank.
   One way links were now the way to go, so a new market in selling one way links emerged.
    Search engines again viewed this as a way to game the system and paid links, if detected, were
    devalued so that they passed no value whatsoever. The nofollow attribute was implemented so
    that, amongst other reasons, links could be sold without penalty. The nofollow attribute has also
    been adopted for other reasons and is used on millions of blogs and some of the most popular
    social sites.
   URL shortening is also popular and again is used by some of the most popular sites on the web.
    The upshot of all this is that although the web continues to grow the ability of many millions of
    pages to link out and cast a vote for other pages has been removed. Of course you still get the
    traffic which can be substantial if you make the front page of Digg. Because the link graph of the
    entire web is essentially in recession, search engines are again reevaluated the way they calculate
    rankings and quality has many discernable signals.

The Need to Discern Quality

   According a study carried out by Webmaster World the top 15 doorway domains are a haven for
   spam. The study analyzed popular search terms and discovered that more than 50% of the results
   were spam. 77% of the results from were found to be spam. The following list shows
   the level of spam found on the top 15 doorway domains:

       Doorway Domain
                     100%
                   100%
                 99%
                  95%
                   95%
                93%
                91
                      85
               84
                   81
                        78
                   77
                77
                   74
                   52

The study shows that on the keywords tested some of these blogs are used exclusively by spammers,
while others had a very high percentage. The reason for this is that these sites provide free blog space
which is a magnet for spammers who need to generate links to low quality blogs or scraper sites quickly.

The next list compares percentage of spam sites by top-level domain' (TLD):

.info 68
.biz 53
.net 12
.org 11%
.com 4%

Chapter 10 .
What are Long-Tail Keywords?
As we‟ve discussed, the internet revolves around keywords and key phrases – the specific terms users
enter into search engines when they‟re searching for content. Finding competitive keywords to target on
your site is as much a science as it an art, as you need to search for elusive keywords that aren‟t too
competitive, but still have significant search volume. As the internet becomes more and more
competitive, smart web site owners are turning to long tail keywords to generate traffic.

According to, the long tail refers to “a socio-statistical theory which suggests that the
collective sales of products in low demand can exceed that of popular products and bestsellers.” In terms
of keywords, this means that it‟s easier to succeed using longer, less competitive keyword phrases, like
“how to establish a home-based vending business”, than it is with more popular keywords, such as “lose
weight” or “make money online.” By using long-tail keywords, you‟re shooting for the small fish in the
pond, while the rest of the marketing community is trolling for the record-breaking bass.

The idea of the long tail is summarized in the following graphic:

Courtesy of

At the far left of the graph, we see the most popular products on the marketplace – those that receive the
bulk of the traffic and interest on the internet. The amount of traffic for these products quickly levels off
as we move right along the graph, but you‟ll also notice that the long tail section is much longer than
that of the popular product section. The long tail section represents all the traffic coming from longer,
more obscure search phrases.
So what does all this mean for you as a site owner? When you‟re just getting started, it‟s much more
difficult to get ranked for popular search terms, given the amount of resources the existing sites have to
maintain their rankings. However, it‟s entirely within your reach to rank at the top of the search engines
for long tail keywords, using a combination of on-site and off-site search engine optimization strategies.

If you rank high for lots of long-tail keywords then you will receive lots of free traffic from the search

Chapter 11 .
Directory Submissions
Submitting your website to both free and paid directory sites is a great way to increase the backlinks to
your webpage. Directories are compilation sites that list links to all sorts of sites. To get a better feel for
how these sites are structured, check out the Open Directory Project at – it‟s one of the best
known directory sites on the internet today.

Traditionally, directory sites are broken down into two different categories – free and paid – depending
on whether or not they charge you to include your site in their listings. Both types can have a place in
your search engine optimization campaign, depending on what type of budget you‟re working with.
However, whether paid or not, it‟s important to realize that not all directories are equal in their potential
to improve your site‟s ranking. You‟ll need to do your due diligence to be sure the directory you‟re
submitting your site to gives you the best return for your efforts.

Getting your site listed in targeted directories that relate to the theme of your website will likely be seen
as a premium by the search engines. In addition, you might even receive some traffic from them.

For example, if someone who‟s interested in dog training finds a great directory dedicated to gathering
links in their area, they‟ll likely bookmark the directory and come back time and time again. To find
these highly targeted directories, search for “your niche +directory” using your favorite search engine.

In addition to finding targeted directories for submitting you site to, you also want to look at the quality
of the directory. Submitting your site to directories with higher traffic and higher PageRank will lead to
better SEO results for your web pages.

Another important thing to check into before you submit your site to a directory (especially if you‟re
going to pay) is whether or not the directory makes use of “NOFOLLOW” attributes. Basically, these
tags tell the search engine robots that any outgoing links on the directory page shouldn‟t be considered
in the search engine rankings. These tags were intended to prevent spam links from gaining popularity in
the search engines, but if you submit your site to a directory that uses them, you‟ll lose the backlink
benefits of directory postings. If the directory doesn‟t explicitly state whether or not it uses these tags,
you can find out by looking in the code for rel=”nofollow” or by using a special plug-in for your
browser that marks these links as red.

When it comes to submitting your site to directories, you again have two different options. You can
manually submit your site to each directory you‟d like to be included in, or you can purchase software or
services that will do the submissions for you. In fact, if you search for “directory submission” on any of
the major search engines, you‟ll see a number of different advertisements for bulk directory submission
services. In most cases, these services won‟t dramatically impact your site‟s search engine optimization,
since the directories they submit your site to may not be well optimized for your niche.

Submitting your site to directories by hand is more time-consuming, but it may be worth the money
saved if you‟re only going to be including your page in a select few directories. And you can save your
time altogether by outsourcing this task.

Individual directories have their own rules for how a site should be submitted, but most include selecting
the appropriate category for your link and including details like your link, anchor text and a brief
description of your site. The amount of time it takes before your link is visible will also vary by
directory. However, if you‟re diligent in finding well-ranked, high-traffic directories, you should see
increased traffic to your site in a short period of time.

Article Directory Submissions
Submitting articles related to your site‟s content to article directories is another great way to build your
backlinks. Most article directories allow you to include links to your site in the resource box. This box
just gives readers the details on how they can get more information relating to the topic of the article.
Carefully choose the anchor text in the resource box in order to see an increase in search engine rankings
for these keywords and key phrases.

In addition to building backlinks, you‟ll also likely get traffic from the articles you submit to popular
article directories. Some of your articles may even be re-published on other sites, generating even more
traffic and possibly more backlinks. In theory, the better the article, the more likely it will be re-
published on other sites. This should lead to more backlinks to your site (which the search engines see
and reward you for) and more traffic from the links you include in your resource box.

If you are writing articles for SEO purposes, it‟s crucial to spend the time selecting a good anchor text
strategy for your resource box. In fact, I spend more time on this than anything else.

Once you‟ve determined your article marketing strategy, you‟ll need to actually write the article or
outsource this task. Make sure that your article‟s title pulls readers in … something like “3 tips on how
to make money online with affiliate marketing” would do fairly well. People love easy-to-read content,
so breaking down your article into specific bullet points will encourage more traffic.

Of course, if you choose to use this type of article title, you will need to identify the three tips you‟re
going to cover. The good news is that doing this will help you to quickly structure and write out your
article. Articles published to article directories should be between 200-750 words. Most article writing
experts recommend starting out with an introduction to help draw the reader in to the article before they
continue with the main points of the article.

If you‟re having trouble getting down to the business of actually writing the article, consider the
following tips. First, don‟t put too much thought into it – try to write naturally, as if you‟re writing the
way you‟d talk about your target subject. Don‟t get bogged down on whether you should be using a
colon or a semicolon in a particular sentence – just get the words down on paper and you can come back
and edit them later. If writing still isn‟t coming easily to you, search for articles on similar subjects to
see how they‟re structured and what kind of voice they use. Don‟t plagiarize – use them only to get a
feel for how your own articles can be written.

The final element of your article is the resource box. The resource box is your chance to introduce
yourself and convince the reader to visit your website. Different article directories differ on whether
you‟re allowed to use anchor text links in your article text, but almost all of them will allow you to put a
link to your website in the resource box – so you can see what a powerful tool this section is.

Since most article directories will allow you to use anchor text in your resource box, you should take
advantage of that instead of just listing your URL. Adding in anchor text is crucial for SEO so make sure
to select a good key phrase for this. I also suggest listing your URL as well, just in case the HTML link
doesn‟t work. In addition to an anchor text link + the URL, you should try to make your resource box
interesting, to encourage readers to click through to your site.

Here‟s an example of a good resource box with anchor text for the key phrase „make thousands of
dollars online‟.

“Want to learn how I „make thousands of dollars online‟, every month, working only a few hours each
week? Check out my FREE report at!”

Including the anchor text with a link back to your webpage will boost the rankings of the page for the
key phrase „make thousands of dollars online‟ in the search engines. Just make sure the anchor text you
use is something people are searching for in the search engines.

Once you have all your article elements ready, it‟s time to submit them to the directories. Google seems
to give the most weight to articles posted on, and,
so make these directories the primary focus of your article marketing campaign. It‟s easy to sign up for a
free account at each of these sites, and you‟ll find more complete results on entering and submitting
articles on each webpage.

Press Release Submissions
If you‟ve worked in the professional media and communications industry, you‟re probably already
familiar with press releases. These single page documents have traditionally been used to inform media
personnel about upcoming news-worthy events for inclusion in newspapers and magazines. However,
press releases are finding new lives in the field of search engine optimization, where site owners have
found that these tools can be used for generating backlinks as well as increasing traffic back to their

One of the biggest reasons for this trend is the increase in web-based press release distribution sites. In
the past, press releases were submitted directly to contacts at media institutions. Today, however, sites
like allow site owners to distribute their press releases to news organizations around the
world in a manner of seconds.

That means you can write a press release, and it may get published on hundreds or even thousands of
different websites. If you include links with anchor text in your release than that means you‟ll give the
page you link back to a boost for the anchor text phrase you used. Just like the case with writing and
submitting articles, you may see significant traffic from your release as well.

To get started with press release submissions, you‟ll need to have something noteworthy to publish.
Press releases on topics like, “ publishes a new article” are unlikely to be picked up by news
sources, so if you‟re looking to generate traffic and have your press release published on a number of
sites, you‟ll have dismal results. On the other hand, if you‟re launching a new website or product that
provides an innovative new benefit, drafting and submitting a press release is a great way to generate
interest in your site.

Press releases tend to follow a prescribed format, including a headline, body text and boilerplate section.
The headline is the key to getting your press release picked up by news sources. It should pull the reader
in, while summarizing the main benefit of your site or product in a few words. Take a few minutes to
come up with a concise description (no more than 2-3 sentences) of how your site is different from the
others on the web. From this description, pull out a single descriptive phrase and use it in your heading.
For example, one of the top press releases currently listed on is “The Business Plan Is Dead
– New Book Shows How to Raise Capital in a New Way”. This headline simultaneously describes the
book‟s benefits and engages the reader‟s interest.

The body of the press release comes next, constituting the majority of your press release‟s text. The
body should be anywhere from 2-4 paragraphs and should be written in third person, simulating an
actual news story. In this section, you can describe your site or product‟s benefits in more detail. This is
a great place to include specific examples of how people have used your product or any testimonial
quotes you‟ve gathered regarding your site. You can also include a link to your site, along with
keyword-optimized anchor text.

Please note that many press release submission services charge a premium for putting anchor text in the
body of the release, but as long as you carefully research your keywords, it can be well worth the extra

Finally, the boilerplate section of your press release should include a few sentences that summarize your
business and contact information that news sources can use to reach you. The information in this section
should be fairly general – the term “boilerplate” indicates that the text can be used in any of your press
releases with equal relevancy. Depending on the interest your press release generates, members of the
news media may use the information included in this section to contact you regarding potential stories or
interviews, so it‟s important to include it in any press release you submit. Often, this is where you can
include further links back to your site. In many cases, you‟ll also be able to put anchor text links in here,
although you may have to pay extra. At the very least, you can put in your full URL.

Submitting your information to press release sites is easy, and you‟ll find that there are hundreds of
different sites that will distribute your release for you. Some of these sites offer free distribution, while
others charge fees to use their services. When you‟re just getting started with press release submission,
it‟s a good idea to try out a number of different services to see which generate the best results for your
site. Be sure to track how quickly your press releases are picked up by each service to further refine this
search engine optimization strategy.

Forum and Blog Commenting
Forum and blog commenting are extremely
powerful tools for driving free, targeted
traffic to your site, in addition to improving
your site‟s ranking with numerous quality
backlinks. Forums are specially designed
websites that allow people with similar
interests to chat by sharing messages.

Blogs, on the other hand, are typically
operated by a single person, who posts
messages relating to a specific topic. In
most cases, readers have the opportunity to
share their opinions on specific posts
through comment sections.

Before you make any posts to forums or
comments to blogs, make sure the links are
„do follow‟ links. This means that the site
does not use the nofollow tag. You may
check for no follow links by reviewing the
HTML code or by using a special browser
plug-in. Firefox has a couple you can
choose from.

First, let‟s look at forums and how to use them effectively to help them generate traffic to your site. To
start, you‟ll need to find a forum in your specific niche – it doesn‟t make sense to start a comment
campaign on a silk worm enthusiast forum when you‟re promoting a “make money online” product.
Search Google or any of the other major search engines for “make money online +forum” and you
should be able to identify several forums in your area of interest.

Once you‟ve identified a few candidates, check them out to determine which forums are the most active.
With this method, you might as well drive traffic as you increase your backlinks. If a forum only has a
few members that post once a month on average, you won‟t be able to drive as much traffic as a heavily
trafficked forum with hundreds or thousands of active members. You should be able to find membership
statistics posted somewhere on the forum‟s homepage, or you can look at the threads to see how many
comments are made on average and how recently they‟ve been updated.

Once you‟re ready to starting posting to a forum community, you‟ll likely need to register for an
account, which should be as simple as entering a user name, password and valid email address. Once
your account is set up, you can enter additional information in your user profile and create a special
“signature file” that‟s displayed after every post you make in the forum. The signature file is the place
where you‟ll want to include a description of your site and a link to the page. This way, whenever you
post to a forum, people will see this advertisement to your site and be encouraged to click through.

A few words of caution about posting to forums, however. First, most forums have strict rules about
advertising, and may close your account if they think you‟re spamming the forum with messages about
your site. Therefore, when you post to a forum, be sure you‟re providing helpful information, either in
your own posts or in response to others. Not only will this convince the forum owners that you‟re a
valuable member of the community, you‟ll be building up your reputation as an expert in the niche,
which will make other members more likely to click through to your site.

Blogs focus on one person‟s postings, instead of community conversations. However, active blogs may
have just as many visitors as popular forums, so they‟re still an important part of your traffic generation
campaign. To find blogs in your niche, run a similar Google search for “making money online +blog”
and check out several of the top results. Look for sites that are frequently updated with new posts and
that have several user comments following each blog post.

Typically, there are fewer restrictions about posting links in your blog comments than there are on
forum posts, but that doesn‟t mean you should spam the blog, by posting only your site‟s link in the
comment sections. Most blog owners retain the right to delete user comments, and your posts will
quickly be removed if the owner considers them to be spam. Instead, focus again on providing valuable
content that makes users want to click through to your site. With continued effort, you should see a
tremendous return in terms of traffic and improved site ranking on the major search engines.

In order to find the best do follow blogs on your topic, use Inline SEO‟s online DoFollow Diver tool.
This free, online tool will allow you to search over 700 blogs that do not use the nofollow tag. This
means the search engine spiders will see the link from their blog to yours. Type in the keywords related
to your site and find the best blogs to post comments to. Remember not to spam, make sure to add useful
content in your comments. You may find this valuable tool by visiting and clicking
on the SEO Tools link. You‟ll see Do Follow Diver listed there.

Social Bookmarking
                                           Social bookmarking is a relatively new phenomenon in the
                                           world of SEO, but it‟s an extremely powerful tool for
                                           building backlinks and for driving traffic to your site.
                                           Basically, social bookmarking sites enable users to share
                                           links to sites they find interesting. For example, a
                                           skateboarding enthusiast might share the links he‟s collected
                                           of cool skateboarding sites with a friend, or a web surfer
                                           could visit the homepage of one of these sites to see what‟s
                                           new and interesting on the internet.

                                         However, these two examples demonstrate the power of
                                         social bookmarking in a more organic sense. As site owners,
                                         we‟ll use the opportunities bookmarking sites provide to
                                         bookmark our own sites to boost our link building campaigns.
                                         There are hundreds of different social bookmarking sites
operating today, but for now, we‟ll focus on three of the most popular sites – Digg,, and
Stumble Upon.

    101    SEOCERTIFICATION.ORG.IN – Digg currently has an Alexa ranking of 202 – indicating that it‟s the 202nd most popular
site on the internet. Thousands of web surfers visit this site everyday to check out entertaining news and
video clips, making it a great way to drive traffic to your site. Digg allows users to submit sites in eight
major categories – technology, world & business, science, gaming, lifestyle, entertainment, sports and
offbeat. To submit your site, you‟ll need to sign up for a free account with a username, password and
email address. Once your account is set up, you‟ll be able to add links and descriptions of your
webpages, and create icons for your sites that will allow other readers to bookmark your site on Digg. – is another popular social bookmarking site, in part due to the ease with
which you can submit links. Unlike Digg, however, the site is much more freeform – focusing
specifically on tags instead of separate categories. In this case, tags are single words that describe the
content of the article – for example, “tattoo” in the case of an article titled “how to find tattoo designs”.
Users can visit the homepage and search for specific tags that interest them, and the site will
pull up all the articles linked in the database that include that tag. Signing up for an account with is similar to – just choose a username and password, enter your email address and
you‟re ready to start adding links to your site. – Stumble Upon is one of the newer additions to the social bookmarking field. The
site‟s biggest draw is its ability to recommend sites you may like, based on the recommendations of
other Stumble Upon members. Signing up for an account with this service is fairly similar to the process
described for and However, one of the major advantages to this site is the easy-to-
use toolbar you can download to your web browser that makes submitting your links simple. Once the
toolbar is installed, you‟ll see a button that says “I like it” display on the top of your web browser.
Navigate to one of your site‟s pages, then press the button, and you‟ll see a pop-up window that enables
you to add your link to the Stumble Upon database.

In addition to these sites, there are hundreds of other social bookmarking sites, many of which focus on
specific niches, like the entertainment or gaming industries. Because tens of thousands of people visit
these social bookmarking sites every day, there‟s bound to be at least one visitor who‟s interested in
content from your specific niche. If you‟re lucky enough to have your article featured on the homepage
of any of these directories, you could see your site benefit from hundreds of new visitors in a single day.

But even if you don‟t make the homepage, by allowing visitors to easily bookmark your pages at the top
social bookmarking sites, you‟ll get links back to your pages. As mentioned previously, you can also
bookmark your own web pages at the bookmarking sites (but please don‟t spam) just to get things
rolling and start building backlinks.

Optimizing a Business and Domain Name
You might think what‟s the name of my business got to do with search engine optimization (SEO)? If
you choose the correct business and domain name in the first place it makes the whole optimization
process so much easier.

As you‟ll discover later links to a web site are very important, in fact essential for high search engine
positioning. The text associated with those links (called anchor text) is one of the most important SEO
factors to get right when it comes to optimizing a site.

The Right Domain Name = The Right Anchor Text
The red text (SEO Delhi Search Engine Optimization Services) below is the anchor text of a link to the
home page of this site; it‟s the copyright link at the base of every page-

<a href=”/”>SEO Delhi Search Engine Optimization Services </a>

Those links are under our control, we as webmasters can create the perfect anchor text for each page.
However, webmasters linking to your site will tend to do one of two things. The first is they use your
business/site name for the anchor text (for this site SEO is most likely) and the second is the actual URL
(for this site for the anchor text. If you have picked the business name and
domain name badly it won‟t help your site in the search engines since links to your site will not use the
right anchor text.

The right anchor text, (keyword rich anchor text) is essential to good search engine rankings.

Branding vs. Search Engine Optimization

Zeus Thrones may sound great for a company selling high quality toilet seats, but it‟s not going to help
potential online customers find your carefully crafted web site via the major search engines, unless you
have a large advertising budget for branding purposes (like Amazon had/has) your potential customers
won‟t know your business or your web site even exists.

If you have a site about Search Engine Optimization (SEO) for example a good business name (for
optimization reasons) would be Search Engine Optimization or SEO and the domain name would be
search-engine-optimization.tld or seo.tld respectively (tld being com, net, etc…).

There are other considerations to take into account when choosing a business and domain name
including branding and of course available domain names, so compromises have to be made. For
example when deciding on a business and domain name for this web site I knew I couldn‟t have the
ideal domain names (for optimization reasons) because others already owned them. After a little
keyword research at Wordtracker and checking various domains, settled on
and general site name SEO Delhi Search Engine Optimization Services (or in short SEO Delhi).

Why Hyphenated Domain Names?

                                                          Why (hyphenated) and not
                                                 (non-hyphenated)? The
                                                          current main search engine (Google) doesn‟t
                                                          recognize the individual words from the above
                                                          non-hyphenated domain names, so they would
                                                          not help future optimization plans. The
                                                          hyphenated domain
is seen by Google as-

www seo delhi com since Google treats . (dots) and – (hyphens) as a space. We say “google can parse
those keywords out of the domain name”. Note a domain with a single keyword (i.e www.keyword.tld)
will have the single keyword recognized since Google sees the two dots (in the example above) as
spaces (sees www keyword tld). So when choosing a domain name either go with a single keyword
domain (like if available) or have multiple (or at least one) keyword(s) separated by
hyphens (like

Branding Revisited

If branding is a consideration or you plan to tell people about your site on the phone or in person,
hyphenated domains are harder to verbalize. So would be www dot seo hyphen
delhi dot com which isn‟t too bad, but imagine a domain with multiple hyphens, doesn‟t exactly role off
the tongue!! Fortunately there is a solution, register a non-hyphenated version as well and refer to it on
company letterheads, traditional advertising media etc… and redirect the non-hyphenated domain to the
hyphenated domain via a 301 redirect.

A word of warning regarding multiple domains. A simple 301 redirect will not be treated negatively by
the search engines, but having two identical sites (called mirror sites) might. If you plan to have two or
more sites about the same subject make the content different. Changing the background color or a few
images, does not make a site different, you need unique text. If you can‟t create unique sites, don‟t make
mirror sites, you might get your main business site penalized!!

If you MUST have two or more identical sites give the duplicates canonical URLs back the main site. A
canonical URL tells Google etc… the preferred version of a webpage (they are page not site based).
Every page of a duplicate site should have a corresponding canonical URL back to the main page.

With my sites I currently only advertise online so have no need for a second easier to verbalize non-
hyphenated domain name, so haven‟t registered alternatives (as it happens was
already taken anyway). However, I did plan to add an element of branding to this site long term which is
why I went with the business/domain name SEO Delhi, it‟s memorable (so brandable) and has at least
one keyword (SEO). I could have registered a name like “SEO Search Online Company”, but this is
completely unbrandable as no one will recall a generic name like that, but would be great in terms of
keywords since all four words are used by searches looking for SEO Services. If branding isn‟t
important (say an affiliate site) a generic keyword rich domain name is advisable.

In essence with we have created a web site that is both brandable and easier to

Anchor Text Revisited

When a web master links to SEO Delhi they will tend to use the following code-

<a href=””>SEO Delhi</a> (using the business name as anchor text)


<a href=””></a> (using the URL as anchor text)

In both cases the site benefits from the keywords Google can parse from the anchor text. The first
example we have SEO Delhi (one important keyword out of two words). In the second example www
SEO delhi com (one important keyword out of four words, not as good, but better than nothing). So in
both cases we benefit just by choosing the right business and domain name.

If we‟d of gone with the words combined ( version, Google wouldn‟t parse out
anything helpful, it would see www seodelhi com which is unhelpful since no one will search for
seodelhi! Webmasters who use the business name might use Seodelhi as anchor text instead of the
preferred SEO Delhi as well!

The above also holds true for directory and filenames, so when choosing directory and filenames use
hyphens like you see for the pages of this site. For example the page you are reading has the URL-

This will be parsed to- http www seo delhi com tutorial domain name choice if a webmaster linked to
this page using the entire URL! From this we have important keywords “SEO Tutorial Domain Name

Please note underscores (_) within a filename or directory like domain_name_choice are not treated as
word separators like hyphens are in domain-name-choice, so rather than parsing to “domain name
choice” Google see one word “domain_name_choice”. Try the search in Google for
domain_name_choice to see the problem.

You will find a lot of so called SEO experts believe underscores are treated as spaces, ask them for the
evidence: evidence would be pages ranking for a SERP like “Unique Keywords Here” (include the
speech marks in the Google search) with a filename like unique_keywords_here.html and the words
Unique, Keywords and Here are NOT used on the page or as anchor text with in links to the page. So the
only use of those words are the filename and separated by underscores.

The Right Domain Name helps with Search Engine Rankings

There is a direct search engine optimization benefit to using keyword rich hyphenated domain names
(and directory/filenames). Google and other search engines using Google database (AOL, Netscape
etc…) will give every page of a site a fair boost for singular and hyphenated words within the domain
name, even when the word isn‟t used in the code. Yahoo and Bing also give a SERPs boost, but it‟s
much smaller than seen with Google.

For example one of our consultants has a home page with their ISP. The URL is http:// ooar123/ search-engine-optimization/ (this page no longer exists, but the
concept is still true).

This URL parses to “http homepage ntlworld com ooar123 search engine optimization” with the last
three being important keywords. The „words‟ homepage, ntlworld and ooar123 are not keywords

important to the content and so they are not used in the content of any of the pages of the site. A search
for various combinations does show that even though those words are not used in the content, just being
part of the URL (domain name) is enough for Google to find those pages.

Try these searches in Google-

ooar123 – 1st place
homepage ooar123 – 1st place
homepage ntlworld ooar123 – 1st place

The above searches use words only within the URL (none of those words are used in the content).

The searches below include one word that‟s only within the URL and another word that is found in the

ooar123 seo – 1st place
ooar123 search – 1st place
ntlworld seo – 1st place
homepage seo – 6th place out of over 2,000,000 pages!

The last search shows the importance of words within the URL. A search for just homepage in Google
reports over 200,000,000 (200 million) pages, that‟s a lot of pages using the word homepage. Of those
pages 2 million (about 1%) also use the word SEO. Our consultant‟s home page is number 6 for the
combination even though the word homepage is not used on the site, it‟s just part of the URL.

Having a keyword within the URL will alone result in a boost for that keyword for every page of the
site, so your domain name should ideally contain your most important keyword.

Indirect SEO Benefits to Domain and Business Name Choice

Search engines rely on the words in the body text and other areas of code on a page. If your site and
domain name is highly related to the phrases you wish to rank highly for adding those phrases to your
sites content will be so much easier. Look at the number of times the word SEO has been added to this
page just through mentioning the business name SEO Gold, if our business name was “Good Traffic
Rankings” for example the word SEO would not be used as much.

The main words for this site are Search Engine Optimization, SEO and Search Engine Placement etc…
Since the site is called SEO Gold Search Engine Optimization Services just by mentioning the name of
the site we‟ve added five keywords to a page.

Chapter 12 .

How Heat Maps Can Help Your Ecommerce Website

Heat maps are one of the most powerful tools an ecommerce website owner can have at their fingertips. These
literally give you an insight into your customers‟ minds as they peruse your website. Here is a short introduction
to heat maps along with tips for making them work hard for your ecommerce website.

What are heat maps?

A heat map is a visual picture of your website as it is viewed by visitors. This tool usually shows the
webpage in question as a color map, with red being the area most viewed and blue and violet as the least
viewed. This allows you to see exactly where consumers are looking when they open a given webpage,
which then allows you to place the most important elements in these places. While it may seem like a
gimmick, a heat map is actually a valuable and effective tool. There are many programs that will allow
you to heat map your website and see how it can be improved.

Heat maps can be developed both for eye tracking and for click tracking. Click tracking is similar to eye
tracking, except that it looks at where your visitors are clicking the most. This again can help you to
move around important page elements so that the ones that generate sales are given prime placement.

Using Heat Maps

Heat maps are nifty, but they work best when you approach them in a scientific way. Here are a few tips
for getting the most out of this tool.

     Start with a specific question. There are several things you could look at, like: Do customers
      look at my call to action? Do they see the sidebar? Are they looking at my photographs? Do
      customers scroll below the fold? Each of these questions is easy to answer with heat maps, and
      all can contribute to the success of your company.
     Choose your panel wisely. Customer research is a huge investment and it is one that needs to be
      spent wisely. Make sure your panel is made up of people from your target audience, not just
      anybody within a broad demographic.
     Be specific. Your panel needs to have very specific instructions regarding what they are looking
      for and the goal of the study. In addition, you need to have a defined set of outcomes for your
      test and know ahead of time what they will indicate. Thinking about these factors ahead of time
      can help you to design a test that will give you the answers that you seek.
     Consider testing an alternate design. Sometimes the flaws in a design can be seen best when
      there is another one available for comparison. It may pay to develop a new and very different
      design to be tested side-by-side with your real one.

      Don’t stop at the data. While looking at aggregate data is important because it gives you an
       overview of the test results, you need to look deeper in order to get the in-detail information that
       you need. Analyze individual sessions as well as playback videos; often these have the answers
       that you really need.

Use a Heat Map to Boost Website Usability!

It's been proven that websites have specific areas where human eyes always spend the most time, and
heat maps are the ultimate tool for determining exactly where most eyes look fist.

With a heat map, webmasters like yourself can improve and tweak a website to put the most valuable
stuff in the most opportunistic spots. Read more about heat maps, and how to implement one on your

What's a Heat Map?
Beginning as research taken from eye tracking software, a heat map is a software program that tracks the
path that a website user's mouse cursor takes as he or she navigates through the website.

When this data is recorded for many users, a typical "pattern" can be seen. On a heat map, the most
frequented areas where mouse cursors have been are usually red, orange and yellow, and the lesser-
traveled areas are seen in green and blue. Areas not frequented at all are not colored.

Above: a classic heat map diagram for a web page. The darker/more red & orange regions are the "sweet
spots" for advertisements, since they were proven to be the regions where peoples' eyes instinctively
travel to the most.

Using Heat Map Info for Ad Placement
By placing ads in the "hottest" regions of a heat map, you're giving them the most exposure -- it's a
simple formula and also a good rule of thumb. As results have shown, the top/center and upper-left
regions of a website get the most attention.

Beware of using too many ads above the fold -- a January 2012 statement from Google spoke of an
algorithmic update that punished websites with "too many ads above the fold." To be safe, use ads
sparingly -- a top banner ad, a small in-content ad, and be extremely conservative with anything else.

The "Golden Triangle"

The "Golden Triangle" is an industry term referring to the sweet spot on a search engine results page,
which happens to be shaped like a triangle with its most viewed point near the region where a website's
logo typically rests.

This particular heat map region is a benchmark not only for websites, but for articles and even search
engine results pages. It is also a major determinant for ad placement. In this example, when looking at
Google AdWords placement: the top/center region gets the most visibility, with the top of the right
column getting second string.

Heat Maps as a Marketing Analysis Tool
One of the most profound advantages of heat maps is that they give insight into visitor behavior. You'll
be able to see what most people on average are clicking or gravitating to. Here are a few other important

    With a heat map, you can see which part of a link is being clicked more. For instance, if you
     have a link such as "cars for sale," you might notice that most people are clicking on the word
     "cars" within that hyperlink, instead of the "for sale" part of it. Look for these kinds of behaviors
     as a whole, and you can more effectively fine tune your site's navigation and enhance your
     linking strategy.
    You can reveal poorly performing areas of your site with a heat map. For instance, if many
     visitors are clicking a non-hyperlinked image, they're probably expecting something to happen.
     These kinds of actions are much easier to diagnose when your website has a heat map feature.

     Locate these spots, and hyperlink them accordingly to where visitors are most likely expecting
     them to point to!
    As previously mentioned on this page, look for "high click traffic" spots on your page, and
     consider putting an ad there for maximum results. Also, pay attention to highly trafficked areas
     on your menu. Is there anything you can do to make it better or less cluttered?

Get a Heat Map for Your Site
Until Google Analytics includes a heat map feature (which is rumored to be sooner than later), there are
alternative programs you can use to see where the most activity occurs on your site. These heat maps are
based on the areas where your visitors' mouse cursors have been trailing.

Mouse flow

Mouse-tracking software that lets you actually view of how your visitors navigate your site. A really
cool innovation in viewing visitor behavior on your sites. Plans range from free, to $264/mo.


The free (but very basic) solution to getting a fully-functional heat map up on your site.

Crazy Egg

This full Analytics package includes a heat map feature, with prices ranging from $9/mo to $99/mo.

Click Density

Another web analytics package offering heat map technology to show where your visitors are looking
and clicking to. Its pricing plans start at $5/month.

Click Tale

An enterprise solution for implementing a feature rich heat map solution, with analytics and heat map
playback. Plans range from $99/mo to $990/mo.

Chapter 13 .

Google Panda Update vs. Google Penguin Updates
The SEO community has been a buzz this past week with the latest update from Google, named
Penguin. Penguin came down the pipeline last week, right on the tail of the latest Panda update. Since
most of the big updates in the past year have been focused on Panda, many site owners are left
wondering what the real differences between Panda and Penguin are. Here is a breakdown:

Google Panda Update Overview:
According to Google‟s official blog post when Panda launched,

This update is designed to reduce rankings for low-quality sites—sites which are low-value add for
users, copy content from other websites or sites that are just not very useful. At the same time, it will
provide better rankings for high-quality sites—sites with original content and information such as
research, in-depth reports, thoughtful analysis and so on.

Basically, Panda updates are designed to target pages that aren‟t necessarily spam but aren‟t great
quality. This was the first ever penalty that went after “thin content,” and the sites that were hit hardest
by the first Panda update were content farms (hence why it was originally called the Farmer update),
where users could publish dozens of low-quality, keyword stuffed articles that offered little to no real
value for the reader. Many publishers would submit the same article to a bunch of these content farms
just to get extra links.

Panda is a site wide penalty, which means that if “enough” (no specific number) pages of your site were
flagged for having thin content, your entire site could be penalized. Panda was also intended to stop
scrappers (sites that would republish other company‟s content) from outranking the original author‟s

Here is a breakdown of all the Panda updates and their release dates. If your site’s traffic took a
major hit around one of these times there is a good chance it was flagged by Panda

1. Panda 1.0 (aka the Farmer Update) on February 24th 2011
2. Panda 2.0 on April 11th 2011. (Panda impacts all English speaking countries)
3. Panda 2.1 on May 9th 2011 or so
4. Panda 2.2 on June 18th 2011 or so.
5. Panda 2.3 on around July 22nd 2011.
6. Panda 2.4 in August 2011(Panda goes international)
7. Panda 2.5 on September 28th 2011
8. Panda 2.5.1 on October 9th 2011
9. Panda 2.5.2 on October 13th 2011
10. Panda 2.5.3 on October 19/20th 2011
11. Panda 3.1 on November 18th 2011
12. Panda 3.2 on about January 15th 2012
13. Panda 3.3 on about February 26th 2012
14. Panda 3.4 on March 23rd 2012
15. Panda 3.5 on April 19th 2012

Search Engine Land recently created this great Google Panda update infographic to help walk site
owners through the many versions of the Google Panda updates.

Many site owners complained that even after they made changes to their sites in order to be more
“Panda friendly,” their sites didn‟t automatically recover. Panda updates do not happen at regular
intervals, and Google doesn‟t re-index every site each time, so some site owners were forced to deal
with low traffic for several months until Google got around to re-crawling their website and taking note
of any positive changes.

Google Penguin Update Overview:
The Google Penguin Update launched on April 24. According to the Google blog, Penguin is an
“important algorithm change targeted at webspam. The change will decrease rankings for sites that we
believe are violating Google‟s existing quality guidelines.” Google mentions that typical black hat SEO
tactics like keyword stuffing (long considered webspam) would get a site in trouble, but less obvious
tactics (link incorporating irrelevant outgoing links into a page of content) would also cause Penguin to
flag your site. Says Google,

Sites affected by this change might not be easily recognizable as spamming without deep analysis or
expertise, but the common thread is that these sites are doing much more than white hat SEO; we
believe they are engaging in webspam tactics to manipulate search engine rankings.

Site owners should be sure to check their Google Webmaster accounts for any messages from Google
warning about your past spam activity and a potential penalty. Google says that Penguin has impacted
about 3.1% of queries (compared to Panda 1.0‟s 12%). If you saw major traffic losses between April
24th and April 25th, chances are Penguin is the culprit, even though Panda 3.5 came out around the
same time.

Unfortunately, Google has yet to outline exactly what signals Penguin is picking up on, so many site
owners that were negatively impacted are in the dark as to where they want wrong with their onsite
SEO. Many in the SEO community have speculated that some contributing factors to Penguin might be
things like:

1.     Aggressive exact-match anchor text
2.     Overuse of exact-match domains
3.     Low-quality article marketing & blog spam
4.     Keyword stuffing in internal/outbound links

It‟s important to remember that Panda is an algorithm update, not a manual penalty. A reconsideration
request to Google won‟t make much a difference–you‟ll have to repair your site and wait for a refresh
before your site will recover. As always do not panic if you are seeing a down turn in traffic, in the past
when there is a major Google update like these things often rebound. If you do think you have some
sort of SEO penalty as a result of either the Google Panda or Google Penguin updates, please contact
your SEO service provider to help or start trouble shooting.

SEO (search engine optimization) has been one of the most important buzz words for web publishers
over the past 10 years. Getting ranked in Google means free traffic for web publishers, so improving
and optimizing a given website for the search engines was essential. However, Google Panda is here to
stay, and it has forever changed the rules of SEO.

For the past 10+ years, Google utilized its PageRank methodology to rank websites. If you had a website
about basketball, and you got a link from – Google would notice that your site is a quality
site about basketball. It was similar to a voting system, when a relevant and high authority site linked to
you, it would count as a vote. And the more votes your website received, the better your website would
rank. Of course relevance played a roll, as Google gave more value to sites linking to you which were
about a similar topic as yours. In addition, authority mattered, so one link from may hold
more value than 5 links from sports related blogs which weren‟t as popular.

Two other major factors that Google considered were unique content and the "Title Tag". Google
wanted content that was unique and not displayed on other web pages across the internet. If duplicate
content was found, Google would determine which site was the original author of the content, and it
would penalize the other sites which had scraped the content.

Google also factored the "Title Tag" as it was a way for web publishers to tell users and Google what the
given webpage was about. This helped Google to organize and rank web pages for given keyword

This methodology for ranking web pages worked, and Google utilized the above methods in addition to
several others to display highly relevant search results. For years, Google results were of a higher
quality than all other search engines, which is why Google continued to command over a 65% market
share. However, over the past few years, other search engines such as Bing caught up, and Google
wasn't so special anymore. At the same time, web publishers became savvy and they figured out ways
to sneak into Google ahead of more relevant results. For example, earlier this year, JC Penny was
accused of purchasing links on websites across the web to make Google think that these links were
natural and thus a vote for JC Penny's websites.

As more and more users complained about search results, Google realized it needed to shift, and in
came Google Panda. Google Panda is an entirely new way for Google to evaluate websites. And while
Google will still factor in many of the same criteria it has in the past, Google Panda adds an entirely new
element to Google's ranking methodology.

Panda wants better quality websites in its results. It is less concerned with signals that other websites
give it and more concerned with what the actual users think about the website. Think of Google Panda
as an automated way for Google to have users power its search results. The brilliant part is that it is user
powered without the user having to do anything different. Panda is not only genius, but it makes sense
as it should prevent lower quality sites from tricking Google into thinking they are of higher quality.

Panda factors in a wide variety of user signals to help Google determine the quality of a website. It
looks at "Time on Site" as a way to determine how quality of an experience the user is having on a given
site. It looks at the bounce rate, which is a measure of the percentage of people that leave a site without
doing anything. It looks at social signals such as shares and +1's as a way to see if people are
recommending a given webpage. It looks at page views per visit as a way to see how people are
navigating through a given site.

Google also looks at Branded Search traffic which is the amount of people that are specifically looking
for a given site. So, if your basketball site is called "Fun Basketball Dude" - and Google notices that an
increasing amount of people are searching for "Fun Basketball Dude" as a way to get to your website,
that is a way for Google to recognize that your site is enjoyed by users.

Overall, these are "usage metrics" and they signal to Google how users value a particular webpage or
website. In the old days, unique content was important, but Google Panda wants unique content that is
also high quality content. And the usage metrics Google has in place will help it to determine if the
content that the reader lands on is truly high in quality.

If you are trying to rank well in Google - I think you should listen to what Google is saying. Instead of
trying to trick Google with Black Hat techniques, utilize Google's tips which will essentially improve
your site while boosting your chance at increased referral traffic from Google.

Panda and Penguin
Google‟s recent changes to how it ranks websites have started to have an effect on many businesses and
the amount and type of SEO required to achieve high rankings has changed.

Off-Page SEO Checklist:
Perhaps it is no surprise that Google gives increasing prominence to its own social platform. Whilst its
use is still compared to Facebook, its importance for SEO is growing so you should set up an account,
populate it with relevant content and start building your circles.

Together with on-page SEO detailed in our earlier blog, actioning the above will help you achieve
good search engine rankings and give you that competitive edge.

1. Relevancy now matters a most.
With the latest algorithm updates, Google now pays more importance on the relevance of the whole
theme contents. In a site, relevancy words usually mean similarities within the pages, post with related
keywords. That‟s the reason why niche sites have good SERP and site authority. It is probably not a
good idea to write one post on entertainment and another on technology.

2. Do more guest post and totally forget about articles directories, for
effective link.
Back in old days, most effective link building strategies used by webmaster were submitting articles to
article directories (Like Ezine Articles). But since algorithm changed and 2012 Google is much more
advanced than compares to 2000 Google, and so the strategy needs to be get changes too.

Due to low quality content in most of article directories, those sites were the first to gets effects during
the recent panda updates. So, if you are looking for building backlinks then articles directories are no
more that effective likes before.

Rather I would recommend for guest posting to some higher page rank Blogs.

3.      Social Marketing

Be active on Social Networking, Panda loves to be social
This is one of the major changes and will mean a regular commitment to engage with clients via social
sharing platforms such as Facebook, Twitter, LinkedIn, Google+ etc. You will need to link back to your
website and use keywords in titles, copy and links without over doing it. It‟s likely you will need daily
interactions to build a following, and should aim to engage for 80% of the time and sell for just 20%..

Google is giving much important on real time updated information and social media presents. Only
Google‟s know the exact impact of Twitter, Facebook, and LinkedIn on sites in its search results but it
was confirmed by Google team that now their search results is providing real time sharing.

     4. Add visual interest, Got some videos, Google loves them
     5. Website visitors love non-verbal content so images and video is important, but
        for maximum effectiveness any such content needs to optimized, so place
        keywords carefully in ULRs, tags, titles etc..

Sites like YouTube and Metacafe were the real winner on this recent Panda update. But what would
have been the reason behind? Videos! Search engines mostly Google is giving much weight to videos,
so if you got some or planning for, you better be fast. It might help your site page rank and can increase
the SERP.

      6. They Avoid Excessive Advertisements.

Though the excessive advertisements in a site is not a ideal practice and Google have been telling this
lots of time, but on recent updates Google is now much strict about this.

Excessive advertisement creates problems both for user and search bots. It actually doesn‟t makes sense
to have so much of ads neither its user friendly. Too many ads can make your site Meaningless in the
eyes of search bot.

So, I will recommend write quality content (if possible, include images, videos, graph on content) and
use limited ads.

As a webmaster you should take your site as your own baby who actually needs your care and attention.
The above points might sounds too common issue but I remind you there is no more any scope for little
issue, Google is no more considering even a slight mistakes.

7. Content Marketing
Google likes to see links to your site from blogs, directories, Wikis and other authoritative and relevant
web pages. Again quality is the key here – quality content on quality websites is the thing to aim for.

8 – Site Speed and Server Time
Google Caffeine made load times a SEO ranking factor so make sure your website sits on a fast UK-
based server, and any avoid downtime.

9 – Link building
Links are still important for SEO but the two factors to think about are quality and timeliness. You want
links from relevant and well-ranked websites – and you want these to be built over time. Google will
penalise links to poor quality and/or unconnected sites and links which are created very quickly may get
a „slap‟ for looking artificial – this will look like a spamming activity. Bad practice here can lead to
Google removing your site from searches – so use an ethical internet marketing agency for any link
building activity. Avoid automated tools.

10 – Link Anchor text
Don‟t over link a keyword – it should perhaps account for no more that 20-30% of your links. Include
generic links (e.g. „read more‟ and „visit http://www…‟) and naked links to create a variety.

Chapter 14 .

                           Google Ranking Factors -
                                SEO Checklist

    Google Ranking Factor Checklist

         1. Positive ON-Page SEO Factors

         2. Negative ON-Page SEO Factors

         3. Positive OFF-Page SEO Factors

         4. Negative OFF-Page SEO Factors

    Notes for the Above Factors Brief Google Update List – Panda The Sand Box Sources

    There are "over 200 SEO factors" that Google uses to rank pages in the Google search
    results (SERPs). What are the search engine optimization rules?
    Here is the speculation - educated guesses by SEO webmasters on top webmaster forums.
    Should you wish to achieve a high ranking, the various confirmed and suspected Google
    Search Engine Optimization (SEO) Rules are listed below.

    The SEO Rules listed below are NOT listed by weight, and not by any presumed relevance - THAT exercise is left up to
    the reader!

                      1.                    Alleged
                                      POSITIVE ON-Page
                                SEO Google Ranking Factors (38)
                   (Keeping in mind the converse, of course, that when violated, some of these
                    factors immediately jump into the NEGATIVE On-PageRanking Factors

                           The term "Keyword" below refers to the "Keyword Phrase",
                                       which can be one word or more.
                        Green rows confirmed by Google patent of Aug. 10, 2006

Note -
Claim     Factor                 POSITIVE
  #         #           ON-Page SEO Factors                         Brief Note

                                                        Google patent - Topic extraction
    50         -             KEYWORDS                   For keyword selection,
                                                        try Google Ad Words - Google Trends

HOT            1    Keyword in URL                      First word is best, second is second best, etc.

HOT            2    Keyword in Domain name              Same as in page-name-with-hyphens

-                   Keywords - Header
                                                        Keyword in Title tag - close to beginning
HOT            3    Keyword in Title tag                Title tag 10 - 60 characters, no special

                                                        Shows theme - less than 200 chars.
-              4    Keyword in Description meta tag     Google no longer "relies" upon this tag, but
                                                        will often use it.

-              5    Keyword in Keyword metatag          Shows theme - less than 10 words.
                                                        Every word in this tag MUST appear
                                                        somewhere in the body text. If not, it can be
                                                        penalized for irrelevance.
                                                        No single word should appear more than twice.
                                                        If not, it may be considered spam. Google
                                                        purportedly no longer uses this tag, but others

-                   Keywords - Body
-              6    Keyword density in body text        5 - 20% - (all keywords/ total words)
                                                        Some report topic sensitivity - the keyword
                                                        spamming threshold % varies with the topic.

-              7            Individual keyword density 1 - 6% - (each keyword/ total words)

HOT            8    Keyword in H1, H2 and H3            Use Hx font style tags appropriately

                                                        "Strong is treated the same as bold, italic is
-              9    Keyword font size                   treated the same as emphasis" . . . Matt Cutts
                                                        July 2006

                    Keyword proximity (for 2+
-              10                                       Directly adjacent is best

                                                        Does word order in the page match word order
-              11   Keyword phrase order                in the query?
                                                        Try to anticipate query, and match word order.

-              12   Keyword prominence (how early in    Can be important at top of page, in bold, in

                         page/tag)                         large font

-                        Keywords - Other
-               13       Keyword in alt text               Should describe graphic - Do NOT fill with
                                                           (Was part of Google Florida OOP - tripped a
                                                           threshold - may still be in effect to some
                                                           degree as a red flag, when summed with all
                                                           other on-page optimization - total page
                                                           optimization score - TPOS).

                         Keyword in links to site pages
-               14                                         Links out anchor text use keyword?
                         (anchor text)

-                        NAVIGATION - INTERNAL LINKS
SITE            15       To internal pages- keywords?      Link should contain keywords.
                                                           The filename "linked to" should contain the
                                                           Use hyphenated filenames, but not long ones -
                                                           two or three hyphens only.

                                                           Validate all links to all pages on site.
SITE            16       All Internal links valid?
                                                           Use a free link checker. I like this one.

                                                           TRY FOR two clicks to any page - no page
SITE            17       Efficient - tree-like structure
                                                           deeper than 4 clicks

SITE            18       Intra-site linking                Appropriate links between lower-level pages

    54          -        NAVIGATION - OUTGOING LINKS

    55               19 To external pages- keywords?       Google patent - Link only to good sites. Do
                                                           not link to link farms. CAREFUL - Links can
                                                           and do go bad, resulting in site demotion.
                                                           Unfortunately, you must devote the time
                                                           necessary to police your outgoing links - they
                                                           are your responsibility.

                                                           Google patent - Should be on topic,
    56               20 Outgoing link Anchor Text

                     21 Link stability over time           Google patent - Avoid "Link Churn"

-               22       All External links valid?         Validate all links periodically.

-               23       Less than 100 links out total     Google says limit to 100,

                                                           but readily accepts 2-3 times that number. ref

-              121   Linking to Authority                  Some say this gives a boost -
             (added)                                       Others say that is absurd. However, it certainly
                                                           is the opposite of linking to trash, which WILL
                                                           hurt you.

-                          OTHER ON-Page Factors
-                24        Domain Name Extension             .gov sites seem to be the highest status
                           Top Level Domain - TLD            .edu sites seem to be given a high status
                                                             .org sites seem to be given a high status
                                                             .com sites excel in encompassing all the
                                                             spam/ crud sites, resulting in the need for the
                                                             highest scrutiny/ action by Google.
                                                             Perhaps one would do well with the new
                                                             .info domain class.<update> - Nope.
                                                             Spammers jumped all over it - no safe haven
                                                             there. Not so much, now - .info sites can
                                                             rank highly.

-                25        File Size                         Try not to exceed 100K page size (however,
                                                             some subject matter, such as this page,
                                                             requires larger file sizes).
                                                             Smaller files are preferred <40K (lots of

-                26        Hyphens in URL                    Preferred method for indicating a space,
                                                             where there can be no actual space
                                                             One or two= excellent for separating
                                                             keywords (i.e., pet-smart, pets-mart)
                                                             Four or more= BAD, starts to look spammy
                                                             Ten = Spammer for sure, demotion

    6, 7              27 Freshness of Pages                  Google patent - Changes over time
    12,                                                      Newer the better - if news, retail or auction!
     13                                                      Google likes fresh pages. So do I.

                           Freshness - Amount of Content
    8, 9              28                                     New pages - Ratio of old pages to new pages

    27                29 Freshness of Links                  Google patent - May be good or bad
                                                             Excellent for high-trust sites
                                                             May not be so good for newer, low-trust

                                                                 Frequent updates = frequent spidering =
-             30       Frequency of Updates
                                                                 newer cache

-             31       Page Theming                              Page exhibit theme? General consistency?

                                                                 Stem, stems, stemmed, stemmer,
-             32       Keyword stemming
                                                                 stemming, stemmist, stemification

-             33       Applied Semantics                         Synonyms, CIRCA white paper

                                                                 Latent Semantic Indexing - Speculation, no
-             34                                           LSI

                                                                 Keep it minimized - use somewhat less than
-             35       URL length                                the 2,000 characters allowed by IE - less
                                                                 than 100 is good, less is even better

-                      OTHER ON-SITE Factors

    5              36 Site Size - Google likes big sites         Larger sites are presumed to be better
                                                                 funded, better organized, better constructed,
                                                                 and therefore better sites. Google likes
                                                                 LARGE sites, for various reasons, not all
                                                                 positive. This has resulted in the advent of
                                                                 machine-generated 10,000-page spam sites -
                                                                 size for the sake of size. Google has caught
                                                                 on and dumped millions of pages, or made
                                                                 them supplemental.

    4              37 Site Age                                   Google patent - Old is best. Old is Golden.

                                                                 Age of page vs. age of other pages on site
    3              38 Age of page vs. age of site                Newer pages on an older site will get faster

          Note: For ALL the POSITIVE On-Page factors listed above,
          PAGE RANK can OVERRIDE them all. So can Google-Bombing.

                              2. Alleged
                          Negative ON-Page
                    SEO Google Ranking Factors (24)

         Factor             NEGATIVE
Note       #         ON-Page SEO Factors            Brief Note
                  Text presented in graphics form
                  only                              Text represented graphically is invisible to
BAD          39
                  No ACTUAL body text on the        search engines.

BAD          40   Affiliate site?                   The Florida update went after affiliates with a
                                                    vengeance - flower and travel affiliates were
                                                    hit hard - cookie-cutter sites with massive
                                                    inter-linking, but little unique content.
                                                    Subsequent updates have also targeted

BAD          41   Over optimization penalty (OOP)   Penalty for over-compliance with well-
                                                    established, accepted web optimization
                                                    practices. Too high keyword repetition
                                                    (keyword stuffing) may get you the OOP.
                                                    Overuse of H1 tags has been mentioned. Meta-
                                                    tag stuffing.

BAD          42   Link to a bad neighborhood        Don't link to link farms, FFAs (Free For All's)
                                                    Also, don't forget to check the Google status of
                                                    EVERYONE you link to periodically. A site
                                                    may go "bad", and you can end up being
                                                    penalized, even though you did nothing. For
                                                    instance, some failed real estate sites have been
                                                    switched to p0rn by unscrupulous webmasters,
                                                    for the traffic. This is not good for you, if you
                                                    are linking to the originally legitimate URL.

                                                    Don't immediately send your visitor to another
BAD          43   Redirect thru refresh metatags    page other than the one he/ she clicked on,
                                                    using meta refresh.

BAD          44   Vile language - ethnic slur       Including the George Carlin 7 bad words you
                                                    can't say on TV, plus the 150 or so that
                                                    followed. Don't shoot yourself right straight in
                                                    the foot. Also, avoid combinations of normal
                                                    words, which when used together, become

                                                         something else entirely - such as the word
                                                         juice, and the word l0ve. See why I wrote that
                                                         zero? I don't even want to get a proximity
                                                         penalty, either. Paranoia, or caution? You
                                                         decide. I always want to try to put my "best
                                                         foot forward".

BAD         45       Poison words                        The word "Links" in a title tag has been
                                                         suggested to be a bad idea. Here is my list
                                                         of Poison Words for Adsense. This penalty has
                                                         been loosened - many of these words now
                                                         appear in normal context, with no problems.
                                                         But watch your step.

BAD         46       Excessive cross-linking             - within the same C block
                                                         If you have many sites (>10, author's guess)
                                                         with the same web host, prolific cross-linking
                                                         can indicate more of a single entity, and less of
                                                         democratic web voting. Easy to spot, easy to
                                                         "This does not apply to a small number of
                                                         sites" .. (this author guesses the number 10,
                                                         JAWG) . . . "hosted on a local server". . Matt
                                                         Cutts July 2006

BAD         47       Stealing images/ text blocks from   Copyright violation - Google responds strongly
                     another domain                      if you are reported. ref egol
                                                         File Google DMCA

BAD         48       Keyword stuffing threshold          In body, meta tags, alt text, etc. = demotion

??          49       Keyword dilution                    Targeting too many unrelated keywords on a
                                                         page, which would detract from theming, and
                                                         reduce the importance of your REALLY
                                                         important keywords.

??               50 Page edit - can reduce consistency   Google patent -
                                                         Google is now switching between a "newer"
                                                         cache, and several "older" caches, frequently
                                                         drawing from BOTH at the same time.
                                                         This was possibly implemented to frustrate
                                                         SERP manipulators. Did your last edit
                                                         substantially alter your keywords, or theme?
                                                         Expect noticeable SERP bouncing.

6-7              51 Frequency of Content Change          Google patent - Too frequent = bad

                     52 Freshness of Anchor Text               Google patent - Too frequent = bad

                                                               Problematic - know pitfalls - shorten URLs,
    ??          53       Dynamic Pages                         reduce variables (". . no more than 2 or 3",
                                                               M.Cutts July 2006), lose the session IDs

    ??          54       Excessive Javascript                  Don't use for redirects, or hiding links

    ??          55       Flash page - NOT                      Most (all-?) SE spiders can't read Flash content
                                                               Provide an HTML alternative, or experience
                                                               lower SERP positioning.

    ??          56       Use of Frames                         Spidering Problems with Frames - STILL

-               57       Robot exclusion "no index" tag        Intentional self-exclusion

-               58       Single pixel links                    A red flag - one reason only - a sneaky link.

-               59       Invisible text                        OK - No penalty - Google advises against
                                                               All over the place - but nothing is ever done.
                                                               (The text is the same color as the background,
                                                               and hence cannot be seen by the viewer, but
                                                               can be visible to the search engine spiders.) I
                                                               believe Google does penalize for hidden text,
                                                               since it is an attempt to manipulate rank.
                                                               Although they don't catch everyone.

-               60       Gateway, doorway page                 OK - No penalty - Google advises against
                         (I see changes here - not only does   Google used to reward these pages.
                         the doorway page disappear, but the   Multiple entrance pages in the top ten SERPs -
                         main page gets pushed down, as        I see it daily. There they are at #2, with their
                         well - this is a welcome fix.)        twin at #5 - 6 months now. Reported numerous

-               61       Duplicate content (YOUR'S)            OK - No penalty - Google advises against
                         Duplicate content (THEIR'S) below     this.
                         (Highjack)                            Google picks one (usually the oldest), and
                                                               shoves it to the top, and pushes the second
                                                               choice down. This has been a big issue with
                                                               stolen content - the thief usurps your former
                                                               position with YOUR OWN content.

-               62       HTML code violations                  Doesn't matter - Google advises against this.
                         (The big G does not even use          Unless of course, the page is totally FUBAR.
                         DOCTYPE declarations, required        Simple HTML verification is NOT required

               forW3C validation.)                     (but advised, since it could contribute to your
                                                       page quality factor - PQF).

-     -        Since the above 4 items are so          IN GENERAL, this works pretty well to keep
               controversial, I would like to add      webmasters in line. The fallacy of this is that
               this comment:                           attentive webmasters can readily observe
               There are many things that Google       continuing, blatant exceptions to these official
               would LIKE to have webmasters do,       pronouncements.
               but that they simply cannot control,
               due to logistical considerations.       There are many anecdotes about Goggle
               Their only alternative is to foment     "taking care" of a problem. Google states that
               fear and doubt by implying that any     they do not provide hand-tweaked "boosts",
               violation of their "suggestions" will   but are silent about hand-tweaked demotions.
               result in swift and fierce demotion.    They occur, for sure. To believe otherwise is
               (This is somewhat dated - G is          naive. Wouldn't YOU swat the most obnoxious
               fixing these things.)                   flies? I would.

                                                       It is becoming easier to determine the best
                                                       thing to do. Try to avoid any Google penalties
                                                       or demotions.

-             Phrase-based ranking, filters,           Feb. 2007 - Google patent granted. Do not use
              penalties                                phrases that have been associated and
                                                       correlated with known spamming techniques,
                                                       or you will be penalized. What phrases? Ahh,
                                                       you tell me.

-       122   Poor spelling and grammar                Pages that are higher quality and more
      (added)                                          reputable (i.e. higher PageRank) tend to use
                                                       better spelling and grammar. Demotion for bad
                                                       spelling is highly logical.

                                   3. Alleged
                               POSITIVE OFF-Page
                         SEO Google Ranking Factors (43)

            Factor           POSITIVE
    Note      #       OFF-Page SEO Factors               Brief Note
-                    INCOMING LINKS :

HOT          63      Page Rank                           Based on the Number and Quality of links to
                                                         Google link reporting continues to display just
                                                         a SMALL fraction of your actual backlinks,
                                                         and they are NOT just greater than PR4 - they
                                                         are mixed.

-            64      Total incoming links                Historically, FAST counted best
                     ("backlinks")                       (
                                                         No more - Yahoo (parent) broke it.

                                                         In Yahoo search, type in:

                                                         Try MSN -

                                                         Current TYPICAL Backlink Reporting
                                                         Ratios -
                                                         Google - 30 links
                                                         MSN - 1,000 links
                                                         Yahoo - 3,000 links

-            65      Incoming links from high-           In 2004, Google used to count (report) the
                     ranking pages                       links from all PR4+ pages that linked to you.
                                                         In 2005-2006, Google reported only a small
                                                         fraction of the links, in what seemed like an
                                                         almost random manner. In Feb. 2007, Google
                                                         markedly upgraded (increased) the number of
                                                         links that they report.

-              66 Acceleration of link popularity        Google patent
                  (". . . used to be a good thing" ...   Link acquisition speed boost - speculative
                  Martha)                                Too fast = artificial? Cause of -30 penalty?

                                                       Sandbox penalty imposed if new site?

-                  FOR EACH INCOMING LINK :

-           67     Page rank of the referring page     Based on the quality of links to you

HOT         68     Anchor text of                      Contains keyword, key phrase?
                   inbound link to you                 #1 result in SERP does NOT EVEN need to
                                                       have the keyword(s) on the page,
                                                       ANYWHERE!!! What does that tell you?
                                                       (Enables Google-bombing - search for
                                                       "miserable failure")

-             69 Age of link                           Google patent - Old = Good.

                   Frequency of change of anchor       Google patent - Not good. Why would you do
-             70
                   text                                that?

-           71     Popularity of referring page        Popularity = desirability, respect

                   # of outgoing links on referrer
-           72                                         Fewer is better - makes yours more important

-           73     Position of link on referrer page   Early in HTML is best

                   Keyword density on referring
-           74                                         For search keyword(s)

-           75     HTML title of referrer page         Same subject/ theme?

    28        76 Link from "Expert" site?              Google patent - Big time boost (Hilltop
                                                       Recently reported to give a big boost !

-           77     Referrer page - Same theme          From the same or related theme? BETTER

-           78     Referrer page - Different theme     From different or unrelated theme? WORSE

-           79     Image map link?                     Problematic?

-           80     Javascript link?                    Problematic- attempt to hide link?

-                  DIRECTORIES :

-           81     Site listed in DMOZ Directory?This is a tough one.
                                                 Google's directory comes STRAIGHT from
                   The "Secret Hand" DMOZ Issues the DMOZ directory. You should try to get
                   1. Legitimate sites CAN'T GET into dmoz.
                   IN                            But you can't.
                   2. No Accountability          Be careful whom you approach with the old

                  3. Corrupt Editors                  spondulix -
                  4. Competitive Sites Barred         Formal DMOZ Bribe Instructions.
                  5. Dirty Tricks Employed            It is almost impossible to get into DMOZ. This
                  6. Rude dmoz editors                site cannot get in, after waiting over 2
                                                      YEARS (33 months). Not even in the lowest,
                  Flawed concept - communism          most insignificant category, "Personal
                  doesn't work                        Pages". I guess I just don't "measure up" to the
                  Free editing? Nothing is free.      other 20,000+ sites in the personal category.
                  DMOZ Sucks Discussions              I'm not the suck-up type - I kissed them off
                  DMOZ Problems Discussions           long ago. What a waste of time!

                  The Google Directory is             UPDATE: This page (not site) finally got
                  produced by an unknown,             indexed in June 2007, thanks to a legitimate
                  ungoverned, unpoliced, ill-         editor. No money was paid.
                  intentioned, retaliatory,
                  monopoly enterprise, consisting     Google needs to DO SOMETHING about
                  of profiteering power-ego editors   populating its own directory with the
                  feathering their own nests - the    skewed, incomplete, poorly determined
                  ODP. AOL is making millions,        results from the dysfunctional Open
                  and needs to police it's run-amok   Directory Project - the ODP!
                  entity. Enough already!             Absolute Power Corrupts Absolutely

-           82    DMOZ category?                      Theme fit category?
                                                      General or geographic category? Both are
                                                      possible, and acceptable.

                                                      Big boost - You can get in by paying $299
    HOT     83    Site listed in Yahoo Directory?     each year.
                                                      Many swear it is worth it - many swear it isn't.

                  Site listed in LookSmart
-           84                                        Boost? Another great vote for your site.

            85    Site listed in inktomi?             Inktomi has been absorbed internally by

-           86    Site listed in other directories    Directory listing boost (If other RESPECTED
                  (About, BOTW, etc.)                 directories link to you, this must be positive.)

                  Expert site?
-           87                                        Large-sized site, quality incoming links
                  (Hilltop or Condensed Hilltop)

HOT           88 Site Age - Old shows                 Google patent
                  stability                           Boost for long-established sites, new pages
                                                      indexed easily
                                                      The opposite of the sand box.

-           89    Site Age - Very New Boost           Temporary boost for very new sites - I estimate

                                                       that this boost lasts from 1 week to 3 weeks -
                                                       Yahoo does it too.

                                                       Influences SERPs - logical, consistent,
-             90     Site Directory - Tree Structure

-             91     Site Map and more site map        Complete - keywords in anchor text

-             92     Site Size                         Previously, many pages preferred - conferred
                                                       authority upon site, thus page. Bigger sites =
                                                       better SERPs
                                                       Now, fewer pages preferred, due to
                                                       proliferation of computer-generated pages.
                                                       Google has been dropping pages like crazy.

-             93     Site Theming                      Site exhibit theme? Use many related terms?
                                                       Have you used a keyword suggestion tool?
                                                       A thesaurus?

                     PAGE METRICS - USER                Currently implemented through the Google
                     BEHAVIOR:                                                          tool bar?

    34, 35      94 Page traffic                        Google patent - # of visitors, trend

                                                       Google patent - How often is a page clicked
15,16,21        95 Page Selection Rate - CTR

                                                       Google patent - Relatively long time =
    36, 37      96 Time spent on page
                                                       indicates relevance hit

    45, 46      97 Did user Bookmark page?             Google patent - Bookmark = Good

                     Bookmark add/ removal
     47         98                                     Google patent - Recent = Good?

-             99     How they left, where they went    Back button, link clicked, etc.

                     SITE METRICS - USER                Currently implemented through the Google
                     BEHAVIOR :                                                         tool bar?

                                                       Google patent - # of visitors, increasing trend
    34, 35     100 Site Traffic
                                                       = good

-             101    Referrer                          Authoritative referrer?

-             102    Keyword                           Keyword searches used to find you

-             103    Time spent on domain              Relatively long time = indicates relevance hit
                                                       Add brownie points.

     38               -      DOMAIN OWNER BEHAVIOR :

     40               104 Domain Registration Time              Google patent - Domain Expiration Date
                                                                Register for 5 years, Google knows you are
                                                                Register for 1 year, is it a throw-away domain?

     39               105 Are associated sites legitimate?      Google patent - No spam, ownership, etc.

                                       4. Alleged
                                  NEGATIVE OFF-Page
                             SEO Google Ranking Factors (13)

            Factor                NEGATIVE
Note          #           OFF-Page SEO Factors               Brief Note
-               120 Traffic Buying                           Have you paid a company for web traffic? It is
            (added)                                          probably low quality traffic, with a zero
                                                             conversion rate. Some providers of traffic for
                                                             traffic's sake may be considered "bad
                                                             neighborhoods". Can Google discount your traffic
                                                             (for true popularity), because they know it's mostly
                                                             Have you read about Traffic Power?

    22-          106 Temporal Link Analysis                  In a nut shell, old links are valued, new links are
    29                                                       not.
                                                             This is intended to thwart rapid incoming link
                                                             accumulation, accomplished through the tactic of
                                                             link buying.
                                                             Just one of the sandbox factors.

                                                             Query meaning changes over time, due to current
    18           107 Change of Meanings

BAD             108       Zero links to you                  You MUST have at least 1 (one) incoming link
                                                             (back link) from some website somewhere, that
                                                             Google is aware of, to REMAIN in the index.

BAD              109 Link-buying                             Google patent - Google hates link-buying,
                                                             because it corrupts their PR model in the worst
                          (Very good IF you don't get        way possible.
                          caught,                            1. Does your page have links it really doesn't

                      but don't do it -                 merit?
                      when caught, the penalty isn't    2. Did you get tons of links in a short time
                      worth it.)                        period?
                                                        3. Do you have links from high-PR, unrelated

                 110 Prior Site Ranking                 Google patent - High = Good

BAD             111   Cloaking                          Google promises to Ban! (Presenting one webpage
                                                        to the search engine spider, and another webpage
                                                        to everybody else.)

    ??          112   Links from bad                    Google says that incoming links from bad sites
                      neighborhoods,affiliates          can't hurt you, because you can't control them.
                                                        Ideally, this would be true.
                                                        However, some speculate otherwise, esp., when
                                                        other associated factors are thrown into the mix,
                                                        such as web rings.

BAD             113   Penalties - resulting from        Should result in IMPRISONMENT, forthwith!
                      Domain Hijacking                  Grand Theft, mandatory minimum sentence.
                      (work with Google to fix)         The criminal COPIES your entire website, and
                                                        HOSTS it elsewhere, with . . . a few changes.

-               114   Penalty - Google TOS violation    WMG is the worst offender - gobbles up tons of
                                                        Google server time by nervous Nellie webmasters.
                                                        Google evenmentions them by name. I think that
                                                        Google will spank you when you cross the
                                                        threshold, of say, 100 queries per day for the same
                                                        term, from the same IP. Google can block your IP.
                                                        Get a Google API.

    ??          115   Server Reliability - S/B >99.9%   What is your uptime? Ever notice a daily time
                                                        when your server is unavailable, like about 1:30
                                                        AM? How diligent must Googlebot be? This is the
                                                        worst reason to get dropped - you just aren't there!
                                                        An ISP maintenance interruption can cause

-               116   No more room                      The 232 problem - Google has hit the 4.3 Gigabyte
                      Pages being dropped from large    address space wall. Bull! Google now has over 8
                      sites                             Gigs of indexed pages.
                                                        Thousands of pages are disappearing from various
                                                        huge websites, but I think that it is G just cleaning
                                                        house, by dumping computer-generated pages.

                117   Rank Manipulation by              Impossible by Google definition (except for a few

                Competitor Attack                  nasty tricks, like making your competition appear
                                                   to be link spammers)
                (1. Content theft causing you to   Ideally, there SHOULD be nothing that your
                get a duplicate content penalty,   competition can do to directly hurt your rankings.
                even though your content is the
                original - Google has problems     However, an astute observer noticed that Google
                tracking original authorship.      changed their website to read :
                People are still stealing my       Old verbiage = "There is nothing a competitor can
                content, but nobody trumps me      do to harm your ranking ..."
                (in Google) with my own            New verbiage = "There is ALMOST nothing a
                content - hats off to Google.)     competitor can do ..."
                                                   An obvious concession that Google thinks that at
                Examples -                         least some dirty tricks work!
                Site-Wide Link Attack
                and                                Of course, there will always be new ones!
                302 Redirect Attack
                Hijacker Attack
-         118   Bouncing Ball Algorithm            At least 2, and often 3 identifiable Google Search
                                                   Algos are currently in use, alternating pseudo-
                                                   randomly through the data centers.
                                                   G has moved to a daily dance. Multiple
                                                   changing factors are applied daily. GOOD
                                                   LUCK NOW on trying to figure things out!

                                                   IN ADDITION, some the above factors are being
                                                   "tweaked" daily. Not only are the "weights" of the
                                                   factors changed, but the formula itself changes.
                                                   Change is the only constant.

                                                   An algo change can boost or demote your site. I
                                                   put this in the negative factors section, because
                                                   your position is never secure, unless of course, you
                                                   are huge (PR=7 or greater). If you simply cannot
                                                   achieve top position, your only alternative to first
                                                   page SERP exposure may beGoogle Ad
                                                   Words (you pay for exposure).

                                                   Today, I searched for an extremely competitive
                                                   "2-word term", and I found that NOT ONE of the
                                                   top ten Google SERPs had even one of the words
                                                   on the page.
                                                   Today's theory - when it doesn't matter, anybody

                                                   can get #1 in a second, if they know the on-page
                                                   rules. BUT, after a certain "commercial
                                                   competitive level", the "semantic analysis" algo
                                                   kicks in, and less becomes more. The keyword
                                                   density rules are flipped upon their noggins. I
                                                   think that we are witnessing the evolution of
                                                   search engine anti-seo sophistication, right before
                                                   our very eyes. Fun stuff.

                          .Notes to the Above 122 Google Ranking Factors

1 I have tried to summarize the best opinions of many webmaster forum posters.

2 There are no published rules - this is my continuously changing compilation of SEO chatter.
  This is my semi-annual, one-way technical Google ranking blog, if you will.

3 If your keywords are Rare and Unique, then Page Rank doesn't matter.

4 If your keywords are very Competitive, then Page Rank becomes very important.

5 The fewer incoming links that you have, the more important on-page factors are, for
  noncompetitive terms.

6 There are a million ifs, ors, buts . . . I am attempting a concise summary.
  Exceptions to EACH of the POSITIVE ON-Page factors are frequent and many.
  However, I feel that it is important to score highly on as many factors as possible, since factor
  weight and even factor consideration are changing constantly - CYA. Not to mention the other

7 A few words about the LANGUAGE used on the Google site -
  in a phrase - "soft spoken". We see it everywhere these days.
  I am referring to understatement, sometimes even to the point of confusion.
  "significant", "may", etc.
  For example, when Google states that maybe it might not be a good idea to do a particular thing,
  what they SOMETIMES really mean is "If you do it, you are history".
  Some Google suggestions are actually commands (STRONG HINTS) in disguise.
  At some point, you begin to realize this.
  Google just can't tell us everything, literally. Sooooooo, take the hints.

8 Become religious. Seek the light. It's there, but you gotta look.
  LISTEN UP! Read the rules. Read between the lines. Carefully.
  Differentiate. Project. Carefully analyze your own situation.

   Webmaster Guidelines

    How does Google rank pages?

    Google Facts and Fiction

    Search Engine Optimizers

                                 Historical Updates
                   The "Panda" Update of February 23, 2011
                        (AKA "Farmer" Update, "Farm" Update)

  Much is being said about the Google "Content Farm" update. Opinions and counter opinions are
  rampant. Here are some theories and conclusions that have been advanced, by webmasters at
                                       Webmaster World.

                Site Problem                                       Suggested Solution

1. Determine which of your pages were hardest       1. Immediately drop all your "hardest hit" pages.
hit.                                                2. Add more original content to your pages.
2. "Thin" content is being hit hard.                3. Get out of the "article submission" business.
3. "Article sites" are being hit hard.              4. Eliminate all scraped, spun, and "copied content".
4. "Copied content" is being hit hard.              5. Use only "unique content" on EVERY page.
5. "Duplicate content" is being hit hard.           6. Reduce the number of ads, especially at the top of
6. Sites with high ad-to-content ratios are being   the page.
hit hard.                                           7. Use AdSense on all your pages - it certainly won't
7. Sites using AdSense are NOT being hit as         hurt.
hard (eHow).

            The Huge Dichotomy - Google Search vs. Google AdSense
                            Whatcha Gunna Do?

To RANK WELL (in the Panda update),                 To MAKE MONEY,
Google Search wants to see                          Google AdSense wants you to use 3 ad units, 3 link
VERY FEW ADS,                                       units,

especially at the top of a page.                    and 2 search units on every page.
                                                    Elevendy-seven ads aren't going to help you, if your
                                                    page doesn't rank, because it has TOO MANY
USER EXPERIENCE is paramount.                       ADS!

                                                    REVENUE MAXIMIZATION is

The other dichotomy -                               Google AdSense advises placing 3 ad units, 3 link
Google Webmaster Tools strongly advices             units,
to speed up page load time,                         and 2 search units on every page.
by eliminating javascript and "lookups".            Google also advises using Google Analytics.
                                                    This advice conflicts with Google Webmaster
USER EXPERIENCE is paramount.                       Tools.

                                                    REVENUE MAXIMIZATION is

     It remains to be seen, just how Google is going to resolve these profound conflicts of interest.
                             For every question, I always use the "logic test".

Would it be logical for Google,               Are they doing it now? Most certainly, for eHow,
to rank websites near the top,                which makes Google millions.
if those websites are serving AdSense         Will they do it for others in the future?
                                              If revenue maximization is paramount,
eHow, an AdSense-prominent content            is this not inevitable, at some point?
farm, has                                     It would be "common sense", if maximizing shareholder
not only escaped Panda, they have been        value is the primary objective.
eHow is owned by Demand Media,                Is AdSense money corrupting the Internet, just as lobbyist
who have been accused of creating             money has completely corrupted our congress?
the ultimate MFA (Made-For-AdSense)           Are we going to end up with only mega-MFA sites
websites.                                     on page one of the SERPs?

Would it be logical for Google,               No.
to rank websites near the top,                On the other hand, it would be logical to avoid this
if those websites are NOT serving             accusation, which is easily measurable.
AdSense ads?                                  BUT, there is no money in it.
                                              Is Google going to populate the top of their SERPs
                                              with websites that run Bing ads exclusively?
                                              Now, THAT is not logical. It would be dumb.

                               Historical Updates
                        Brief, Partial Google Algorithm Update List

2007     2007
         Daily ranking.
         -30 penalty noticed, -350 penalty noticed, -950 penalty noticed

2006     2006
         Unnatural links are anathema to Google SEO - minus 30 SERP positions penalty

         Everflux is the rule of the day - many small updates.
         Sandbox (aging delay) alive and well
         July 27th update
         Big Daddy

         2006 - July - July 27 update, plus quality landing page update
         2006 - Feb. - Big Daddy update

2005     2005
         Novem. 5 - Jagger 3 Update
         Nov.10 all done - settling out.

         October 26 - Jagger 2 Update
         More SERPs churning - OLD is highly valued.

         October 17 - Jagger 1 Update
         Recent Links, Recent Sites = SERP Turmoil
         Google has declared war on phony (low-grade) links
         Devaluation of reciprocal from unrelated-subject pages
         Devaluation of links from "link houses"
         Devaluation of purchased links
         Big allinanchor changes
         Large companies thrust to the top of the SERPs
         Sandbox update
         Lowering of sandbox threshold - re-sandboxing
         Too fast link accumulation - links acquired too quickly

         Deeper sandbox (longer - over 6 months)
         Some suggest that one's site really had to be online before January 2004,
         in order to avoid all ramifications of the sandbox.

         2005 - Sept. - the "False" Update (Sep.22 - Big update, many changes)
         2005 - May - Bourbon Update
         2005 - Febr.- Allegra Update - Feb.17, 2005, Some sites released from the sandbox,
         but many remain.

2004     2004 - Feb.- Brandy Update
         2004 - Jan.- Austin Update

2003     2003 - Nov.- Florida Update
         2003 - June - Esmeralda Update
         2003 - May - Dominic Update
         2003 - Apr. - Cassandra Update
         2003 - Mar. - Boston Update

          Chapter 15 .

                                The Google Sandbox -
        The Single-Biggest SEO Ranking Factor for New Sites
   Google is clearly fighting spam by sacrificing SERP newness for higher SERP quality.

March 2004                     HOW IT WORKS                    SOLUTIONS
Edited August 2006             If you subscribe to the spam    Two methods are currently being used to
                               reduction theory, Google's      get around the Sandbox penalty for new
THE GOOGLE                     thinking was, NO NEW            sites.
SANDBOX                        SITES get good ranking,
The sandbox is alive and       until they prove themselves.    One method is to join the Google Ad
well.                          Spammers generate               Words or Adsense program, in which
In March 2004, Google          thousands of new pages          case your pages get spidered in
implemented a new filter,      daily, along with millions of   MINUTES. Your site will be checked
now referred to as "The        new links to go with them.      initially with an algo or human "smell
Sandbox". This new "effect"    This penalty is new-site        test". If you smell good, you're in. Good
took months to notice and      based. Long-standing sites      rankings will follow (provided of course,
quantify.                      have no trouble ranking new     that you have good on-page SEO, and a
                               pages.                          few good backlinks).
The sandbox is also referred
to as an "aging delay". Two    Link Weight Aging Delay         I speculate that if you are a Google
aging delays have been        Google WITHHOLDS "link           partner ("approved"), then you are not
suggested - one for link      juice" on new sites, by          going to be penalized, unless you
weight, and one for           deprecating the new links,       subsequently "go bad".
competitive term ranking.     for 2-8 months. If the
                              domain and backlinks have        The second method is to buy an old
The sand box only applies to existed for a certain length      domain, just for it's longevity, and old
highly COMPETITIVE            of time (6 months?), then        backlinks. Many have bought up old
terms, revolving around       maybe you are OK, and            domain names for this purpose. This may
money, such as the words      escape from the sandbox.         work right now, but the rules will soon
attorney, loans, viagra, real Over time, the newly             change again. They always do.
estate, etc. The more         generated links are given        Good luck!
lucrative the keyword, the    weight, and eventually the
longer the wait.              sandbox effect is lifted.

Yahoo has a sandbox, as        Competitive Term Aging Delay
well. Opinion seems to         Google WITHHOLDS high
indicate that the Yahoo        ranking ability on new sites,
                               for highly commercial

aging delay is not quite as   keywords, such as loans,
long as the Google Aging      real estate, viagra, etc.
Delay.                        Eventually, the new site will
Yahoo does seem to provide    rank well for the competitive
an initial boost that will    keywords, and the sandbox
disappear after about 4       effect is lifted. Six months is
weeks.                        mentioned most frequently.

MSN appears to have no
sandbox. New sites with
new pages, targeting
competitive terms, can rank
well very quickly (weeks)
for those terms.

Chapter 16 .

          Free SEO Tools You Should Know About

If you wants to do SEO of any websites, than you require some free tools, by using these
tools your company will enjoy the unrivalled results we provide by taking advantage of
our cutting edge search engine optimization and marketing expertise. Utilizing our
research and experience, Step miles will customize your SEO initiatives to meet the
specific needs of your company's Web site, industry, audience, goals and budget.

With these free SEO tools, your company will receive a significant, measurable increase
in qualified traffic which is then analyzed, so you can convert those new leads into sales.
Means to serve startup companies looking to establish a presence online. free SEO
services include Web site optimization, keyword(s) analysis, Web site traffic reports, and
weekly Web site ranking reports.

According to Anand Saini, the President of, these services provide a
foundation for companies to build upon. He is also providing free SEO services in hopes
to build relationships with entrepreneurs and aid any company that is looking to compete
online.“We are not doing this for capital gain,” said Mr.Anand Saini of
SEOCERTIFICATION.ORG.IN. “we like to focus on building relationships. Free SEO
services enable us to make contact with companies that may not be able to afford full
services. This is a great way to give back, and we are proud to be able to do it.”

“Big or small, we want to help,” said ANAND SAINI. “We do not want small companies
or startups to feel outgunned. Bottom line: if a company has a passion for its product, and
a willingness to work with us, we want to help build its presence online.”

1) Traffic Travies

Free Market Research SEO and PC optimization tool

Traffic Travies is a free to download software which covers many aspects of SEO. It's an all-in-one SEO tool that
combines market research tools, SEO and Pay-Per-Click campaigns, optimization tools and competitor research

2) Keyword Tumbler
Keyword Research Marketing Tool

Here's a freeware tool which takes an existing phrase and creates several new variations. It mixes the words in the
phrase and gives you all possible variations. A useful tool to get a list of possible search engine phrases.

3) Keyword Pad
                                   Keyword List Generator

                                   A more advanced keyword generation tool with many useful options. With
                                   Keyword Pad you not only generate keywords list, but can also manipulate
                                   them--remove duplicate keywords, sort keywords, remove unwanted words
                                   and characters, find and replace a term, merge lists etc.

4) Primitive Word Counter
Keyword Density Calculator

Primitive Word Counter is a valuable SEO tool
which calculates the density of a keyword in a
given paragraph. After finding the density you can also save the result in .csv format. Completely free to use, no
installation required.

5) Keyword Analyzer

Free tool for keyword gathering and result estimation

With this tool you can compare the competition against the demand of a certain niche. Keyword Analyzer takes
your main keyword, generates a list of 100 related keywords and then shows you the number of searches for that
keyword on popular Search Engines like Google, MSN and Yahoo.

6) SEO Surf
SEO Analysis Tools

This tool allows the users to shortlist, follow up and manage potential link partners, eases backward link
management and finds keywords related to the base keywords. Users can get time based reports and export them
to html. A useful tool for novices as well as experienced Search Engine Optimizers.

7) Xedant Keyword Harvester 1.1
Keyword Harvest Tool

This tool harvests information from the top 10 web pages which appear in a search query on Google against one
or many keywords. The information includes Keywords from Meta Keyword Tags, Web Page contents, Title Tag,
H1 to H5 Tags and Image Alt Tags etc.


8) Google Keyword Tool
Free Keyword Tool

Google Keyword Tool is a free tool from Google which gives ideas about new keywords. It takes a keyword,
phrase or a URL of a web page and gives out a list of related possible keywords.


9) Search-based Keyword Tool
Keyword Tool with additional Functionalities.

This tool has all the properties of above mentioned Google
Keyword Tool plus it generates keywords as well as landing page ideas highly appropriate and specific to the
given web page. It also helps you identify advertising opportunities that are still unused in the AdWords ad
campaigns of user.

10) Keyword Density Analyzer
Keyword Density and Word Depth Calculator

This free online tool helps you to determine a keyword's density
and depth measurement. Keyword Density Analyzer takes URLs, Keywords and Stop Words as input and outputs
the Word and Character Density as well as whole Keyword Density.

11) Niche Watch11
Niche Keywords Research Service.

Here's a free competitor analysis tool which gives you output against
a keyword. This online tool gets a keyword or phrase and gives you an analysis of the top 20 competitors against
that keyword.

12) Google Suggest Keyword Suggestion Scraper Tool
Keywords Suggestion Tool

This tool gets keyword from users and grabs the top queries
from Google Suggest, Google search result counts, and links
them to the SEO Book keyword suggestion tool to give users a
list of Keyword Suggestions related to the original one.

13) Free Keyword Suggestion Tool
Keywords vs Search Volume Estimator

Free Keyword Suggestion Tool takes a keyword from users
and generates a list of Top 100 related keywords. Along with
the list, you'll get an estimate of the daily search volume against each of the keywords. It comes with an Adult
Filter feature with options of "Remove Offensive" and "Remove DubiousRemove Dubious " keywords.

14) Keyword Suggestion Tool - Keyword Popularity Tool

Keyword Suggestion along Popularity.

Another helpful keyword suggestion tool, which also informs
you about the most popular combinations. Some of the features
include Search for related terms and common misspellings,
Seasonal Search Trends, KEI Analysis and Keyword Density Analysis.

Browser Plugins

15) Google Semantics 2.2

Firefox Add-on

This Firefox add-on helps you get a synonym for a keyword during a search on Google. This is also known as
Latent Semantic Indexing (LSI) and is a vital element in Search Engine Optimization and article writing.

16) Alexa Toolbar

SEO Toolbar

A handy SEO toolbar for Internet Explorer users. This SEO toolbar has features that let you view the stats on any
site you visit: related links, traffic trends, Alexa traffic ranking and more. Definitely a tool to check out.

17) SearchStatus 1.29
Firefox Toolbar Extension

SearchStatus is a free extension for Firefox. Check how every website in the world is performing. For every site
users visit, you can view its Google PageRank, Google Category, ranking, Alexa incoming links,
backward links from Google and more.

18) MetaTags

Firefox Sidebar Add-on for SEO

As a free sidebar add-on, MetaTags conveniently displays the meta-
information of web pages like meta tags and links. This tool also
helps in finding duplicate keyword phrases and few other SEO jobs.

19) Web Developer
Firefox, Flock, Seamonkey Extension.

Designed for Firefox, Flock and Seamonkey, WebDeveloper adds a
menu and a toolbar to the browser with various web developer tools. It can work on several platforms including
Windows, Mac OS X and Linux.

Ranking Tools

20) Web Tools
Ranking and related Tools.
Web Tools is a compilation of links to free online SEO Resources. Get tools
related to Search Engine tasks (Backlink Checker, Cloaking Checker, Google
Datacenter Search etc), Domain Checkups (Alexa Traffic Rank, Broken Line Checker, Domain Lookup etc), and
HTML tools (HTML Optimizer, Link Extractor etc)

21) PageRank Lookup
Page Rank Checker

Here's a Google toolbar extension that allows you to check the PageRank for a website. Users can enter a list of
URLs and it'll return the PageRank value for each one. Find out which competing websites have a higher
PageRank value, a higher traffic rate and a higher search engine position.

22) Rank Checker

Search Engine Ranking Tool

It becomes an easy thing to check your search engine rank with a Search Engine Ranking tool. Rank Checker
requires only that you create a free account before using the tool. It takes Web Pages and Keywords from users,
returns search engine rankings and then stores them for easy comparison later.

23) Check Google Pagerank
Positioning Estimation Tool

Another easy-to-use, free tool for determining the rank of a page in a typical Google search. Check Google
Pagerank facilitates your page rank without the need to install a Google toolbar in your browser.

24) Google Rankings

Ranking Tool for popular Search Engines.

There are a number of different search engines users can use. This tool
provides you with an easy and efficient way to check the Rank of a
website in one or all of the most popular search engines online including Google, Yahoo, MSN and Ask.

25) SEO Rank Tool
Ranking and Backlink Tool
This is a simple, yet powerful tool. Use the SEO Rank tool to easily check your website's page rank, back links
and much more.


26) Rank Checker
Ranking Checking Tool for Firefox

Rank Checker is a free extension for Firefox. It allows you
to use your browser to easily check your site's rankings in
Google (both US and international), Yahoo, and Microsoft Live search. According to the developers "This tool is
designed to be quick and easy to use, but to also give you the opportunity to gain the level of insight needed to
compete with other SEOs at the professional level".

27) goRank
Professional SEO Ranking Tool

goRank offers you free SEO tools including a keyword rank tracker, keyword density analyzer and link popularity
checker. The site also contains several useful articles related to SEO.

Monitoring Tools

28) Alexa Site Information

Website Monitoring Tools

This free site lookup tool allows you to enter the URL
of your website and get detailed information and insight on your site's performance. Find out your traffic stats,
other related site links, keywords and demographics.

29) Site Meter

Real Time Reporting Tool

Learn more about your audience. As a real time website tracking and counter tool, Site Meter is an easy to install
tool for users looking to better understand who is visiting their Websites, where users are coming from and where
they are going within the site.

30) Traffic Estimator
Google Traffic Estimation Tool

Here's a simple and powerful free tool from Google. Traffic Estimator takes
keyword lists, daily budgets, target language and locations, and gives you the
quick traffic estimates for new keywords without adding them to an account or using the AdWords sign-up

31) BackLink Watch

Backlink Checker

BackLink Watch is a powerful tool which takes
the URL of a website to get complete detailed
information about the quality and quantity of
backlinks pointing to the website. It'll show
"anchor" text, pagerank, total outbound links on that page, and nofollow flags for each of the inbound link

32) Quantcast
Website Evaluation Tool

Quantcast is a decent tool for understanding your website's value to advertisers or, for SEOs, taking a peek under
the hood of your competitor's site.

33) SEO for Firefox

Page Ranking Extension for Firefox

SEO for Firefox pulls in many useful marketing data points and makes it
easy to get a whole view of the competitive landscape of a market.

34) SEO Trail
Search Engine / Social Networking Performance Monitoring Tool

SEO Trail is an SEO Tool for Search Engine as well as Social Networking Tracking. It monitors how well a site
is performing across various indexing and ranking platforms.

35) GoingUp
Document and Image Converter

With GoingUp, you can find out about your visitors, where they came from, what pages they visit and what
technologies they use. The GoingUp dashboard can be customized to show relevant information about a given
site's performance.

36) Pingdom GIGRIB
Website Monitoring Tool

Here's a website monitoring tool which checks for a website's up and down times. If this sounds like a tool you
can use, you'll need to be signed up with the GIGRIB monitoring network.

37) SEO Monitor
Video and Audio Format Converter

SEO Monitor is a is a stats system which updates user stats every 7 days and adds updates in the Pagerank , pages
indexed and backlinks.

Website Optimization Tools

38) Social Poster
Social Bookmarking Tool

Social Poster is a tool which allows its users to automatically submit their prepared tagged articles with a link to
target resource to a large set of social bookmarking sites.

39) Google Analytics
A popular free tool from Google. It gives you important insight about
your own website traffic. Get informed on how users interact with your site and optimize on it.

40) Social Marker
Social Bookmarking and Promotion Tool

Social Marker is a free tool designed to quickly bookmark a website across 51 of the top social bookmarking sites
out there. You can use this tool to help promote your site. Social bookmarking sites include: Digg, Reddit,
Delicious, Twitter Spurl, Sphinn, Technorati, Fark, Google Bookmarks, Slashdot, and much more!

41) Website Optimizer

Website Testing and Optimization Tool

Website Optimizer is another free tool from Google which helps you test and optimize the content and design of
your site. You can increase revenue and ROI quickly and easily, whether you're new to marketing or not.

42) Web Page Analyzer
Website speed Test

Web Page Anaylyzer is a website speed analysis tool. Input the URL of your website to
calculate the page size, composition, and download time as well as the size of individual
elements. The tool sums up each type of web page component. It also offers advice on how to
improve page load time based on the results generated.

43) Copyscape
Anti Plagiarism Tool

Copyscape is a free tool to check whether your
website contents are being copied. It makes it easy to find duplicate content on the Web that is ultimately stealing
your audience. All you have to do is type in the URL of one of your webpages and Copyscape will find sites that
have copied your content.

44) Search Engine Spider Simulator

Spider Simulation Tool
Here's a free and simple tool which helps you to simulate a
Search Engine by displaying the contents of a webpage
exactly how a Search Engine would see it. It also displays the
hyperlinks that are crawled by a Search Engine when it visits the particular webpage.

45) Web Page Analyzer
This simple tool reads the page you specify, and puts it
through a few basic tests to see if it qualifies as 'search-
engine-friendly' for the phrase specified. It then gives a
report on the good, and the bad points of the page in
SEO terms.

46) BuiltWith

Website Optimizer

Optimize, Track, Compete. BuiltWith is an optimization and
tracking tool which speeds up your website SEO. It monitors the website for problems which could potentially
cause you search engine optimization issues.

Sitemap Tools

47) XML-Sitemaps

Sitemap Generator

This site helps users to create their XML, ROR, Text
and HTML Site Maps Online. These sitemaps can be submitted to Google, Yahoo and other search engines to
help them crawl your website better.

48) SitemapDoc

Google Sitemap Generator and Editor

To work with this tool all you need to do is type in your URL and the SitemapDoc tool will create a map of your
site. Edit Last Modified, Change Frequency and Priority options for individual pages that can be reviewed as well.

49) AutoMapIt
Sitemap Creation Service

AutoMapIt is a collection of webmaster tools to
help your web site do better in the search engine rankings. It produces site maps for websites including static
pages, dynamic pages, blogs, forums, online product catalogs. AutoMapit makes sure that the website ranks
quickly and is clear of any issues that push the search engines away.

50) GWebCrawler & Google Sitemap Creator

Source Code Web Indexing Engine

GWebCrawler & Google Sitemap Creator is a free site Indexing tool with must-have features: Include only if
URL contains field, Exclude URLs containing field, URL variables to parse out, Create Google Sitemap XML
file, and See the Queue list while processing etc.


51) Free Online Sitemap Generator
Sitemap Generator

This tool gets your website URL and optional parameters and
generates a sitemap file-free of cost.

52) Free site map creator
Sitemap Creator Tool

Free site map creator is a free desktop
application which is free and easy to install. It
exports your directory structure to a file which can then be used to create sitemap of a website, or to print the
contents of a folder.

53) XML Sitemap Generator

XML Sitemap Generator

This tool is a professional XML sitemap generator toll which allows
you to automatically generate XML sitemaps for a website. With
XML Sitemap Generator, you can keep up-to-date, edit your sitemaps,
upload them to a web server and submit them to all search engines that support XML sitemaps.

In this category, you'll find a number of SEO related research tools that can help you, as a website
owner, with the research needed to optimize your web content. Track your competitors, find the most
popular links, and get updated with emerging trends. The following tools will make it easier.

Link Tools

54) Back Link Analyzer

Link Popularity Tool

Back Link Analyzer is a free toll to check link popularity and
also helps in link analysis. It give you reports explaining what anchor text is linking into a page or site.

55) Yahoo Site Explorer
Back Link Checker

This free and powerful tool from Yahoo helps users to explore all the web pages indexed by Yahoo! Search. You
can also view the most popular pages from any site. They can dive into a comprehensive site map, and find pages
that link to that site or any page.

56) Quarkbase
Website Information Tool

Quarkbase is a database which stores information about
websites website. On this site, you can share your user experience, express your views, connect to websites and
find up-to-date information. Good for tracking your own and your competitor's website.

57) 57. Link Popularity
SEO Link Tool

This is a free Back Link Checking tool which generates a
complete site analysis for the user. Get information on the
amount of back links, indexed sites, and Google Page Rank
for a site.

58) Link Popularity
Back Link Checker

Website popularity is becoming increasingly important, as many search engines are using this information as a
ranking criterion. This Link Popularity tool by helps you by informing you about the number of
websites that link to your own.

59) BackLinks Analyzer Tool

Back Link Checker

This is another free Back Link Checking Tool. It takes an
inputted URL, Link Type and Query Length and gives
you useful information: Number of Links to a Domain, Number of Links to a Homepage, Pages Indexed, Deep
Link Percentage, All unique linking domains and their IP addresses & unique c block addresses."

60) SEO Link Cloaker Tool
Link Cloaking Tool

This SEO Cloaking Tool is free to download and allows you to cloak your links. According to its developers,
"you can boost your click through rate by at least 200%"

Trend Analysis Tools

61) Trend Report
URL Comparison Tool

This free tool allows you to enter up to 100 URLs and then compare them on various rankings. You can also
compare specific pages, in addition to the primary domain.

62) Trifecta
URL Strength Checker

Trifecta is a free tool which takes a website's URL and gives
you information about the number of sites/pages that link to the site, the number of times their brand is mentioned
on the web and how much traffic their site receives.

63) Search Engine Saturation

Indexed Web Pages Finder

As a simple yet efficient tool, Search Engine Saturation
takes a few website URLs and compares them against each
other. The information provided back includes total number of pages in each search engines index, which of their
pages are showing up in the search engines and how their competitors are performing in search engine results.

64) Webmaster Tools
SEO Tools for Webmasters

Another set of free tools from Google
which helps you see how your site is
performing in Google's search results.
You can troubleshoot potential site problems, build Google-friendly sites, ensure that Google can find all your
web pages, drive traffic to your site, and remove pages from Google's search results if needed.

65) Marketleap Search Engine Marketing Tools
SEM Tools

Keep up to date with new web trends, key words
and popular links. Here are three free and
helpful tools offered by Marketleap: Link
Popularity Check, Search Engine Saturation and
Keyword Verification Tools.

66) Mint
Web Site Analytics Tool

Mint is an excellent and easy to install tool.
Highlighted features include breaking down a
specified site's activity from the past day, week, month and year as total page views and unique visitors. Check
out the site and find out what else Mint can analyze about trends and websites.

67) Domain Stats Tool
Competitor Statistics

This tool is free and easy to use. It helps
you track statistics about your competitor's
domains. Statistics include Alexa Traffic
Rank, Age of the domains, Yahoo
WebRank, Dmoz listings, count of back links and number of pages indexed in Search Engines like Google,
Yahoo and MSN.

68) Keyword Forecast
Keywords Demographics Tool

This is a free to use tool from Microsoft adCenter Labs. It
takes inputted keywords separated by semi-colons and
predicts the impression count and predicts demographic

69) Hot Trends

Web Trends Analyzer

Hot Trends is a free tool from Google Labs and can be added as
iGoogle Toolbar. Find out what's on the public's collective mind,
the day's top 100 fastest-rising search queries in the U.S., and what the search activity looked like over the course
of a particular day.

Competitor Research Tools

If you need tools to help you work with PDF files more efficiently, read on to find just the right add-on.

70) Web CEO
Search Engine Marketing and Optimization Tool

Web CEO is a powerful tool to use in the SEO
industry. It helps you retrieve a list of keywords and
key phrases that will bring the most targeted visitors
to your site, optimize your web pages for top-10 positions, and manage pay-per-click campaigns from a single
unified workspace.

71) Meta Search
Meta Search Tool

Myriad is an ad free Meta search tool by This search engine allows users to combine the results of Google, Yahoo, MSN Search and Ask
Jeeves in one set of results. The results can be conveniently exported in a CSV file.

72) SEO Digger
Keywords Number Definition Tool

SEO digger is a free tool which allows users to
study which keywords can be used to find a given domain in Google or MSN. It also lets to analyze any site and
documents--both of users and their competitors.

73) Thumbshots Ranking

Search Engine Rankings Analyzer

This free application allows you to compare a
keyword's rankings from one search engine to
another. You can also calculate the overlapping of
links between the two sets of results.

74) SpyFu
Competitors Info Tool

As a free spying tool, SpyFu helps you determine which keywords your competitors are
buying and which ones they're optimizing their site for. You can compare websites and
spot crucial strategies too important to pass up.

75) Compete

Analytics and Comparison Tool

Compete is a web analytics and comparison services provider.
Build and optimize search marketing campaigns that create
brand awareness, drive site traffic and increase sales. Its Compete Search Analytics tool helps you come up with
new keywords to help drive traffic to any domain, identify gaps in user\s search strategy and track their
performance against competitors and peers.

76) Google Insights for Search
Search Volume Comparison Tool

With Google Insights for Search, you can compare
search volume patterns across specific regions,
categories, time frames and properties.

77) Xinu
Page Rank and Back Link Aalyzer.

Get assistance with finding Ranks and Back Links about
your website. Xinu also has additional options of showing or
hiding syndication, validations, social bookmarks, ranking,
backlinks, indexed pages, diagnosis, domain and screenshot.

78) Internet Archive

Texts, audio, moving images, software‟s and web pages collections. Internet Archive is a
non-profit organization which was founded to build an Internet library. Among other
things, Internet Archive is a huge collection of archived web pages. Find out how old a
website is, what it looked like in the past, and get ideas for your own .

In the end, articles, blogs and news about SEO tools, techniques and trends will help you to better
understand the concepts of Search Engine Optimization. To that end, we have also include some of the
top notch providers of SEO services for your review.

79) Search Engine Roundtable

SEM Blog

Search Engine Roundtable reports about the most
interesting threads taking place at the Search Engine
Marketing forums. Some of the most recognized names at those forums are listed and hence, Roundtable is able to
report on these threads.

80) SEMLaguna
SEM and SEO Services

SEMLaguna provides Internet Marketing solutions and
services and guarantee to deliver significant benefits to user's business. Users can tap into SEMLaguna‟s On-page
and Off-page Internet Marketing experience that will help their brand your company and/or product.

81) SEOmozBlog

SEO and SEM Blog

SEOmozBlog keeps its visitors up to date with the
most current News, Tips & Highlights from the search marketing industry with the daily SEO Blog.

82) Matt Cutts: Gadgets, Google, and SEO
Personal blog of Matt Cutts

Matt Cutts joined Google as a software engineer in January 2000
and is currently the head of Google's Webspam team. This is how Matt Cutts introduces himself and where he
gains his experience from. This is his personal blog with emphasis on Google's role in SEO industry.

83) Michael Gray - Graywolf's SEO Blog
                  Personal Blog of Michael Gray

                  This is personal Blog of Michael Gray. His experience in SEO includes his position as
                  webmaster for a major specialty retailer involved in web development and website management
                  since 1998. On his blog you'll find out about search engines, SEO, SEM, the internet, business
and the media.

84) Marketing Pilgrim
Internet Marketing News and Views

This site brings the latest news, rumors and reviews of all
things related to internet marketing and online advertising.

85) Prone Advertising
Personnel Experiences Blog

Here is another SEO voice. This time the blog is about
the personal experiences of Neil Patel in successful online

86) Online Marketing Blog

Internet Marketing Blog

This is TopRank's internet marketing blog with highlights on digital PR,
social and search engine marketing.

87) Search Engine Journal

SEO News Blog

Designed and maintained by Chris Pearson, this blog
covers the latest news in the SEO and SEM industries.

Resource Sites
88) Webmaster and SEO

SEO Resources

If you're looking for resources with a personal voice, here's a blog about several useful and important resources
for web masters and SEO optimizers.

89) Beginner's Guide to Search Engine Optimization

SEO Guide.

This is a guide providing an overview of the many
processes, techniques and strategies used by
professional search engine optimization specialists.

90) Search Engine Land
SEO News

If you're looking for information and news
on search engine marketing, searching issues
and the search engine industry, check out Search Engine Land.
91) Search Engine Watch
Search Engines Tips

Search Engine Watch presents tips and
information about searching the web, analysis
of the search engine industry and also helps to
site owners trying to improve their ability to be
found in search engines.

92) PromotionWorld

PromotionWorld is a site for free search
engine promotion, full of tips and tricks to
help webmasters promote their sites for free!

93) Traffick
Web Portals Guide and Blog

Traffick charts the rise of search engines with a
special eye toward the business side. Find out
how the search engine marketing is changing
and how everything and anything is sold and
marketed using the internet.

94) High Rankings
SEO Forum

Here's an SEO Forum with several Tips and Tricks, Techniques, Resources and
Advices and Design and Usability information articles.

95) Search Engine Forums
Search Engine Marketing forum

On this site, you'll find out persoanlly about
Search Engine Optimization Techniques and
Search Engine Marketing Industry Discussion Forums at JimWorld.

SEO Services/Blog

96) SEOmoz
SEO Services

The Official site of SEOmoz which provides
companies around the world with consulting, Internet marketing and search engine optimization services.

97) SEO Design
SEO Services

Site of SEO Design which specializes in organic
search engine optimization services for Google,
Yahoo, and MSN.

98) SEO and Social Media Marketing
SEO and SEM Service Providers

This is a website resource of SEO and SEM services
providers They claim to provide
clients with clear Internet marketing strategies that
increase traffic, link popularity, and website conversions.

99) Harvest SEO
Outsource Link Builders

Harvest SEO is a service provider for SEOs. They offer
their services from very small businesses to outsourcing
non-core business services.

100) SEO services by Outsourcing4u
Outsource SEO Services

Outsourcing4u is a UK based SEO organization that
provides users with the best search engine optimization
services and web-development work at a fraction of the
price that they would normally pay in the UK, US &

101) SheerSEO
SEO Reporting Tool

Sheer SEO automates tedious SEO tasks including data
collection and report creation and provides a clear and
visual image of overall SEO performance of any
Website. SheerSEO is a free tool that requires you to sign up. However, after an account is created, the SEO
indications are monitored, and alerts are sent when changes occur.

Select the most suitable tools for you, build your knowledge level, and remain informed about the latest search
engine trends and techniques. These tools will help you work your way towards the top of the search engine
ranks. Get your website out there!

This tool diagnoses common causes and effects of
duplicate content penalties.

(1) It tries to diagnose the common www vs non-
www duplicate content issue by checking
the headers returned by both versions of the url,
the current cache in google, and possible PR dispersion.

(2) It checks for the common default page error where both the / and /index.html (or other default page) return
200/OK headers.

(3) It checks for incorrect 404 pages which deliver a 200/OK Header and,

(4) It checks for supplemental pages in the Google index.

WZ-30-A, Bhagwan Das Nagar, Punjabi Bagh Metro Station Opp. Piller-67, East
Punjabi Bagh, Delhi – 110026
Contact: 011-28316148, 3203571, 30538061, 8010298388
E-mail:, Visit:


To top