Docstoc

Google Fast Indexing Formula

Document Sample
Google Fast Indexing Formula Powered By Docstoc
					Everyone in the online world knows that the most desirable traffic to its
site from a Google search. Folks, 80% of Internet searches are conducted
on Google.

In theory it is simple - if you have something interesting to someone
else if you have a website with honest goodness goal to build something
useful for someone to make another, thinking that someone other than you.
This is how the makers of Google its primary mission, more or less have
an extensive collection of information and help people of our planet you
will find useful to describe things.

In practice, it is not so simple. It is not so easy because there are
thousands, even millions of websites like yours, because you may be a
very honest online business is to sell some very useful products, but not
d 'unheard of, exceptionally large "content". If your website listed on
page 265 for a set of search results, make sure no more visitors that
way.

Unlike Yahoo and others, which relies on human intervention, that Google
is all through automation. Websites are indexed (or crawled or spidered -
all terms refer to the same process), the indexing software called
Googlebot. Googlebot sees Web pages daily, and decide the rules in the
program, which pages are the main Google index and which not. Once your
site is indexed, so it has to be indexed by a human or a robot just
stumbled upon your submitted rank pages so that Google knows which side
of a search engine to your site, and phrases if your site is even a part
of the search results.

The Googlebot is very smart and works very well. Remember, it's just a
piece of software, a very sophisticated, but it's just a computer
program. So she has a number of algorithms (rules) it uses to index
content site) (information, a set of skills (as I said, the Googlebot is
really intelligent) and a number of limitations. As such it is an
impressive number of ways that you are stumbling Googlebot and make it
impossible to index your content. Alternatively, the Googlebot can index
your website and then people will find when looking for words it
contains.

This article will try to make all necessary foundations to ensure the
consistency and persistence in Google, starting with the very basic
steps: preparation of Googlebot, the crawler will inform indexed by
Google.

1. Webmaster Guidelines Google's own reading

The people behind Google seem to master the material, two things: First,
to have) most of their algorithms (rules are so secret that we do not
speculate all-Google-Employees, too. Two of the guidelines are very
simple, direct and sharp. Following their instructions will not harm your
site's ranking. Despite their guidelines can and probably will hurt the
long term. Http: / / www.google.com / webmasters / walk guidelines.html
Sun and read what Google has to say about themselves.

2. Do you have text links.
Make each page of your site, based on available via a text link, such as
Javascript, Flash, DHTML (Dynamic HTML), native language, etc. Googlebot
is opposed to text.
Google says: "Make a site with a clear hierarchy and text links. Every
page should be at least one static text link.

This is probably the most important key to the existence of your site in
Google. Googlebot is actually a robot, a type of browser software, based
on the venerable Lynx browser. The reasoning behind this approach is that
designers try to be as close as possible to emulate the navigation rights
preserved, that is your website really understand the man. Therefore, by
downloading Lynx on your computer and view your site through Lynx
(http://lynx.isc.org) you will see more or less exactly the information
Googlebot can read and index and follow links Googlebot. They are also
errors in your HTML pages and places where to put a robot and could not
reach the rest of your site.

I know it is very unfair for those of us who love and the potential of
sites built entirely in Flash or other engines. However, until the nice
people, the figure has Google ads a good way to explore are within a
Flash file and extract relevant information, we stuck with standard HTML.

This does not mean you can not your site really nice and fill it with
Java Script and Flash eye candy. But you have to regularly text and
standard text links. In general, you can achieve the desired effect by
extra navigation menus on standard text links.

3. Avoid frames.

Avoid the images at any price. If you must use them (for example) is on
the side of someone else, it looks like a part of your site, do not use
them on your homepage.

Frames are like the plague, they sneak up on you. It is incredibly easy
to lose pieces of Googlebot over a badly formatted frameset. You might
hear some robots, including Googlebot Google and Yahoo Slurp are quickly
gaining capabilities to go correctly in these frames. My philosophy is a
ubiquitous feature if you're not sure, leave it in the closet.

4. Keep the number of links on a page is less than 100.

This advice comes straight from the Google Webmaster Guidelines: "Keep
the links on a given page to a reasonable number (fewer than 100)."

It's more like a suggestion, and I'm not 100% sure if you get punished in
any way or if Googlebot just do not read your links after 100th However,
I can say from personal experience that I tried a page with 700 links,
and it seemed fine. Then one day I have tried an illegal site on my PDA
Blackberry view, and I had this strange error, my page is formatted.
After I split the page into several compounds, each with 80, which we
also worked on the PDA side.
Who cares about the BlackBerry? Well, if you read this and your goal is
to get visitors to your main concern should not sell to anyone. Remember,
now more than ever, people are using to access different devices and
different software on the Internet. Every visitor is a potential
customer. Every employee at a major U.S. law firm and many other business
people use a BlackBerry.

Finally, it would, why you need that many links all on one page? Suppose,
for example, you specialize in Promotional Products - Corporate Gifts
brand, such as pens, caps, coins and other products (sometimes called
"premiums") printed with the logo. Her name is decided on John Doe, and
you have JDPromos your company name (not very imaginative, but) for our
examples. It would have all the elements of your catalog as a text link
so that each object is indexed as a link and key word. In addition, those
who run forums, ezines, blogs, like the standard links to the articles,
such as software they could use to create dynamic links, invisible to
certain robots.

5. Give each page a meaningful way.

Giving every single page on the site as a complete and meaningful. This
is directly from the webmaster guidelines from Google. See Rule # 1

Incidentally, for those who are the debates over the death of meta tags,
fascinated
<title> </ title>
Day is not a meta tag, but a necessary element for each page.

The title "Day" will be supported by all Web tools to build there, and
goes through the mind of a web page (between the "head" and "/ head").

Google offers syntax "allintitle", which allows the user to search only
text that appears in the title of a page. Many people in a Google toolbar
in their website allows users to receive results by title. It returns
more than 29 million results for Untitled Document.

Most of us - myself included - copy and paste template pages for the
convenience of not before, the elements of design from scratch. If you
do, remember to change the title.

Make sure that your title is not just a list of keywords and it is the
actual content of a page. Google can and will be reviewed before it on
the relevancy of your page.

6. Do not place important text in pictures.

Google says: "Try to use text instead of images to display important
names, content or links. The Google crawler does not recognize text
contained in images."

It is very tempting to create images with text in them, for the simple
reason that we are not a designer) to the police very little (type that
the basic choices in HTML permissible. In addition, different browsers
tend to see things differently today, it's much easier to a picture of
the text that appears to be consistent and not worry about styles,
operating systems, create, etc.

7. Use descriptive ALT tags.

The "Alt" tag is an alternative text (use the links used in the name) for
images and video and was designed to show text browsers (like Lynx) do
not d them simply''a generic image for each link image that you have. If
all your links say, 'Image' can understand how a potential visitor what
they do?

Make sure that the description is appropriate and accurate. Join our
business promotional products as an example. Say they have shown a
picture of a screen of the trade, as an example of a service they offer
outside the usual boxes engraved mint, calculators and key rings. If you
use the "Alt" tag, press only says: "display" is to see what Googlebot
and index. When the day is something like "sample template, says trade
show display," which is certainly more useful and user-friendly
Googlebot.

Please note that although the "Alt" tag counts, and Google seems to put a
high price for this, it belongs in the clear.

8. With descriptions of useful links

At the risk of sound like a scratched CD, I'll say it again: Whether you
are image links or text links, please use the text in your sense of tags
so that Googlebot can associate that text with the link href.

In other words, probably more that we are the website design for this
fictitious company promotional items, we have named JDPromos. If you
intend to put a link to a series of promos coffee mug sample, said
something like "linked to samples JDPromos cups of coffee brand" not just
"coffee cups", or worse, "click here for photos. Never as the link text
"Reading" or use "go here" or "download" button, "here", "Do not click
here", you get the picture - I hope.

Do not try to Googlebot with hidden links or duplicate content or
irrelevant pages of words like "sex" and "hot chicks" fool. Googlebot
will not play and you will be punished in one way or another in the long
run.

9. Use a "description" tag for each page

Include

<meta content="[insert your description site's here]">

Tag in the header of the page summarize your site. Use a meaningful
description in a sentence or two, do not spam keywords.

Better still, include descriptive text on the first page of the website,
the user can actually read. This text will appear as the description of
your site to Google.
More important to categorize the site's content does not exceed the least
important content on a page from Google's text on a page about his
position text at the bottom of the base of a page is considered less
important or "relevant" should be used any of Google's own words.

10. Use the search strings short

Use URLs with query strings hard, if at all possible. Query strings are
also dynamic pages. You can identify that normally lead to dynamic pages
by the presence of the symbol? Character. Note that the list of
parameters query string, the better. Be aware that each search engine
robot scans dynamic pages as static pages. This helps keep the parameters
short and the number of URLs.

11. Use "& id = parameter

If you must use query strings, or dynamic pages, never "& id =" parameter
in the chain.

I know it's ridiculous, how could not use it difficult or impossible for
you "& id =" parameter may sound, but if you are a programmer and you can
change the variable names to replace "id" with something else. Otherwise,
skip Googlebot only this page.

Google says: "Do not use" & id = "as a parameter in your URLs, as we are
not these pages in our index.

12. Use robots.txt

Use a robots.txt file Googlebot see around your site. Mechanism of the
old and very clever, will lead to standard robot such as Googlebot, you
can see the places where the robot is not allowed to establish for
reasons of confidentiality, or reasons to avoid Google penalties. You can
turn the robot away from your cgi-bin to hold and in other places you
might not be for the entire population of the quest for the planet.
Remember, this is a guideline, not an obstacle, robots are not programmed
to correspond to, are not included. Bottom line, you use direct the
"robots.txt" file to the Googlebot, but not the safety standards
improved.

Google says: "Use the robots.txt file on your webserver. This file tells
crawlers which directories can or can not be discovered."

13. Create a site map

A table of contents of this website is only one page   of your website,
where you do your users around the structure of your   website. The basic
form of the sitemap is a page that lists all of your   pages with a brief
description and a link - the entire text, of course.   When you do the site
plan to follow all the above rules and do not forget   that the purpose of
the site map to guide your visitors to humans.
Google says: "Do you have a sitemap for your users with links to
important parts of your site. When the survey more than 100 links, you
can split the sitemap into separate pages."

14. Use the project Google Sitemaps

At the time of writing the fastest ensure better and more accurate, that
your website is properly indexed and crawled by Googlebot will
participate in Google Sitemaps project.

In short, you make a map as an XML page and submit it directly to Google.
Google then sends Googlebot to index your site. In addition to the free
offer fast, you also get a good amount of statistics and the ability to
correct potential errors in your site.

Please note that the XML Sitemap for Google Sitemap project designed
specifically for Googlebot, and is the site map in the final rule will be
described only for human users.

Even before that, not shrink, Sitemap, Google is a very simple text file
and you will receive all necessary information and directions at: https:
/ / www.google.com / webmasters / sitemaps

Good luck!

				
DOCUMENT INFO
Shared By:
Tags:
Stats:
views:202
posted:2/17/2010
language:English
pages:6