Wednesday, October 8, 2008

Create Lots of Pages

Websites with lots of pages in general rank better than sites with just a few pages, all
other things being equal. It is better to have a 50-page site with short pages than a 5-
page site with long, flowing pages. Each page should however contain a
minimum of about 200 visible words of text to maximize relevance with
Google.

Short pages also are indexed faster and download faster. Studies show you lose
10% of your visitors for every second it takes your page to download and display in
their browser. Much beyond 5 seconds and you might as well forget it – people will
click elsewhere. This is important to keep in mind.

Also, you need pages with real content – don’t create just a lot of “fluff” pages that
are standard fair anyway – About Us page, Contact Us page, etc.

Keep your web pages simple from a coding standpoint. Try to avoid gratuitous
animations, junk graphics, large imagemaps, JavaScript, or anything else that may
get in the way of Google or, more importantly, of your customers getting the
message you are trying to get across on your site.

Also be sure and break up your pages using

,

, and

heads, and
include your keywords in these heads. Not only will it help visitors read your pages
more quickly by providing visual separators on the page, it will give your pages more
relevance with Google.

Strive to have only one topic per page, and then to optimize that page for that
particular topic (keyword phrase). Write content by hand, don’t be lured into using
software programs that use “templates” for generating web pages. In general, your
pages will look cookie-cutter and Google may consider them as duplicate pages.


Caution: Don’t create pages that are all identical or nearly so. Google may
consider them to be spam or duplicates and your page (or site) may be penalized.
Pages full of high quality, unique, keyword-rich content are a must. Be careful if you
both HTML and PDF versions of the same content. Google will index both.

To prevent this, create a robots.txt file and place it in the main (root) directory on
your server. A robots.txt file specifies which directories and file types to exclude from
crawling. If your PDF files are duplicates of your HTML files, put all the PDF files in a
different directory and specify that this directory by excluded from crawling. For more
information on creating a robots.txt file, see
http://www.searchengineworld.com/robots/robots_tutorial.htm.




Here is a sample website with pages you should consider for your site:

• Home page
• Your main product, service, or content pages (this is the meat of your site)
• FAQ page(s) (Frequently Asked Questions)
• Sitemap page (links to each page on your site)
• About Us page
• Contact Us page
• Related Links page(s) (discussed later)
• Link to Us page (discussed later)
• Testimonials page
• Copyright, Disclaimers, Privacy Policy page
• Ordering page

Lastly, adding more pages to your site is one of two ways of increasing your site’s
total PageRank (PR) value. PR is assigned on a per page basis, but can be
channeled or distributed amongst pages of your site. This important concept will be
discussed later on.

0 comments: