your daily tech resources

Increase pagerank by create a lot page

0

Sites with lots of pages in general rank better than sites with just a few pages, every single other thing being equivalent.

It is better to have a 55 page site with short pages than a 4 page site with long character.

Creat  more pages easier to crawl by most search engine

Each page should however contain a minimum of about 200 visible words of text to maximize relevance with Google.

Short pages also are indexed faster and download faster. Studies show you lose 10% of your visitors for every second it takes your page to download and display in their browser.

Also, you need pages with real content , don't create just a lot of stuff pages that are standard reasonable at any rate.

Here is an example site with pages you should consider for your website :

  • F.A.Q (Frequently Asked Questions)
  • Landing page ,main product,best product, service, or content pages
  • About Me page
  • affliate page
  • Contact Us page
  • Related Links page(s)
  • Link to Us page
  • Testimonials page
  • Sitemap page ( links to each page on your site )
  • Register , Login page , thanks you page , checkout page
  • Copyright, Disclaimers, Privacy Policy , page Ordering page

Make your page better

When Google crawls your site, it commonly begins at the landing page and after that takes after each links on the page to all your different pages.

Google finds your landing page thus from following a links on another site that focuses to your site.

spider web

As a rule, Google appears to append more significance to files that are nearer to the root folder on your server - the folder on your Web server where the landing page files is found.

Remember however that some Web designers may make numerous folders on the server for ease in keeping up heaps of files.

  • be sure and break up your pages using <H1>, <H2> and include your keywords in these headings.
  • Keep your web pages simple from a coding standpoint. Try to avoid gratuitous animations, junk graphics, large imagemaps, JavaScript, or anything else that may get in the way of Google or, more importantly, of your customers getting the message you are trying to get across on your site.
  • Not only will it help visitors read your pages more quickly by providing visual separators on the page, it will give your pages more relevance with Google.
  • Strive to have only one topic per page, and then to optimize that page for that particular topic (keyword phrase).
  • Write content by hand, don't be lured into using software programs that use "templates" for generating web pages. In general, your pages will look cookie cutter and Google may consider them as duplicate pages.
  • Lastly, adding more pages to your site is one of two ways of increasing your site's total PageRank (PR) value. PR is assigned on a per page basis, but can be channeled or distributed amongst pages of your site.

Don’t Nest or Bloat Your Pages With Code

Google generally has a time limit that it sets to crawl sites. If you have a large site, Google may not have time to crawl all pages during the first or second passes.

This problem can be minimized if you keep the code of your Web pages lean and clean.
This also makes your pages download faster, which improves the visitor experience. Studies show that you lose 10% of your visitors for every second it takes your page to load.

After about 5 seconds, you might as well forget it - most people will have left your site. Remember there is a still a large percentage of people who still use dial-up modems , particularly outside of the US. This will not change anytime soon.

This means try not to have more code than visible content (text) on your page. Frequently web pages are comprised of 80% or even 90% JavaScript code and style code .

Right-click a web page and then click View Source - you will be amazed at the amount of code present.

Although Google ignores such code, it still takes time for it to Wade through to find your content.

Caution. Don't create pages that are all identical or nearly so. Google may Pages full of high quality, unique, keyword-rich content are a must. Be careful if you both HTML and PDF versions of the same content. Google will index both.

To prevent this, create a robots.txt file and place it in the main (root) directory on your server. A robots.txt file specifies which directories and file types to exclude from crawling.

If your PDF files are duplicates of your HTML files, put all the PDF files in a different directory and specify that this directory by excluded from crawling. For more information on creating

Leave A Reply

Your email address will not be published.