Pages

Tuesday, July 5, 2011

On Page Optimization

What is On Page Optimization?
On-Page factors are related directly to the content and structure of the website. This normally consists of pages written in the Hypertext Markup Language but also applies to other document formats that are indexed by search engines, for example Microsoft Word or PDF formats. On-page optimization involves modifying keyword frequency in the URL, Title, Headings, Hypertext Links and Body text. It may also involve reducing redundant HTML codes produced by Web page authoring tools and restructuring the site to produce better linked and focused page content. On page optimization is the process by which various elements on an individual web page are structured so that the web page can be found by the search engines for specific keyword(s) or keyword phrases.

1. Meta Tags Optimizing:-
A meta tag is an HTML tag that provides information about Web page's content, such as what HTML specifications a Web page follows or a description of its content. A meta tag doesn't affect how a Web page is displayed in a browser window. This meta tag is usually placed beneath the title and meta description tags. HEAD section of your pages' HTML code, like this:

2. Robots.txt for search engine:-
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. It works likes this: a robot wants to visits a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds:

User-agent: *
Disallow: /
The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site. There are two important considerations when using /robots.txt:

• robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.

• the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use.

6. URL Rewriting:-
URL Rewriting means it gives you the ability to transparently redirect one URL to another, without the user’s knowledge. This opens up all sorts of possibilities, from simply redirecting old URLs to new addresses, to cleaning up the ‘dirty’ URLs coming from a poor publishing system — giving you URLs that are friendlier to both readers and search engines. Static URLs are known to be better than Dynamic URLs because of a number of reasons.

1. Static URLs typically Rank better in Search Engines.

2. Search Engines are known to index the content of dynamic pages a lot slower Compared to static pages.

3. Static URLs are always more friendlier looking to the End Users.

Example of a dynamic
URL http://www.widgets.com/product.php?categoryid=1&productid=10

This tool helps you convert dynamic URLs into static looking html URLs.

Example of the above dynamic URL Re-written using this tool
http://www.widgets.com/product-categoryid-1-productid-10.htm

7.  Anchor Text Optimization:-
Anchor text is the visible text showing in a link. In a normally built site, anchor text is usually used to indicate the subject and title of the page that it links to. Search engines put a very big importance on Anchor texts. Keywords in anchor text enhance the relevance of the target page pertaining to the keywords used. A mistake some newbie’s make is to put their website name into the Anchor Text. It is good if your domain contains your keywords, but if not, it is a waste of a perfectly good link. Many web sites no longer ranking because they use their official company name in Anchor Text. Vary your text throughout your links. Use Anchor Text within your site.

8.  Image Optimization:-
Optimizing images for the web is a tricky business. You have to get the right balance between file size and picture quality. It is an essential step though. Look at any webpage, and you will see that most of its load time comes from images. Your website will be needlessly slow if you don't drop the sizes of these images. There are three key areas where bytes can be shaved off your graphics: bit depth (number of colors), resolution, and dimension. Here I'm going to show you the practicalities of web graphics optimization, and less of the technicalities. You don't need expensive graphics editors to compress the sizes of your images, as there are plenty of free utilities and shareware that will do the job for you.

9.  Heading Optimization:-
H1 & H2 headings, or header tags as they are often called, are a fairly heavily weighed element of on-age content, and if leverage properly can be effective in search engine optimization. The header tags (H1, H2, H3 and so on) represent the beginning of a new section or area of a pages content and alert search engine spiders of the relevancy of the content that follows the heading tag. The heading tags should reside tags in the body of a document, and should precede relevant content segments. ore is not better, don't overuse! Limit each page to one H1 & 1-2 H2s, then use H3s. Include exact target phrases with-in heading tags, but use variations to avoid obvious repetition.

10. Title & Content Analysis:-
It is displayed at the top of your browser and it is also used as the title of your website in the SERPs. Therefore you should take your time and think about the best title for your page. Give every page of your website its own title and do not use one title for the whole website. It should contain no more than around 10-12 words. Think of it as a short ad copy, because that’s basically displayed when users see them on the SERPs.

No comments: