Saturday, June 27, 2009

Competitor Analysis


Compititor analysis is used to analylize what your compititors did to get higher ranking in google or major search engines. Compititor analysis is very important task. In compititor analysis we analyze first 5 or 10 compitors and find out what they did in their title tag,meta tag.
Steps to find what your compititor did:-

  1. Find all related compititors.
  2. Find how important these websites are.
  3. Find link popularity.
  4. Find number of indexed page.
  5. Select a website that rankes #1 in google.
  6. Right click to their homepage. A small popup found click on view page source.
  7. When you click view page source a new window opens that contains all the programming of your compititors site.
  8. Find title tag and meta tag and found out what your compititor did.

ONPAGE OPTIMIZATION

Onpage optimization is a critical part of seo. It includes your whole site's element that effects your ranking. onpage optimization includes these entities:

  1. Page Structure and layout= page structure includes how the page looks and feels. The page must be user friendly.
  2. Title tag and Metatag= many people says meta tags are passed but not yet meta tag is very important aspect do not ignore it.Meta tag is a tag that gives information about information so it gives a brief information about your page or website to the search engines.Title tag must be compatible to the page's content. The title must be gives a brief information about what the page contains.
  3. H1 and H2 tags= Headings are also very important in onpage optimization. The headings must be related to the content of your page.
  4. Good Content= As everyone knows content is king. your content must be relevant to your site's theme and also original. Sites with duplicate content are banned by google. So remember to make your content usefull and original.
  5. Proper file names= proper file names includes the name you viewed in address bar . The file names of your sites must be related to the page content. Example- a page related to news must include the file name with news.html.
  6. Image Alt Tag= google doesn't understant images and image content. So it is very important to put alt tag in images .
  7. Create HTML sitemap= After doing all the stuffs in programming side create a HTML sitemap for users . The sitemap must include all the pages of your site with hyperlink.
  8. Create Sitemap XML= create a xml sitemap for googlecrawler. The xml gives all the information and pages to the gogle crawler with ease.For creating sitemap visit-http://www.xml-sitemaps.com/
  9. Use Google Analytics-http://www.google.com/analytics/
  10. Use Google webmasters-http://www.google.co.in/webmasters/start/

Friday, June 26, 2009

Website Security Models

Do you have a website or ready to creating one. What is the first things came into your mind may be from where u got a developer who develop your site. But you need to think again the first and foremost thing one needs to do is to chalk out a security plan and policy model for your site.A security model for your site helps you to finding out what type of site you can create which is secure in terms of privacy.
Here i am going to tell you many security models. you can apply anyone according to your site.


  1. The Open House: In this case, the front door and all the rooms are unlocked. visitors will be free to move around anywhere from any room to any room. This resembles an unprotected site where users do not require any special authentication to view the information.
  2. The Owner: A case where the front door is locked but all the rooms are unlocked. The owner lives in the house and locks the front door in order to keep the neighbours out but once anyone gets into the house, they will be free to go into all other rooms. This is a useful security model if any company gets to go into all other rooms. This is a useful security model if any company gets a lot of outsiders passing through but only want to have its employees access the site.
  3. The Garden Party: An excellent case in which the front door is unlocked but certain rooms inside the house are locked. Anyone may wish to allow people to help themselves to the bar on the front lawn and get into the washrooms but not necessarily into the bedrooms where the owner has kept all of his personal things.
  4. The Paying Guest: This is a more stringent measure than the above in which the front door is locked and certain rooms are locked. The guest has a key to enter the house and is able to get into his room only but the other rooms are off his limits. This model will verify whether or not a user would be allowed to enter the site. Once this user is authenticated, only then may he or she move freely throughout the other rooms as log as s/he has access to them.
  5. The Fort: A locked massive iron gate with barbed wire, front door locked, all rooms also locked, and there is a watchman guarding the house. Simple, unless teh users have the proper credentials or certificates or entry passes, they will not be allowed to get in.

Saturday, June 13, 2009

SEARCH ENGINE


Search engine is a tool designed to search for information on the web.The information may consist of web pages,images,information and other types of files. search engine gives you information related to your query. It create searche related to your query calculate relavancy according to your query and retrieve the desired result. Search engine is very good tool to reduce your effort to search any information on the web. Below of this blog i am giving you the work of search engines

work of search engine:-

  1. crawling- search engine first of all crawls the web for the information. This task is done by a software called crawler or spider. spider follows links from one page to another and index everything they find on their way. It crawls whole website each and every page and crawls its content. After crawling it save the site's content in a database from where it can later retrieved.
  2. Index- After page is crawled it is time to index its content.Essentially the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords.
  3. Processes-It compares the search string put by the user. When a search request comes the search engine processes.It compare the search string in the search request with the indexed pages. since it is likely that more than one pages contains the search string, the search engine starts calculating the relevancy of each of the pages in its index to the search string.
  4. Calculating relevancy- When a user put any search string the search engine compare the search string with the most relevant site from its database. After that it index the site according to the relavancy to the search string. It shows that search engines work on human tendency.
  5. Retrieving- After the comparison search engine retrieves the webpage that is relavant to the user search. The retrieving process takes less than a second . And after getting the information realated to the user the search engine's work is finished.