Sunday, May 11, 2008

Search Engine Basics

It is the search engines that finally bring your website to the public, those who are searching for something that you have or can get for them. Therefore, it is to your advantage to know how these search engines actually work and how they present information to the customer initiating the search.

Search Engines use spiders to index websites. When you submit your website pages to a search engine by completing their required submission page, the search engine spider will index your entire site. A 'spider' is an automated program that is run by the search engine system. "Spidey" visits a web site then reads the content on the actual site, notes the site's Meta-tags and also follows the links that the site connects to. The spider then returns all that information back to a central depository where the data is indexed. Spidey will visit each link you have on your website and index those sites as well. Some spiders will only index a certain number of pages on your site, so don't create a website with 500 pages!

The spider will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the moderators of the search engine.

A spider is almost like a book containing the table of contents including the actual content and the links and references for all the websites that it finds during its search. It may index up to a million pages a day.

When you ask Google to locate information, it is actually searching through the index which it has created and not actually searching the Web and it will tell you how long it took to search for the information you requested. Different search engines produce different rankings because not every search engine uses the same algorithm to search through the indices. As Google appears to be the fastest, it is the most popular.

One of the things that a search engine algorithm scans for is the frequency and location of keywords on a web page, but it can also detect artificial keyword stuffing or spam-dexing. Then the algorithms analyze the way that pages link to other pages in the Web. By checking how pages link to each other, an engine can both determine what a page is about, if the keywords of the linked pages are similar to the keywords on the original page.

Obsidian Gray

No comments:

AdSense Secrets

Dilbert