Robot Spider Basics
-
What is a Robot?
-
A computer program that adapts to conditions
-
Other programs must have known "input" - a list of zip codes,
for example. Robots don't know what they'll find.
-
What is a Spider?
-
A web robot that follows links
-
Given a page, it will read the HTML and extract links to other pages
-
It repeats this process, reading the pages and following links until
it runs out of pages
-
How do Search Engines use Robot Spiders?
-
Search Engines store words from pages in their indexes
-
Robot Spiders find those pages and retrieve the text
Start | Prev
| Next
Search Engine Strategies
Dallas, November 9, 2000
Avi Rappoport, Search Tools Consulting
www.searchtools.com