|
This survey has been on the SearchTools
site since December 1998. The following report summarizes the responses
rating search tools with many responses (6 or more entries) as of October
31, 2000. It is a self-selected group of respondents, and probably not
statistically significant, but the results are interesting.
This page summarizes the ratings for search engines: for information
on web site size, audience, server location, etc., see the Survey
Results page.
Conclusions
Search engines requirements are more complex and idiosyncratic than
they appear at first. A web site may have dynamic pages, or be missing
page descriptions, or change often, requiring a flexible indexer to
adjust to these conditions. An intranet may have power searchers and
complex frames containing binary data recognized by special client modules,
while a topical portal may have customers who perform many single-word
searches. No one search engine is best for everyone, but some have consistently
happy users while others are rated very badly. If you read the comments,
you can learn a great deal from the experiences of our survey takers!
Methodology
This survey is not at all rigorous. It is a self-selected group of
respondents, and not statistically significant, but the results are
interesting. While the ratings vary for each engine, many respondants
provided enlightening comments. When reading the disadvantages section,
please check the date entered: many of these search engines have been
updated recently.
In the survey, we included a section about the site search tools that
web administrators are using. The questions asked about the name of
the product and allowed them to rate their tool, using this measure:
We also asked the people how long they have been using the engine --
those who are new to a product may not be the most reliable at evaluation.
And we provided fields to describe what they liked and disliked about
the product.
Disclosure: The SearchTools.com site is provided as a free service to the web
development community and is not sponsored by any advertisers. However, Search
Tools Consulting has provided analysis and information to search engine devlopers
including Atomz, AltaVista, Siderean, Google, Inktomi, Maxum (Phantom) and Mondosoft
(MondoSearch). We do not give them site visitor or survey personal information
or allow our relationships with any vendors to change any product review or analysis.
Twelve search engines got 6 or more survey response. Comments from
users indicate that the highly-rated products provide solid indexing
and search functions, are flexible and easy to administer. Those with
lower ratings tend to be older packages which don't have configurable
indexing, adjustable results rankings, or good administration interfaces.
These search tools had five or fewer responses, so we didn't even try
to average the ratings:
Alkaline, AltaVista Search, Apple e.g., FreeFind, Fluid Dynamics, Harvest,
Hummingbird (Fulcrum), hyperseek, iCat, ICE, iFilter, Inquery, Intelligent
Miner for Text, Inmagic / DBText\works / Lycos Site Spider, OfficeVision
400, Oracle ConText, Perlfect Search, Phantom, PicoSearch, NetCreations
PinPoint, PLWeb, SiteMiner, SiteServer, WAIS & wais-sf, WebIndex/WebFind,
WebSTAR Search, WhatUSeek.
While each individual engine had few responses, the comments
in the responses are quite enlightening.
This report covers custom-written and home-made search engines. Of
these 21 entries, only a few people were satisfied with their search
engines. Complaints include problems with coverage, duplicates, relevance
ranking and speed. Given these problems, and the number of solid search
engines available, whether as commercial products, remote services or
open-source code, we recommend against custom search engines.
|