What is a spider?
Web crawlers (also called web spiders and web robots, more often called web chasers in the FOAF community) are programs or scripts that automatically grab information from the World Wide Web according to certain rules. Other names that are not often used include ants, automatic indexing, simulation programs, or worms.
Why observe spider records?
Introduction to Spider Analyzer plug-in
Spider Analyzer plug-in download