What are Robots and Crawlers ?

Robot Icon

Web Robots aka ( Web Wanderers, Crawlers, or Spiders) are programs that automatically navigate through the internet going into websites, running and retrieving information. Search engines like Google, Yahoo and Bing all use them to index website content. Spammers and others use them to scan for other things like email addresses.

The first question people ask is "Can I stop a robot/spider from scanning my site"? The answer is yes you can, "if" they pay attention to the robot.txt file in your root folder.

The not so friendly ones try to run your website looking for forms and then try to find holes in them to get into your database. That is why we ask you to verify your are not a robot when logging in and doing other things that would access the database.

Spider Icon
We use cookies on this website.
By using this site, you agree that we may store and access cookies on your device.
More information can be found at all about cookies.