This is a page intended to provide known robots access to selected pages on the site without crawling to them via normal navigation. Pages other than the following list will get blocked for robots, who are supposed to be reading "robots.txt":

/yggdrasil/Projects/WorkHorse/whtop.jsp
/yggdrasil/Projects/Ragnarok/ragtop.jsp
/yggdrasil/Projects/Trailscraps/index.jsp
/yggdrasil/Projects/BLIMP/top.jsp
/yggdrasil/Projects/index.jsp
/yggdrasil/Articles
/yggdrasil/Bob/articles.jsp
/yggdrasil/Bob/resume.jsp
/yggdrasil/Hiking/personalhikes.jsp
/yggdrasil/Hiking/personalparks.jsp
/yggdrasil/Hiking/index.jsp
/yggdrasil/Stocks/countries.jsp
/yggdrasil/Stocks/indices.jsp
/yggdrasil/Stocks/watchlist.jsp
/yggdrasil/Stocks/stockguest.jsp
/yggdrasil/robotlinks.jsp
/yggdrasil/index.jsp
/yggdrasil/welcome.jsp

But I'm NOT A ROBOT!!!

If you are actually a real person, and wound up here, most likely you are not allowing cookies. See Cookie Use. Not honoring cookie setting is something that just about every web crawler does, which results in them creating a new session on each request. In order to detect ill mannered bots which are disguising themselves as browsers, I keep track of how many sessions a particular IP address creates in a short window. If multiple sessions are getting created with very few requests that don't specify a new session, I will start treating that IP as a robot until it stops doing that. If you don't allow cookies, you will trigger this after a few requests. Allow cookies, at least for the browser session.

If you DO start multiple new browser sessions on your IP without mutiple requests to this site on them, you can also trigger the "robot detection". The detection window is short (currently 15 minutes). You can either wait, or just poke around the site as a robot until there's enough requests from your IP which didn't create new sessions. That will unflag your robot status.

There are some other reasons, but they are unlikely. Not described here to avoid giving more tips on circumventing robot detection to robot developers who don't wish to obey the robots.txt rules, including the crawl delay.

© copyright, 2005-2022, Robert L. McQueer
Powered By