WASHINGTON DC -- Websites get looked at by two different kinds of visitors: the human ones who peer around, look at the graphics, think about the links and click slowly; and the spiders, those automated scanners that come in from search engines like Google, or, more ominously, from malicious attackers, competing businesses and spammers looking for e-mail addresses.
Fortunately, it has always been pretty easy to tell the difference between the two in server logs, and block unwanted or anti-social crawlers. But research presented at the Shmoo Con hacker conference here Friday may change that.
Billy Hoffman, an engineer at Atlanta company SPI Dynamics unveiled a new, smarter web-crawling application that behaves like a person using a browser, rather than a computer program. "Basically this nullifies any traditional form of forensics," says Hoffman.