Search robots are the programs indexing web-documents in the Internet.
In 1993-94 it was discovered, that search robots often performed documents indexing against will of web-site owners. Sometimes, robots interfered with common users and the same files were indexed several times. In some cases, robots indexed wrong documents - deep virtual directories, temporary information or CGI-scripts. Exclusions Standard was designed to solve such problems.