The Free On-line Dictionary of Computing (30 December 2018):
standard for robot exclusion
robot exclusion standard
robots.txt
    A proposal to try to prevent the havoc
   wreaked by many of the early web robots when
   they retrieved documents too rapidly or retrieved documents
   that had side effects (such as voting).  The proposed standard
   for robot exclusion offers a solution to these problems in the
   form of a file called "robots.txt" placed in the document
   root of the website.
   W3C standard
   (http://w3.org/TR/html4/appendix/notes.html#h-B.4.1.1).
   (2006-10-17)