The ACAP (Automated Content Access Protocol) proposal has been designed by a consortium of publishers to have greater control over the indexing of their content by search engines.
An excerpt from AP:
The Automated Content Access Protocol proposal, unveiled Thursday by a consortium of publishers at the global headquarters of The Associated Press, seeks to have those extra commands — and more — apply across the board.
With the ACAP commands, sites could try to limit how long search engines retain copies in their indexes, for instance, or tell the crawler not to follow any of the links that appear within a Web page.
Web administrators use the robot.txt file to convey to search engine crawlers or indexers information regarding what portions of the Web site is up for indexing. The unofficial standard was developed in 1994, and most engines complied with it.
The issues here are that the new proposal is not backed by any major search engine. Also, the proposal is in a voluntary acceptance mode.