This is a failed proposal. Consensus for its implementation was not established within a reasonable period of time. If you want to revive discussion, please use the talk page or initiate a thread at the village pump. |
Search engines such as Google and Bing deliver search results by using computer programs called web crawlers to 'surf' the internet looking for new pages to add to search indices, and for updates to previously 'crawled' pages. These potentially-intrusive programs are governed by a set of standards that allow website owners to control which pages the crawlers are allowed to visit, and which links they are allowed to follow to reach new pages. In the context of Wikipedia, this means that we have the ability to control which pages are accessible to web crawlers, and hence which pages are returned by search engines such as Google.