robots.txt to disallow all pages except one? Do they override and cascade?

后端 未结 4 1303
春和景丽
春和景丽 2021-02-05 00:17

I want one page of my site to be crawled and no others.

Also, if it\'s any different than the answer above, I would also like to know the syntax for disallowing ever

4条回答
  •  甜味超标
    2021-02-05 01:02

    http://en.wikipedia.org/wiki/Robots.txt#Allow_directive

    The order is only important to robots that follow the standard; in the case of the Google or Bing bots, the order is not important.

提交回复
热议问题