Can I prevent search engines from indexing an entire directory on my website?

徘徊边缘 提交于 2019-11-29 01:29:19

The robots.txt standard is meant for this. Example

User-agent: *
Disallow: /protected-directory/

Search engines will obey this, but of course the content will still be published (and probably more easily discoverable if you put the URL in the robots.txt), so password protection via .htaccess is an option, too.

What you want is a robots.txt file

The file should be in your server root and the content should be something like;

User-agent: *
Disallow: /mybetasite/

This will politely ask search indexing services not to index the pages under that directory, which all well behaved search engines will respect.

Indeed, robots.txt at the site root is the way to go. To add multiple entries (as the OP suggests), do as follows:

User-agent: *
Disallow: /test_directory_aaa/
Disallow: /test_directory_bbb/
Disallow: /test_directory_ccc/

Or, to take the .htpasswd route:

In .htaccess, add:

AuthType Basic
AuthName "Marty's test directory"
AuthUserFile /test_directory_aaa/.htpasswd
AuthUserFile /test_directory_bbb/.htpasswd
AuthUserFile /test_directory_ccc/.htpasswd
require valid-user

In .htpasswd, add:

username1:s0M3md5H4sh1
username2:s0M3md5H4sh2
username3:s0M3md5H4sh3

Put following code in robot.txt which should be in root directory to refuse your entire site from indexing.

User-agent: *
Disallow: /

Create a file called Robots.txt in your public_html directory.

Put the following code in it:

    User-agent: * 
    Disallow: /foldername/

foldername is the name of the directory you wish to block

Block Specific File for SEO: To specify matching the end of a URL, use $. For instance, to block any URLs that end with .xls:

User-agent: * Disallow: /*.xls$

Ref: http://antezeta.com/news/avoid-search-engine-indexing

http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449&topic=1724262&ctx=topic

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!