django serving robots.txt efficiently

隐身守侯 提交于 2019-12-02 21:50:34
HankMoody

Yes, robots.txt should not be served by Django if the file is static. Try something like this in your Nginx config file:

location  /robots.txt {
    alias  /path/to/static/robots.txt;
}

See here for more info: http://wiki.nginx.org/HttpCoreModule#alias

Same thing applies to the favicon.ico file if you have one.

The equivalent code for Apache config is:

Alias /robots.txt /path/to/static/robots.txt

I know this is a late reply, I was looking for similar solution when don't have access to the web server config. So for anyone else looking for a similar solution, I found this page: http://www.techstricks.com/adding-robots-txt-to-your-django-project/

which suggests adding this to your project url.py:

from django.conf.urls import url
from django.http import HttpResponse

urlpatterns = [
    #.... your project urls
    url(r'^robots.txt', lambda x: HttpResponse("User-Agent: *\nDisallow:", content_type="text/plain"), name="robots_file"),
]

which I think should be slightly more efficient that using a template file, although it could make your url rules untidy if need multiple 'Disallow:' options.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!