Multiple robots.txt for subdomains in rails

后端 未结 6 2153
轻奢々
轻奢々 2021-01-31 23:21

I have a site with multiple subdomains and I want the named subdomains robots.txt to be different from the www one.

I tried to use .htaccess, but the FastCGI doesn\'t lo

6条回答
  •  灰色年华
    2021-01-31 23:53

    Actually, you probably want to set a mime type in mime_types.rb and do it in a respond_to block so it doesn't return it as 'text/html':

    Mime::Type.register "text/plain", :txt
    

    Then, your routes would look like this:

    map.robots '/robots.txt', :controller => 'robots', :action => 'robots'
    

    For rails3:

    match '/robots.txt' => 'robots#robots'
    

    and the controller something like this (put the file(s) where ever you like):

    class RobotsController < ApplicationController
      def robots
        subdomain = # get subdomain, escape
        robots = File.read(RAILS_ROOT + "/config/robots.#{subdomain}.txt")
        respond_to do |format|
          format.txt { render :text => robots, :layout => false }
        end
      end
    end
    

    at the risk of overengineering it, I might even be tempted to cache the file read operation...

    Oh, yeah, you'll almost certainly have to remove/move the existing 'public/robots.txt' file.

    Astute readers will notice that you can easily substitute RAILS_ENV for subdomain...

提交回复
热议问题