How to disable Rails sessions for web crawlers?

孤街醉人 提交于 2019-12-24 06:41:57

问题


It used to be that a line like this in the application controller would disable sessions entirely for a request:

session :off, :if => Proc.new {|req| req.user_agent =~ BOT_REGEX}

With Rails 3.x, this is either deprecated or no longer works. I realize that the new concept is that sessions are lazy loaded, but the execution flow through the app uses/checks sessions even if it's a web bot.

So is there some new mechanism that could be used to disable sessions on a per-request basis?


回答1:


There doesn't appear to be a built-in way to do this in Rails 3, but you can monkey patch SessionHash to get a similar result:

class ActionDispatch::Session::AbstractStore::SessionHash
  private
    def load_for_write!
      load! unless loaded? || (@env['HTTP_USER_AGENT'] =~ BOT_REGEX)
    end
end

This will prevent the session store object from being created. You will still be able to assign into the session hash, and even access that same session data later in the same request.




回答2:


I'm assuming @jordoh's answer works for the default rails cookie_store, but didn't work for me using active_record_store to store my sessions in the database.

If you're using the active_record_store and want to stop sessions being created for bots this works:

# Save this file as config/initializers/session_store_ext.rb 
# and don't forget to define BOT_REGEX

class ActiveRecord::SessionStore
  _set_session = instance_method :set_session

  define_method :set_session  do | env, sid, session_data, options |
    unless env['HTTP_USER_AGENT'] =~ BOT_REGEX
      _set_session.bind(self).call env, sid, session_data, options
    end

    sid
  end

  private :set_session
end

I've written a blog post detailing this - Conditionally Disabling Database Sessions in Ruby on Rails 3



来源:https://stackoverflow.com/questions/9987797/how-to-disable-rails-sessions-for-web-crawlers

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!