[Rails] Disabling sessions for web crawlers
I'm upgrading our Rails app from 2.1 to 2.3.11. Yes, this is
difficult. :-)
Our application controller contains the following line:
session :off, :if => Proc.new {|req| req.user_agent =~ BOT_REGEX}
The purpose of this line is to prevent the creation of sessions for
the Googlebot and other crawlers. Since the majority of our traffic
comes from crawlers, this is a significant performance savings for us.
Now, with Rails 2.3, sessions are lazy-loaded, and I'm getting the
"Disabling sessions for a single controller has been deprecated"
warning.
It appears that the way to avoid creating sessions is now to simply
never access them. However, our application references the session all
over the place, and it seems easiest to turn them off completely up
front, guaranteeing that we won't accidentally create one.
Is such a thing still possible? Can I disable sessions completely for
a request, such that a lazy load cannot occur?
Thanks,
Robin
--
You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group.
To post to this group, send email to rubyonrails-talk@googlegroups.com.
To unsubscribe from this group, send email to rubyonrails-talk+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home