Monday, May 7, 2012

[Rails] Re: out of memory generating huge csv from active record

On Monday, May 7, 2012 6:10:43 PM UTC-3, Jedrin wrote:

 I am trying to work with generating really large CSV files from
active record. (This is actually an end case test however) I am trying
this with
a set of active records that is 700000 records which is just a test
case that I have, though it is very large. The type of find() below is
supposed to work in pages and not have all active records in memory. I
get an out of memory error (see stack dump below). The print of the
count also never comes out.


No, your code will load *all* records:

find(:all).each does that.

Either you change to find_in_batches or find_each

See documentation:

http://api.rubyonrails.org/classes/ActiveRecord/Batches.html

--
Luis Lavena

--
You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group.
To view this discussion on the web visit https://groups.google.com/d/msg/rubyonrails-talk/-/3L1VofbN8F0J.
To post to this group, send email to rubyonrails-talk@googlegroups.com.
To unsubscribe from this group, send email to rubyonrails-talk+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home


Real Estate