Sunday, November 7, 2010

[Rails] Optimization on huge generating xml?

Hi all,

Currently, I'm developing a rails app that are heavy generating xml
from restful webservice. My xml representation of web service use
nokogiri gem to generates xml format that match expected format from
client. But the problem is data is quite big around 50, 000 records to
pull out from the table(millions records). I just test in my local
machine, it takes about 20 minutes to get the response from the
request.

Do you any ideas on how to optimize this problem? I'm not sure if we
don't use ActiveRecord, and we just use pure sql statement to pull out
the data for generating xml, then the performance is huge faster or
not?

Thanks,
Samnang

--
You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group.
To post to this group, send email to rubyonrails-talk@googlegroups.com.
To unsubscribe from this group, send email to rubyonrails-talk+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home


Real Estate