On Jan 10, 7:18 pm, tvw <t...@s4r.de> wrote:
> Hi,
>
> today I went into a problem, when I had to iterate over a big result
> set. ActiveRecord produces a huge array of model instances, which
> consumed 1GB of memory on my machine. A comparable perl DBI script
> only used some KB for iterating over the result set.
How many records was this? 1gb is pretty crazy.
>
> Then I was suggested to use batch (#find_each) for the problem. But
> basically batch does the same thing by splitting the query into
> several queries, which only return 1000 model instances in an array,
> which still consumes more memory than necessary, but not that much.
> The problem: it was much slower (it took 25 minutes, while the version
> without batch took 90 seconds and the perl script took only 40
> seconds) on my legacy database.
This sounds like your legacy DB is somehow not indexing the ID column
- after all, instantiating all those objects only takes 90 seconds.
--Matt Jones
--
You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group.
To post to this group, send email to rubyonrails-talk@googlegroups.com.
To unsubscribe from this group, send email to rubyonrails-talk+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/rubyonrails-talk?hl=en.
No comments:
Post a Comment