Close
Glad You're Ready. Let's Get Started!

Let us know how we can contact you.

Thank you!

We'll respond shortly.

PIVOTAL LABS
Standup 2010-08-20: Database Woes and Eval Nuances

Help!

A Pivot is working on a project where 1,500 records living in a CSV are iterated over (loaded in to ActiveRecord objects) and loaded in to a Postgres database… or at least that was the intention. The database just crashed on him in a repeatable way. Anyone seen anything like this before?

On another project, there’s a has_many association on a table with 40,000 records in MySQL. It seems that pulling these in through standard ActiveRecord means is not releasing memory (you can watch the memory grow as it executes). The work around was to use ActiveRecord::Base.connection.select_all(), but it would be nice to know why GC didn’t kick in and clean up house. Maybe AR is holding on to object references somewhere outside of scope?

Interesting Things

Speaking of scope, there’s a subtle difference in Ruby 1.9.2 that a pivot ran into – previously you could set variables inside a call to eval() and have that variable defined in the same scope as the call. That is no longer the case in 1.9.2 where eval has its own scope inside the call.

And lastly, on this day in 19..(mumble mumble), NASA launched the Voyager 2. Later in its career, Voyager was given it’s own drama series and a female captain to command the vessel.

Comments
  1. I’ve been loading ~50k records from CSV into PG reliably for a while for a project.

    Using FasterCSV in combination with AR-Extensions ActiveRecord::Base#import.

    http://fastercsv.rubyforge.org/

    http://www.continuousthinking.com/are/import

    AR#import does the insert without having to instantiate the AR objects. So less memory and also much much faster.

    -Nick Gauthier
    SmartLogic Solutions

Post a Comment

Your Information (Name required. Email address will not be displayed with comment.)

* Copy This Password *

* Type Or Paste Password Here *