postgresql 8.4 Commit in Function on 3 Million Row Table
I have a function that needs to update two columns in a table with 3 million rows. I thought a cursor would break the job into chunks and most importantly COMMIT the chunk of data in the UPDATE command at the end of the loop.
The function commits all at once on all 3 million rows. I know this because I can start the function, count the number of nulls on the field the cursor should be updating. I come back 30 minutes later and the number of nulls on the updated field hasn't changed. If I set a limit to test the whole function it works as specified. It's updating all at once. Or something like that.
I just don't have the memory to do an update on 3 million rows. When the table gets to 30 million rows it's definitely not working.
The following simplified psuedo code is what I've got now.
I defined a cursor cursor_name and a query something like select gid, (col1 *2), (col2 *4) from big_giant_table
I opened the cursor
Used fetch cursor_name and dump the col1 and col2 values into variables and then LOOP
if logic to break the LOOP when it gets to the last record.
perform update big_giant_table set (my_first_calc, my_next_calc) values (co1_sum, col2_sum) where gid = gid_val
--I tried a commit here with Postgresql 8.4 and the function doesn't compile.
Do I need to start another cursor for the update to force the commit?
Last edited by mpapet; 08-26-2011 at 01:23 AM.