Chunking of Updates
By Jean-Pierre Dijcks on Nov 04, 2009
While I was looking at the Edition-based Redefinition I was puzzled by, or thinking about the performance of large data volumes moving across the edition. So I started thinking about how do I get that trigger to do stuff in parallel. Turns out I was thinking upside down... or inside out is maybe better.
In 11g Release 2 a new DBMS package is introduced called DBMS_PARALLEL_EXECUTE, and in the RUN_TASK procedure it allows me to actually fire a cross-edition trigger, and there is my answer to how to get the redefinition of the data to happen in parallel. Eureka... we already figured this out, it just took me a while to figure out the connections...
If you are just interested in avoiding large rollback segments or trying to find some way of doing updates on data in a small controlled batch manner I would say, have a look at the documentation for DBMS_PARALLEL_EXECUTE. The nice thing about the chunking is that you get restartability and control over the update in smaller chunks without having to build the infrastructure for all of this yourself.