Chunking of Updates

While I was looking at the Edition-based Redefinition I was puzzled by, or thinking about the performance of large data volumes moving across the edition. So I started thinking about how do I get that trigger to do stuff in parallel. Turns out I was thinking upside down... or inside out is maybe better.

In 11g Release 2 a new DBMS package is introduced called DBMS_PARALLEL_EXECUTE, and in the RUN_TASK procedure it allows me to actually fire a cross-edition trigger, and there is my answer to how to get the redefinition of the data to happen in parallel. Eureka... we already figured this out, it just took me a while to figure out the connections...

If you are just interested in avoiding large rollback segments or trying to find some way of doing updates on data in a small controlled batch manner I would say, have a look at the documentation for DBMS_PARALLEL_EXECUTE. The nice thing about the chunking is that you get restartability and control over the update in smaller chunks without having to build the infrastructure for all of this yourself.

You can read some more here and here.

Comments:

Post a Comment:
Comments are closed for this entry.
About

The data warehouse insider is written by the Oracle product management team and sheds lights on all thing data warehousing and big data.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
2
4
5
6
7
8
9
10
11
12
13
14
16
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today