Upgrading Versions

When running the CC&B (and OUAF) framework upgrade scripts, it is not uncommon to encounter "duplicate record" type errors, usually caused by custom records inserted with values which do not conform to the standard naming convention (custom fields are recommended to be prefixed with CM), or instances where accelerator-related fields are bundled into the base product. In instances where these errors are encountered it is recommended that the application architects review the records and either change the existing values to an unused code, or alternatively delete the records altogether before rerunning the script step (the script should pause once these errors are listed allowing the review and correction to occur).

These errors are typically encountered as part of the initial Package Release/ Config Master upgrade and as a result any rollbacks required are usually fast (as a result of the fact that these instances have extremely low volumes of transactional data present in the database).

It is vitally important to recognise that all data errors must be analysed before taking any corrective action, as the ability to corrupt the installed instance is very high, and that any direct data fixes circumvent the existing business object based validation rules built into the product. Ideally all deletions should be performed via the CC&B front-end (which will require a database restore back to a known version which aligns with the application version) to ensure that the referential integrity is maintained.

I have also encountered issues with these scripts when running against full volume database instances, specifically related to instances where the upgrade script is attempting to add or change column definitions on tables with large row counts. In these instances there are really only a couple of methods available to speed up the upgrade:

1. Make use of Oracle 11g features which speed up column changes (when compared to the older Oracle 10g method of applying these changes).

 2. Do a full table extract, change the table definition manually, and then reload the data back into the table.

3. turn on parallelism for ddl operations (alter session enable parallel ddl), note that this may not be appropriate based on the number of partitions, processors and disk channels available to you.

I tend to recommend option 2, as it is the easiest to implement across all installations, and is guaranteed to have the desired effect (as long as the changes applied manually align with the upgrade script expectations, otherwise the script will simply attempt to align the table definition, and negate all of the manual effort expended).

Comments:

Post a Comment:
  • HTML Syntax: NOT allowed
About

Stuart Ramage

I am a Consulting Technical Manager for Oracle Corporation, and a member of the OU Black Belt Team, based in Hobart Tasmania.
I have worked in the Utility arena since 1999 on the Oracle UGBU product line, in a variety of roles including Conversion, Technical and Functional Architect.

Contact me on:

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today