Sunday Nov 23, 2008

Foreign Key Characteristic Values

We have encountered issues where Conversion runs in CC&B 2.2.0 (and possibly previous versions) results in Foreign key characteristics being populated with a value which contains trailing spaces. Whilst CC&B handles these correctly, it appears that the majority of reporting steps defined as part of the reconciliation processes do not do the required trimming of values, resulting in no match being found.

I recommend that all reconciliation reporting of foreign Key values includes the necessary 'TRIM' function calls in the selection criteria.

Sunday Oct 19, 2008

Running Keygen for Partial Conversion Runs

Hint: Whilst the majority of the CC&B Conversion process can be run as a subset of steps dependent on the type of conversion run being executed (ie. Convert on Person will only require that the Validation and Production steps specific to the Person entity be executed) it is important to note that once a project moves onto a conversion run involving a more complete 'V' structure, that the need to run all KeyGen steps becomes mandatory.

KeyGen will attempt to allocate a new CC&B key value for each record on the associated parent table, but it also creates a single 'blank' key value which is used by all Production Insert steps to link to optional key values. A prime example of this is the CI_ACCT MAILING_PREM_ID, this field is an optional field on the CI_ACCT records, but unless the associated CI_PREM KeyGen has been executed to create a ' ' entry on the CK_PREM table, none of the CI_ACCT records will be transferred from Staging to Production.

As a result, I recommend that once a site reaches the point where they are considering moving data from Staging to Production, that they include the full suite of KeyGen steps in the Conversion run schedule (regardless of how many of the CC&B data structures they are converting over). Given that these runs are being executed against a set of empty source tables, it is not anticipated that the overall run times will be extended to an unmanagable level.

Monday Sep 01, 2008

Threading...

A common question that I get asked is.. how do we force multi-threaded conversion jobs to split the key ranges evenly across each thread instance?

Unfortunately I don't have an easy answer. The base conversion jobs seem to take a simplistic approach of using the low and high override values (if supplied, otherwise they use values of 0000000000 and 9999999999) and then divide these by the number of threads to determine the ranges for each instance.

But.. We have found that we can force a predefined range of values and simulate threading but having separate batch controls defined for each thread, with predefined high/low values and these can then be run concurrently through standard batch submission processes.. e.g..
if we want to run VAL-ACCT in 4 threads and ensure that a known range of source keys between 0000000001 and 0000100000 is split between the 4 threads, we clone the existing batch control into
VAL-ACCT1, with OVRD-LOW-ID = 0000000001 and OVRD-HIGH-ID = 0000025000
VAL-ACCT2, with OVRD-LOW-ID = 0000025001 and OVRD-HIGH-ID = 0000050000
VAL-ACCT3, with OVRD-LOW-ID = 0000050001 and OVRD-HIGH-ID = 0000075000
VAL-ACCT4, with OVRD-LOW-ID = 0000075001 and OVRD-HIGH-ID = 0000100000

These four jobs can then be submitted alongside each other with Thread Number and Thread Count both set to 1, with no chance of them being rejected by the framework (this normally restricts the same batch control from running more than once at a time). The actual number of instances and the thread ranges of each will need to be adjusted in the same way as we normally do for our optimisation tasks, it is just a little harder since we have to adjust the batch control parameters rather than just the thread counts.

Tuesday Jul 29, 2008

Tidy Balances.. I don't really hate it

I have been asked why I tend to avoid running the Tidy Balances Conversion process, and the short answer is that 'I have never found a need to..'.

The Tidy Balances process has been designed to accept an input file of Accounts and balance 'buckets' and will produce adjustments of a specific type to ensure that the account has aged balances of these values. This could be handy in situations where minimal transaction data is being converted into CC&B via the traditional Extract-transform-and-load means (ie. convert all source data into CC&B Adjustments and Financial Transactions and run the required Validation and Keygen processes), it does not fit well with my experience of bringing 2-5 years of history over (which seems to be a common requirement on CC&B projects) and needing to create an opening balance for the Service agreements which is essentially just a placeholder transaction which is no longer subject to any aging or collection processing .

The effort required to create the Tidy Balances input table (i.e. summarise all historic transactions older than x days/months and generate a record on the work file for this account), can easily be serviced via the ConversionCentral data sources and some careful SQL, resulting in less effort required by the extraction team. The selection criteria for extracted data included in the standard mapping into Adjustments is normally in the format:

Select Source_Account, Adjustment_type, Adjustment_Amount,....
From Source System Tables
Where Transaction Date >= Conversion_Cutoff_Date


My approach to ensuring that all transactional data is processed in the run is to include a separate ControlCentral data source which generates a single 'Opening Balance' Adjustment record as:

Select Source_Account, 'Opening Balance Adjustment Code', SUM(Adjustment_Amount),....
From Source System Tables
Where Transaction Date <
Conversion_Cutoff_Date

With a bit of manipulation to ensure that this is attached to the earliest converted bill, we ensure that these transactions are not reprocessed by the first CC&B bill, whilst still ensuring that the Account and SA balances conform to expected values.

Given that these adjustments are older than 120 days, they are normally not subject to traditional aging and as such this solution proves to be easier (and quicker) than running a full Tidy Balances process.

Tuesday Jul 15, 2008

A picture is worth a thousand words...

CC&B Conversion Process
The attached image provides an overview of how the CC&B Conversion process hangs together. Hopefully this proves useful to you in understanding the various components of the process. For in-depth detail of the Server-side conversion modules, check the CC&B online help



Thursday Jun 19, 2008

Speeding up Conversion (or slowing down time)

Hint: One of the questions that I tend to get asked a lot is... how do we speed up conversion?

I tend to focus on a number of factors in an attempt to speed things up, but my top 11 is:

  1. Are the Staging and Production schemas on the same database/tablespace?
  2. Are the disks experiencing hotspots as a result of this shared implementation?
  3. If the two schemas are not on the same database, what is the theoretical maximum capacity of the link between the two instances?
  4. Is the machine CPU or Memory constrained?
  5. Can multithreaded runs be threaded further?
  6. Can Indexes be dropped for specific conversion steps? In most instances it it cheaper to drop the indexes before any mass insert steps and rebuild them afterwards than wear the cost of multiple I/Os for each insert.
  7. In instances where the indexes cannot be dropped, have these been created on their own database tablespace, with a dedicated I/O channel?
  8. For multi-threaded jobs, are the thread ranges reasonable, or can better results be achieved by running a number of single threaded artificial high/low limit jobs (CC&B Threading does not dynamically assign High/Low limits, it basically takes the max key range and divides by the number of threads, which can result in all of the work falling to the first couple of threads).
  9. How regular are database statistics being gathered... massive numbers of inserts can skew the optimum access path, so regular stats collections are essential.
  10. Can complex data manipulation be better served with the creation of a temporary table on the Staging or Input schemas?
  11. Consider whether all documented Conversion steps are required. e.g. I tend to avoid running Tidy Balances and prefer to build all required components and associated values into my data extracts and mapping rules.

Wednesday Sep 12, 2007

Problems connecting to distributed databases using Oracle10g.

Question Recieved: We have encountered issues connecting to distributed databases using the internal CC&B subsystems (and TNSping) resulting in the following errors:
***********************************************************************
Fatal NI connect error 12557, connecting to:
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=CCBTEST)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=CCBCONVT)(CID=(PROGRAM=C:\spl\CCBCONVT\runtime\CLBDSYNC.exe)(HOST=CCBTEST)(USER=s_cissys))))
VERSION INFORMATION:
TNS for 32-bit Windows: Version 10.2.0.2.0 - Production
Time: 04-SEP-2007 16:26:20
Tracing not turned on.
Tns error struct:
ns main err code: 12557
TNS-12557: TNS:protocol adapter not loadable
ns secondary err code: 12560
nt main err code: 527
TNS-00527: Protocol Adapter not loadable
nt secondary err code: 0
nt OS err code: 0
and
--------------------------------------------------------------------------
--Executing Data Synchronization Process at 09-05-2007 08:50:51
--------------------------------------------------------------------------

--Connecting to the Database
--User Name : CISUSER
--Database Name : CCBCONVT

Couldn't connect to the database Oracle CCBCONVT CISUSER - ORA-12557: TNS:protocol adapter not loadable (DBD ERROR: OCIServerAttach)

Answer: After ensuring that all of the Oracle connections were making use of TNS to perform connections (via the netca oracle command) we investigated the PATH settings and discovered that the original PATH setting
Path=D:\oracle\agent10g\jlib;D:\oracle\agent10g\bin;C:\spl\3rdparty\Perl64\site\bin;C:\spl\3rdparty\Perl64bin;D:\oracle\product\10.2.0\db_1\perl\site\5.8.3\lib\MSWin32-X64-multi-thread\auto\DBD\Oracle;D:\oracle\product\10.2.0\db_1\bin;D:\oracle\product\10.2.0\db_1\NETWORK\ADMIN;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem
caused tnsping to return
TNS-12557: TNS:protocol adapter not loadable
but changing the path by moving the $ORACLE_HOME\bin directory to the beginning (as below)



Path=D:\oracle\product\10.2.0\db_1\bin;D:\oracle\agent10g\jlib;D:\oracle\agent10g\bin;C:\spl\3rdparty\Perl64site\bin;C:\spl\3rdparty\Perl64\bin;D:\oracle\product\10.2.0\db_1\perl\site\5.8.3\lib\MSWin32-X64-multi-thread\auto\DBD\Oracle;D:\oracle\product\10.2.0\db_1\NETWORK\ADMIN;%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem
corrected this problem.
Further experimentation highlighted the fact that D:\oracle\agent10g\bin was the problematic directory in the PATH. Examining the contents of this directory revealed older versions of tnsping.exe and oranjni10.dll that are present in the D:\oracle\product\10.2.0\db_1\bin directory as shown below.
D:\oracle\agent10g\bin
tnsping.exe 10.2.0.2.0
oranjni10.dll 10.2.0.1.0
D:\oracle\product\10.2.0.db_1\bin

tnsping.exe 10.2.0.3.0
oranjni10.dll 10.2.0.3.0
As a result we have changed our path settings and restarted our CC&B instances to resolve our connection problems.

Sunday Sep 02, 2007

Iterative Conversion

Question Received: I am toying with the idea of migrating the current information first and the remainder of the history at a later date. I have heard that the conversion tool copes with this, but haven't found any information on how it does.

Answer: The Toolkit will support iterative conversions as long as the original master data key tables (the CK_* tables) are not cleared down from Staging (the already converted Transactional Data would need to be cleared down) and the Production instance being migrated into is actually Production (we have migrated into a pre-prod instance in the past and then unloaded this and loaded it into the real PROD instance, but this will not work for your situation. You need to be migrating directly into your intended environment). In this case the migration tool will still know all about the original keys and the generated keys for the primary objects (Account, SA, etc.) and as such it will be able to link the data converted as part of a second pass onto these entities.

It should be noted that this may result in the original opening balances potentially being displayed with an incorrect value (if we are talking about Financial Transactions) and also that care will have to be taken to ensure that all related objects are aligned (eg. A Bill must have a set to bill segments, meter reads and a financial transactions, and these entities cannot exist independantly).

It should also be noted that subsequent runs of the conversion tool would need to be 'trimmed' to ensure that they are only doing work on the objects affected. You would not want to revalidate and migrate all Person, Account, SA, SA/SP, SP and Premise details since this information has already been processed, but you would definitely want to run the affected transactional record validation and keygen processes.

There is no real "hard-and-fast" rule around this processing since is it specific to each implmentations needs, but the majority of the effort required should be detailed in the Conversion Tool section of the online help (under Adminstration/ The Conversion Tool). The major rule is to ensure that you only run the steps and validation/keygen steps that you need and do not do a complete rerun for your subsequent conversion.

Use of Lookups in Conversion Mapping

[Read More]
About

Stuart Ramage

I am a Consulting Technical Manager for Oracle Corporation, and a member of the OU Black Belt Team, based in Hobart Tasmania.
I have worked in the Utility arena since 1999 on the Oracle UGBU product line, in a variety of roles including Conversion, Technical and Functional Architect.

Contact me on:

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today