The New World of Risk Management and Data Excellence: Avoid These Data Modeling Mistakes Upfront

In my previous blog on June 12th, I discussed what we can imagine to be a solid foundation to the Risk and Finance initiatives. Obviously, it is not just about the data model and that is the first point I would like to address today.

Many vendors that strive to deliver a warehouse designed to service more than a handful of users within a business department tout an extensive, comprehensive logical data model. The role and need of a logical model is important and uncontested, but its ability to act as a true project accelerator is greatly over-valued and over-sold.

There is not a single application in financial services that can run off a 3rd Normal Form Logical Model. All applications need a physicalized model. To physicalize a logical model – no matter how detailed, extensive and comprehensive – is no small or easy task.

A typical iteration cycle of a data centric project is:

  • Capture business requirements
  • Map and gap them with the logical model 
  • Create a semantic model mapping to the logical model 
  • Confirm with business users the models are correct 
  • Adjust for changes to specifications from the business 
  • Adjust for changes to the application landscape (and hence the semantic model and logical model gapping) while gathering the requirements 
  • Re-confirm the specifications with the business users 
  • Adjust for changes in the business user personnel 
  • Re-confirm 
  • And so on... 

Multiply this by every line of business, major application and group of key stake holders. Where does the budget go?

So we’re back to asking, how well does the technology truly reflect your business? What good is all the data in the world in the fastest processing environment if you have to pull it out, and put it in a data mart to run your reports?

Therefore, as accelerators go, the one to value most is a well conceived, business relevant, wide ranging physical model that’s designed to drive analytics and reporting in the manner in which business users typically operate. Requirements are locked down much faster, they have far greater probability of being successful the first time and being implemented before major changes can take place. Users will see value far sooner and more meaningfully than any other alternative.
Now as I mentioned, it is not just about the Data Model and many of the discussions in terms of architecture to support the needs of Risk and Finance stops right there.

In effect, does it not make sense to then use this fantastic repository you have just created as the Golden Source of data to then have a single processing environment? What I mean by processing environment is very simple. Consider a Data Warehouse in place, in most instances the bank will use it as the source to execute calculations in separate risk engines and extract the data from the DW to populate a given Data Mart. For example, FTP, Regulatory Capital, Credit Economic Capital and Channel Performance, etc. are all Analytical Applications that make use of that data for business decisions. Each effectively has their own decision, calculation and data processing engine.

Fundamentally, whether you are creating the result of a Cost Allocation or the calculation of Regulatory Capital, you are simply ‘doing stuff’ to data. Some of these ‘stuff’ are simple and prescriptive in nature and others are very complex, involving for instance stochastic calculus (think Economic Capital).

Since each of the engines is essentially doing its own calculation, why not take the synergy a step further and have a single processing layer where all the calculations and transformations can be executed in a simple, consistent manner with only one definition of dimensionality, security, auditability, etc.

In our next and final blog entry we will look at some example of benefits from customers who have embarked on this journey.  
Hope to see you in Shanghai for the Oracle Financial Services Summit and have a fruitful exchange of ideas

John Foulley is Director for Financial Services Analytical Applications at Oracle. He can be reached at john.foulley AT oracle.com.

Comments:

Hi Jenna,

To answer your question: "How does technology truly reflect our business?"

I think the problem is not with technology - technology is not part of this equation. The problem is with the current project management model that we're following - it creates a lot of overhead and never represents our business at a 100%.

I know from my own business - if I want to apply rigid and strict project management principles to it - we would never finish any project on time. Project management needs to be adapted to the business, and the stakeholders must be more lenient with the project and the PM. We usually don't take projects where we feel we will go to the gallows if we're late on a project - mistakes happen, inaccuracies happen, and we're not expected to know everything upfront.

On a closing note, a project schedule is an estimate - something that many stakeholders out there don't seem to understand.

Posted by PM Hut on July 17, 2013 at 02:09 PM EDT #

Thank you for commenting, PM Hut. Project Management is very important to any of project’s success. However, I feel that success does not just depend on Project Management, but many other additional moving parts, including the architecture, data governance (as 80% of the problems are data related) and training/people, clear mandates and ownership. All of these pieces are required for the successful delivery of a transformation initiative.

Posted by guest on July 22, 2013 at 03:18 PM EDT #

Post a Comment:
  • HTML Syntax: NOT allowed