Risk Data Aggregation & Risk Reporting: Will You Be Compliant With BCBS 239 Principles In Time?

Jenna Danko
Product Marketing

Given the phenomenal growth of the financial services industry over the past decade, resulting in a much greater demand on data provisioning, it was imperative that sizeable investments were made in risk data aggregation frameworks to support a bank’s profitable business model.  The rewards of such would have been immeasurable during the recent crisis, simply for the fact that a complete view of what risk was being run against each exposure, counterparty, customer, product, instrument, entity etc. could have been established in minutes rather than days.  An example was the delay in establishing what a bank’s total group exposure and risk was to Lehman’s at the height of the crisis.  Not having this information at hand may inevitably have led to some sub-optimal decision making on how best to weather the impending storm and the fallout shortly afterwards.   

As everyone is probably aware, the Basel Committee, as part of its detailed ongoing review of what went wrong during the global financial crisis and what should be done to negate the impact of a future crisis, released BCBS 239 “Effective Risk Data Aggregation & Risk Reporting” back in Jan 2013. It was no surprise that the primary focus was on how overall, banks were less than agile in getting a complete, granular, transparent aggregated view of the risks they faced.

Whether this was due to a less than robust risk data governance infrastructure, limited capabilities for risk data aggregation or opaque reporting practices, it is clear that considerable resources and efforts are required by banks to ensure they are fully compliant with all 11 principles of BCBS 239 by January 1st, 2016.

Although that deadline seems manageable, the Basel Committee sought to get a feel of what shape the industry was in, hence its Working Group on Systemically Important Banks (SIB) Supervision (WGSS) was tasked to put together a questionnaire (87 questions in total), which was sent to approximately 30 global SIBs to get their responses and overall level of preparedness against each of the BCBS 239 principles.  The results of that survey have been digested and conclusions drawn in the paper Progress in adopting the principles for effective risk data aggregation and risk reporting (December 2013), and make for interesting reading.

So where are the banks today in meeting that January 2016 deadline?  From a numbers point of view and considering the assessment was done on a ‘best effort basis’ the average rating against all 11 principles was 2.8 (4 = Fully Compliant and 1 = Non-Compliant) and individual principles ranged between 2.5 to 3.2.  So is that good or bad? Well it’s a halfway house between being ‘largely compliant’ to ‘materially non-compliant’.  Rather than dwell on qualifying the intricacies between the two, what is revealing is the anomaly or may I say, contradiction, in some of the responses/ratings given.  

Looking at Principles 2, 3 and 6, ‘Data Architecture/IT Governance’, ‘Accuracy/Integrity’ and ‘Adaptability’, all of these scored the lowest, 2.5 to 2.6. Slightly concerning that half of the respondents indicated they were far from being compliant.  However when compared to Principles 8, 9, and 11, ‘Comprehensiveness’, Clarity/Usefulness and ‘Report Distribution’, where banks gave themselves top marks, there appears to be a disconnect between the data provision and assurance of the former versus the usage and distribution of the latter.  How does one reconcile the two? Well according to the WGSS, they don’t know exactly what to make of it, since the overarching principles of BCBS 239 are interdependent to one another, thus making overall compliance less likely. 

Given such an observation, how should one interpret the findings of the survey?  It can be fair to say, that banks’ business units and departments still operate in their data silos and perhaps the questions were viewed in isolation, as per the status quo.  On the plus side, the survey and the expected follow up by supervisors will provide greater insight on the existing processes, procedures and controls pertaining to the risk data cycle.  And that can only be beneficial; as 20% of the banks surveyed accepted that they were ‘materially non-compliant’ against half of the principles and almost one-third not expecting to be compliant at all with at least one principle. 

One would hope that banks are fully aware of the problems they face (despite the survey results) and their order of magnitude.  The challenge is whether to approach and resolve the problems piecemeal or under the umbrella of an integrated solution.   Turning again to the survey, this sheds light of where the primary weakness lie, namely

  • the continued reliance on manual workarounds
  • a lack of a consolidated view of risk
  • fragile risk systems
  • less than satisfactory risk data governance
  • opaque definitions of data ownership
  • weak controls around data quality assurance
  • not enough documented policies and procedures around risk data aggregation

There is nothing new in the list mentioned above, new in the sense that one would be hard pressed to disagree with the findings and recommendations of the latest BIS survey, since they are ingredients to a robust risk data aggregation framework, which if viewed objectively, is operationally conducive to maximizing the overall risk return of the bank, whilst simultaneously safeguarding the bank’s capital and liquidity during a highly protracted stress scenario.

The BCBS 239 paper goes into extensive detail of each of the 11 guiding principles, but it should be noted that high frequency risk data aggregation and reporting simply won’t flow out of a set of disparate processes and applications.  Rather there needs to be a clear understanding of what data is used where, for what purpose, for what frequency, and by whom i.e. the entire risk data cycle should be transparent, detailed and well documented to the point where, for example, a Risk Management team could run ad-hoc risk analysis at whatever level of granularity or consolidation during a Board Committee meeting.  

Can the industry take the necessary steps to ensure they are better prepared when the next crisis hits? Tentatively, yes, but the path to compliance will be progressive rather than immediate.  The Basel Committee recognizes this and will be formally monitoring banks’ programs in meeting the deadline for compliance by way of delving further and unearthing the specific reasons for potential non-compliance and assess a bank’s ability to rapidly aggregate risk data and generate appropriate risk output under the pretext of a stressed environment.    

Undoubtedly, this will further stretch a bank’s resources, which are currently submerged in a plethora of activities related to Basel III, but getting BCBS 239 right will have much more broader real tangible benefits than initially envisaged.

Ziauddin Ishaq is the Global Solutions Lead for Liquidity Risk at Oracle.  The views expressed on this blog are his own and do not necessarily reflect the views of Oracle. He can be reached at ziauddin.ishaq AT oracle.com.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.