Friday Jan 20, 2012

Gain the Customer-Service Advantage with Agile Order-cycle Processes

Communications Service Providers (CSPs) are faced with declining voice revenues; hyper-competition from increasing number of IP network based providers and customer demand for integrated telephony, mobile, TV and internet services. While “Triple play” or “Quadruple play” offerings have become the norm, CSPs are experiencing increasing customer churn and revenue-leakage arising from errors and delays in order management across order-capture and order-provisioning.

Let me outline the operational challenge that limits the pursuit of streamlined and integrated order orchestration processes. Legacy OSS/BSS systems have stove-pipe order management systems and processes that are aligned to customer segments, service types or service bundles. This has the following implications:

  • increased complexity in on-demand and converged services provisioning
  • poor estimation of actual demand (in terms of order being processed in the BSS systems) and thus poor network capacity and inventory planning
  • stop-gap workarounds for order orchestration unable to cope with increasing order volume and converged service offering complexity

So what are the business implications for a CSP of a fragmented order-cycle?

Firstly, this leads to revenue leakage stemming from order fallout due to the following factors:

  • delays in orchestrating orders such as service bundles across OSS/BSS silos
  • errors in provisioning converged services due to fragmented view of orders e.g. "triple play"

Secondly, CSPs experience customer churn with dissatisfaction resulting from

  • in-flexibility in changing in-flight orders without un-doing them or incurring a penalty
  • expectation-reality gap with untimely and error-prone provisioning belying CSP promises

One possible solution lies in building a light-weight BPM layer between customer-interaction channels and OSS CRM systems as well as between BSS Order Management systems and the OSS silos (in some cases the OSS SDP layer). The benefit of a single order pipe and centralized order orchestration layer will be as follows:

  1. Improved tracking and status reporting of multi-part orders e.g converged services
  2. Timely and accurate provisioning with "in-flight" order customization and lower error-rates of multi-part orders
  3. Lower order fallout and abandonment resulting from reduction in order processing error-rates

Below is a concept architecture for overcoming the organizational, technological and converged services complexity to achieve order-cycle agility.

While Service Delivery Platforms can and have unified the order flow within individual OSS, BSS and organizational silos, the frontier in customer service advantage lies in unifying order flows at the edge to deliver consistent experience to external stakeholders especially customers. Moving to next generation services creates an opportunity to differentiate through timely and accurate provisioning of converged services.Hence CSPs need to harmonize order-cycle processes, across traditional OSS/BSS ‘silos’ and multiple business units, to deliver true on the promise of offering single-point-of-contact (SPOC) converged communication services for businesses and consumers alike.

Thursday Jan 12, 2012

Stuck in Cement: Turn to BPM for edge applications

[Note: Cross-posted from]

"Stuck in Cement: When Packaged Apps Create Barriers to Innovation", reads the title of a recent Forrester research paper. The author, Craig Le Clair, laments that packaged applications create inertia that makes it harder for organizations to embrace change from an execution perspective. As per the report, there is widespread frustration with regards to ability to packaged applications to allow businesses to break free from operational silos and embrace change. So does that mean packaged applications are the root of all organizational inertia and should be dispensed with? Certainly not!

Vertical or horizontal applications packaged applications were intended to provide scale to business operations in terms of Capacity (i.e. volume), Performance (i.e. Straight Through Processing (STP)) and Compliance (with standards and /or  regulation) while mitigating time, effort and comprehensive skills set requirements, both technical and functional, of developing custom applications. The same rationale and value of packaged applications holds true, even more so today,  when time-to-value (lead-to-cash and trouble-to-resolve) and time-to-market (concept-to-market and time-to-compliance) pressures are greater than ever. While technology innovations such as Cloud accelerate initial set-up time and effort, to a large extent, cloud based applications apportion up-front and on-going costs of packaged applications over their life-time. It would be sacrilegious to claim that cloud based applications will solve the agility issues faced with on-premise applications. In fact the integration challenge would remain largely the same, if not get more complicated especially given the security, privacy and data synchronization concerns.

The problem of responding to change from an packaged applications perspective has been incorrectly associated with the eradication of business silos. Organizational and IT systems stove-pipes have been berated as being the cause of dysfunction in responding to change. But are organizational silos really bad? If so, why do they develop in the first place? Organization and IT system silos are a consequence and concomitance of natural evolution as the organization grows in the depth and breadth of its offerings, geographic reach, vertical specialization and market (i.e. customer segments). To respond to business priorities, that is revenue growth, margins, profitability or market share, organization will continue to become more complicated.. Matrix organizational structures are giving way to mesh (i.e. network) like organizational structure where the boundaries between internal lines of business and the external stakeholders (including customers, partners and suppliers) is blurring. Shouldn't businesses then be making more investments in packaged applications that are purpose-fit for specific customer niches, geographies and industries? Clearly, the flexibility of changing existing packaged-applications to meet new business needs is overrated in today's business environment.

The solution lies in providing a consistent experience across external interfaces while continuing to make investments in internal applications (packaged or custom). After all specialized, purpose-fit, applications will deliver a competitive advantage. This is where edge applications built on BPM shine in overcoming the change inertia plaguing businesses. For instance, let's consider a local retailer contemplating entry in an overseas market. What if the retailer's existing CRM system does not fit the requirements of rapid-entry into the target market? What choices does the retailer have?

One choice, could be to customize the existing CRM system through customized development effort. Another choice, could be rip-and-replace the existing CRM system with a new on-premise or cloud based CRM system. The latter approach may appear tempting in vendor pitches but is not for the faint-hearted in practice. To quote Carl Von Clausewitz, "Everything in strategy is very simple, but that does not mean that everything is easy!" In reality neither of the above approaches scale in the long-term.

Yet another alternative, one that businesses typically resort to, is to deploy a new CRM system that is purpose-fit for the requirements of the overseas market. In this case, the business is faced with the time and effort of re-coding business rules and compliance policies in the new CRM system. Though this approach makes sense it becomes harder to scale when future needs complicate integration effort and consistent enforcement of business rules and compliance policies across the stove-pipe CRM systems. However businesses can circumvent these issues if they build an intermediate layer that interfaces with the customer channels and orchestrates the orders across the different front-end CRM systems. In this manner, businesses get the performance and capability benefits of purpose-fit packaged-applications and while being able to apply business rules and compliance policies consistently across them thereby providing a uniform customer experience across the external channels.

The future is here today and BPM addresses the long-standing challenge of strategy-execution gap by serving as a platform for building edge applications.

Thursday Dec 29, 2011

Getting Social with BPM in 2012

Over the last few months i have come across numerous perspectives in blogs (from leading research firms such as Forrester, Gartner etc.) and marketing collateral from vendors (including mega-vendors and pure-play BPM vendors) on the promise of Social BPM. Analysts and BPM vendors are undivided in heralding Social BPM as a key trend for BPM in 2012. Just about every BPM vendor is making claims of delivering Social capability in their BPM offering. After all why wouldn't they, Social is the new buzzword and by some measure every technology (software or hardware) that has any human interaction seems like a suitable candidate for being labeled as Social.

Most vendors define Social BPM as the use of Web2.0 technologies to drive BPM efforts in terms of discovering, developing and fine-tuning business processes with their BPM tools. However, this view is rather narrow given the amplification, or should i say buzz, the term Social BPM is conjured to generate. I believe there is more to Social BPM that just integrating Web 2.0 technologies such as portals, IM etc. to aid the usage of the BPM tools sets with improved collaboration. Suspend disbelief in my claim for just a little bit longer and allow me to build my case on the business value of BPM.

Business priorities are centered largely on three things - Revenue, Cost and Customer Satisfaction. The fundamental value of BPM lies in empowering businesses to achieve efficiency, agility (in terms of responsiveness to change) and visibility (in terms of business operations insight) through process standardization and streamlining. BPM enables revenue up-lift from better lead-to-cash processes, cost rationalization through streamlined back-office and operational processes and improved customer satisfaction through conformance to customer/partner SLAs. However the promise of Social BPM as envisioned today to process discovery, design and development by incorporating multiple voices through Web2.0 technologies, does little to drive business value in terms of the above. 

Beyond leveraging Social capabilities in BPM tools through Web 2.0 technologies, a more business centric use-case for Social BPM could be leveraging external and internal Social networks to augment existing business process. For instance, HR processes for talent acquisition could be integrated with professional Social networks such as LinkedIn to source candidates and perform background checks. Similarly, Sales processes such as Lead generation could be integrated with personal Social networks such as Facebook to drive advertising leads into the sales funnel. Yet another example could be of linking customer feedback and review from recommendation networks such as Yelp into product planning processes to provide a constant stream of customer intelligence, in terms of needs and satisfaction levels.

Clearly there is more to Social BPM and i believe we are still scratching the surface.

Look forward to hearing your thoughts on how you are envisioning Social BPM in your business.

Monday Dec 19, 2011

Improving Visibility of Payments Value-Chain part 2 / 2

In my earlier post I discussed the business imperative for improving visibility in payments processes and the factors (demand, organizational and technology) creating impedance in realizing the vision of Straight-Through-Processing. In this post, I will outline some approaches of improving Straight-Through-Processing in payments processes to improve visibility in them.

Visibility in payments processing is fundamentally about knowing the status of payments as they flow from front-office initiation channels, across mid-office fraud, risk and compliance systems , through the payments application i.e. AR,AP and finally through external interfaces with clearing providers and financial network providers e.g. SWIFT, ACH etc. From a technology standpoint, payments straight-through-processing requires streamlining the integration across all these inbound channels, outbound interfaces and ancillary systems.

One approach is to consolidate all of the existing payments applications with a payments hub. Payments hubs are packaged applications that consolidate interfaces, both inbound and external, and offer full-fledged payments processing capabilities including AR, AP, treasury, exception management i.e. payments repair and reporting. In principle, centralization of payments processing with payments hubs provides real-time cash flow visibility by eliminating manual / semi-automated reconciling across multiple payment application stove-pipes, enables single-source-of-truth audit reporting and streamlined governance. It is important to note though that such application consolidation with a payments hub does require serious investments in time and effort, potentially being a multi-year effort and may be cost prohibitive depending on the comprehensiveness of the payments hub in terms of payment processing capability.

An alternative to application consolidation is interface consolidation with a payments gateway. Payments gateways streamline the number of integration points for payments applications by serving as a common pipe through which all in-bound and out-bound payment traffic flows. Unlike payments hubs, payments gateways do not replace core payments processing that’s undertaken by the payments applications. They offer connectivity to internal applications and external interfaces through pre-built adapters. This approach improves visibility in terms of being able to offer an aggregated view of the payments traffic in terms of volume and type of payments. Sophisticated gateways offer the ability to introspect the payments traffic, which could comprise of different message formats, there by offering a real-time snapshot of payments inflow, payments outflow and exceptions.

A hybrid approach is to use BPM in conjunction with the above technologies. Here, a lightweight abstract process is created to represent the end-to-end journey from the time a payment enters the organizational boundary to time it leaves the organizational boundary. This enables a chronological view of the payments workloads thereby giving insight for improving business operations by eliminating the bottlenecks for a specific line of business or initiation channel or payments application. Such abstract processes could be modeled as events driven processes that are triggered / invoked by different systems as payments flow through them. This BPM layer could also serve as the single-source for tracking payments. Below is a graphical depiction of how a BPM layer can be used to gain end-to-end visibility of payments.

Wednesday Dec 14, 2011

Improving Visibility of Payments Value-Chain part 1 / 2

Payments processing is a central activity for financial institutions, especially retail banks, and intermediaries that provided clearing and settlement services. Visibility of payments processing is essentially about the ability to track payments and handle payments exceptions as payments flow from initiation to settlement. The business imperative for financial institutions, especially retail banks, for improving visibility of their payments processes stems largely from the following:

  • Lowering time and cost of fraud detection, risk management and compliance by applying these efforts in a centralized manner across lines of businesses, payment types and payment channels
  • Gaining real-time visibility of their cash-flows to optimize working capital by improving effeciency in borrowing and lending and negotiating appropriate SLAs with intermediaries such as clearing houses and payment channel providers such as credit card providers

While automation has improved capacity of existing payments systems to cope with ever-increasing volume of payments traffic, there remain several hurdles to improving visibility of payments processes. Payments processing is a complex businesses for largely the following reasons:

  1. Large and growing number of channels for payments initiation. This includes non-electronic channels such as in-person and post, and electronic channels such as ATM, KiosKs, Point-of-Sale (PoS), Online, Mobile etc.
  2. Multiple payment types including cash, check/draft, card (credit / debit), Electronic Funds Transfer (EFT), Wire transfer etc.
  3. Payments initiated as a particular type could be cleared and settled as another type. For instance, a customer may pay a merchant using a check and the merchant's bank may scan the check and send it as an electronic payment type through a clearing counter-party.
  4. Varying governance requirements across payment type, clearing intermediaries, governments and industry standards
  5. Loss of control due to separation of payment processing function across different entities e.g. initiation of card payments is handled by a retail function in a bank whereas clearing of card payments could be handled a card provider such as VISA

In addition to the above there are the following operational hurdles faced by a retail bank:

  1. Out-dated payment systems that rely on batch-processing thereby making it incredibly difficult to report status of individual payments.
  2. Multiple payment systems, each with their own fraud, compliance and risk systems, that are not integrated thereby increasing time, cost and complexity of fraud detection, compliance and risk management
  3. Multiple external interfaces to clearing intermediaries e.g. SWIFT, ACH, FedWire, Card providers, each with their unique security and message exchange requirements
  4. Structural silos, internal to a bank, aligned to payment types and systems thereby hampering enterprise-wide view of payment activities

In the next post, I will explore some approaches to achieving STP and improving visibility in payments processes that span front-office initiation channels, ancilliary back-office systems and external interfaces.

Saturday Dec 10, 2011

Harnessing Business Events for Predictive Decision Making - part 3 / 3

The previous posts on this topic discussed the need for brain-like decision systems, key attributes of such systems and the enabling technology components. This post drills down into some of the common use-cases where opportunity cost of split-second "sense-and-respond" is overwhelming and intelligent BPM systems, or iBPMS (a term coined by Gartner in a recent research report) are gathering momentum. 

  • Financial Services - Payments processing is the bloodstream of financial services institutions. Banks and network providers (e.g. card issuers, clearing houses etc.) are experiencing phenomenal growth in volume of payments driven by emergence of newer payment channels (e.g. NFC contact-less mobile), greater payment types and aggressive drive to reach out to the un-banked population. Hence there is ever greater regulatory and commercial pressure to prevent fraudulent activities such as identity theft, terrorist financing and money laundering. It is no longer sufficient to rely on existing risk and governance systems to do retrospective analysis to detect and identify source of breaches. After all millions of dollars can be siphoned off in a split second and the perpetrators impossible to trace if the crime was committed with a mobile phone. What's needed is the ability to look at all of the payments transactions as they flow across payments processes, identify a rogue transaction (based on business rules or as an exception) and trigger an alert process to intervene a likely act of fraudulence. Clearly there will be a few false triggers but over time, just like our brains, such systems will be able to predict with greater confidence.

  • Healthcare - Patient Monitoring Systems (PMS), especially life-support systems that are meant to keep patients alive in medical situations where one or more critical organs have failed or are likely ot fail. Clearly, speed of emergency response is highly mission critical, if not life critical, in these systems. The premise of such systems is to monitor vital life statistics continuously and trigger alerts when critical thresholds are reached. However in some instances it is too late for any remedial action even when doctors respond without delay to an alert. In such instances precious lives could be saved if the PMS was able to predict likely organ failure or a threshold breach just a little bit beforehand and not after the fact. Intelligent dashboards that integrate such real-time data feeds from multiple PMSs would allow centralized monitoring and pro-active response thereby increasing critical-care success rates. 
  • Public Sector - Governments are expressing increased concern around cyber-security to safe-guard national interests. As more and more, government workloads and data shift to the internet and inter-linked systems, the vulnerability to and cost of breaches increases manifold. "Terabytes of data are flying out the door, and billions of dollars are lost in remediation costs and reputational harm, government and private security experts said in interviews", (Source: Reuters, June 16, 2011). The cyber-security problem has been tackled largely in a reactive manner till date, where security vendors rush to offer fixes after breaches have occurred or vulnerabilities are disclosed by software vendors. For governments such an approach is simply unacceptable. For instance, by the time a security fix is offered the damage could already be done if US nuclear documents fell in the wrong hands. Next generation cyber-security systems monitor both external i.e. web and internal i.e. business processes in real-time and correlate seemingly isolated data points to detect suspicious activity.

In addition to the above industries, we are seeing application of such iBPM systems, in telecommunications, retail and transportation. My colleague, Dan Tortorici, has written an interesting whitepaper on how Oracle BPM and Oracle Complex Event Processing (CEP) technologies are collectively enabling intelligent process automation, continuous process improvement and business transformation.

Friday Dec 02, 2011

Harnessing Business Events for Predictive Decision Making - part 2 / 3

In my earlier post i discussed the workings of the human brain to illustrate capabilities desired of the next generation decision systems in order to harness business events for real-time predictive decision making.

Below is a graphical depiction of the attributes of a "brain-like" decision system, its benefits and the underlying technology enablers.

Achieving near real-time predictive intelligence creates special demands of the technology components, hardware in particular, in terms of scalability, fault-tolerance and capacity (both compute and storage). After all performance is of the essence in building decision systems that operate at brain-like speeds. A remarkable capability of the human brain is that data and instructions are physically part of the same component, called neurons. This is what allows the brain to store exabytes (1 exabyte = 1 Million terabytes) of data and process equally vast number of real-time events in a flash. In fact more than 99% of the data storage, retrieval and processing in the brain happens without even conscious thought. In contrast, memory and processing is handled by separate components in computers. This is why it takes supercomputers that consume Megawatts of energy (for comparison sake, the human brain consumes around 12 Watts of energy at its peak performance) to simulate human brain activity. Now building a supercomputer for predictive decision making is certainly an over-ambitious, if not audacious endeavor, for most businesses.

Given enough time, skills and financial resources it is certainly possible to build such hardware systems. However, this is neither a forte nor a desirable capability which businesses should strive for. After all, businesses have more pressing concerns around realizing returns on IT investments in ever shrinking time-scales than experimenting with their technology infrastructures. A possible solution lies in integrated systems that are engineered from the ground up and not merely assembled from multi-vendor components. The rationale here is that optimizing individual parts is unlikely to optimize the whole. Hence simply assembling a hardware system by self-integrating server, storage, inter-connects and operating systems from multiple vendors is likely to create bottlenecks at the the slowest links in the chain thereby negating any benefits arising from performance claims of individual component vendors.

Oracle has a unique capability in delivering engineered systems comprising of middleware components and hardware substrata to allow you to get up and running with such decision systems while rationalizing costs and mitigating execution risk. If you are contemplating such event-driven decision systems feel free to drop me a note.

Sunday Nov 27, 2011

Harnessing Business Events for Predictive Decision Making - part 1 / 3

Businesses have long relied on data mining to elicit patterns and forecast future demand and supply trends. Improvements in computing hardware, specifically storage and compute capacity, have significantly enhanced the ability to store and analyze mountains of data in ever shrinking time-frames. Nevertheless, the reality is that data growth is outpacing storage capacity by a factor of two and computing power is still very much bounded by Moore's Law, doubling only every 18 months.

Faced with this data explosion, businesses are exploring means to develop human brain-like capabilities in their decision systems (including BI and Analytics) to make sense of the data storm, in other words business events, in real-time and respond pro-actively rather than re-actively. It is more like having a little bit of the right information just a little bit before hand than having all of the right information after the fact (premise of the book, "The Two Second Advantage"). To appreciate this thought better let's first understand the workings of the human brain.

Neuroscience research has revealed that the human brain is predictive in nature and that talent is nothing more than exceptional predictive ability. The cerebral-cortex, part of the human brain responsible for cognition, thought, language etc., comprises of five layers. The lowest layer in the hierarchy is responsible for sensory perception i.e. discrete, detail-oriented tasks whereas each of the above layers increasingly focused on assembling higher-order conceptual models. Information flows both up and down the layered memory hierarchy. This allows the conceptual mental-models to be refined over-time through experience and repetition. Secondly, and more importantly, the top-layers are able to prime the lower layers to anticipate certain events based on the existing mental-models thereby giving the brain a predictive ability. In a way the human brain develops a "memory of the future", some sort of an anticipatory thinking which let's it predict based on occurrence of events in real-time. A higher order of predictive ability stems from being able to recognize the lack of certain events. For instance, it is one thing to recognize the beats in a music track and another to detect beats that were missed, which involves a higher order predictive ability.

Existing decision systems analyze historical data to identify patterns and use statistical forecasting techniques to drive planning. They are similar to the human-brain in that they employ business rules very much like mental-models to chunk and classify information. However unlike the human brain existing decision systems are unable to evolve these rules automatically (AI still best suited for highly specific tasks) and  predict the future based on real-time business events. Mistake me not,  existing decision systems remain vital to driving long-term and broader business planning. For instance, a telco will still rely on BI and Analytics software to plan promotions and optimize inventory but tap into business events enabled predictive insight to identify specifically which customers are likely to churn and engage with them pro-actively.

In the next post, i will depict the technology components that enable businesses to harness real-time events and drive predictive decision making.

Friday Nov 18, 2011

Gauging Maturity of your BPM Strategy - part 1 / 2

In this post I will discuss the essence of maturity assessment and the business imperative for doing the same in the context of BPM. Social psychology purports that an individual progresses from being a beginner to an expert in a given activity or task along four stages of self-awareness:

  1. Unconscious Incompetence where the individual does not understand or know how to do something and does not necessarily recognize the deficit and may even deny the usefulness of the skill.
  2. Conscious Incompetence where the individual recognizes the deficit, as well as the value of a new skill in addressing the deficit.
  3. Conscious Competence where the individual understands or knows how to do something but demonstrating the skill requires explicit concentration.
  4. Unconscious Competence where the individual has had so much practice with a skill that it has become "second nature" and serves as a basis of developing other complementary skills.
We can extend the above thinking to an organization as a whole by measuring an organization’s level of competence in a specific area or capability, as an aggregate of the competence levels of individuals it is comprised of. After all organizations too like individuals, evolve through experience, develop “memory” and capabilities that are shaped through a constant cycle of learning, un-learning and re-learning. Hence the key to organizational success lies in developing these capabilities to enable execution of its strategy in-line with the external environment i.e. demand, competition, economy etc. However developing a capability merits establishing a base line in order to
  • Assess the magnitude of improvement from past investments
  • Identify gaps and short-comings
  • Prioritize future investments in the right areas

A maturity assessment is essentially an organizational self-awareness check that is aimed at depicting the “as-is” snapshot of an existing capability in-order to guide future investments to develop that capability in-line with business goals. This effectively is the essence of a maturity assessment.

Organizational capabilities stem through its architecture, routines, culture and intellectual resources that are implicitly and explicitly embedded in its business processes. Given that business processes underpin realization of organizational capabilities, is what has prompted business transformation and process management efforts. Thus, the BPM capability of an organization needs to be measured on an on-going basis to ensure delivery of its planned benefits.

In my next post I will describe Oracle’s BPM Maturity assessment methodology.

Thursday Nov 17, 2011

Managing Operational Risk of Financial Services Processes – part 2/2

In my earlier blog post, I had described the factors that lead to compliance complexity of financial services processes. In this post, I will outline the business implications of the increasing process compliance complexity and the specific role of BPM in addressing the operational risk reduction objectives of regulatory compliance.

First, let’s look at the business implications of increasing complexity of process compliance for financial institutions:

· Increased time and cost of compliance due to duplication of effort in conforming to regulatory requirements due to process changes driven by evolving regulatory mandates, shifting business priorities or internal/external audit requirements

· Delays in audit reporting due to quality issues in reconciling non-standard process KPIs and integrity concerns arising from the need to rely on multiple data sources for a given process

Next, let’s consider some approaches to managing the operational risk of business processes. Financial institutions considering reducing operational risk of their processes, generally speaking, have two choices:

· Rip-and-replace existing applications with new off-the shelf applications.

· Extend capabilities of existing applications by modeling their data and process interactions, with other applications or user-channels, outside of the application boundary using BPM.

The benefit of the first approach is that compliance with new regulatory requirements would be embedded within the boundaries of these applications. However pre-built compliance of any packaged application or custom-built application should not be mistaken as a one-shot fix for future compliance needs. The reason is that business needs and regulatory requirements inevitably out grow end-to-end capabilities of even the most comprehensive packaged or custom-built business application.

Thus, processes that originally resided within the application will eventually spill outside the application boundary. It is precisely at such hand-offs between applications or between overlaying processes where vulnerabilities arise to unknown and accidental faults that potentially result in errors and lead to partial or total failure.

The gist of the above argument is that processes which reside outside application boundaries, in other words, span multiple applications constitute a latent operational risk that spans the end-to-end value chain. For instance, distortion of data flowing from an account-opening application to a credit-rating system if left un-checked renders compliance with “KYC” policies void even when the “KYC” checklist was enforced at the time of data capture by the account-opening application.

Oracle Business Process Management is enabling financial institutions to lower operational risk of such process ”gaps” for Financial Services processes including “Customer On-boarding”, “Quote-to-Contract”, “Deposit/Loan Origination”, “Trade Exceptions”, “Interest Claim Tracking” etc.. If you are faced with a similar challenge and need any guidance on the same feel free to drop me a note.

Tuesday Nov 15, 2011

Managing Operational Risk of Financial Services Processes – part 1/ 2

Financial institutions view compliance as a regulatory burden that incurs a high initial capital outlay and recurring costs. By its very nature regulation takes a prescriptive, common-for-all, approach to managing financial and non-financial risk. Needless to say, no longer does mere compliance with regulation will lead to sustainable differentiation. Genuine competitive advantage will stem from being able to cope with innovation demands of the present economic environment while meeting compliance goals with regulatory mandates in a faster and cost-efficient manner.

Let’s first take a look at the key factors that are limiting the pursuit of the above goal.

Regulatory requirements are growing, driven in-part by revisions to existing mandates in line with cross-border, pan-geographic, nature of financial value chains today and more so by frequent systemic failures that have destabilized the financial markets and the global economy over the last decade. In addition to the increase in regulation, financial institutions are faced with pressures of regulatory overlap and regulatory conflict.

Regulatory overlap arises primarily from two things: firstly, due to the blurring of boundaries between lines-of-businesses with complex organizational structures and secondly, due to varying requirements of jurisdictional directives across geographic boundaries e.g. a securities firm with operations in US and EU would be subject different requirements of “Know-Your-Customer” (KYC) as per the PATRIOT ACT in US and MiFiD in EU.

Another consequence and concomitance of regulatory change is regulatory conflict, which again, arises primarily from two things: firstly, due to diametrically opposite priorities of line-of-business and secondly, due to tension that regulatory requirements create between shareholders interests of tighter due-diligence and customer concerns of privacy. For instance, Customer Due Diligence (CDD) as per KYC requires eliciting detailed information from customers to prevent illegal activities such as money-laundering, terrorist financing or identity theft. While new customers are still more likely to comply with such stringent background checks at time of account opening, existing customers baulk at such practices as a breach of trust and privacy.

As mentioned earlier regulatory compliance addresses both financial and non-financial risks. Operational risk is a non-financial risk that stems from business execution and spans people, processes, systems and information. Operational risk arising from financial processes in particular transcends other sources of such risk. Let’s look at the factors underpinning the operational risk of financial processes.

The rapid pace of innovation and geographic expansion of financial institutions has resulted in proliferation and ad-hoc evolution of back-office, mid-office and front-office processes. This has had two serious implications on increasing the operational risk of financial processes:

· Inconsistency of processes across lines-of-business, customer channels and product/service offerings. This makes it harder for the risk function to enforce a standardized risk methodology and in turn breaches harder to detect.

· The proliferation of processes coupled with increasingly frequent change-cycles has resulted in accidental breaches and increased vulnerability to regulatory inadequacies.

In summary, regulatory growth (including overlap and conflict) coupled with process proliferation and inconsistency is driving process compliance complexity

In my next post I will address the implications of this process complexity on financial institutions and outline the role of BPM in lowering specific aspects of operational risk of financial processes.


A business centric perspective on Private Cloud, Data-center Modernization and EAI.

Sanjeev Sharma
Twitter: @sanjeevio


« April 2014