Monday Oct 15, 2012

Oracle Exalogic Customer Momentum @ OOW'12

[Adapted from here

At Oracle Open World 2012, i sat down with some of the Oracle Exalogic early adopters  to discuss the business benefits these businesses were realizing by embracing the engineered systems approach to data-center modernization and application consolidation. Below is an overview of the 4 businesses that won the Oracle Fusion Middleware Innovation Award for Oracle Exalogic this year.

Company: Netshoes
About: Leading online retailer of sporting goods in Latin America.

  • Rapid business growth resulted in frequent outages and poor response-time of online store-front
  • Conventional ad-hoc approach to horizontal scaling resulted in high CAPEX and OPEX
  • Poor performance and unavailability of online store-front resulted in revenue loss from purchase abandonment

Consolidated ATG Commerce and Oracle WebLogic running on Oracle Exalogic.
Business Impact:
Reduced abandonment rates resulting in a two-digit increase in online conversion rates translating directly into revenue up-lift

Company: Claro
About: Leading communications services provider in Latin America.

  • Support business growth over the next 3  - 5 years while maximizing re-use of existing middleware and application investments with minimal effort and risk

Consolidated Oracle Fusion Middleware components (Oracle WebLogic, Oracle SOA Suite, Oracle Tuxedo) and JAVA applications onto Oracle Exalogic and Oracle Exadata.
Business Impact:
Improved partner SLA’s 7x while improving throughput 5X and response-time 35x for  JAVA applications

Company: UL
About: Leading safety testing and certification organization in the world.

  • Transition from being a non-profit to a profit oriented enterprise and grow from a $1B to $5B in annual revenues in the next 5 years
  • Undertake a massive business transformation by aligning change strategy with execution

Consolidated Oracle Applications (E-Business Suite, Siebel, BI, Hyperion) and Oracle Fusion Middleware (AIA, SOA Suite) on Oracle Exalogic and Oracle Exadata
Business Impact:
Reduced financial and operating risk in re-architecting IT services to support new business capabilities supporting 87,000 manufacturers

Company: Ingersoll Rand
About: Leading manufacturer of industrial, climate, residential and security solutions.

  • Business continuity risks due to complexity in enforcing consistent operational and financial controls;
  • Re-active business decisions reduced ability to offer differentiation and compete

Consolidated Oracle E-business Suite on Oracle Exalogic and Oracle Exadata
Business Impact:
Service differentiation with faster order provisioning and a shorter lead-to-cash cycle translating into higher customer satisfaction and quicker cash-conversion

Check out the winners of the Oracle Fusion Middleware Innovation awards in other categories here.

Wednesday Sep 26, 2012

Supercharging the Performance of Your Front-Office Applications @ OOW'12

[Re-posted from here.]

You can increase customer satisfaction, brand equity, and ultimately top-line revenue by deploying  Oracle ATG Web Commerce, Oracle WebCenter Sites, Oracle Endeca applications, Oracle’s  Siebel applications, and other front-office applications on Oracle Exalogic, Oracle’s combination  of hardware and software for applications and middleware.

Join me (Sanjeev Sharma) and my colleague, Kelly Goetsch, at the following conference session at Oracle Open World to find out how Customer Experience can be transformed with Oracle Exalogic:

Session:  CON9421 - Supercharging the Performance of Your Front-Office Applications with Oracle Exalogic
Date: Wednesday, 3 Oct, 2012
Time: 10:15 am - 11:15 am (PST)
Venue: Moscone South (309)

Saturday Jul 14, 2012

Oracle Exalogic in Higher-Education: Virtual Learning Environments

In the quest to become the leading education institutions of choice  and draw world-class academic and student talent forward-thinking universities continue to embrace and evolve ICT to further their agenda in learning, teaching and research. However the global and domestic financial and operating environments impacting universities have grown increasingly challenging applying increased pressure in two ways.

  1. From a revenue perspective, the potential of a second global financial crisis looms large, with the potential to trigger yet another global recession. While education sector has been less affected by economic cycles in the past, the unprecedented level of economic turmoil that exists today makes it difficult to anticipate the revenue ramifications.
  2. From a cost perspective, further globalisation has greatly increased the competitive nature of the higher education sector, especially so due the boom in demand for education services and proliferation of education providers in emerging markets.
So how are leading universities preparing themselves to respond to this challenge and what sort of transformation are they hoping to realize?

Integral to achieving their goal of attracting top student talent is being able to provide outstanding student experience. While enhancing the campus based experience is still very important, universities are increasingly looking to augment physical campus based learning with virtual, online delivery of educational programs and services.

To offer students an engaging, stimulating and fun environment for learning universities have invested in a range of Information and Communication Technologies (ICT) at the core of which is the Student Portal, the Learning Management System and Student self-services such as IT help-desk etc. Periodically universities need to undertake a major refresh of these applications to deliver the next generation, collaborative and mobile learning experience. In addition to this, back-office university information systems must support seamless and cost effective access to information for decision making, and transactional services. Universities increasingly want to deliver shared services in collaboration with other institutions. As such universities are refreshing their back-office finance and resource planning applications with to ensure they can drive efficiency in their critical budget planning and operations processes.

Now there are many other applications universities rely on to manage their infrastructure, administrative services, alumni services and so on. A key challenge facing universities in their large-scale application modernization efforts is that upgrading to modern applications places further demands on data-centre infrastructure include storage, compute nodes and networking gear. Not to mention, refreshing the data-centre infrastructure entails integration risk due to multi-vendor procurement, testing, tuning and optimization. 

An approach to IT that worked well in the past centered around plugging the gaps in desired capability and driven by ad-hoc requests. However the sustainability of that approach is becoming a real impediment to optimizing CAPEX and OPEX budgetary controls given the woes of infrastructure fragmentation. As such universities are now standardizing their infrastructure and consolidate core applications on an open standards based environment. This is what is shifting their thinking towards engineered systems from Oracle including Oracle Exadata for the data tier and Oracle Exalogic for the middle tier.

Broadly speaking, there are primarily 3 factors why Oracle Exalogic has become the logical choice of running business applications for higher education institutions:

Firstly, Oracle Exalogic has enabled universities to accelerate the go-live time for application modernization by providing a pre-integrated, pre-tested, pre-optimized and pre-tuned infrastructure that enables end-to-end apps-to-disk management.

Secondly, Oracle Exalogic has allowed universities to consolidate application workloads, thus reducing the number of physical servers and further improving data-centre density through virtualization. This has brought cost savings in terms of software licenses, maintenance and energy consumption.

Thirdly, Oracle Exalogic and Oracle Exadata have allowed universities to shift towards a private cloud platform model for metering and charging computing resources as multi-tenant services, effectively transforming their IT from a cost-center to a profit-center.

Find out more, at the upcoming Exalogic Elastic Cloud 2.0 Launch.

Thursday Jul 12, 2012

Oracle Exalogic in Public Sector: Law Enforcement

The mandate for government law enforcement agencies is to safe-guard the protect the public against instruments of terror and enforce legislation. Ensuring authorized entry or exit of people, keeping unauthorized and dangerous immigrants under check and monitoring large scale goods movement is clearly one of the most critical and demanding operating environments. Processing thousands of people and vehicles places extreme demands on the underlying IT infrastructure. To support their mission critical operations, governments rely on massive data center facilities typically dispersed across multiple locations. Such IT operations are managed by several thousand personnel.

When it is a matter of national security, failure is not an option under any circumstances. Uninterrupted 24x7x365 operation is a necessity, since an hour of downtime can back up thousands of people at the borders during peak traffic times. Needless to say, the IT solutions law enforcement agencies require must offer utmost reliability, scalability, and security. Given the mission-critical nature of their operations, homeland security agencies are looking to evolve their IT infrastructure in a sustainable manner so as to cater to both existing and future workloads.

The overarching priority for law enforcement agencies is to reduce data center costs through application and infrastructure consolidation.

The challenge with the traditional approach is that:

  • Parts are not guaranteed to work together.
  • There are too many possible variations and a lack of standardization.
  • The final product is not optimized for best performance or maintenance.
  • There is no overall warranty.

The Engineered Systems approach to data centre operations is a paradigm shift from the traditional approach of assembling disparate layers of storage, networking, compute nodes, operating systems and so on.

Broadly speaking, there are primarily 3 factors why Oracle Exalogic has become the logical choice of running business applications for law enforcement agencies:

Firstly, by virtualizing the middle-tier infrastructure, Oracle Exalogic enables law enforcement agencies to eliminate large-scale, legacy systems. Moreover, consolidating multiple platforms onto a smaller number of machines requires very little or no modification of the application code. The benefit of physical and virtual consolidation is a manifold reduction in data center footprints and accompanying energy savings (in terms of cooling and consumption).

Secondly, the simplicity of managing a distributed private cloud middle-tier infrastructure in another important deciding factor. Distributed systems often require separate teams to manage the various solution components. As a result, root-cause analysis can be subject to undue complexity as multiple vendors claim that others are responsible for the issues at hand. With Oracle Exalogic, there is an single touch-point for any support issue involving Oracle Applications, middleware, compute nodes, networking and storage.

Thirdly, Exalogic has native integration with Exadata, Oracle’s engineered system for OLTP workloads. Having a standardized data-tier on Exadata creates unmatched synergies in running the middle-tier on Oracle Exalogic. Find out more, at the upcoming Exalogic Elastic Cloud 2.0 Launch.

Wednesday May 02, 2012 BPM Patterns & Practices in Industry

If you have missed out any of my industry blog posts in the past, the following 30 min webcast brings the entire effort to closure.

Sunday Apr 29, 2012

Day 2: Highlights @ Gartner BPM Summit, Baltimore, 2012

Below is a graphical approach to depicting a Business Transformation (in other words value of BPM) that was shared during one of the sessions at the Gartner BPM summit.

I reckon this is really useful for IT decision makers in articulating value of BPM to the business leaders and showing progress along a time horizon i.e. the transformation roadmap.

Note: The color scheme could indicate priority or magnitude of investment or some combination of the both

Tuesday Apr 03, 2012

ARTS Reference Model for Retail

Consider a hypothetical scenario where you have been tasked to set up retail operations for a electronic goods or daily consumables or a luxury brand etc. It is very likely you will be faced with the following questions:

  1. What are the essential business capabilities that you must have in place? 
  2. What are the essential business activities under-pinning each of the business capabilities, identified in Step 1?
  3. What are the set of steps that you need to perform to execute each of the business activities, identified in Step 2?

Answers to the above will drive your investments in software and hardware to enable the core retail operations. More importantly, the choices you make in responding to the above questions will have several implications in the short-run and in the long-run. In the short-term, you will incur the time and cost of defining your technology requirements, procuring the software/hardware components and getting them up and running. In the long-term, as you grow in operations organically or through M&A, partnerships and franchiser business models  you will invariably need to make more technology investments to manage the greater complexity (scale and scope) of business operations. 

"As new software applications, such as time & attendance, labor scheduling, and POS transactions, just to mention a few, are introduced into the store environment, it takes a disproportionate amount of time and effort to integrate them with existing store applications. These integration projects can add up to 50 percent to the time needed to implement a new software application and contribute significantly to the cost of the overall project, particularly if a systems integrator is called in. This has been the reality that all retailers have had to live with over the last two decades. The effect of the environment has not only been to increase costs, but also to limit retailers' ability to implement change and the speed with which they can do so." (excerpt taken from here)

Now, one would think a lot of retailers would have already gone through the pain of finding answers to these questions, so why re-invent the wheel? Precisely so, a major effort began almost 17 years ago in the retail industry to make it less expensive and less difficult to deploy new technology in stores and at the retail enterprise level. This effort is called the Association for Retail Technology Standards (ARTS). Without standards such as those defined by ARTS, you would very likely end up experiencing the following:

  • Increased Time and Cost due to resource wastage arising from
    • re-inventing the wheel i.e. re-creating vanilla processes from scratch, and
    • incurring, otherwise avoidable, mistakes and errors by ignoring experience of others
  • Sub-optimal Process Efficiency due to narrow, isolated view of processes thereby
    • ignoring process inter-dependencies i.e. optimizing parts but not the whole, and
    • resulting in lack of transparency and inter-departmental finger-pointing

Embracing ARTS standards as a blue-print for establishing or managing or streamlining your retail operations can benefit you in the following ways:

  • Improved Time-to-Market from parity with industry best-practice processes e.g. ARTS, thus
    • avoiding “reinventing the wheel” for common retail processes and focusing more on customizing processes for differentiations, and
    • lowering integration complexity and risk with a standardized vocabulary for exchange between internal and external i.e. partner systems
  • Lower Operating Costs by embracing the ARTS enterprise-wide process reference model for
    • developing and streamlining retail operations holistically instead of a narrow, silo-ed view, and 
    • procuring IT systems in compliance with ARTS thus avoiding IT budget marginalization

While parity with industry standards such as ARTS business process model by itself does not create a differentiation, it does however provide a higher starting point for bridging the strategy-execution gap in setting up and improving retail operations.

Friday Feb 24, 2012

Mitigating Cyber-Security Risks to Smart-Grid AMI

Today, utilities are committing to capital intensive investments in upgrading to smart-grid/metering infrastructure driven by government regulation and consumer awareness. However, in this process of becoming increasingly “smarter”, the digitization of the grid is blurring lines between operational, information and communication technologies. This inadvertently makes the grid highly vulnerable to cyber-attacks, physical sabotage and equipment malfunction in the “last-mile” i.e. Automated Metering Infrastructure (AMI). Key components of AMI are:

  • Smart-meter
  • Head-end Systems (HES)
  • Meter Reading-Control Systems (MRC)
  • Meter Data Management Systems (MDMS)
  • Load Control Devices (LC)

The graphic below depicts the logical architecture of AMI, although in practice capabilities of multiple components (other than meters) could be encapsulated in the same application.

(Source: )

The table below describes the cyber-security requirements for AMI identified by Cyber Security Working Group (CSWG) of the Smart-Grid Interoperability Panel (SGIP), based on the NIST IR 7628 standard for Smart-Grid security.

(Source: )

BPM can drive improvements in securing the “last-mile” by enforcing smart-grid security guidelines such as NIST IR 7628. The benefits of greater control and visibility of the AMI processes as listed in the table above are as follows:

  • Secure  “Last Mile” in energy distribution from cyber-threats and physical attacks with predictive and real-time Fault Modes and Event Assessment (FMEA) for AMI processes
  • Lower Regulatory Risk in terms of consumer litigation and disputes with automated audit trail of data flows across the AMI to limit and timely respond to data confidentiality breaches
  • Improve Customer Satisfaction by preventing unwarranted disconnections arising from meter malfunction, human-error and data corruption

Clearly, there is an enormous promise in terms of energy conservation and lower carbon footprint in the transformation of the electric grid. However the infusion of information and communication technologies into the conventional electro-mechanical grid does creates unknown vulnerabilities that can bring the grid, if not society, to a halt. Securing the “last-mile” i.e. AMI will be crucial to realizing the promise of the smart-grid future.

Friday Jan 27, 2012

Driving Operational Efficiency with eTOM

The distinction between network operator Communications Service Providers (CSPs) and virtual CSPs e.g. MVNOs is decreasing by the day driven by industry deregulation and proliferation of IP based networks that have lowered barriers to entry. This hyper-competition is creating continuous pressure on CSPs to shorten time-to-market cycles for PLM and FAB (fulfillment, assurance and billing) processes to differentiate.

The pursuit of this goal though, is hampered by the complex organizational structure with silos aligned to customer segments. service types or service bundles. Consequently, line-of-business priorities take precedence over end-to-end business goals resulting in

  • Frequent IT refresh cycles with “fire-fighting” IT investment approach
  • Further fragmentation of the OSS/BSS with heterogeneous non-interoperable multi-vendor IT systems

So, why do such IT oriented or driven business transformation efforts repeatedly under-perform resulting in time and budget over-runs? Assuming the issue is not about the capability of the IT systems or the people implementing them. That leaves us with only one other lever and that is processes. Non-compliance to standards based ITSM frameworks to drive IT investments poorly aligned with end-to-end business priorities results in 

  • Ad-hoc or silo driven IT procurement limiting ability to realize vendor section and contract negotiation benefits of firm-wide procurement
  • Integration complexity of attaining end-to-end process view across non-standards based processes proliferating across multiple systems

A popular approach to achieving operational efficiency in running a service provider business in the communications industry is to strive for conformance with tmforum's Business Process Framework (eTOM).


The benefit of doing so are as follows:

  • Improved Time-to-Market with business-IT alignment driven by a comprehensive catalog of CSP business processes i.e. eTOM, thus lowering integration risk by procuring IT systems pre-validated with industry standards and optimizing service provider operations by embracing industry best-practices in processes
  • Lower Procurement Costs in terms of applying a consistent methodology to guide IT purchase decisions in vendor selection across RFI/RFPs and negotiate IT contracts with an enterprise view and avoiding IT misspend on non-essential or low priority systems, and IT overspend on redundant systems

While parity with industry standards such as eTOM by itself does not create a differentiation in the end-user service offering it does however enables improved utilization of IT budgets and narrows the strategy-execution gap that plagues business transformation programs. Business Process Management helps CSPs embrace eTOM to drive operational efficiency by enforcing a process-aligned methodology to IT procurement and service delivery.

Thursday Jan 12, 2012

Stuck in Cement: Turn to BPM for edge applications

[Note: Cross-posted from]

"Stuck in Cement: When Packaged Apps Create Barriers to Innovation", reads the title of a recent Forrester research paper. The author, Craig Le Clair, laments that packaged applications create inertia that makes it harder for organizations to embrace change from an execution perspective. As per the report, there is widespread frustration with regards to ability to packaged applications to allow businesses to break free from operational silos and embrace change. So does that mean packaged applications are the root of all organizational inertia and should be dispensed with? Certainly not!

Vertical or horizontal applications packaged applications were intended to provide scale to business operations in terms of Capacity (i.e. volume), Performance (i.e. Straight Through Processing (STP)) and Compliance (with standards and /or  regulation) while mitigating time, effort and comprehensive skills set requirements, both technical and functional, of developing custom applications. The same rationale and value of packaged applications holds true, even more so today,  when time-to-value (lead-to-cash and trouble-to-resolve) and time-to-market (concept-to-market and time-to-compliance) pressures are greater than ever. While technology innovations such as Cloud accelerate initial set-up time and effort, to a large extent, cloud based applications apportion up-front and on-going costs of packaged applications over their life-time. It would be sacrilegious to claim that cloud based applications will solve the agility issues faced with on-premise applications. In fact the integration challenge would remain largely the same, if not get more complicated especially given the security, privacy and data synchronization concerns.

The problem of responding to change from an packaged applications perspective has been incorrectly associated with the eradication of business silos. Organizational and IT systems stove-pipes have been berated as being the cause of dysfunction in responding to change. But are organizational silos really bad? If so, why do they develop in the first place? Organization and IT system silos are a consequence and concomitance of natural evolution as the organization grows in the depth and breadth of its offerings, geographic reach, vertical specialization and market (i.e. customer segments). To respond to business priorities, that is revenue growth, margins, profitability or market share, organization will continue to become more complicated.. Matrix organizational structures are giving way to mesh (i.e. network) like organizational structure where the boundaries between internal lines of business and the external stakeholders (including customers, partners and suppliers) is blurring. Shouldn't businesses then be making more investments in packaged applications that are purpose-fit for specific customer niches, geographies and industries? Clearly, the flexibility of changing existing packaged-applications to meet new business needs is overrated in today's business environment.

The solution lies in providing a consistent experience across external interfaces while continuing to make investments in internal applications (packaged or custom). After all specialized, purpose-fit, applications will deliver a competitive advantage. This is where edge applications built on BPM shine in overcoming the change inertia plaguing businesses. For instance, let's consider a local retailer contemplating entry in an overseas market. What if the retailer's existing CRM system does not fit the requirements of rapid-entry into the target market? What choices does the retailer have?

One choice, could be to customize the existing CRM system through customized development effort. Another choice, could be rip-and-replace the existing CRM system with a new on-premise or cloud based CRM system. The latter approach may appear tempting in vendor pitches but is not for the faint-hearted in practice. To quote Carl Von Clausewitz, "Everything in strategy is very simple, but that does not mean that everything is easy!" In reality neither of the above approaches scale in the long-term.

Yet another alternative, one that businesses typically resort to, is to deploy a new CRM system that is purpose-fit for the requirements of the overseas market. In this case, the business is faced with the time and effort of re-coding business rules and compliance policies in the new CRM system. Though this approach makes sense it becomes harder to scale when future needs complicate integration effort and consistent enforcement of business rules and compliance policies across the stove-pipe CRM systems. However businesses can circumvent these issues if they build an intermediate layer that interfaces with the customer channels and orchestrates the orders across the different front-end CRM systems. In this manner, businesses get the performance and capability benefits of purpose-fit packaged-applications and while being able to apply business rules and compliance policies consistently across them thereby providing a uniform customer experience across the external channels.

The future is here today and BPM addresses the long-standing challenge of strategy-execution gap by serving as a platform for building edge applications.

Monday Dec 19, 2011

Improving Visibility of Payments Value-Chain part 2 / 2

In my earlier post I discussed the business imperative for improving visibility in payments processes and the factors (demand, organizational and technology) creating impedance in realizing the vision of Straight-Through-Processing. In this post, I will outline some approaches of improving Straight-Through-Processing in payments processes to improve visibility in them.

Visibility in payments processing is fundamentally about knowing the status of payments as they flow from front-office initiation channels, across mid-office fraud, risk and compliance systems , through the payments application i.e. AR,AP and finally through external interfaces with clearing providers and financial network providers e.g. SWIFT, ACH etc. From a technology standpoint, payments straight-through-processing requires streamlining the integration across all these inbound channels, outbound interfaces and ancillary systems.

One approach is to consolidate all of the existing payments applications with a payments hub. Payments hubs are packaged applications that consolidate interfaces, both inbound and external, and offer full-fledged payments processing capabilities including AR, AP, treasury, exception management i.e. payments repair and reporting. In principle, centralization of payments processing with payments hubs provides real-time cash flow visibility by eliminating manual / semi-automated reconciling across multiple payment application stove-pipes, enables single-source-of-truth audit reporting and streamlined governance. It is important to note though that such application consolidation with a payments hub does require serious investments in time and effort, potentially being a multi-year effort and may be cost prohibitive depending on the comprehensiveness of the payments hub in terms of payment processing capability.

An alternative to application consolidation is interface consolidation with a payments gateway. Payments gateways streamline the number of integration points for payments applications by serving as a common pipe through which all in-bound and out-bound payment traffic flows. Unlike payments hubs, payments gateways do not replace core payments processing that’s undertaken by the payments applications. They offer connectivity to internal applications and external interfaces through pre-built adapters. This approach improves visibility in terms of being able to offer an aggregated view of the payments traffic in terms of volume and type of payments. Sophisticated gateways offer the ability to introspect the payments traffic, which could comprise of different message formats, there by offering a real-time snapshot of payments inflow, payments outflow and exceptions.

A hybrid approach is to use BPM in conjunction with the above technologies. Here, a lightweight abstract process is created to represent the end-to-end journey from the time a payment enters the organizational boundary to time it leaves the organizational boundary. This enables a chronological view of the payments workloads thereby giving insight for improving business operations by eliminating the bottlenecks for a specific line of business or initiation channel or payments application. Such abstract processes could be modeled as events driven processes that are triggered / invoked by different systems as payments flow through them. This BPM layer could also serve as the single-source for tracking payments. Below is a graphical depiction of how a BPM layer can be used to gain end-to-end visibility of payments.

Wednesday Dec 14, 2011

Improving Visibility of Payments Value-Chain part 1 / 2

Payments processing is a central activity for financial institutions, especially retail banks, and intermediaries that provided clearing and settlement services. Visibility of payments processing is essentially about the ability to track payments and handle payments exceptions as payments flow from initiation to settlement. The business imperative for financial institutions, especially retail banks, for improving visibility of their payments processes stems largely from the following:

  • Lowering time and cost of fraud detection, risk management and compliance by applying these efforts in a centralized manner across lines of businesses, payment types and payment channels
  • Gaining real-time visibility of their cash-flows to optimize working capital by improving effeciency in borrowing and lending and negotiating appropriate SLAs with intermediaries such as clearing houses and payment channel providers such as credit card providers

While automation has improved capacity of existing payments systems to cope with ever-increasing volume of payments traffic, there remain several hurdles to improving visibility of payments processes. Payments processing is a complex businesses for largely the following reasons:

  1. Large and growing number of channels for payments initiation. This includes non-electronic channels such as in-person and post, and electronic channels such as ATM, KiosKs, Point-of-Sale (PoS), Online, Mobile etc.
  2. Multiple payment types including cash, check/draft, card (credit / debit), Electronic Funds Transfer (EFT), Wire transfer etc.
  3. Payments initiated as a particular type could be cleared and settled as another type. For instance, a customer may pay a merchant using a check and the merchant's bank may scan the check and send it as an electronic payment type through a clearing counter-party.
  4. Varying governance requirements across payment type, clearing intermediaries, governments and industry standards
  5. Loss of control due to separation of payment processing function across different entities e.g. initiation of card payments is handled by a retail function in a bank whereas clearing of card payments could be handled a card provider such as VISA

In addition to the above there are the following operational hurdles faced by a retail bank:

  1. Out-dated payment systems that rely on batch-processing thereby making it incredibly difficult to report status of individual payments.
  2. Multiple payment systems, each with their own fraud, compliance and risk systems, that are not integrated thereby increasing time, cost and complexity of fraud detection, compliance and risk management
  3. Multiple external interfaces to clearing intermediaries e.g. SWIFT, ACH, FedWire, Card providers, each with their unique security and message exchange requirements
  4. Structural silos, internal to a bank, aligned to payment types and systems thereby hampering enterprise-wide view of payment activities

In the next post, I will explore some approaches to achieving STP and improving visibility in payments processes that span front-office initiation channels, ancilliary back-office systems and external interfaces.


A business centric perspective on Private Cloud, Data-center Modernization and EAI.

Sanjeev Sharma
Twitter: @sanjeevio


« August 2016