X

Gain Insights into Current Trends and Challenges Impacting the Financial Services Industry

Recent Posts

Bank Revenue Management: driven by data, or going on guesstimates?

Blog Authored by Akshaya Kapoor, Senior Director, Product Strategy, Oracle Are billing and pricing mechanisms falling behind the rest of banks’ digital stack? To fully realize the benefits of their investments in the digital customer experience, financial institutions must also merge customer data with a more fluid, forward-looking approach to how they manage revenue and price their products. By doing so, they can not only ensure more stable and resilient revenues in the long term, but also personalize what is arguably the most critical element of the customer experience: how much they pay, and when. Dynamic pricing for dynamic businesses Despite broad moves towards customization across the financial services industry, few institutions so far have adopted an equally flexible approach to billing and pricing. In most cases, fixed fees and commissions are still set based upon the bank’s general assessment of what customers may be willing to pay, or the market rate set by its competitors. These traditional pricing models don’t usually account for the huge amounts of customer data that are already beginning to personalize almost every other area of the corporate banking experience – potentially causing banks to miss out on opportunities to better service those customers and seize greater revenue and market share in the process. Banks should consider applying the same data and analytics technologies that they already use elsewhere to how they bill and price products. This would not only allow them to more accurately define and optimize pricing to achieve the highest possible level of profit with each customer. As businesses grow, their financial service requirements also evolve – sometimes extremely rapidly, as in the case of everything from fast-growing start-ups to larger enterprises undergoing sudden mergers or acquisitions. A more dynamic, data-driven approach to pricing can identify not only the right products for those changing needs, but the right price based on the customer’s financial history and current situation. That should ultimately translate into more products and services sold, in better alignment with customers’ unique and ever-changing requirements – a win-win for both banks and the businesses they serve. Centralize to customize For this to happen, financial institutions will need to begin centralizing their billing, pricing, and general revenue management systems – moving away from traditional departmental siloes as quickly as possible. Only by converging their pricing and billing functions, and removing the redundancy that exists across different departments, can banks hope to both gain a full picture of their customers and co-ordinate pricing efforts for the best possible experience. When they do so, banks will also find it increasingly straightforward to calculate and forecast revenues from each customer; a centralized revenue management platform also paves the way for further innovations, such as real-time billing and automated, rule-based pricing, that can simultaneously enhance banking revenues and customer satisfaction. Traditional “guesstimates” of the optimal pricing point can no longer function effectively – not least because market and individual customer demands shift so quickly as to render any such guesses almost immediately redundant. By extending their data and analytics capabilities from the front-end customer experience into the realm of billing and pricing, banks can take the guesswork out of their revenues and better align their profit models to real, often surprising trends in demand. Ultimately, such dynamic pricing systems should seek to strengthen the relationships between banks and their customers on which long-term profits and growth – for both sides – are founded. Discover how banks are extracting greater value from pricing to billing through collections by implementing a complete platform for end-end revenue management. Find out how at www.oracle.com/goto/sibos 

Blog Authored by Akshaya Kapoor, Senior Director, Product Strategy, Oracle Are billing and pricing mechanisms falling behind the rest of banks’ digital stack? To fully realize the benefits of...

Banking

Could Latency Determine your Bank’s Rating

Blog Authored by Shriyanka Hore, Director Product Strategy, Oracle As the global financial industry moves towards real-time payments, banks will lose their relevance unless they can meet customers’ need for speed. Almost all major financial markets have taken steps to make real-time payments the default for both consumer and corporate customers, driven by new legislations like SEPA Instant in the Eurozone and the New Payments Platform (NPP) in Australia. According to Forrester’s latest findings, 90% of banks worldwide are building payments APIs with the explicit goal of improving visibility and agility – responding to demand for ever faster, more convenient payments. Some reports suggest that more than 1 in 2 corporate banking clients will choose their principal bank based on which institution provides real-time payments that match the speed of business. As real-time becomes the default for payments processing, banks will find it increasingly difficult to justify fees or premiums for offering such capabilities. Soon, banks could even be ranked or rated according to their speed of payments handling – something made increasingly possible and transparent with new industrial standards like SWIFT GPI. Corporate banking clients today expect the same speed of transaction that they experience as consumers, if not faster. They demand immediate, real-time access to funds and instantaneous responsiveness as part of how they gain and maintain a competitive edge. Banks need to go beyond traditional batch processing and embrace automated, even AI-enabled models if they want to survive these pressures from customers. Staying secure, at speed The challenge, for many banks, will be this: how can they maintain the fidelity of their business offerings at increasingly taxing speeds? Maintaining compliance and good governance of corporate transactions already poses a sizable challenge; how much more so when dealing with a much higher volume of payments at steadily rising velocity? Forrester’s research suggests that 75% of banks worldwide see lengthy compliance and risk evaluations as the primary barrier to payments transformation, even more than those struggling with legacy infrastructure and processes (70%). To break the speed barrier, and maintain or even augment their reputation amongst customers, banks will need to accelerate not only payments but the way in which they secure and govern them. Automation may be the key to doing so. Banks which automate their payments processes stand to reduce handling time by up to 40%, even as they radically improve the accuracy of those payments. By automating similarly repetitive processes such as Know Your Customer (KYC) or anti-money laundering checks, banks could significantly speed up their compliance routines – to the point of delivering real-time compliance that matches the ever-increasing speed of payments themselves. Applying analytics to their growing wealth of compliance and regulatory data could also help banks profile customer risk and flag suspicious transactions with much greater precision – simplifying the payments value chain and driving up operational efficiencies in a way that supports, if not enables, far broader digital transformation. At the same time, banks would do well to centralize their payments handling. At present, many banks still operate multiple concurrent payments systems in siloed models where cross-system integration is rare. Bridging those silos, and drawing all payments handling under a single hub, should eliminate those duplicate processes while also consolidating all payments data in a single place – making it much easier to develop new APIs, gateways, or other services based on analysis of that data and customers’ underlying needs. Some real-time payments standards, like Australia’s NPP, already demand higher levels of real-time service integration beyond just payments themselves. Banks would do well to shift gears ahead of regulation and centralize their systems to lay those solid foundations for further innovation and platform-building. Ultimately, banks will only thrive if they can continue increasing the speed of payments as part of broader improvements to customer service. Adopting real-time payments is just the start: banks must fundamentally streamline their compliance, security, and even data sharing processes if they are to match the speed at which their customers seek to operate. What opportunities do you see for banks to turn customers’ need for speed into a competitive advantage? Join us at SIBOS 2018 to learn more about what it means to deliver lightning-fast payments and how to meet customers’ need for speed. www.oracle.com/goto/sibos

Blog Authored by Shriyanka Hore, Director Product Strategy, Oracle As the global financial industry moves towards real-time payments, banks will lose their relevance unless they can meet customers’...

Analytics

Financial Compliance: the foundation of banks’ customer experience?

Blog Authored By Sunil Mathew, Vice President, FCCM and Big Data, Oracle The strength of banks’ compliance offerings will increasingly correlate with the quality of their customer experience. By investing in faster, more accurate compliance systems, banks can not only reduce their liability in the case of financial crimes or money laundering – which has, in cases, stretched into billions of dollars of punitive measures – but also improve the fundamentals of customer service and engagement. From bottleneck to accelerator Compliance often comes as a speedbump to real-time payments, data sharing across services, and other core elements of the open banking environment that banks must now operate in. Banks face rapidly rising levels of regulatory complexity both locally and globally, with traditional methodologies of risk assessment under increasing pressure to provide intelligence of greater accuracy and timeliness than ever before. That typically translates into more and more time spent analyzing larger and larger sets of data – which in turn threatens to throttle the agility of the bank’s services and overall customer experience. How can banks achieve even more rigorous standards of governance without compromising their fidelity and the reputation they hold amongst their customers? They may consider adopting new anti-fraud or AML platforms which combine real-time monitoring with increasingly sophisticated analytics, data visualization, and AI or machine learning capabilities. Doing so will help them to automate the detection and streamline the response to potential criminal or noncompliant events with heightened speed and accuracy that constantly improves over time. For this to happen, banks should focus on centralizing or otherwise consolidating data from their core banking systems: the deeper the data sets available for these systems to analyze, the more precisely they can identify potential risk factors and refine the criteria they search for. Banks cannot achieve an optimal experience for their customers without first tackling the speed and accuracy of their compliance solutions. Businesses will inevitably question their trust in any financial institution without robust compliance and governance controls – something traditional banks still hold as an advantage over their less-established fintech rivals. In fact, banks would do well to treat compliance and governance as the foundation of their customer experience – one which, when improved, results in significant flow-on benefits across the organization. Compliance as the cornerstone The effects of digitizing compliance will be felt almost immediately by all a bank’s customers. Apart from faster transactions – essential for services like cash flow management, or credit approval – customers should also find that their regulatory and reporting burdens steadily decline. Smarter, AI-enabled platforms can significantly lower the incidence of false positives when comparing names or other identifiers, minimizing the disruptions caused to businesses by unnecessary audits or money laundering investigations. When data from those platforms is securely shared across the bank, customers should also encounter far less governance checks and balances, the sort normally demand by KYC standards or other mandates on customer identification. The greatest impact of digitized, AI-enabled compliance on customer experience is also the least visible. But while most businesses may never see the cases of fraud averted or financial crimes stopped, they will almost certainly notice the benefits of faster transactions and service levels with less onerous manual reporting and governance requirements. For now, relatively few financial institutions – including fintechs – have managed to strike the right balance between robust governance and speed of processing. Those who can accelerate their compliance solutions with data, analytics, and new technologies like machine learning will find themselves increasingly sought after by businesses looking to benefit from that enhanced customer experience – and even apply such capabilities to their own offerings. Join us at SIBOS 2018 for in-depth discussion into the role of data and AI in the fight against money laundering and financial crime – and the opportunities for banks as providers of this valuable compliance expertise. Take part in our upcoming executive luncheon roundtable session “Crack the Code” to understand ways to streamline and protect your organisation, by using machine learning and advanced analytics to discover patterns and turn raw data into a critical source of business intelligence. View more at www.oracle.com/goto/sibos

Blog Authored By Sunil Mathew, Vice President, FCCM and Big Data, Oracle The strength of banks’ compliance offerings will increasingly correlate with the quality of their customer experience....

Banking

Pricing and Billing in the Cloud: Putting it into practice

Blog Authored by Akshaya Kapoor, Senior Director, Product Strategy, Oracle Cloud pricing and billing has come a long way in the last few years and has gradually over time evolved from a disruptive innovation to a pervasive reality. We are beginning to see many financial institutions thinking about it, experimenting with it, and are gaining a deeper understanding of how it is shaping up to be a major paradigm shift in revenue management. Why this is happening now: The shift to recurring subscription revenue models and increased customer demand for customized pricing and billing options are all converging to set the stage for this opportunity. To focus on recurring customer relationships, the very foundation from which an enterprise operates, not only needs to be re-established but also needs to be “modernized”. Pricing and billing in the cloud-catchphrase or practical approach? Historically pricing and billing application implementations have been lengthy, costly, and to some extent unpredictable, with extensive customizations and expensive consulting engagements. The cloud changes everything. Pricing and billing in the cloud, gives you all benefits that a cloud solution promises - faster deployment, high availability, quick scalability and business agility. We are beginning to see many financial institutions thinking about it, experimenting with it, and are gaining a deeper understanding of how it is shaping up to be a major paradigm shift in revenue management. The advantages are clear: Is transitioning pricing and billing to the cloud only about cost optimization?  For many business leaders, the phrase “cloud transformation” is synonymous with “cost savings”, and that’s understandable. A recent analysis, found cloud application projects deliver 3.2 times the ROI of on-premise ones. Suffice to say, that’s a staggering number. Beyond cost savings, the benefits that come with pricing and billing cloud adoption are profound. Enterprises can: Make more-precise and more-profitable pricing decisions with built-in analytics: Today’s cloud pricing and billing applications have built-in analytics that help financial institutions monitor and act on key events during the customer life cycle Improve customer relationships by providing customers with timely, accurate, and transparent pricing and invoicing. Cloud pricing and billing eliminates manual processing and nonvalue work, thereby reducing the likelihood of errors and saving time  Gain real-time insight into revenue and profit margin estimates. By using cloud-based pricing and billing, CFOs and line-level managers can analyze up-to-the-minute financial situation and performance obligations, allowing for better-informed, strategic decision making Cloud based pricing and billing can be easily tailored to fit the exact needs, compared to the ‘one size fits all’ design of on-premise systems. In the cloud, financial institutions can change processes, rules and other conditions easily and rapidly Offer personalization with innovative, flexible pricing and billing: Intelligent billing and charging models that are reflective of consumption behavior Strategic values to exploit: Take the case of a top consumer and wholesale bank. Once it’s pricing and billing is on the Cloud the bank discovers these value additions: Using a pre-configured application, the bank gets off the ground quickly. It takes just six months to go into operation and helps the bank take services to existing and new markets that much faster There are fewer disruptions as upgrades are delivered regularly and automatically, which means near zero downtime It has the freedom to scale up and down as needed - and fast, making it easier to  quickly pilot new services on a limited scale - and progressively scale to millions of transactions as necessary, thereby accelerating innovation With many compliance and security measures baked-into the cloud, the bank no longer needs to maintain internal extensive security systems nor does the bank have to spend time and effort on compliance procedures The bank has expanded its operations internationally and quickly conquered new markets with built-in localizations that address local business requirements With seamless sharing of enterprise information and direct integration between other cloud-based and on-premise source systems, the bank has been able to reduce inefficiency caused by siloed systems. The bank has also gained a single source of truth that cascades all operational functions and is able to make business decisions more accurately and quickly Financial institutions that embrace cloud pricing and billing today are more likely to see a greater payoff tomorrow. By moving pricing and billing to the cloud, financial institutions can achieve a sustainable competitive advantage by transforming how they operate internally and how they deliver value to customers. See how Oracle can help you create Tomorrow’s Pricing and Billing, Today at www.oracle.com/goto/sibos and join us for a roundtable discussion on Oct 23 in Sydney.

Blog Authored by Akshaya Kapoor, Senior Director, Product Strategy, Oracle Cloud pricing and billing has come a long way in the last few years and has gradually over time evolved from a...

Big doesn’t have to mean slow, for Open Banking

Blog Authored by Shriyanka Hore, Director Product Strategy, Oracle Do smaller, nimbler fintechs have an inherent advantage over traditional financial institutions when it comes to open banking? It may certainly seem so, but that may not be the case. In fact, expertise that larger banks bring to bear – particularly in the areas of B2B transactions and corporate banking – may give them the critical mass to develop much broader, deeper offerings, and even establish themselves as platforms of choice in the open banking ecosystem. But they will need to accelerate their adoption of more open data standards if they want to build the necessary momentum and avoid ending up at the back of the race for customers’ loyalty and attention. Bigger isn’t better…or is it? The scale of traditional banks has often been held up as a roadblock to their adoption of open banking – and it’s true that smaller financial institutions and fintechs can typically adjust their infrastructure and processes much faster than their larger competitors. But traditional banks have something which these smaller players often struggle to achieve: breadth and depth of service offerings, especially in B2B and corporate disciplines which require more specialized knowledge or technical mastery. Couple this with the faster, more streamlined data sharing of open banking standards, and traditional banks find themselves with an opportunity to own a much more comprehensive swathe of the banking services pool than fintechs or smaller institutions – who typically focus on just one core offering – may ever be able to. As banks expand their services and grow transaction volumes, they also gain more and more data to then offer to other service providers in the open banking environment. That has the potential to create a virtuous cycle which eventually establishes the bank as a platform-of-choice – both for businesses looking for a seamless, fully-integrated corporate banking experience, and other providers looking to gain new customers or enter new markets. In an open banking environment, platform providers hold a significant and defensible advantage, and the critical mass of traditional banks – both in providing core services like payments and cross-border transactions, as well as strategic value-adds like Know Your Customer (KYC) and anti-fraud protection – puts them in good stead to take up that position. Riding the regulatory wave To take advantage of these opportunities, however, traditional banks will need to adopt as proactive a stance as possible towards open banking – particularly when faced with its rising tide of regulation. New standards and directives like PSD2, for example, can be treated as impetus for innovation in how banks connect, cross-sell, and integrate products varying from faster payments to cross-bank handling of cash management. Banks which can develop the solutions and platforms for these new open services – like multi-bank cash management solutions for SMEs – stand to generate tens of millions of dollars in additional revenue. For large banks to claim their advantage over smaller competitors, they will need to view these new directives – and open banking more generally – as a matter of customer experience rather than forced adaptation. The movement away from closed-off, proprietary services into open APIs, the forging of new partnerships in the cloud: all these should ultimately be guided by what the bank’s customers want and need, and how the data from those customers might be used to further improve that experience across the entire financial services ecosystem. Banks who move first will gain a valuable head-start on the race to become platforms-of-choice: Citi’s proactive move to enroll with the UK’s Open Banking Directory sends a clear message that it, and its data are literally open for business – and committed to delivering as streamlined and cohesive a banking experience for their customers as possible. Banks must stay at the front edge of the regulation wave, whether by adopting new core systems or preparing their data for third-party use, if they want their size to be a source of momentum instead of slowing them down. Join us at SIBOS 2018 to learn more about what open banking means for today’s financial institutions and how to turn impending regulation into a catalyst for change.www.oracle.com/goto/sibos 

Blog Authored by Shriyanka Hore, Director Product Strategy, Oracle Do smaller, nimbler fintechs have an inherent advantage over traditional financial institutions when it comes to open banking? It may...

Analytics

Six ways to shift gears and ramp up your innovation quotient at SIBOS 2018

Blog Authored by Parvez Ahmad, Director Marketing, Oracle Oracle’s innovation in financial services will be showcased at SIBOS 2018, where you can discover how technology is shaping the future of banking into a faster, more efficient and secure digital economy. From one-on-one meet-ups with our solution experts to discovery sessions to experience our demos, we’re looking forward to share best practices, connect, and collaborate on modern solutions for the banking world. At Oracle, we’re gearing up for the transition into a platform-centric, open banking environment – one which has major implications for the speed and customer engagement of corporate banks worldwide. Here’s what you can experience with us at SIBOS: Arrive in style at the ICC with a free Tesla ride This year, Oracle is offering SIBOS delegates free rides to the conference – and what better way than to arrive in an electric vehicle that doesn’t leave any carbon footprints behind? Simply tweet us on @OracleFreeRides to schedule a ride in a Tesla Model X – a more frictionless, efficient way of going places. Start your day at the 5km fun run Start your Wednesday morning fresh with a 6am fun run along Pyrmont Waterfront. Sponsored by Oracle, the sunrise route will start and end at the picturesque Metcalfe Park, giving you an opportunity to see the city from a different perspective before a busy conference day begins. The five-kilometer course is a great chance to get outdoors and network with other active delegates, so bring your running shoes and strap in for an early start. Sign up here. Reach for new horizons at the exclusive Dinner in the Sky Dine with us atop the iconic Sydney Tower, 300 meters above the heart of the central business district with 360-degree views of the city skyline. At this exclusive invitation-only event, you’ll meet and hear from Scott Farrell, senior partner at King & Wood Mallesons who has over 20 years’ experience in financial law and government regulatory across Australia and Asia. You’ll also get the opportunity to listen to Australian track cycling Olympian, Anna Meares, about the secrets to reaching peak performance and how to inspire great leadership under even the toughest conditions. Register your interest here. Meet with Oracle experts one-to-one at Booth I11 Engage in one-on-one meetings with Oracle experts at our exhibition booth for a discovery session and brainstorm on solutions and the technology enablers for corporate and business banking. Experience demos of our cloud-based platforms built on modern, open, and data-rich technology. Demos will cover a range of major flashpoints for today’s banks including virtual account management, real-time liquidity management, real-time payments, revenue and billing management, and risk and compliance in today’s heightened-security environment (including KYC and AML solutions). Keep up-to-date with industry trends Learn how to navigate the digital platform economy and the latest financial sector changes with our Power Up and Open Theatre sessions. The Power Up Sessions are quick 20-minute TEDx style presentations, while the Open Theatre will feature guest speaker Laura Misenheirmer from Wells Fargo who will be speaking on Payments Modernization: Responding to Market Events and Regulation Faster. These sessions will inform, educate, and prepare the financial industry in moving to analytics-driven platforms run on cloud technology and artificial intelligence. Shape the future of banking together with us Join us for executive luncheons and roundtable discussions on how adopting best-of-breed technology can help the financial industry overcome previously-intractable problems. Ask questions, network with industry leaders and discuss the role of data science and graph analytics in tackling financial crime. Our session “Crack the Code” will help you understand ways to streamline and protect your organisation, by using machine learning and advanced analytics to discover patterns and turn raw data into a critical source of business intelligence. Another session will be covering how banks can drive pricing & revenue assurance now and the case for change in a digital and real-time world as well as understand how moving to Cloud can accelerate ROI and lead to competitive advantage. These sessions will help industry players innovate and respond to market dynamics using fast, agile and secure digital platforms. Register your interest here. Say hello to us at Booth I11 on Level 1 @ SIBOS 2018, and discover our comprehensive and modern solutions for your business in the digital economy. www.oracle.com/goto/sibos

Blog Authored by Parvez Ahmad, Director Marketing, Oracle Oracle’s innovation in financial services will be showcased at SIBOS 2018, where you can discover how technology is shaping the future of...

Financial Services

Oracle and B-Hive Europe Join Forces to Accelerate and Monetize Fintech Innovation

Banks, insurers, and other companies in the financial services sector have invested heavily in innovation through various fintech initiatives for the past few years. Firms have become good at two-day hackathons, four-day proofs of concept, and press releases touting their cutting-edge services. But putting new capabilities in place and generating revenue from them has proven more challenging. Since innovation without monetization was not the point of the exercise, many are now questioning whether they’re getting enough return on that investment. Oracle believes European financial services companies can realize new revenue streams and higher returns from their fintech investments. All they need is a better ecosystem—one that helps match fintech companies with the firms that could benefit from their technologies. It would include an innovation accelerator and world-class infrastructure, and it would curate fintechs around innovation themes such as trust, open banking, talent, and so on. Plus, it would be based in a central location with a long banking history, a central location like Brussels. To help create that ideal ecosystem, Oracle recently rolled out its Fintech Innovation Program to Europe in conjunction with Brussels-based B-Hive Europe. This collaborative fintech platform brings together major banks, insurers, and market infrastructure players to work on common innovation challenges. The goal is to build bridges to the startup and scale-up community and also create a market where banks can offer capabilities in a modular way without having to own every individual component that makes up a service offering.   As B-Hive Europe’s first strategic cloud infrastructure collaborator, Oracle brings expertise to help fintech companies achieve higher levels of enterprise readiness. Oracle will be collaborating with B-Hive Europe to both measure the performance of fintechs and to help them improve their performance scores. Through collaboration, testing, mentoring, and a formal certification program, Oracle will be helping fintechs become more secure, scalable, and resilient—and help them prove it to tier-one financial services companies. Fintechs that meet Oracle’s standards will gain mentoring and insider access to the Oracle customer base of financial services industry customers, with Oracle providing referrals and invitations to show-and-tell days and webinars. Oracle believes monetization will flow when financial services companies become customers of fintechs and fintechs become channels for those financial services firms’ capabilities. As an example, when Barclays used Oracle’s Fintech Innovation Program to publish an open-API catalog for their PrecisionPay services to open its virtual credit card services on the fintech ecosystem, it dramatically expanded its channel. It became an easy default choice for companies building a new service offering with Oracle enterprise software and looking to include virtual credit card services as part of their new service. In another example, Oracle added open APIs for TAS Group services—including card issuing, network gateway, and card management—to the Oracle cloud-based Digital Innovation Platform. Oracle customers can now easily select the specific TAS Group services they need—accelerating TAS Group’s innovation and expanding the company’s customer base. Collaboration is key to B-Hive Europe’s success. At a recent panel discussion between representatives from Oracle, B-Hive Europe, and several fintechs, two fintech representatives, Mantica and TAS Group, met and have begun collaborating to build joint solutions on the Oracle Cloud. When big tech meets fintech in an international banking center like Europe, things start happening. Fintechs and financial services companies alike can build on Oracle’s Infrastructure-as-a-Service and Platform-as-a-Service cloud offerings to speed delivery of new services while mitigating innovation risk—and monetize joint innovations in a matter of weeks instead of years. In the near future, expect to hear more about new Oracle fintech initiatives in Europe in collaboration with B-Hive Europe.

Banks, insurers, and other companies in the financial services sector have invested heavily in innovation through various fintech initiatives for the past few years. Firms have become good at two-day...

Analytics

Network-Based Surveillance: A Solution to Transaction Monitoring

Blog By: Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. Financial institutions rely on deterministic rules to cull the transactions and pick out potentially suspicious transactions as part of their Anti-Money Laundering (AML) and Anti-Terrorist Financing (ATF) program. When a transaction is flagged, a notice (case) is generated and a procedure for resolving the red flag is enforced.  Generally, these rules generate cases for resolution (investigation) as soon as a rule is hit. These cases focus on one type of behavior, which may or may not be comprised of multiple and related party (customer, account, external entities) information. Challenges with the Traditional Monitoring Method: The traditional way of surveillance not only generates a massive number of cases, but also has some fundamental issues, such as: Lacks Holistic Surveillance: A specific behavior can be an indicator of suspicious activity and therefore should be assessed in conjunction with other indicators, not in a silo. When cases are created for the entity, as soon as there is a suspicious rule hit, the surveillance process is not factoring the behaviors occurred before and after that specific activity. This means the surveillance process is lacking a holistic view, which makes the detection process ineffective to some degree. Siloed Investigation: For flagged transactions, AML staff investigate the specific circumstances surrounding the transaction.  High-risk products, areas of operation, business lines and basic customer information can influence the amount of transaction testing. During the investigation process, users do their best to include any related cases found manually, which is primarily based on customer and account. Although this helps investigators include previous cases for that customer, this doesn’t factor other related, loosely related, or hidden suspicious behaviors. These manually linked cases may provide some additional information about investigated entity; however, it may not quantify overall risk. Too Much Information: Data is collected during the transaction testing process and during the follow up investigations. Manual linkage of related cases adds a significant amount of data, which investigators will have to study as part of their investigation. This may be very hard to make any sense of in absence of a proper network view of all the involved parties.     In summary, the traditional way of pattern detection leads to an enormous number of cases, which then require analysis of several other systems for comprehensive investigation. Overall, this leads to longer investigation periods and makes the entire process highly inefficient. Solution: Network-Based Surveillance The purpose of network-based surveillance is to leverage an optimization layer for all the risk indicators (events) to apply a risk-based assessment. This will further allow a comprehensive entity focused case to be created for investigation. Below are the broad level steps for a network-based surveillance process: Step 1 - Ingestion & Enrichment: The first stage is to feed all events from various sources, automated or manual. Events may not have all the requisite information for an effective detection. Therefore, event information should be enriched. This enrichment of data should be extended to customers, accounts, external entities, and other relevant data sources. At this point, functionality to identify duplicate events and prevent them from being reprocessed should be applied. Additionally, any supplicate events should be identified and prevented from being reprocessed. Step 2 - Consolidation: Once ingested and enriched, events should be consolidated based on primary entities (Customer, Account, Tax ID, Address), matched event data or relevant elements (Line of Business, Geography, Jurisdiction) of the focal entity associated with the event. Consolidation rules can further be segregated to factor various case types, such as AML Monitoring, Sanctions, etc. Step 3 - Scoring & Correlation: Scoring on the events in the pre-case should be used to compare against the case creation threshold for evaluating whether the event optimization layer should cater for event score (at the point of event creation), pre-case score (every time batch runs) and entity score (every time batch runs). If a new event is generated on an entity on whom/which there is an open extendable case added, then the event could be directly tagged to that case. The case statuses that allow new events to be added to it should be configurable. Step 4 - Correlation Scoring & Case Creation: In the traditional way, every event generated from the transaction monitoring system either creates a case or gets consolidated to an existing case. In the last step, new events under a pre-case layer should be consolidated to a case once the score breaches the configurable case creation threshold. Benefits of Network-Based Surveillance: Let’s discuss some key benefits to having a network-based pattern detection. Increased Coverage: Instead of investigating each risk indicator (event), network-based pattern detection allows for prioritization of risk events. This will increase the monitoring coverage. Identify Hidden Relationships: Party relationships can be defined based on tightly or loosely related links. This will help identify hidden relationships at the surveillance layer itself, which may have been missed during investigation. Risk-Based Scoring & Prioritization: Network-based multiple layer correlation process allows for risk scoring, not at case level, but at individual event and entity level too. Holistic Investigation: Since correlated entities and events are linked and presented as part of case information, this allows for investigation from any entity perspective. Enhanced Network Visualization: Now that relationships are identified and enriched leveraging both internal and external data, much more advance network visualization can be used to determine bad entities. While this new way of monitoring means a much more efficient Anti-Money Laundering and Anti-Terrorist Financing program, organizations should be careful about the level of network link to be used for correlation. If not thought through, this can lead to a much more complex case and might ‘overhelp’ investigators. Appropriate training, future need for delinking and information sharing between analytics and Financial Investigation Units should be considered when getting into this new program. Lastly, the subsequent phase would be to apply machine learning to identify new hidden relationships, statistical techniques for scoring and determine case promotion threshold based on historical information.    

Blog By: Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. Financial institutions rely on deterministic rules to cull the transactions and pick out...

Financial Services

Leaders and Laggards: Get Ahead or Get Left Behind

By Mark Smedley, Vice President of Financial Services Pressure on wealth and asset management firms to innovate is coming from all directions. With the choice to innovate or lose out to fintechs and other faster-moving competitors, firms need to identify key areas where emerging technologies can be best adopted to help them create a truly client-centric experience. In a recent report, Wealth and Asset Management 2022: The Path to Digital Leadership [i], EConsult Solutions explores some of the most promising areas to focus on as well as the costs of not moving fast enough. The firms that are leading in these areas are experiencing impressive results: an average 8.6% increase in revenue, an 11.3% increase in productivity, and a 6.3% increase in market share. For the laggards, the cost is high—nearly $80 million per $1 billion of revenue. For the industry leaders, cloud, artificial intelligence (AI), and blockchain are three technologies that are showing the most impressive results. Innovation Must Be Built on Solid Ground—in the Cloud The EConsult Solutionsreport identifies the foundational elements of a strong technology strategy. Of these, cloud is critical because a cloud-based platform facilitates the use of APIs, AI, blockchain, and robotics. When coupled with a data warehouse or data lake environment, these technologies make possible the delivery of innovative products and services, speed time to market for new offerings, reduce operating costs, scale easily, and deliver an extremely personalized client experience. But cloud is just the beginning. The Smart Money Is on AI AI may possibly be the biggest game-changer in this ongoing digital transformation. Digital leaders are using it to increase advisor productivity, improve portfolio management, detect and anticipate cybersecurity risks, and more. Charles Schwab is using natural language processing to create personalized experiences for clients, and to arm its advisors with information to help investors. AI can also drive internal change. The EConsult Solutions report predicts that, by 2022, robotics and AI will increasingly replace routine activities, including many areas of trading and back office operations. By taking away these more mundane employee tasks, firms will be able to direct their efforts to more creative applications that add client value. AI has already made inroads in replacing humans in high-frequency trading and some areas of portfolio management. AI is certainly a key factor in humanizing the client experience, but so it another technology that is still misunderstood by many: blockchain.        Blockchain Helps Create the Frictionless Client Experience       Blockchain has tremendous potential to streamline and simplify client interactions with their advisors. Because it’s a distributed public ledger that provides every stakeholder in a transaction a “single view of the truth,” blockchain can enable “smart contracts,” more transparency, faster transaction processing, and better data security. Since the blockchain ledger is encrypted, it can potentially replace many third-party intermediaries such as banks, brokers, or custodians for many different kinds of transactions or identity management—revolutionizing the whole financial services industry. Blockchain can also streamline the client onboarding process. If your identity is embedded in a block, then firms have immediate access to information that can verify your identity and credentials for fast processing. Other applications where blockchain has huge potential include reduction of counterparty, market exposure, and operational risk; and replacing the internal book of record for transactions and portfolio positions. It’s Not Either/Or—It’s Both/And Thankfully, technology can’t completely replace advisors. In fact, these new technologies offer them the opportunity to better serve their clients. Automated processes, customer insights, and market intelligence can enhance the human touch. Technology will also automate mundane tasks for advisors and other employees, which is definitely humanizing and liberating. But the digital transformation will also require a major culture change at organizations. Unless firms are willing to become extremely agile and foster innovation, they won’t be able to succeed. Digital Is Not an Option We can’t underestimate the risk of not adopting these promising technologies, or even waiting too long to make the move. Econsult Solutions’s report estimates a “laggard penalty” of up to $1.5 billion for large firms. Key to becoming a leader is staying ahead of the technology curve, and every wealth and asset management firm is taking steps to incorporate digital technology. The question is which ones will be able to take the lead, and which ones will lag behind. [i] Econsult Solutions, Wealth and Asset Management 2022: The Path to Digital Leadership. 2017.

By Mark Smedley, Vice President of Financial Services Pressure on wealth and asset management firms to innovate is coming from all directions. With the choice to innovate or lose out to fintechs and...

Financial Services

Welcome the Blockchain Generation

By: Sanjay Mathew, Sr. Director, FS Industry Solutions   While the Carolina Fintech Hub Generation Blockchain Challenge has come to a close, the opportunity to impact the future of multiple industries is far from over. Students participating in this hackathon were tested with real-world challenges: learning a complex technology on the fly, collaborating and making difficult choices within a group, and planning for contingencies when things didn’t go as expected. And yet these students rose to the challenge. For the 30 students on the 10 teams that submitted projects, the Blockchain Challenge was a chance to connect with industry executives and stakeholders in the North and South Carolina area. Participants took on a multitude of challenges, exhibiting qualities that are found in every successful entrepreneur, including the patience and perseverance to keep moving ahead and the discipline to keep collaborating and problem solving with their banking mentors and their Oracle support team. With access to the enterprise-grade Oracle Blockchain Cloud Service, student teams were able to take their ideas from business case to commercially relevant proof of concept with only two days of training. In addition to the experience, the connections, and the chance for a monetary reward, this generation’s entrepreneurs also learned valuable lessons from being hands-on with the process of applying blockchain technology. But the students weren’t the only ones learning. For those on the other side of this challenge—the sponsors who participated in mentoring student teams and judging the final results—this event provided an opportunity to give back to the community and foster emerging talent. But it was also a chance for the mentors to see what “thinking outside the box” looks like for the next generation and to reassess their organization’s own plans for transformation. For these companies, the Blockchain Challenge modeled a smarter way to innovate—a practical approach of engaging the academic community and partnering with technology leaders that together can jumpstart the next generation of opportunities.  Blockchain mentors from Oracle also enjoyed the fresh ideas and persistence exhibited by the students who were continually questioning the status quo—something that “mature” businesses sometimes forget to do. The young entrepreneurs didn’t take “No” for an answer and refused to be denied. Rather than being defined by project constraints, these future leaders broke through those constraints—often with the shear brute force of hard work, long hours, and a refusal to accept anything but success. Financial services, and so much more. The concept of blockchain as a distributed ledger for making and recording transactions is straight forward. It’s the unlimited scope for applications that’s causing heart palpitations in boardrooms around the world. It’s a technology that could affect—literally—everyone, but, acceptance in the financial services industry has been understandably measured. In fact, it may be that the acceptance of blockchain in other, less heavily regulated industries is key to accelerating its acceptance in the financial services arena.  “It will take a change in thought leadership within the industry to allow some of this technology to take hold,” observed one mentor from a leading national bank.   It’s that kind of change that’s been seen in the Blockchain Challenge. While sponsors of the first CFH hackathon were mainly from the financial services arena, the challenge itself was open to cross-industry submissions and produced a significant number of forward-thinking, viable business opportunities in areas such as energy, real estate, and healthcare, as well as financial services. Sudhakar Pyndi, Director, Enterprise Architecture at Ally Bank, a mentor and judge in the CFH Generation Blockchain Challenge was impressed by the level of hard work and dedication shown by the student competitors and sees a connection between the variety of applications and his own financial services industry. “It was good to learn about those kinds of applications as well. We’re learning a lot that could easily apply to our industry.” One mentor, impressed with both the variety in the applications and the level of excitement and commitment from the student entrepreneurs noted “There’s not just a showcase of talent there, but there’s a showcase of opportunity as well.” Most executives believe that blockchain will revolutionize the financial industry over the long run, but many believe it will take time for the technology to gain market acceptance. That’s consistent with an Oracle survey that shows investment providers plan to more than double their use of blockchain (from 15% to 31%) over the next five years. But the Blockchain challenge made one thing clear to competitors and sponsors alike: There’s no going back; there’s only moving ahead—either as a leader in the process of accepting and implementing this cutting edge technology or as a beneficiary of the results. The next generation is already there. The Generation Blockchain Challenge Hackathon shows that a future with blockchain is here. Now. And based on the results of this challenge and the capabilities of these young entrepreneurs, it’s time to for the rest of us to lead, follow, or get out of the way. ----------------- Learn more about the Carolina Fintech Hub Blockchain Challenge contestants, here.

By: Sanjay Mathew, Sr. Director, FS Industry Solutions   While the Carolina Fintech Hub Generation Blockchain Challenge has come to a close, the opportunity to impact the future of multiple industries...

Blockchain Technology and Financial Crime: A New Future?

Blog By: Julien Mansourian, Strategy and Transformation Executive. Blockchain is a sequential distributed and encrypted database found in cryptocurrencies derived from Bitcoin. It has become a tool for maintaining transparent and distributed ledgers that can verify transactions, including financial ones, with minimal third party involvement. Blockchain is distributed in the sense that the ledger is not held in a central location but rather spread across a network of computers or nodes. And it is transparent in the sense that every transaction is made public for all to see. In a Blockchain environment, historical transactions can’t be changed, whichreduces the possibility of data altering and maintains a high level of data integrity. Blockchain has a tremendous potential to become one of the most wanted technologies across all industry sectors over the next coming years. Experts say the Blockchain will cause a revolution similar to what the Internet provoked. A recent article from Let’s Talk Payments lists 26 separate banks currently exploring the use of Blockchain technology for payments processing. In parallel, there are 42 banks that are considering a common set of standards and best practices with a view to creating commercial applications using a Blockchain. On the anti-money laundering (AML) side, financial institutions spend about US$10 billion per year in developing AML measures, yet still money laundering continues to take place on a large scale. The objective of money laundering is to disguise the source of illegally obtained money so that the origins become untraceable. Money laundering processes are quite extensive. Generally speaking, money is laundered whenever a person or business deals in any way with another person’s benefit from crime. That can occur in a countless number of diverse ways. Traditionally money laundering has been described as a process that takes place in three distinct stages: Placement, the stage at which criminally derived funds are introduced in the financial system. Layering, the substantive stage of the process in which the property is 'washed' and its ownership and source is disguised. Integration, the final stage at which the 'laundered' property is re-introduced into the legitimate economy. This three-staged definition of money laundering is highly simplistic. The reality is that the so called stages often overlap and in some cases, for example in cases of financial crimes, there is no requirement for the proceeds of crime to be 'placed'. As financial criminals are after the big economic data that banks and similar financial organizations are dealing with, it is more than important to first keep their data safe. By implementing Blockchain, banks can increase the level of data traceability and integrity, which will consequently improve the quality of their transaction monitoring activities. In parallel, Blockchain can significantly reinforce the effectiveness of the internal controls (COBIT 5 – Data Governance Requirements) to secure data and meet the underlying principles outlined by ISACA: Reduce complexity and increase cost-effectiveness Increase user satisfaction with information security arrangements and outcomes Improve integration of information security Inform risk decisions and risk awareness Reduce information security incidents Enhance support for innovation and competitiveness Blockchain can’t become the future AML platform, but as it is immutable and holds client information, financial institutions can directly leverage Blockchain to source data for Know Your Customer (KYC) or AML activities. The data irreversibility of Blockchain provides a single source of truth and therefore reduces the risk of duplications or errors. Blockchain can be used to streamline client on-boarding and KYC processes in a considerable way, but it is necessary to focus on data privacy and cyber security. Cryptography is another critical point which has to be in place to ensure partitioning of data. The number of cases of fraud, hacking and unauthorized personnel accessing data that should be secure poses a significant risk for all businesses. Utilizing Blockchain could overhaul and improve security and highlight where upgrades need to be made after a leak or hack has occurred. Blockchain technology presents considerable advantages to business, but there are some challenges too. Shifting to a decentralized network will require educating end users and operators and integrating with current working process to have the biggest and best impact.  

Blog By: Julien Mansourian, Strategy and Transformation Executive. Blockchain is a sequential distributed and encrypted database found in cryptocurrencies derived from Bitcoin. It has become a tool for...

Banking

How will Transfer Agency continue to be relevant in the current Age of Disruption?

With the profound changes happening across Financial Services due to rapid technological shifts, and regulatory reactions to various scandals, will Transfer Agency still be a function that is relevant to the industry? Transfer Agency has long been thought of as a cost of providing Fund Administration either in-house or through Third Party Administration, and the advances that are prevalent today can lead us to question if the process can be done more efficiently and cheaply than it is at present.  Banks and Asset Managers feel that the process is cumbersome and a ‘must do’ activity rather than a ‘want to do’ activity.  This view is further entrenched within Third Party Administrators who see Fund Accounting and Custody as profit centres, and who are ‘forced’ to do Transfer Agency to win mandates from various Asset Managers. Some are questioning if it is a function that will cease to exist[1] whilst others may look to see if it can be reduced or merged into another line of business.  What will the future Transfer Agent look like as technology continues to advance and people look to disintermediate traditional ways of doing business? Since the start of the new century the world has gone through an unprecedented amount of technological change which has impacted us in ways that we did not foresee. What were once seen as stalwarts of various industries are now no longer in existence. New companies have taken technology and applied it in new and imaginative ways to disrupt and even derail physical industries. At the end of 2000 we approached the end of the dot-com bubble as people rushed to all things Internet related.  Any company with an Internet address was hot property, irrespective of its business model and prospects. As history taught us many of these companies failed as the new technology was nascent and still to find its true calling, with entrepreneurs. Fast forward 18 years and now we see the benefits of the Internet; but most of us would not have anticipated that we would be consuming this through a ‘telephone.’ The advent of the smart device has enabled a wholesale and rapid change in the way we engage with each other (social media); acquire goods and services (Amazon; Alibaba etc.) and how we get information and entertain ourselves. The result of this has been profound with the world entering the Digital Revolution, akin to the Industrial Revolution, with its multi-faceted impact. The Age of Disruption is part of the Digital Revolution as entrepreneurs and others look to apply digital technology to all areas of our lives. But what does the technological shift mean for Transfer Agency, will it remain a function that is required?  Or will the advance in change bring about the end of the function as we move on to more efficient processes? It is important to remember that in of itself Transfer Agency can still be remembered as a book keeping process, wherein the Investors’ contact details, trades and holdings are captured and maintained.  Over time, as Investor and Asset Managers needs changed and the regulatory requirement increased the process became more complex and nuanced. Given its increasing maturity and presence, Transfer Agency has become another layer of intermediation between the Investor and Fund Manager. Whilst, within the Industry we see each Function (Custody, Fund Accounting etc) as a value add service it must be remembered that to the Investor these functions extract costs from the overall Investment. To maintain relevance and cost effectiveness, Transfer Agency must address many significant factors that impact its reputation as an unloved cost centre. This includes the speed and willingness to change; the product lines and investor types which are serviced; the underlying technology and ecosystem and the opportunity to provide an enhanced customer experience. By addressing these and showing the value, the function can bring value to the various stakeholders so that Transfer Agency can thrive and mature further. It is important to remember that it is the one place an Investor gets to engage with the Asset Manager and so offers an unparalleled opportunity to set the tone of engagement and develop the Investor relationship. The Transfer Agent that can show itself as a true partner to the Asset Manager by providing a leading customer experience with an ability to deliver on all the reporting and KYC/AML requirements will soon be seen as essential to the Asset Manager. Effective and efficient processing with the ability to be agile and responsive to Investor, Asset Manager and Regulatory demands will ensure that Transfer Agency can maintain relevance. However, standing still and maintaining relevance is something that Transfer Agency has being trying to do for the last number of years. Now is the time to realise that new, bold actions should be taken so that Transfer Agency is not consigned to the dustbin of history. As the millennial and digital natives become more engaged with the Asset Management industry and recognise the need to ‘save for a rainy’ day, so the Transfer Agency / Asset Management partnership that offers the best customer experience will continue to be relevant. The upcoming generation of Investors, be they Retail or Corporate, will expect to have a dynamic, flexible and ‘enjoyable’ investing experience! This presents a challenge to a function that moves at a glacial pace and frequently pays lip-service to transformation. Now is the time for Transfer Agency to grab the opportunity and show its willingness for quick and effective change. Many in the industry are consciously making efforts to investigate new ways of doing business in how they can advance themselves in the current market and showcase their innovation. However, the whole Transfer Agency needs to rethink how they operate to take advantage of the technology offerings so they can adapt, and survive. Only the bravest and boldest will then thrive. Nevertheless, this cannot be done on legacy systems that over time have spawned whole sub-ecosystems as tactical solutions were applied to maintain the BAU state. Putting a shiny Digital Front End, for example, onto this type of environment will not provide a strategic solution since multiple touch points will be required behind the scenes slowing the experience, potentially disengaging the very audience targeted for the Front End.  An area though which Transfer Agency can and should lead the way is in the servicing of Products and Investors.  The forward looking, dynamic Transfer Agent will realise that they can provide their services to a diverse range of Asset Managers and Investors for a broad range of Products.  Retail investors should be embraced (as it should be remembered that they are in effect the investors underneath the Institutional Investor) with a multi-channel offering on an agile platform. As millennials come to the fore for their future savings a staid Transfer Agent will find it hard to engage and justify their fees if they cannot service this new investor type. The Transfer Agent that can take its experience in fund servicing and apply to new product ranges (e.g. Pensions, ETFs) and additional services (such as acting as Distribution channel/Sub TA) will find opportunities in the market.  With the capability to service Mutual Funds, Alternative Investments and Pension & Insurance offerings through a single gateway, the next generation Transfer Agent continues to be relevant if the engagement process is effective.  To do this, Transfer Agents can evolve by moving away from legacy systems and infrastructure to a micro service architecture in a more streamlined environment. They can truly transform themselves and become a Digital Transfer Agent with the capability to service a wider range of products and investors at lower cost. As a function Transfer Agency will still be completed as it is necessary to keep a record of all Investors in a Fund.  The modern, forward looking Transfer Agent will not necessarily be what we see and are comfortable with today. To maintain relevance now and into the future, Transfer Agency has to embrace change and the transformative technology that is still in its infancy. In addition, the ‘sticky plaster’ approach to systems needs to be rethought and appropriate investment in modern IT system offerings must become the new normal. A cost-efficient function that can provide a streamlined, efficient service which fully engages the end Investor will always be seen as relevant. However, if we continue to wait for others to lead we will soon find ourselves looking back at history trying to understand why Transfer Agency is no longer required. This is an abridged version of an article that was first published in the Journal of Securities Operations &Custody, Volume 10 Number 2 by the same author (see https://www.henrystewartpublications.com/jsoc/v10) [1] “Why transfer agents should embrace an infrastructural future”; Dominic Hobson; http://cooconnect.com/dominic-hobson/why-transfer-agents-should-embrace-an-infrastructural-future

With the profound changes happening across Financial Services due to rapid technological shifts, and regulatory reactions to various scandals, will Transfer Agency still be a function that is relevant...

The Blockchain Generation: Creating the Future of Banking

By: Rochelle Brocks-Smith, Director, FS Industry Solutions Sanjay Mathew, Sr. Director, FS Industry Solutions   It’s one thing to be the master of your own destiny in your 20s. It’s quite another to tackle the challenge of redefining the future of financial services by mastering a cutting-edge technology like blockchain.  Yet that is exactly what students and recent graduates from the Carolinas in partnership with leading banks (Bank of America, Wells Fargo, BB&T & Ally Bank), Oracle SIA partners and Aurablocks are doing as part of the Carolina Fintech Hub Generation Blockchain Challenge.   Why the Generation Blockchain Challenge? The Carolina Fintech Hub, the sponsor of this blockchain event and an organization focused on accelerating the merging worlds of financial services and digital technology, hopes to inspire the next generation of leaders participating in this challenge with an opportunity to develop commercially viable and industry relevant blockchain projects. The challenge for students is to “imagine a disruptive use for blockchain technology” and present a fully working blockchain solution. Entries could win participants one of three monetary prizes and a chance to see their ideas brought to fruition and possibly to impact the course of the industry. The Challenge was officially launched on January 22, and 30 teams submitted their business cases before the February 25 deadline. On February 28, 10 teams—a total of 22 students—were selected to move forward into phase two, developing a proof of concept (POC). Students can explore any blockchain topic they wish as long as they are solving real industry problems that have commercial relevance to the sponsor companies.  Some of the final top 10 projects that were chosen for the challenge include projects like clinical trials record keeping, title and deed transfers, public health reporting, prescription fulfillment optimization, Know-Your-Customer/Anti-Money Laundering, trading of energy and oil, managing gift points and contract arbitration. The range of projects cover a great mix of blockchain project types from provenance tracking, marketplace disintermediation of middlemen, complex process optimization, records keeping and identity management.  You might ask why bank mentors are interested in such a vast array of cross-industry projects. The answer lies in the realization that to learn and innovate one must explore and learn from other industries beyond their own. That is when real innovation ignites. The top three winners will be announced on April 20th by a jury of industry experts. While close to 12 universities submitted applications for the challenge, the top 10 student teams that were selected to compete are representing the University of North Carolina/Charlotte and the North Carolina State University. Teams take advantage of training on cutting-edge technology and real world experience To help the 10 final teams speed development time and focus on their applications, they have been given access to the Oracle Digital Innovation Platform for Open Banking, along with extensive training from Oracle and  Oracle implementation partner Aurablocks, who conducted blockchain training at Raleigh and Charlotte for over 150 students. The Oracle open banking platform is a cloud-based open API based solution that can help accelerate the implementation of open banking ecosystems, connecting legacy core banking platforms with a variety of Fintechs and industry partners enabling rapid digitization of legacy customer facing processes. Teams will be able to use a full API stack, pre-integrated  with fintech solutions on Oracle’s open banking platform (e.g., example payments, machine learning, cognitive tools, biometrics,  bank account connectivity).The platform also a mobile development platform, UI tools, chat bots, development tools, and open source tools support with built-in Oracle’s enterprise-grade blockchain platform Oracle Blockchain Cloud Services. Access to this innovative platform will help the students marry their ideas with the latest Oracle and innovative fintech technology—technology that they would never have access to without the Generation Blockchain Challenge. They will also have the help of industry leaders from CFH supporting organizations including Ally, BB&T, Bank of America, Wells Fargo, Oracle & SIA partners who have volunteered to serve as team mentors and panel judges. Blockchain: The art of the possible Blockchain is a game changing technology that promises to revolutionize financial services. In a recent Roubini ThoughtLab report, “Wealth and Asset Management 2022: The Path to Digital Leadership,” 15 percent of industry leaders surveyed said they are currently using blockchain technology, and another 31 percent expect to be adopting it in the next five years—a 110 percent growth rate.    For the students involved in the challenge, this is a fantastic opportunity to learn more about blockchain technology and to help them develop their business and technical skills with the help of their mentors and access to Oracle’s advanced enterprise-grade blockchain cloud technology. At Oracle we hope to support these students in their endeavors to become the successful entrepreneurs and develop commercial banking and non-banking applications which may be applicable to millions of customers across industries globally. For more information read: Join the Blockchain Revolution Why the Future of Banking is Open    

By: Rochelle Brocks-Smith, Director, FS Industry Solutions Sanjay Mathew, Sr. Director, FS Industry Solutions   It’s one thing to be the master of your own destiny in your 20s. It’s quite another to...

Analytics

The Emerging Generation of Machine Learning and Artificial Intelligence: Solving AML Challenges

Blog By: Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. Some time ago, in my Top 3 Trends Transforming AML Programs post, I discussed how Machine Learning would be one of the top three trends in transforming an Anti-Money Laundering (AML) program. Extending the same thread, in this post I will discuss current key challenges and how to tackle them with Machine Learning and Artificial Intelligence (AI). It has been a challenge to evaluate the total amount of money going through the laundry cycle due to the concealed nature of money-laundering. The estimated amount of money laundered globally in one year is 2 - 5% of the global GDP, or $800 billion - $2 trillion in current US dollars. And that’s why it’s no surprise that regulatory scrutiny and penalties continue to rise, which then drives compliance cost (see my post on Robust AML Program vs. Penalties: AML & Sanction Fines in 2017). Rapid developments in financial information, technology and communication allow money to move anywhere in the world with speed and ease. This makes the task of combating money laundering more urgent than ever. The deeper "dirty money" gets into the international banking system, the more difficult it is to identify its origin. Recent products, such as virtual currency, make it even more complex. Current Challenges: The AML compliance program has come a long way since its evolution; however, below are the current key challenges faced by the industry when it comes to detection and investigation: Traditional Rules Generate False Positives; Expectation to Monitor More:  To monitor suspicious money laundering behaviors, financial institutions have relied on traditional rules to generate alerts. Static rules are applied on a fixed segmented customer population. Most of the financial institutions still follow this unsophisticated segmentation approach. Which means the total number of false positives is very high. Additionally, since these traditional rules are fixed and not dynamic, the number of transactions increases the number of false alerts. Adjusting the parameters to generate too little alerts means that the firm has increased overall risk and is now exposed to regulators. Unknown Patterns: Based on industry knowledge, both financial institutions and system providers offer a good coverage of monitoring patterns, which are applied based on the financial institution’s AML risk profile. Ongoing tuning – Above the Line and Below the Line – is performed to make sure similar behaviors are not missed for the customer population and helps understand the performance of specific suspicious behavior. However, financial institutions do not have a mechanism to understand the suspicious monitoring patterns, which are completely missed and unidentified. This might mean that a financial institution exposed to money laundering activities goes unnoticed. Acquiring More Man Power is Not Enough: Detected alerts and patterns by the system is not always conclusive proof of money laundering. While the system highlights them, it is ultimately dependent on investigators for conclusion. Below are the key areas where having more human resources is not making the overall investigation procedure efficient: Looking for Needle in a Haystack: For the last few years, as regulatory scrutiny increased, firms were expected to monitor even more. It was tough to convince regulators the reduction of monitoring thresholds, which would help firms reduce the total number of suspicious alerts and false positives. The quick resolution was to hire more investigative resources to clear the pile of alerts these firms were generating; the current situation is like looking for needle in a haystack. Manual Linking is Not Reliable: Because the monitoring scope has increased significantly, firms are expected to monitor suspicious behavior from different view points (e.g. suspicious behavior focused on customers, involved external parties, other institutions, etc). This resulted in duplicate alerts, alerts that look unrelated upfront but are related, and entities (customer, accounts, external parties, etc.) that look unlinked upfront but are linked. Social network activity and the use of modern devices makes it almost impossible for investigators to link related alerts or entities manually. Therefore, current investigation procedures are lagging a customer 360-degree viewpoint. How Machine Learning & Artificial Intelligence Can Profit? While machine learning algorithms have been around for a quite some time now, the ability to automatically apply complex mathematical calculations on big data is a new expansion. Artificial Intelligence technology provides an ability for systems to learn from past human interaction, which can be used later. Let us explore how machine learning and AI can be leveraged in the AML space to resolve some of the challenges we discussed above. Sophisticated Monitoring: Typically, monitoring behaviors are applied on a fixed segmented customer base, which when matched generate suspicious alerts. Some more advanced financial institutions apply more comprehensive segmentation logic, which is based on customer type, product, transactional activity, etc. However, most apply fixed rules on their customer population. The first step for advance monitoring would be to apply more sophisticated intelligent segmentation based on customer data in segments/groups with similar characteristics so that appropriate behaviors can determine the suspicious pattern. The second step is to leverage machine learning to write expressions of complex rules that are required to detect a combination of multiple money laundering patterns as compared to a single rule at a time (traditional approach). Intelligent segmentation combined with sophisticated machine learning patterns can significantly reduce the number of false positives and will enhance overall detection coverage of previously overlooked suspicious transactions. Pattern Discovery: Machine learning relies on algorithms, which read through data and automate pattern detection. It can identify hidden patterns that a user wouldn’t necessarily have noticed. It is a more comprehensive approach of making sure unknown suspicious patterns are identified and detected. Networks can be formed and correlated at multiple levels and can be based on common attributes found on the events themselves, the reference data related to the event and/or any other internal or external data (social network activity). Network Analytics, which works like social media analyzing connections between people, can provide a visual of how a financial institution’s customers can be connected via their transactions. Network Analytics allows both automated pattern detection as well as the expression of explicit (and complex) pattern queries. Correlation and Prioritization: Above described Network Analytics can be further used for correlation, allowing firms to apply logic to group similar alerts thus enabling individuals to spend less time sifting through alerts and focus on correlated cases. The correlation process should examine alerts and entities associated to those alerts, scoring them individually and as part of a correlated network. Later, alert scoring which looks at historical alert dispositions can be applied using machine learning models. Users should be able to select models, tune parameters, compare model results and deploy the most productive model to score new. Conclusion: When strategically implemented, the methods described above can help financial institutions improve operational efficiency, reduce risk and focus employee man-hours on a smaller, higher risk caseload. Lastly, no matter at what area Machine Learning or AI is applied, “explainability” is key from regulatory standpoint. Therefore, focus would be to produce more explainable models, while maintaining an elevated level of learning performance (prediction accuracy), and enable human users to understand, appropriately trust, and effectively manage the emerging generation of artificially intelligent partners.

Blog By: Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. Some time ago, in my Top 3 Trends Transforming AML Programs post, I discussed how Machine...

Banking

Fintech & Digital Disruption on Transfer Agency

In today’s Technology and Business Landscape Digital Disruption is and has been a buzz phrase for many years.  In particular, we have seen the impact on Financial Services, as well as many other industries.  Nevertheless, in some of the more established markets and segments of Financial Services there has been a mixed outcome to the new digital age. Along with Digital Disruption there has also been much discussion for the past few years on the impact of FinTech within Financial Services. It seems Core banking is at the fore of this focus for both FinTech and Digital Disruption, such that the Transfer Agency space has not seen the full impact of the revolution.  Within Transfer Agency we have seen many companies choose to extend the lifespan of current, legacy systems.  However, as Fund investors have become more 'tech savvy‘, and the demand from Regulation continues to grow, many organizations now find that they cannot move forward as the market and their clients demand (see Fund Servicing in the Millennial Age).  FinTech companies can offer alternatives and we have seen impacts on established market participants.  Nevertheless, many FinTechs are not focused on the fundamental aspects of Transfer Agency and therefore they do not always cater to the needs of the key players.  The same is true for those offering a transition to the digital market.  However, it has been found that with the rush to be digitally ready many companies have not received the expected Return On Investment.  To be digitally ready you first need to be on a stable core platform that offers a global foundation for current and future needs.  This requirement is one of the strategic elements that any next generation system for a Transfer Agent should provide. We only have to look at the various banking groups across the globe who have struggled to add digital to their legacy core banking systems (see Outage for Bank following refresh) to see the impact of adding digital to legacy.  Those groups with a more robust and scalable platform have been more successful in their digital role out. So, will Digital Disruption impact upon us here in the Transfer Agency segment and if so how?  Will legacy systems be robust enough to handle the new digital requirement?  The TA industry is at a crossroads. When we look at the role TA plays in the funds industry, it is often seen as an activity that must be performed, but one that is rarely done with eagerness due to the perceived costs involved.  We know that regulators are placing significant challenges on all participants, but is this preventing a move to digital?  Within Financial Services and Transfer Agency (TA) many commentators have expressed themselves to be doomsayers, prophets or participators in the current vogue of technology speak.  Is Fintech the way forward or can incumbents use their current providers to move into the digital age? For many the burden is adding additional risk but moving forward with digital will, is there a regulatory hindrance that is inhibiting the move to digital in established markets?  Is regulatory compliance a smokescreen and do today’s Transfer Agents have more fundamental issues preventing a move to digital? To date we have not seen the expected Digital Disruption in the industry as incumbents and Fintechs have focused their attentions elsewhere.  However, I now expect this to change and the TA with the best technology provider will evolve to meet the market need for digital.  With a robust and scalable platform behind a modern TA, opportunities exist to improve revenue streams and take the market lead with a best-of-breed offering. However, as noted above, legacy systems are not necessarily capable of supporting an additional channel without a risk of further stability issues. These platforms have grown their own eco-systems to such an extent that they require ongoing maintenance to ensure a BAU state for the business.  The addition of the digital channel may, or may not, be the ‘straw that breaks the camel’s back’. So, is FinTech going to be the silver bullet for us in TA for digitisation?  When researching what is ‘Fintech’ for TA I was amazed to find that the search engines were returning results from 2008.  Is that true, if we type “Transfer Agency Fintech” the industry has not progressed in over 8 years, or has there been a period of stagnation? Whilst this is the case in Europe and the US in particular there is evidence that Fintechs and incumbents are successfully moving into digital in other regions (see Thai Bank plans for 2016).   I, for one, feel that here in Europe we have been slow to grasp the opportunity that digital presents. Consequently, as incumbent Fintech and other providers have been slow to move to digital, who will act as our disruptor?  Will there be ‘true’ disruption from a new source (be it a Fintech or a new entrant (eg Apple) or will it be the use of existing technologies in a new way? The area can provide our first glimpse of this disruption is technology that is already established and more than 20 years old, namely distributed ledgers.  The advent of Bitcoin (and associated Blockchain technology) has given this old concept a new lease of life. The idea we can take from Blockchain is that it offers the end investor a simplified experience when engaging with their service provider.  In turn the service provider will need to support this evolving technology.  To move forward then we need to look beyond our existing legacy platform, identify where simplification for eco-system and/or platform can take place.  We have seen that many applications have spawned their own eco-systems and customisations to meet the demands of a dynamic market and energetic regulators.  As such, these legacy systems are not in a place where they have the agility and ability to update quickly.  There is the need to have support (often critical life support) and costly deliverables to allow for the upgraded version to be delivered to the market place.  Is this suitable for those who want to move into the digital space? With a simplified enterprise wide system, it is much easier for the platform provider to react where needed, to service their partners by being engaging with their consumer and drive efficiencies within their business to keep costs down. We have, in Oracle, seen examples of success our customers enjoy now after simplification to a single global application.

In today’s Technology and Business Landscape Digital Disruption is and has been a buzz phrase for many years.  In particular, we have seen the impact on Financial Services, as well as many other...

Analytics

The Effectiveness of Suspicious Activity Reports

Blog By: Julien Mansourian, Strategy and Transformation Executive. The purpose of an Anti-Money Laundering application is primarily to enable the reporting entities (finance-related industries) to detect and disclose any suspicious transactions above a predefined threshold (depending on jurisdictions) by compiling and submitting a SAR (Suspicious Activity Report) to FIUs (Financial Intelligence Units). The report includes information about known or suspected violations of law or suspicious activity observed by financial institutions. The number of SAR filings made to FIUs is considered a key performance indicator of the overall AML regime and its efficiency by the regulators. As per the statistics issued by FinCEN, the reporting entities have filed more than 3 million SARs in 2017, against only 150,000 in 1996. In Europe, 1.5 million SARs were filed in 2017 across the 28 EU Member States, almost double the number received in 2006. UK, The Netherlands, Italy, Latvia and Poland are the top 5 SAR issuers in Europe. Today, we really talk about an explosion of SAR submissions and many financial organisations are currently questioning the reliability/objectivity of SAR filing. The explosion of the SAR filings reflects a combination of increased regulatory requirements, a broader range of institutions subject to BSA/AML regulations, stronger compliance programs at financial institutions or possibly a weak detection system generating too many alerts (false negatives and false positives) analysed and promoted to cases and disclosed to the regulators. Although, SARs have been instrumental in enabling law enforcement to initiate or supplement major money laundering or terrorist financing investigations and other criminal cases, it still represents a significant burden on budgets and earnings for financial institutions. Filing too many SARs does not mean there is a stronger detection process in place and it considerably increases the workload of compliance functions as they must deal with a combination of too many alerts, cases and consequently SARs. In fact, many financial institutions are still missing out the real criminal activity if we look at the number of fines over the last 7 years. According to Boston Consulting Group, banks globally have paid $321 billion in fines from 2009 to 2016 for an abundance of regulatory failings from money laundering to market manipulation and terrorist financing. Although the bank penalties have decreased from 2015 onwards, the volume of SARs have considerably increased (3 million in 2017 against 1.9 million in 2015). FinCEN and other FIUs worldwide have never disclosed what percentage of SARs result in successful prosecution. It looks like the SAR filing is a black box and nobody wants to talk about the accuracy and consistency of the filed SARs. In some organisations, filing a manual SAR takes more than 2 hours and this is a real burden for compliance functions and must immediately be automated through digitisation of the template and generalisation of E-Filing via governmental portals. Moreover, with the growing number of cyber-attacks, regulators are considering the introduction of a new set of requirements when it comes to filing SARs, and there will be a Cyber SAR increasing the current workload of the compliance office. In 2016, FinCEN issued some advisory notes to request reporting entities to include Cyber-Related information in their SAR templates. Additional events should be included in the form: Description and magnitude of the event Known or suspected time, location, and characteristics or signatures of the event Indicators of compromise Relevant IP addresses and their timestamps Device identifiers Methodologies used Other information the institution believes is relevant For more information about the note, please click here. Europol has recently published a paper outlining some key findings over the effectiveness of the FIUs in Europe: The structure of Financial Intelligence Units (FIUs), their activities, working practices, and methods of recording and analysing information vary considerably across the EU. There is limited harmonisation among EU Member States (MS) beyond the obligation to establish an FIU. This makes any comparison of the implementation and effectiveness of the EU anti-money laundering directives and the effectiveness of suspicious transaction reporting difficult, if not impossible. Just 10% of suspicious transaction reports (STRs) are further investigated after collection, a figure that is unchanged since 2006. Over 65% of reports are received by just two Member States - the UK and the Netherlands. SAR volumes are likely only to increase, in particular as virtual currency providers come into regulatory scope and services using distributed ledger technology (DLT) enter the mainstream. Between 0.7-1.28% of annual EU GDP is detected as being involved in suspect financial activity. Together banks and MSBs are the source of the majority of STRs sent to the FIUs. Certain sectors are noted for their low levels of reporting, in particular high value goods dealers and bureaux de change. Reporting on terrorist financing accounted for less than 1% of reports received by FIUs in 2013-14. The use of cash is the primary reason triggering reporting entities to report suspicion, however in Luxembourg, where cash issuance is almost double its GDP, the use of cash is not a common reason for reporting. The ‘symmetrical’ exchange of information between FIUs may prevent crucial information contained in STRs reaching authorities tasked with criminal investigations. New technology presents challenges to the current anti-money laundering framework. The increasing digitalisation of financial services results in growing volumes of transactions and extremely large data sets requiring computational analysis to reveal patterns, trends, and associations. The use of analytics is therefore becoming essential for both reporting entities and FIUs to cope with information and fully exploit its potential. The growing demand for online services and related internet payment systems poses considerable challenges to the EU policies concerning money laundering and terrorist financing. The development of borderless virtual environments call for reflection on how to adapt policies which are meant to be supervised only at national level, while the underlying business is already transnational and globalised in its own nature: there is an urgent need for a supranational overview. The integration of the FIU.net project at Europol presents an opportunity for greater operational cooperation between FIUs and law enforcement. Conclusion Financial organisations have put in place the required processes, work force and systems to prevent, detect and report suspicious activities. They indeed have increased the number of SAR filings over the last decade pushing their overall cost of compliance to the roof. In parallel, regulators have ramped up enforcement efforts concomitant with the rise in SAR filings and have fined many reporting entities. Banks have a general sentiment of being trapped in an ocean of heavy and frequent regulations coupled with an increasing cost of compliance and impressively high cost of ownership caused by complex system implementations. The continuing rise in SARs is an indication of the current workload in AML compliance. The majority of the SARs are produced within banking. However, there are emerging trends rising in financial institutions across the industry in particular credit institutions. Over-reporting to mitigate regulation risk is undermining the SAR Regime and is resource-intensive for firms. Compliance Officers should focus on improving the SAR reporting quality and its automation to eliminate the existing ocean of unproductive SARs. To support the reporting entities in their SAR optimization journey, FIUs should redesign a global SAR template covering all jurisdictions including the same set of data requirements.

Blog By: Julien Mansourian, Strategy and Transformation Executive. The purpose of an Anti-Money Laundering application is primarily to enable the reporting entities (finance-related industries)...

Banking

Fund Servicing in the Millennial Age

Millennials comprise about a third of the global population and are massively impacting the way investments function. Since they’ve grown up as digital natives, they are certainly more inclined towards digital products, services, and lifestyle. They believe more in indulgence and direct control over things: no wonder then that trends like peer to peer lending and crowdfunding find greater resonance with this generation. The new modes of social interaction, comfort level with the latest technologies and confident vision towards investments that millennials present have been key drivers of recent innovations in the fund industry. Millennials are also more responsible and visionary investors when compared with their parents. They are 2x more active at sustainable investment than average and they not only invest based on their financial goals and requirements but also try to fit it into the larger scheme of things in terms of social, political and environmental implications. Impact investing is one of their favorite ways to plan money matters. The Shift in the Industry While millennials slide into the driving seat of the financial world, baby boomers are accessing their savings and investments to support their retirement plans. Regulations, on the other hand, are affecting all aspects of the industry right from fee structures to product designs to business models themselves. These developments have certainly been driving a massive digital shift in the fund management industry. This also means the way fund managers deal with customer interactions, loyalty and long term relationships will all change to a completely digitized, less time consuming and hyper-connected modus operandi. Several new age investment firms have started replicating millennial requirements into their products and solutions. Online fund selection, portfolio creation as per specific buckets and maintenance is now nothing new. But in the future, fund managers will need a truly digital approach to anything that is distantly related to fund management. According to a research by PWC, technology will be one of the key influencers in the fund servicing market and global investable assets for the asset management industry will increase to more than $100 trillion by 2020. This means that a large volume of fund transactions will rely on new age technology, innovative and connected systems and smart front to back integrations. A Digital Heavy Future Awaits Us According to a research by PWC, most of the fund management companies would have hired a chief digital officer by 2020. While currently, most asset managers are not much digitally involved apart from maintaining a website, by 2020, most of them will be involved digitally through social media, mobile phones and other digital media. Big data, in future will play an important role in helping asset manager understand the requirements of their customers and offer products and services on the right channels at the right time. It will also help them price products in a more relevant manner and offer a more coherent servicing. In an industry that has been oversupplied, differentiation leveraging technological advancements can make a huge difference. It is not a coincident then that the world’s largest asset management firms have been shopping robo advisor firms. One of the latest trends in this market is the entrance of new players from other industries who are heavy on customer data. Since there is a huge gap between the consumer requirements and the way the incumbents service them, there is a huge opportunity for new players in this market. Use of self-service portals with great customer experience, and use of chatbots and automated advisors are some of the leading examples of how a digital shift is a prerequisite in the industry. This will not only increase the speed of servicing but will also free up the precious advisor time for more high priority customer interactions that need personal attention. Asset Management Systems of Tomorrow In order to gain cost efficiency and better revenues in this context, fund servicing firms need to choose their technology provider after doing a thorough requirement to capability mapping. The requirements of asset management firms might vary according to their size, market, and business model. The asset managers of the next generation will certainly need technology platforms that can facilitate their short term and long term business objectives. They will need: Segmentation of their customers based on behavior on social platforms, previous records and future career and personal goals. Digitally enabled prioritization of clientele as per their net worth and lifetime value and optimization of resources accordingly Channel agnostic marketing and communication to percolate across segments of focus in the most timely and relevant manner An automated fund servicing system that gives the fund managers the scope to learn from the past and grow fast. A truly digital customer experience layer and a digitalized backend in order to ensure a seamless servicing and a great customer relationship lifecycle. Fulfilling these requirements will not only need a scalable solution but will require the use of the latest technologies such as artificial intelligence and machine learning. A global vendor with localized focus and an eye to the future might prove a good partner in this journey to digitization. Is your technology vendor supporting your journey to true digitization? My colleague Shalu Upadhyay and I co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at Oracle dot com and shalu dot upadhyay at Oracle dot com            

Millennials comprise about a third of the global population and are massively impacting the way investments function. Since they’ve grown up as digital natives, they are certainly more inclined...

Analytics

The Next Generation Risk & Finance Transformation

Blog By: Julien Mansourian, Strategy and Transformation Executive. In the 90’s, we talked about Enterprise Resource Planning (ERP), an integrated Business Process Management (BPM) software that was designed and built to centralize the management of front-end and back-end business functions, such as SRM, PLM, SCM, CRM, HCM and Accounting, etc. Companies were eager to implement ERPs to replace or consolidate their distributed/fragmented application landscape and end user computing tools to gain efficiency and effectiveness in managing their operations. It was the trendy thing to do at that time, and automation was already a hot topic for most organizations. Namely, SAP, Oracle, BaaN, Sun Systems, JDE and Lawson were competing in the market. On top of the ERPs, applications such as Cognos, Comshare FDC, Hyperion and Cartesis were broadly in use to manage planning, reporting and consolidation activities. For MIS requirements, we could go for Crystal Reports, Qlik, MicroStrategy or Business Objects. While implementing ERPs and other associated applications, companies also initiated a cycle of process streamlining and lean management. So, transformations such as F&A shared service center implementations, outsourcing and offshoring coupled with Kaizen and Six Sigma emerged in support of the system implementations. Clearly, from 1990 to 2000, Enterprise Finance and Performance Management took a predominant space in transformations. In Europe, firms also went through the adoption of the Euro currency and the International Accounting Standards (IAS), which were complex journeys. From 2001 to 2002, the collapse of large corporations such as Enron, WorldCom and Arthur Andersen and the 9/11 terrorist attack pushed governmental bodies to issue a whole slew of regulatory and compliance obligations (Sarbanes-Oxley Act and US Patriot Act) to prevent/detect financial crimes and to provide stronger internal control over preparation and disclosure of financial statements to protect investors from the possibility of fraudulent activities by corporations. To amplify the migraine of C-Level Executives largely involved in a massive number of projects, financial scandals such as Lehman Brothers and Madoff, leading to the credit crunch crisis in 2008, increased the number of market abuse regulations across all continents. SEC, FINMA, FINRA, FinCEN and many others revisited their directives and released a big chunk of new regulations specifically oriented towards financial services. Evidently, from 2002 to 2010, corporations enhanced their Enterprise Risk and Compliance functions to comply with the never stopping regulatory requirements as opposed to drive continuous improvements. Since 2010 and with the considerable increase in digitization and Big Data, financial services are almost swimming in a large pool of regulations and the list is very long, but namely we talk about MAD/MAR, FATCA, CRS, Dodd-Frank, BCBS 239, Basel 3, CTF, IFRS 9, 16 and 17, Solvency, MiFID, FinRep, CoRep, AnaCredit, etc… and the mood continues with GDPR and others to follow. Looking back in time and seeing the trends in today’s market, the traditional finance transformation of 30 years ago has completely changed its face and evolved from purely an enhancement of the CFO function to a broader journey including risk, compliance and operating model initiatives. Today, companies struggle because much of the tactical technology currently in place was not designed to adapt and excel in the era of digital disruption, cloud computing, machine learning, AI, and at the same time, comply with an ocean of regulations. Within many organizations, transformations were planned and executed without a consistent strategic visioning and alignment with a target operating model. C-Level Executives were so much under pressure to deliver quick stop gap solutions over the last years that they had no time and bandwidth to come up with a compelling strategic business model given the uncertainty and volatility of the market and regulations. The major headache of today's C-Level Executives is to reformulate a long-term strategy and to redesign both IT and Business architectures considering data analytics and on-going changes in technology and regulations. Placebo effect does not last long and is very costly. The best way to start a transformation is to elaborate a compelling and consistent transformation strategy and strategic roadmap before embarking the organization on a long journey. Today’s finance transformation is simply a convoluted word, which includes many objectives that could become misleading and difficult to understand. The Next Generation Risk and Finance Transformation is about delivering growth and managing complexity while sustaining a vision and complying with regulations and emphasizing in getting the most valuable insights from the digital assets. Finance Transformation is not only a CFO agenda. It requires heavy and continuous sponsorship from the CIOs, CSOs, CDOs, CCOs and CROs. They must jointly streamline their thinking and achieve some key imperatives: Aligning the strategy at all levels to ensure the best in class execution of the transformation. Maintaining and updating the enterprise operating models (ITOM and BOM). Identify transformation’s KPIs to Monitor business performance. Produce a compelling digital transformation strategy and execution plan. Develop advanced business capabilities across all functions. Invest in innovations, hire a Chief Innovation Officer and launch a team in charge of constantly monitoring emerging technologies to support updates of the business strategies and operating models. Building a single source of truth via data lakes coupled with a robust enterprise data governance model. Going forward, organizations must put in place flexible and scalable technology platforms while complying with regulations and managing their petabytes of data. Deploying technologies to only tick a compliance box is not a sustainable strategy and rework of the solutions can be drastically costly and risky in today’s business context. CFOs are under the gun as they have the difficult challenge of managing the exponential growth of regulatory controls while focusing on performance of their business. The Digital and Innovation journeys are heavily Data Centric and having a single and centralized data lake playing the “Data as a service” role would be the best way to capture, integrate, easily access, and analyze high-quality data before serving Risk, Finance and Compliance functions.

Blog By: Julien Mansourian, Strategy and Transformation Executive. In the 90’s, we talked about Enterprise Resource Planning (ERP), an integrated Business Process Management (BPM) software that...

Analytics

Robust AML Program vs. Penalties: AML & Sanction Fines in 2017

Blog By: Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. In recent years, law enforcement agencies have emphasized efforts to track and prosecute money laundering that occurs through financial institutions, as compared to focusing on money laundering resulting from fundamental criminal violations, such as fraud, drug crimes, tax evasion, or corruption. Federal prosecutors are looking instead at weaknesses in the internal procedures employed by financial institutions to prevent laundering. This makes regulatory compliance the leading driver for anti-money laundering (AML) initiatives by firms, making up to nearly $16 million annually for financial institutions with more than $100 billion in assets. AML compliance costs are anticipated to rise as the regulatory landscape continues to get tougher. In 2017, several guidelines were added across the globe, such as the EU 4th Money Laundering Directive, The Cayman Island AML Regulations 2017, MAS Notice 637 (Amendment) 2017, and so on. Additionally, regulators continue to make a point and penalize financial institutions in the form of consent orders and regulatory fines due to the lack of rigorous AML programs. In 2017 alone there were 25 instances of AML fines, totally $2+ billion by 14 regulatory bodies around the world. This blog post summarizes the AML fines occurring in 2017. Enforcer Countries In 2017, Australia had the most AML fines, followed by the United States and the United Kingdom. Only one enforcement action made up the $1 billion fine in Australia: the Commonwealth Bank fine. Countries Sum of Amount Count Australia $1,000,000,000 1 United States $763,385,000 10 United Kingdom $163,000,000 1 France $96,800,000 2 India $20,000,000 1 Ireland $7,868,915 4 Singapore $2,496,630 3 Bermuda $1,500,000 1 Italy $640,000 1 Taiwan $33,200 1 Grand Total $2,055,723,745 25   Ireland, by Central Bank of Ireland, penalized on four instances: Bank of Ireland, Intesa Sanpaolo Life dac, Allied Irish Banks, and Drimnagh Credit Union Limited. Regulatory Agencies in Charge The United States’ New York Department of Financial Services (NYDFS) imposes the second highest amount of penalties in total as compared to any other regulatory bodies in the world, making them the toughest regulator agency. One of NYDFS’ famous civil penalties in 2017 was against Habib Bank Limited and its New York branch for nearly $630 million. This was based on allegations of persistent Bank Secrecy Act, AML and sanctions compliance failures. Pakistan’s Habib Bank agreed to pay $225 million to settle. Regulatory Body Sum of Amount AUSTRAC $1,000,000,000 New York State Department of Financial Services $471,000,000 FinCEN $163,000,000 Latvian Regulators $91,000,000 TARP $35,000,000 Reserve Bank of India $20,000,000 Central Bank of Ireland $7,868,915 ACPR $5,800,000 Monetary Authority of Singapore $2,496,630 Bermuda Monetary Authority $1,500,000 CONSOB $640,000 FINRA $135,000 Financial Supervisory Commission $33,200 Grand Total $2,055,723,745   Financial Institutions in the Spotlight The largest fine was given to Commonwealth Bank, followed by Deutsche Bank who was penalized twice in 2017. On one occasion, Deutsche Bank AG agreed to pay $41 million to settle allegations from the Federal Reserve that its U.S. operations failed to maintain adequate protections against money laundering, the latest in a string of fines that have cost the German lender billions of dollars. On another occasion, Deutsche Bank failed to deal with a stock trading scheme that enabled some of its clients in Russia to improperly move huge sums of money out of the country and into offshore accounts, according to regulators. In 2017, the virtual currency industry continued to expand. BTC-e, an internet-based currency and one of the largest virtual currency exchanges by volume in the world, was penalized $110 million for facilitating transactions involving ransomware, computer hacking, identity theft, tax refund fraud schemes, public corruption, and drug trafficking. Financial Institution Size and Severity of the Violation Most of the large institutions were penalized for high, medium, and low categories of lapses. Very small institutions (less than $10 billion in assets) were heavily penalized for mainly high severity failures. Two of the very small institutions who were penalized for high severity lapses were Lone Star National Bank for $2 million and Merchants Bank of California for $7 million. Largest Individual AML Fine to Date The MoneyGram penalty of $250,000 was one of the largest fines ever imposed by FinCEN on an individual. The former Chief Compliance Officer of MoneyGram International, Inc., Thomas E. Haider, reached a settlement with FinCEN and was held personally responsible for his company’s anti-money laundering failures. Haider agreed to pay a $250,000 penalty and be barred from working as a compliance officer for any money transmitter for three years. Conclusion In 2018, the AML compliance landscape is expected to be more stringent than ever before. Some key trends in the coming year include: Actions against individuals responsible for an institution’s AML compliance Tax evasion will play a vital role due to Panama to Paradise, the biggest tax evasion data leaks in history Sanctions will continue to play a central role especially in the U.S. government’s response to geopolitical events The message from regulatory agencies around the globe is loud and clear – increase AML compliance or face prospect of civil and/or criminal enforcement. In light of these actions, as well as the increased scrutiny from other regulatory bodies, it is time for institutions to re-examine the quality of their programs and ensure, among other things, not only that their AML and compliance systems are of highest standards, but also that their key personnel understand the importance of AML compliance.

Blog By: Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. In recent years, law enforcement agencies have emphasized efforts to track and prosecute money...

Analytics

The Time to Explore Machine Learning is Now

Over the years, we have become increasingly dependent on technological advancements. Almost all aspects of our lives - the way we communicate, the way we shop, and the way we bank are all impacted by technological advances. Innovations are underway to improve standards and build consumer friendly products. Take for example the mobile phone industry which has evolved significantly over last 10 years and has come up with innovative ways to make communication simpler, by creating products like WhatsApp, real time video calls, games etc. Similarly, Artificial Intelligence (AI) is a technological advancement that is making an impact on the banking industry. Many organizations are already exploring ways to remain ahead of the competition and increase their standing as innovative enterprises. If banks are to remain competitive they need to take into account consumer lifestyle and behavioral changes that are defining today’s value propositions and design around artificial intelligence to deliver relevant services. According to Efma, “AI presents a huge number of opportunities for the banking industry, who, when able to exploit their growing data repositories, can better meet regulations, increase their bottom line, improve customer experience and more.” In broader terms, artificial intelligence is the ability of a machine to act like a human being by gathering facts about a situation through sensors and comparing them with stored data and making decisions based on what it signifies. Voice and visual recognition followed by machine learning are the most commonly used AI solutions across industries. Movie recommendations by Netflix, product recommendations by e-commerce retailers like Amazon, Alibaba and Facebook's ability to spot our friends faces - are all early examples of machine learning. And Google's self-driving car is becoming a classic case study. A self-driven car is not programmed to drive, it learns by driving millions of miles on its own and observing how people drive. Machine learning can be useful especially in cases involving large dynamic data sets, such as those which track consumer behavior. When behaviors change, it can detect delicate shifts in the underlying data, and then revise algorithms accordingly. Machine learning can even identify data variances and treat them as directed, considerably improving predictability. These exclusive capabilities make it significant for a broad range of banking applications. Banks can use machine learning across the front, middle and back office, in functions ranging from customer service to sales and marketing to fraud detection to securities settlement. They can identify or even prevent fraud by deploying machine learning to look into patterns in payment transaction data like for credit cards to spot anomalies or inconsistencies, in middle office functions. There are many other areas in the banking ecosystem where machine learning can add substantial value: Creating New Sales Opportunities – It opens doors for cross selling and up selling to by developing deeper insights from evaluation of customers’ needs and usage patterns. Organizations using machine learning can increase existing customer revenue by 10-15%. Not Letting Customer Go - As machine learning monitors customer behavior deeply, it can forecast if there are any risks of losing a customer and can enable banks to quickly act on retaining them. Media sentiments, demographics and site behaviors play a very important in predicting such activities. Organizations that use machine learning can reduce customer attrition by around 25%. Automating Customer Services – Cognitive machine learning helps organizations automate their customer service centre and lower servicing cost, enhance system performance, improve customer experience, enable faster responses and reduce risks. Today there are multiple virtual assistants available that use deep learning and natural language processing to understand and interact the way humans do. Reducing Bad Debt – Machine learning can build dynamic models which can segment delinquent borrowers and identify self-cure customers enabling organizations have better collection practices. Organizations using machine learning have reduced their bad debts by 35 – 40 % already. Reducing Price Leakage – Machine learning can certainly help in eliminating price leakage and billing errors by applying advanced analytics. Improved customer segmentation and appropriate pricing models can help organizations achieve 10 – 15% more revenue. Technology is transforming the way consumers behave and as seen above it is most evident in the banking industry. Machine Learning can revolutionize payment operations by creating insights from large datasets and forming various patterns, correlations and provide informed decisions in real - time.  Reduced operational cost, improved compliance and better productivity which in turn leads to higher revenue are few benefits that banks can derive out of machine learning.   Despite these benefits, organizations do not exhibit complete trust in machine learning. It must be noted that 60 years ago when calculators where introduced they too were not trusted. And in some cases when organizations do take up machine learning, they expect instant ROI. This however, is not realistic, you need the right talent, right tools to develop the right approach and harness the potential of machine learning. It is evident that machine learning is here to stay, and is impacting a large number of industries, and the banking industry an early adopter. This trend is expected to propagate exponentially in the future. However, it is important for organizations to establish a clear vision and strategy, and embrace this trend to be winners over the next ten years. My colleague Parul Jain and I co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at Oracle dot com and parul dot jain at Oracle dot com

Over the years, we have become increasingly dependent on technological advancements. Almost all aspects of our lives - the way we communicate, the way we shop, and the way we bank are all impacted by...

Banking

Exerting a Gravitational Pull - Singapore Fintech Festival 2017

Mark Smedley, VP FS Industry Solutions, Oracle Sharon Tan, Strategic Marketing Lead, Oracle Financial Services By: Mark Smedley and Sharon Tan The recent Singapore Fintech Festival 2017 revealed the power of gravitational pull - where the strength of the force increased exponentially as objects moved closer together. The festival successfully pulled together market resources into close proximity. We see the community mobilized - financial instuitions and start-ups alike with a mission mandate to solve a series of 100 banking problem statements. Here’s our take on what’s generating the gravitational pull: 4 value propositions and takeaways dominating the FinTech and Innovation landscape.   Consumers in Orbit: Omnipresent Service Triggered by Open APIs Gravity is a force of attraction that pulls objects into its orbit. We’ve observed that the model is shifting from push to pull as banks are looking to a strong API strategy to avoid being ousted out of the consumer ecosystem or orbit. The ‘digital switch’ to omnipresent service experiences has been triggered by the leap forward with Application Programming Interfaces (APIs), Open banking and the rise of FinTechs. As part of the broader Smart Nation agenda, Singapore is taking the lead in the Fintech 2.0 Journey and moving at full speed to an open API economy. Banks are looking to build strong API strategies to integrate and deliver services and data to the digital economy with the right devices and user interfaces, which enables them to retain or regain relevance to their customers Local banks are taking open architecture seriously with the likes of DBS Bank launching their API developer platform with more than 170 APIs and over 50 successful collaborations. At Oracle, we’ve built our Open Banking Platform with 600+ Open APIs from Oracle, in addition to a growing list of FinTech and FSI APIs, which the Oracle ecosystem participants can use to collaborate and deliver innovation at scale. Force Generators & Multipliers: FinTechs & TechFins Non-banks are leading as a force generator and multiplier. Besides FinTech start-ups, TechFin giants like Tencent, Ant Financial and GAFA (Google, Apple, Facebook, Amazon) are all vying to provide payments, lending or other financial services. FinTechs at the festival like Creditease,  Flywire, Blocko and Trunomi are actively filling in the gaps of underserved market segments. They are jumping in to adopt the latest in Blockchain, Data Analytics and Artificial Intelligence and cognitive learning to solve front-end customer problems. We also see new business models emerging around the needs of millennials, overseas workers and small businesses etc.   Our Oracle team had the privilege to mentor 2 of the Top 20 Finalists of the Global Haccelerator -- ERNIT, the world's first smart piggy bank, and SnapCheck's blockchain-enabled Digital Checking Platform.  Both leveraged our Open Banking and API Platform to build and present market-ready solutions for the Financial Industry.   In the new age of collaboration, banks are increasingly realizing that to succeed, they must adopt the habits and culture of digitally native companies: opening up the banks’ APIs to the FinTech community, pursuing agile development, or hosting hackathons to foster intensive digital collaboration. The Force Awakens: Artificial Intelligence, Machine Learning & Data Analytics On the back of MAS launching a S$27 million Artificial Intelligence & Data Analytics Grant (a part of the S$225 million Financial Sector Technology & Innovation Scheme) the industry is turning their focus to follow ‘smart’ money. Real-world use cases to solve demonstrable problems using artificial intelligence and machine learning were fleshed out amongst the banks, FinTechs and technology leaders present. This represents an inevitable progression as FinTech capabilities mature and traditional banks and financial services companies seek to commercialize the best innovations available. In our view, the use of AI to properly harvest data to directly support business strategies, creates a data-driven culture that can readily operationalize customer data. The key is to bring digital and data together to build customer experiences and sustained digital interactions, resulting in customer insights that foster superior acquisition and loyalty. Exerting Influence: Collaboration is the key Gravity exerts influence even at distance. Collaboration will be the gravitational force driving partnership, solving key problems such as Financial Inclusion, Regtech, Cyberfraud etc We see headway in the industry led by initiatives around cross-border payments such as Project Ubin, a move to share blockchain source codes to spur innovation and the linking of PayNow in Singapore and PromptPay in Thailand. Another great initiative that will deepen financial inclusion and digital financial services is the launch of an industry sandbox: ASEAN Financial Innovation Network (AFIN) - a platform for ASEAN banks, microfinance institutions, non-banks and FinTechs to partner. Reaping benefits of collaborative influence will come when banks start to create the right affiliations, cross-sell opportunities, sub-brands or alliances to knit together an ecosystem to capture revenue and valuable data. We are in a perfect storm of innovation and digital-driven transformation in Financial Services. It was a tremendous opportunity for all participants to share leading ideas and for many players to unveil some of their latest, most exciting projects to capture the imagination of the industry. As we wrapped up a very busy week at the Singapore Fintech Festival, we have never been more excited about Oracle’s direction and potential to support the commercialization of technologies to help financial institutions achieve their digital strategies, and to help leading FinTechs gain relevance. Follow us on Twitter @OracleFS and LinkedIn to find out more. For more information: please visit https://cloud.oracle.com/financial-services https://www.oracle.com/pressrelease/oracle-open-api-banking-solution-101617.html https://www.oracle.com/sg/startup/index.html

Mark Smedley, VP FS Industry Solutions, Oracle Sharon Tan, Strategic Marketing Lead, Oracle Financial Services By: Mark Smedley and Sharon Tan The recent Singapore Fintech Festival 2017 revealed the...

Banking

Is Blockchain a disruptor or enabler for Transfer Agency?

We have been awash in the Financial Services over the past several years of hearing of the next big technology that will change the face of the industry, and Transfer Agency has been no different.  In the year 2017 it is the turn of Distributed Consensus Ledger Technology (also known as ‘Blockchain’) to set pulses racing and generate many reams of papers from various parties.  Whatever happened to Big Data from 2016? The principles behind Blockchain are over 40 years old, however with the increase in computing power and its use behind Bitcoin, the industry is now awash with articles and Proof of Concepts purporting to show the benefits of this technology.  There is no doubt that the Financial Services industry suffers from many inefficiencies and some of these can be addressed by the Blockchain. Transfer Agency can be seen as a line of business that would greatly benefit from the disruptive capabilities of Blockchain.  This is particularly salient as Transfer Agents and Asset Managers are still paying lip service to the idea of moving forward with best of breed technology when they are inhibited by legacy systems.   The impact and cost of legacy should not be underestimated as the advent of FinTech’s and their nimbleness means existing players must decide whether they are to adapt and compete with respect to their core service offering or wither and die.  Transfer Agents need to determine if their existing environment can support growth and provide the flexibility for the changes that are coming? The opportunity to seize the moment and have a truly value add service for Asset Managers and Investors is now.  This can only happen, though, for those Transfer Agents are not weighed down by Legacy; who have a holistic view of their approach and whose technology is modern, flexible and reliable.  With a modern platform at a Transfer Agency’s core innovation is possible ensuring that the modern Transfer Agent is relevant to the market. The acceptance of legacy is not an option as the next generation will wonder why the industry exists since the stock answer is “we always did it that way!”.  Does this latest and possibly greatest fad in Blockchain now provide the impetus to change our approach to TA and the platform that underpins it?  Or will Transfer Agents continue down the well worn tactical path and bolt on another ‘patch’ to an already complex and overstretched ecosystem? So, what are the pain points that Blockchain can truly eliminate?  Will it be challenges in a process such as Reconciliations (for example) or can it be truly disruptive enough to negate a Function such as Transfer Agency?  Further we need to question whether the disruptive use of this technology will come from within the industry or from a new source, namely FinTech?  Can we, that is Transfer Agency, as an industry, really disrupt ourselves or do our own vested, legacy interests win out? There is no doubt that a ‘Blockchain’ can and will have an impact on the industry and the potential number of players within it.  Discussions continue to take place around some of the main challenges with Blockchain as there will be a need for everyone to agree on a standard or agreed protocol.  Consensus is fundamental to the technology underpinning Blockchain.  Given the industry’s traditional inability to agree locally, regionally and globally to the same protocols I would expect that there will be several variants of a Blockchain that will come into existence over the next few years.  It is not likely that there will be a single global standard Blockchain which will negate the need for Transfer Agency, as some as suggesting. So how does Transfer Agency look forward and get ready for Blockchain disruption? This where the likes of the Hyperledger consortium can help due to the consortium's approach to Blockchain using open source collaboration, modular architecture, horizontal/cross-industry technology support, and support for enterprise needs (see Oracle joins Hyperledger). Many firms are engaged in Proof of Concept tests to understand and determine what processes can be improved by Blockchain.  The Proof of Concepts have shown benefits in small scale lab conditions where environments are controlled.  An example of a successful Blockchain use case in a lab environment utilising Hyperledger is the ability to send/receive instructions on the Oracle TA platform via Hyperledger and thereby managing the single source record. Questions are arising though as to whether these tests can scale to real life scenarios.  The amount of data and processing power to provide a reliable and robust solution is not currently within reach, but it may not be that far off into the future for the industry to ignore.  (It is also worth noting that there are stability issues within Bitcoin where outages of up to hours have occurred, how will this impact Asset Servicing in a real world scenario?) As noted at the start of this blog, we have been, within the Asset Servicing arena, inundated with articles, papers and announcements of Blockchain breakthroughs, but like the overall Financial Services industry we have yet to see Consensus on the way forward.  Indeed, we have even seen the banks join and leave various Blockchain consortia over the past couple of years as they individually seek to identify what is the best way forward. Not only does the industry have the consensus issue to resolve there is also the challenge to decide which process to apply Blockchain too.  There have been numerous commenters highlighting the benefits that Blockchain will bring.  Two processes that are often seen as ripe for disruption are Settlements and KYC / Client On-Boarding. The ability for an Investor to open an account in a fraction of the time that it currently takes has to be a welcome development as it improves the experience of engaging with a Financial Services Provider let alone the costs involved for the Transfer Agent, which for account opening can be significant. This brings us about to Transfer Agency as the process of being a Transfer Agent is seen as a necessary but unloved part of the Mutual Fund cost chain.  Asset Servicers and Asset Managers get little value out of the function and at best see it as a Cost Centre that must be endured.  TA’s do provide a number of key procedures however their true value is not seen in the overall package that is offered. The cost of legacy systems within Transfer Agency means that the ability to respond to the latest developments in Financial Services is exceptionally slow and reinforces the view of an industry that does not add value.  This can have its benefits (as fads can pass by without any effort or effect) however when a true disruptor arrives Transfer Agency will not recognise it and so TA itself will be passed by. To remain relevant Transfer Agents should look at themselves and determine if they have a best of breed system that is based on an Open Architecture.  With this bedrock they are in a position to assess and react to the current dynamic in technology.  Does the latest discovery mean the Transfer Agent is potentially loses its relevance (Blockchain) or can it ride the storm as legacy gives it a buffer (Big Data)? At this point in time we know Transfer Agency is inefficient with previous technological changes not fully embraced to make a difference.  As Blockchain looms large is now the time for Transfer Agency to recognise the Disruptor and look within itself to see if TA can survive on its legacy environment?  Either way Blockchain will impact us all, the how and when will only be confirmed in history.

We have been awash in the Financial Services over the past several years of hearing of the next big technology that will change the face of the industry, and Transfer Agency has been no different.  In...

Banking

Harnessing Oracle Digital Experience Solution for Customer Delight in Lending and Leasing

Digital originations are evolving According to a research*, the average number of connected devices owned by a typical digital consumer is 3.64. Digital ways of consumption of products and services is clearly here to stay and it is only going to get more sophisticated in future. As automation becomes imminent across lending products, a digital lending system becomes an obvious choice for the aspiring leaders in the industry. Since the origination process is the first touch point to the customer, lending institutions need to evolve out of their silos into an integrated system that offers a personalized origination experience taking into account customer relationship as a whole rather than one product or service at a time. This means an intuitive process with data based underwriting and centralized documentation of customer details. Not only this, all these features need to reflect in a truly user friendly user interface. Here are the key areas which, if digitized, can change the lending and leasing businesses for good: Social integration: 64% of consumers aged between 18 and 34 claim to use social login because they dislike spending time to fill out registration forms. Consumers of this age clearly prefer integration to their social sites which helps them save time and complete mundane tasks conveniently. A truly digital origination system needs to identify this very need and integrate the popular social media options into their interaction layer. Customer data centralization: Digitally savvy lending consumers expect their financial institutions to know them well. This means re-entering the information already shared is a big reason for repulsion. Lenders need to let data flow freely between departments and across stakeholders such as dealers and brokers so that consumers are not hassled. Workflow automation: In order to achieve maximum productivity, the lending systems need to offer the flexibility of customization of workflows. They should be able to use a default origination workflow or customize the workflow in line with the lending institution’s existing processes. Localization: Political, social and economic changes in each geography have a ripple effect on the way people consume financial products. From regulatory requirements to consumer rights protection acts, the adherence to local standards makes up for a substantial part of business operations. Digital channels and delivery modes require to be designed in accordance with local requirements. Integration across entities: Lending institutions work in tandem with consumers, dealers, brokers and other third party agencies. In order to offer great customer experience, flow of information needs to be seamless across all these entities. The Oracle Advantage  Oracle Financial Services Lending and Leasing integration with Oracle Banking Digital Experience makes it easy for lending and leasing institutions to connect customers to their lending applications in a truly digital manner. The option to login via social media accounts such as Facebook and LinkedIn provides speed and flexibility to end consumers and simplifies the login process. The digital experience layer also offers ways to fetch basic customer information from previous records of existing users to prepopulate to the loan application. The Payday Loans application tracker makes it easy to keep a track of the uploaded documents and proofs and also the application status. Applicants can fill and submit their application at their convenience with the ‘Save Application’ feature. All necessary information in addition to the prepopulated fields is collected at the applicants’ convenience. Once origination workflows have been finalized they can be applied to products depending on the best workflow that suits the product. Through this integration, the financial institution administrators get the choice to customize the origination workflow to suit their unique requirements. This can further be customized based upon the country, nature of product and other specific information needed.   The pre-integration of Oracle Banking Digital Experience with Oracle Financial Services Lending and Leasing provides a futuristic approach to lending and leasing. It not only bestows a digital skin to the origination process but also makes the entire process less cumbersome and more consumer-friendly.  My colleagues Shalu Upadhyay, Unmesh Pai and I  co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at oracle dot com, shalu dot upadhyay at oracle dot com and Unmesh dot pai at oracle.com Sources: *Globalwebindex

Digital originations are evolving According to a research*, the average number of connected devices owned by a typical digital consumer is 3.64. Digital ways of consumption of products and services is...

Analytics

Relationship Pricing - The Key to Valuing Your Customers?

Blog By: Akshaya Kapoor, Senior Director, Product Strategy, Oracle Financial Services This is the third and final blog in a series relating to a joint study between Oracle and Efma, which led to the production of a White Paper: ‘Making Innovation Pay’. An important conclusion of the study was that a move towards revenue management is essential - and that as part of this, relationship pricing is one of the key elements in achieving sustainable, long-term profitability and revenue assurance from innovation projects. The report comments: “For many banks, profit margins are still below pre-crisis levels. The solution lies in the development of innovative, personalised offers combined with other revenue-enhancing initiatives such as relationship pricing.” Pricing is one of the ‘Four Ps’ that Oracle believes are essential for building a digital enterprise (and which equally apply to successful innovation) - Product, Price, People and Place. As we talked in the last blog, it’s all about offering the right product, right place, right price and at right time. McKinsey says “Successful banks can adopt a sophisticated pricing strategy that can add 6 to 15 percent to the bank’s revenue, deepen relationships with valuable corporate clients, and encourage performance improvements throughout the organization” Adopting an effective pricing strategy Traditionally, banks have opted for a product-based pricing approach. Strategy should cater from one time buyers with a “Click-buy-bye” offer to regular buyers with dynamic pricing that help & incentivize customers with a “click-buy-buy” offer.  However, in recent years they have been gradually moving towards a more relationship-based pricing approach, along with subscription models, discounts and product bundles - but progress in this area is still slow. Another recent White Paper by Oracle and Efma, ‘Responding to change – how are banks using information and pricing strategies to boost profitability?’, suggests that only 36% of the banks surveyed have adopted true relationship pricing. The paper goes on to say that a poor pricing strategy will ultimately lead to lower profits and that ideally, relationship-based pricing should go hand-in-hand with innovation. This was confirmed by the current study, which observes that a relationship-based pricing strategy can make a real difference to a bank’s ability to respond to change. Relationship pricing involves offering customers more favourable prices on a particular service because of the customer’s purchase of other solutions from the bank. For example, a customer with a bank account might be given a more favourable mortgage rate. The ease with which a bank can set these more personalised prices can be helped by a further strategy – the aggregation of accounts. This enables the bank to see the total customer relationship, which in turn enables it to offer the customer discounted prices as a reward for their loyalty in different areas. Applying a relationship-based pricing strategy One of the difficulties of adopting a relationship-based approach to pricing is that it can be difficult to assess the true value of the relationship with the customer. The key to effective relationship pricing is that it should be applied holistically – and this in turn means that banks need a better understanding of their customers. To understand a customer fully, a bank needs to able to look at different aspects of customer behaviour so that it can recognise, anticipate and influence it through personalised pricing. Banks have a huge amount of customer information – but they need to choose and use the right data so that they can run analytical models that will enable them to develop the right pricing strategies. Finally, they also need to use dynamic pricing and agile strategies that are aligned to critical customer events and milestones (such as buying a house or car or getting married). Unfortunately, most banks that are moving towards a relationship-based pricing approach only apply this strategy to specific products or occasions. This type of piece-meal approach is unlikely to have a lasting effect. One example quoted in the report is that of a person who is both a retail customer and also a small business owner. If the bank is able to view the total relationship, it can start to set prices that reflect the true value of the customer. The report sums this up by saying: “One reason why some of the innovative programmes don’t pay off is because they are designed without paying attention to relationship pricing. The programme is designed for the experience but not for the pricing. So it might involve a good idea but it doesn’t make money because the bank hasn’t identified how to price it effectively.” The White Paper concludes that pricing represents a missed opportunity for banks – but there is still time to grasp this opportunity and develop a relationship-based pricing strategy that is also aligned with their innovation strategy. In this way, they will be able to start to make good progress on the arduous but worthwhile road to sustainable profitability. To see the full report on Making Innovation Pay, please visit  

Blog By: Akshaya Kapoor, Senior Director, Product Strategy, Oracle Financial Services This is the third and final blog in a series relating to a joint study between Oracle and Efma, which led to the...

Analytics

Revenue Management – The Secret Behind Successful Innovation

Blog By: Akshaya Kapoor, Senior Director, Product Strategy, Oracle Financial Services In a previous blog, (‘Using innovation to achieve sustainable profitability’), we looked at a recent joint study between Oracle and Efma, resulting in a White Paper entitled ‘Making Innovation Pay’. One of the key recommendations that emerged from the study was the need for banks to take the topic of revenue management far more seriously. This is an area that is often overlooked, but can be a critical element in achieving sustainable revenue assurance and profitability from innovation programmes. As mentioned in the previous blog, banks recognise the importance of innovation and yet fail to take real advantage of new ideas and technologies and therefore don’t achieve sustainable profitability from their innovation programmes. As McKinsey says “Relationship managers could take up to two days to perform pricing calculations that could be completed in just 20 minutes with technological support”. An effective revenue management strategy can make a significant difference in this area. Revenue management is a topic that is taken very seriously in most other industries – indeed, many have whole departments and senior executives that focus specifically on this subject. This doesn’t appear to be the case in the financial services sector – and could be one reason why banks are lagging behind other industries in terms of making innovation pay. How Revenue Management Works So, what are the key principles behind revenue management? It’s a discipline that enables a company to see its revenue flow as an end-to-end process, with the aim of achieving the optimum customer lifetime value. It involves the application of analytics and operational and systematic controls. It aims to understand the behaviour of customers and their perception of value – leading to product prices that are more closely aligned with individual customer segments. Ultimately, effective revenue management will ensure that the right product is sold to the right customer at the right time, via the right channel, and for the right price. In terms of innovation projects, it also means that banks need to have a clear idea of the expected impact of a new project before they embark upon it. Oracle believes that “organisations that have a centralised discipline of revenue management with their innovation programmes typically enjoy a much higher and more sustainable pay-off.” In other words, revenue management is an essential element of being able to make innovation pay. One of the key aspects of revenue management is relationship pricing, a topic that will be covered by the next blog in this series. Developing a revenue management strategy So, how can banks start to develop more effective revenue management strategies? The White Paper identifies seven vital factors: The need for banks to stop talking about customer centricity and to start putting it into practice. The need to improve the customer experience by starting to look at things from the customer’s viewpoint and by understanding their needs. The use of new techniques, such as behavioural economics and gamification, to delve deeper into how and why people interact and engage with the bank. The use of customer-based relationship pricing capabilities that are both adaptable and scalable. The use of relationship-based, personalised offers for driving cross-selling. The active involvement of managers and employees in optimising both top- and bottom-line metrics. Embedding, testing, learning and developing the system. This last point is particularly important – banks need to experiment and learn from those experiments if they’re going to grow. There is plenty of opportunity for innovation – in developing the right products for the right customers. As mentioned earlier, one of the key areas of innovation that can make a real difference is pricing – and this aspect will be covered by the final blog in this series. Meanwhile, Oracle has been busy developing a range of solutions that it hopes will play an important role in the future of more effective revenue management. To see the full report on Making Innovation Pay, please visit 

Blog By: Akshaya Kapoor, Senior Director, Product Strategy, Oracle Financial Services In a previous blog, (‘Using innovation to achieve sustainable profitability’), we looked at a recent joint study...

Banking

Using Innovation To Achieve Sustainable Profitability

Blog By: Akshaya Kapoor, Senior Director, Product Strategy, Oracle Financial Services It’s no secret that profit margins for most banks are still at or below the levels that they had reached before the financial crisis hit some ten years ago. At the same time, the banks recognise the importance that innovation will play in a rapidly changing world. This will need banks to think differently. It will help bank introduce cultural change that will help them be an effective player in the new Banking ecosystem. They can use these challenges as additional streams of revenue. However, they are failing to use innovation effectively in a way that will achieve lasting revenue assurance & profitability. When it comes to innovations, banks need to build an appetite for “Return-on-Failure” (ROF), rather than unilaterally measure each innovation idea against “Return-on-Investment” (ROI). As a result, Oracle has conducted a study in conjunction with Efma, the independent non-profit making association that provides information and networking resources for financial institutions. The study involved online discussions between members of a Steering Committee of senior financial executives, chaired by Oracle; a few in-depth interviews with some of these members; and a detailed questionnaire sent out to select Efma members. The results of the study are available in a new joint report: ‘Making Innovation Pay’. As one of our esteemed EFMA Steering Committee members said “Banks know what to do, but don’t know how to do”. Are banks taking innovation seriously? This was one of the key questions that the study sought to answer. Firstly, why are banks failing to use innovation to boost their profit margins in a sustainable way? The results of the survey underlined the fact that they feel that innovation is important: 90% of those taking part said that they were likely to increase their investment in this area in the future. However, other results suggest that this might sometimes be a half-hearted approach, as only half of the banks surveyed currently have dedicated Innovation teams. The financial services sector often lags behind other industries in terms of a real commitment to innovation. Reasons for this include the historical emphasis on product silos; complex legacy systems; a lack of integration between channels; and dispersed and encoded business rules. One particularly important aspect is the high level of risk aversion in many banks. The study found that over 80% of banks are either moderately or very risk-averse. Despite this, all of the banks that took part in the survey said that they would probably consider unconventional ideas and over half would follow these through in some way. Current innovation programmes are focused mainly on issues such as increasing customer engagement; building brand value; and developing incremental improvements. Customer engagement was the area where there was the greatest gap between the focus of the programme and its resulting effectiveness. The types of innovation covered include payment enablement, chat-bots, artificial intelligence, biometrics and blockchain. The study looked at innovation measurement – an area where banks again lag far behind other industries. It can be difficult to measure the success of an innovation programme, but this is vital if a bank is to know how a new project has affected aspects such as the customer experience and the brand value. These measurements need to be built into new innovation programmes before they start. Again, in terms of measuring the revenues and profits derived from innovation, banks find this hard and often avoid it. However, 63% of banks said that innovation programmes had led to increased revenues - but only 42% said that this was reflected in higher profit margins. Perhaps one of the most important elements that should be measured is the customer lifetime value, although some banks have difficulty in understanding this concept. Fintechs – friends or foes? At least some of the innovation taking place in banks is a direct result of the perceived threats from fintechs or other new entrants. But should fintechs be seen as ‘the enemy’ or could they in fact become a useful partner in future innovation strategies? Although very few of the survey participants had an established programme involving fintechs, over 85% saw the value of developing partnerships with them and were at least exploring the possibilities. A few have already been working on some joint innovation projects and, although many banks are still wary of such partnerships, it seems that this could herald a new focus for banks in the future. Oracle has been working with both sides to try and help them to align their objectives. Banks can benefit because fintechs have a lot to contribute, as they are both more creative and more agile than banks. They have more new ideas, cutting-edge analytics and are good at online acquisition. Meanwhile, the fintechs can benefit from the far larger customer base and depth of experience of the banks. The study shows that banks need to take innovation seriously. It must become an integral aspect of normal banking life. It needs to draw in people with different types of expertise and different ideas. However, support from senior executives – including the CEO – is crucial to its eventual success. Two other critical factors that emerged from the study (and which will be explored in future blogs) are revenue management and relationship pricing. To see the full report on Making Innovation Pay, please visit 

Blog By: Akshaya Kapoor, Senior Director, Product Strategy, Oracle Financial Services It’s no secret that profit margins for most banks are still at or below the levels that they had reached before...

Analytics

Artificial Intelligence or Rules-Based AML/Sanctions?

Blog post by Julien Mansourian, Strategy and Transformation Executive, Oracle Financial Services Analytical Applications. FinTech is going through a significant innovation change, which is impacting the overall banking operating model and consequently the current techniques used to comply with regulations. Exploring sophisticated, innovative and cost-effective technology platforms to meet the regulatory compliance requirements is in motion and banks will certainly leverage the Artificial Intelligence (AI) path going forward. One good example is Deutsche Bank - leveraging AI to reduce the workforce across the board: "The rise of artificial intelligence (AI) and automated technology is set to have a huge impact on the future workforce of Deutsche Bank, with a "big number" of staff inevitably impacted because they already resemble robots," chief executive John Cryan has said.  Link to article. The second example is HSBC, who is partnering with an AI startup to automate AML detection flow.  Link to article. So, it certainly makes sense for banks to already implement an AI-Centric AML/Sanctions system than implementing a Rules-Based Financial Crime application now and rework or replace it later. AI is the breathtaking expansion of digital data that FinTech has generated over the last 20 years. This heavy data digitization pushed the former data analysts to become data scientists to more focus on developing new techniques to retain, categorize and analyze the information. In parallel, the evolution of Big Data coupled with exponential growth in Cloud Computing including PaaS, SaaS, IaaS, Hardware Stacks and Information Processing platforms have enabled engineers to design algorithms that are no longer bound by the parameters in their code. Moreover, the Real-Time decision making has never been as important as it is today in our day to day business. So, in a nutshell, AI is a branch of computer science that aims to create intelligent machines. It has become an essential part of the technology industry. AI is the simulation of human intelligence processes by machines. These processes include learning (the acquisition of information and rules for using the information), reasoning (using the rules to reach approximate or definite conclusions), and self-correction. Particular applications of AI include expert systems, speech recognition and machine vision. Today, many RegTech companies are improving their regulatory solutions to include innovations such as Cloud, AI, Machine Learning, Automation and Cognitive to follow the market trends. Up to now, most of the banks used a rules-based detection approach to AML and are gradually moving away from a rule-based method to an AI approach. An AI driven system that can gather data instantly and be programmed to make decisions when given a set of simple facts will no doubt improve the AML layer. A rules-based approach to AML would, for example, flag cash transactions over a certain currency amount, block transactions to certain countries, use customer data to select accounts for additional monitoring, and categorize merchant accounts based on prior transactions. These types of systems are already widely used, but they require a significant amount of bank resources to review the transactions that are flagged or blocked to weed out false positives. In addition, a rules-based approach to AML will be unable to adapt to changes in criminal behavior designed to evade detection. An AI approach to AML, by contrast, does not require developers to establish rules that identify potentially criminal transactions. Instead, the system would be trained to identify such transactions over time by analyzing a staggering array of factors. These could eventually come to include where a customer opens an account relative to their home address, what time of day an account was opened, duration between transactions, patterns among the merchants where a customer makes transactions, relationships between other customers of those same merchants, whether a customer uses a mobile telephone, what communication channel a customer uses to contact the bank and even changes in a customer's social media presence. The factors that AI can evaluate are limited only by the available data. AI can identify patterns and connections among the data that humans cannot hope to recognize. Using this information, the AI would then monitor every transaction processed by a bank and predict whether each one is or is not criminal. The accuracy of such a system would be significantly higher, and the resources needed to monitor the output significantly lower, than with a rules-based system. Importantly, an AI system would continually improve its accuracy automatically. For OFAC compliance, as with AML systems, AI would drastically improve detection. This is particularly important because banks are strictly liable for any transaction involving an entity on the OFAC sanctions lists. It has long been routine for bank systems to block attempts by persons on these lists to open accounts or by existing account holders to initiate a transaction with such people. But it is far more difficult for a bank to detect when OFAC controls are intentionally circumvented or the counterparties of transactions are obscured. An AI-based system, however, would be able to prevent these transactions specifically by not relying on defined rules. We are living in an amazing period where we constantly go through technology innovations, but how far can Artificial Intelligence go? Our brain is made up of 100 billion neurons and each neuron has 1000 to 10000 synapses that sent electrochemical signals to 1000 up to 10000 other neurons at the speed of 200 mph. That is a lot of processing. David Deustch predicts with the advent of the quantum computer which measure data in 'qubits' rather than in bits the processing speed alone will be unimaginable compared to today's computers. Artificial Intelligence is definitely the next logical step of the technology revolution and will probably surpass our intelligence.    

Blog post by Julien Mansourian, Strategy and Transformation Executive, Oracle Financial Services Analytical Applications. FinTech is going through a significant innovation change, which is impacting...

Banking

Are you Ready for Digital Lending and Leasing ?

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services They share their cars with unknown people, communicate every emotion through selfies, and their followers on social media define their stardom. They are radically different from previous generations. According to a research, nearly 40% American millennials interact more with their smartphones than they do with any human being. This generational shift is driving a monumental change in the way products and services are consumed across industries. Cab service providers are now digital wallet providers and telecom companies are functioning as banks that don’t expect their consumers to visit their branches, in fact most of them don’t even have any physical branches. What do these socio-technological patterns mean for businesses? For one, they need to move away from the approach of perceiving gadgets as mere ‘channels’ to looking at them as quintessential elements of their consumers’ lifestyles. Addressing the elephant in the room Since millennials are the largest of all generations in terms of size and they embrace digital technologies as their second skin, digital platforms cannot just be one of the many outlets of lending solutions. Digitalization needs to become an intrinsic value to the way lenders interact with their audience. So how do they do it? What do their consumers want and how can lenders offer exactly that? I want everything here and now! In 2015, the New York Times reported a surge in same day delivery orders. The digital native population is more comfortable paying extra for same day delivery than waiting longer for their orders to arrive. Speed is naturally one of the biggest drivers of their buying decisions. Let’s say a millennial wants to apply for a loan. He can always use a P2P lending app or a marketplace site to find out what the best deal is. A regular lender with a traditional setup is not even there in the picture since it involves visiting a branch or calling a representative in order to negotiate: there are chatbots to do that work. Instant gratification is one of their most important factors while buying a credit product. Loan origination systems need to talk to the consumers in their own language. This means that from credit decision making to the application processing, the entire procedure needs to be lightening quick- at least quicker than the closest competition. This is only possible if the underlying technology facilitates fast processing with smart business insights and real time reciprocation of consumer choices. The later stages in the lending lifecycle such as updates and payback also follow the same rules of the game: wait is not appreciated and instant servicing is the biggest differentiator. If it was not shared, it never happened Sharing platforms such as Snapchat, Facebook and Instagram are the primary touch points millennials use to connect with the rest of the world. In fact, the application of social platforms goes beyond just connecting: they now make their buying decisions based on reviews by others on these platforms. 68% of them say they won’t make a major decision until they have discussed it with trustworthy people on their social platforms. Lending Club, the peer-to-peer lending platform, connects prospective buyers and sellers of credit products. All the borrowers need to do is post their requirements on the app and the befitting investors lend them money. This platform based social approach to lending is a more comfortable situation for the digitally savvy consumers. Investors on such platforms have a similar digital outlook as that of borrowers and hence are less intimidating than conventional lenders. It also makes more social sense because they are able to share their experiences and resources with the others. Josh Dykstra in his blog on Fastcompany mentions owning something does not mean much now unless it is used for defining what people can do with it, what they can tell others about it, and what having it says about them. Millennials have grown up to a sharing economy where everything is less private than it was in the previous generation. Their lending solutions also need to reflect this very concept through their digital systems. Ownership does not mean much, experiences do. They have seen their parents over the period of the global financial meltdown and hence believe things are momentary and experiences are forever. Three in four millennials prefer paying for an experience over a product. They opine social and experiential value for money are more important than belonging to things or things belonging to them. This essentially means that a product sold is not a guarantee of a happy customer. Their satisfaction index transcends into the overall journey as a lifecycle rather than a transaction. For lenders, this is an opportunity if they are able to convert every interaction with the consumers into a memorable experience. Providing variety of options is always one of the best ways to service. Whether it’s the frequency of installments paid or the channel of communication or the mode of repayment, the larger the number of choices, the happier the millennial consumers. According to a research, roughly three quarters of Americans now own smartphones and about the same proportion of them have broadband connections at home. Only the lenders who have capabilities to service loan consumers on digital platforms can monetize this situation. They not only need to offer the right mix of products and services at the right time, but also keep the customers informed about the entire process on their channels of choice while making the procedure interactive. This should be topped with the best prices based on the customer relationship and previous records. Servicing customers via digital channels will not only help lenders bring down the costs drastically but will also position them as the forerunners of the new digital brigade of lenders. You ought to know about me, everything, already. Traditional lenders rule out millennials as consumers because they lack proper credit history. Fintechs however are poaching these consumers early on in their lives by looking at other credentials such as an education from a reputed institute and a promising career ahead. Zestfinance, for instance, uses non-conventional ways of underwriting. They employ machine learning to analyze thousands of variables and gain insights into the vast amounts of data they already have in-house, such as customer support data, payment histories, and purchase transactions. The platform can also add nontraditional credit variables, such as how customers fill out forms, how they navigate a lender’s site, and more. The effect is such that a lot of big banks and lenders are losing their prospective customers to such fintechs simply because their understanding of their customers is context based. Fintechs are making use of the fact that they know their customers well enough! Digital technologies can also facilitate the minimization of delinquencies through better business intelligence and insights from consumer data gathered over the course of the relationship with the lender. Psychographic Conversion by AES Technologies, for example, is a solution that leverages user experience and behavioral psychology elements for making loan collections more intuitive. The solution takes into account the fact that loan collections is a crucial part of the lending and leasing business where knowledge of the customer base is extremely important: a badly managed call can ruin the relationship with the customers forever and a simplified transaction can build it for life. The right use of customer information should occur at the right time and this is only possible with digitalization of processes. Any kind of negotiation, resolution or pay back can happen with the proper bucketing of customer data. This can potentially change a process that is perceived as painful and uncomfortable by many to a memorable brand experience that can increase the net promoter score for lenders. If the complexity of the solution shows up on the surface, it’s not good enough Interactivity, intuitiveness and customization are the topmost criteria for most customers today. You cannot expect a millennial to walk to a branch when he has an option of clicking a button on the smartphone and downloading the statement. In fact, they now have access to apps that even tell them where their money could have been spent better than buying a credit product. The obvious derivation of these recent developments is the need for a robust servicing engine backed by futuristic technology. There is no room for lengthy processes and cumbersome offline protocols. There is a need to accept, process and decision credit applications in a paperless mode, with a single data entry process. Lending and leasing institutions should be able to provide seamless channel integration to ensure an application can be started and closed on different channels of customer choice.  Although a great lending platform is the one that makes use of relevant data to cater to the evolving requirements of the consumers, this should all be done in a manner that the consumer still sees things as if they were just one touch away. Take the lead, the race is on  We are inevitably entering an age when there are apps to tell us not to use apps while driving. Millennials are so addicted to digital technologies that there is a new social recommendation surfacing the marketplace: ‘digital diet’ which essentially means a monitored abstinence from all things digital in order to bring people back to at least a minimum level of human interaction. When such a generation is a major part of your target audience, it becomes imperative to conform your products and services to their sense of consumption. A sturdy lending and leasing engine with a modern architecture can go a long way in this context. My colleague Shalu Upadhyay and I co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at Oracle dot com and shalu dot upadhyay at Oracle dot com  

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services They share their cars with unknown people, communicate every emotion through selfies, and their followers on social...

Analytics

The Reality of Upgrading: Spinning Learnings into Opportunities

Blog by Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. In my previous post, I discussed the key drivers behind an upgrade and how to determine if upgrading with the existing system provider can be considered a fresh install or a true upgrade. Despite the budget increases, which includes additional personnel, anti-money laundering (AML) efforts remain suboptimal in many organizations. Often, AML initiatives and systems are deficient from the onset. Among the many issues, institutions fail to incorporate lessons learned from past errors into their new Transaction Monitoring (TM) programs in an appropriate or timely manner. Hence, an upgrade project can be a good opportunity to learn from previous challenges, factoring in those learnings as part of the new system selection process.  This a not meant to be a comprehensive list of all factors that should be considered while choosing a system provider; however, these are specific points that were encountered in previous TM systems. Data: Developing a comprehensive set of data requirements to support an effective TM system will help identify data quality and architecture issues. Addressing these issues at the source system by changing operational procedures, implementing application controls (e.g., edit checks, validations) and dealing with system gaps can enhance operational efficiencies, allowing for better management reporting and improving the overall AML program.  This leads to some key questions specific to data: Does the new system provide better data provisioning ability, such as standard tools, to gather data from various sources to AML system? Is there data quality checks available as part of the standard offering that would help reduce data related risk? How robust is the data model if there is one available out of the box? Does the data model provide enterprise coverage for various products and business units? What is the vendor’s strategy to provide backward compatibility, which would be critical in the next upgrade? Does the architecture help you provide the must have data lineage? In the case of Big Data, is the vendor able to integrate with your data lake to gather data that would help reduce ETL effort tremendously? Enhanced Functions: In addition to key features, the upgrade project provides an opportunity to consider all prior wish list items. It is important to have a clear understanding of whether or not those wish list features are available as part of a standard feature or customization. Customizing those features will lead to additional project and IT risks and may not be available as part of the next upgrade. The goal should be to have these enhanced features as part of the standard offering as much as possible. Process Automation: Having an optimized investigation process can help reduce compliance cost tremendously; therefore, this upgrade project could be a great opportunity to optimize investigation processes aligned with your latest AML program as much as possible. They key automation areas that should be consider here are: The first step of the investigation process is assignment of cases to investigators after the cases have been generated. At this point, suitable scoring methodology can be incorporated to score the cases based on the financial institution’s risk portfolio and policies. Factors such as number and score of previous cases, associated products, suspicious activity type, etc. can be part of the scoring methodology. Cases can then be auto assigned to individual investigators or teams. Scores can also be used to derive case due dates. To get the complete view of an investigation, it would be useful to group cases with other cases.  Criteria such as customer IDs, other customer profile attributes (phone number, address, etc.), related accounts, etc. are some examples of cases that can be grouped automatically to avoid manual linking, enhancing the investigation process. Most of the investigator types go into collecting data from various 3rd party systems, depending on the financial institution’s number of 3rd party systems required to collect data, which can vary approximately from 2 to 10. Such information gathering stages can be automated as well, using relevant information that is automatically linked with the investigated case. This automation can also make information gathering more accurate as compared to manual search and linking. The last stage of the process is regulatory reporting, which mandates financial institutions to E-file detected suspicious activity. This can be another key area to allow for automated E-filing of suspicious activity much quicker (and perhaps more accurately). Platform Unification: Depending on your current state, different AML functions are supported through various silo systems, which increase data, regulatory, and IT risks. This new upgrade project allows for a more unified compliance platform, bringing various functions of the AML program (such as transaction monitoring, case management, regulatory reporting, MIS reporting, model validation, etc.) as part of one platform. Unification of financial crime and compliance platform can occur in a number of phases, and this can be a good opportunity to move towards this. Upcoming Regulatory Trend: All known and expected regulatory trends should be considered as part of all upgrade projects as well. Service providers committed to broader AML compliance areas and  keeping up with regulatory changes will provide the upper edge when there is a time to cater to that requirement. Some recent examples could be Beneficiary Ownership guidelines or Currency Transaction Reporting changes.  Reduced Maintenance Costs: As compared to the current system, the new system should provide more flexibility in terms of onboarding new suspicious detection rules, workflow configuration, MIS reporting, etc. From a technology perspective, the considered system should have better batch maintenance procedures with needed batch tracking capability, onboarding of new data and easy to maintain 3rd party system integration. Ease of onboarding of new business units in the future should also be considered. To summarize, there are plenty of areas that can allow for an improved AML platform and should be considered as part of the upgrade project instead of choosing the existing system provider to be the most effective system for you. In addition, all areas should be closely evaluated to understand the real benefit to you. It would be great to hear your experiences and perspectives on all the key areas that should be leveraged as learning from previous systems while working on an upgrade project.  To learn more, visit us at ACAMS.

Blog by Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. In my previous post, I discussed the key drivers behind an upgrade and how to determine if...

The Reality of Upgrading: Fresh Installs vs. True Upgrades

Blog by Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. A well-made transaction monitoring system is a critical component of an effective anti-money laundering (AML) compliance program. It supports efforts to combat money laundering and terrorist financing by helping financial institutions identify unusual or suspicious activity that must be reported to regulatory authorities, and aids law enforcement in tracking and prosecuting criminals involved in money laundering and terrorist financing. Due to intensifying regulatory scrutiny, the increasing sophistication of criminals and the rapid pace of technological advances, financial institutions are bound to update their AML efforts constantly; therefore, it’s not surprising that these organizations are increasing their investment in transaction monitoring (TM) systems. Maintenance, risk management and other costs related to AML compliance have climbed more than 50 percent in the last few years. With continued regulatory change and rapid implementation requirements, costs are predicted to remain high. In addition, institutions are upgrading their existing AML system every few years, which requires a massive number of manpower, money and time.   What is driving the upgrade? Outdated system: when the current system is not suitable to keep up with compliance needs, changing the outdated system is a gigantic investment. Clearly, if companies fail to embrace the new technology opportunities, they’ll face increasing costs for compliance while simultaneously decreasing their productivity, capacity and efficiencies. Regulatory pressure: when it has been mandated by regulators that there is additional work required and the best way to handle that work is to upgrade to the latest technology. It’s sink or swim for regulated institutions; either they adapt and introduce new technologies that allow them to be more nimble, flexible and productive, or they continue to use labor-intensive processes that will become increasingly cumbersome and untenable. De-supported by the system provider: when the current system provider is planning to de-support the version in use and the only way to get support is to upgrade to the latest version.   Most commonly, financial institutions are upgrading due to a combination, or all, of the drivers described above. Fresh Install vs “True” Upgrade Whatever the driver for upgrade might be, there are two options to take: 1) go with existing system provider to upgrade to the latest version or 2) explore other industry system providers. In most cases, the obvious choice is to upgrade with the existing system provider due to current familiarity with the system and vendor.  There is also a myth that contributes hugely to this obvious choice, which is the idea of upgrading with existing vendor seems somewhat easier as compared to installing a completely new system. It can be complicated to understand if an existing upgrade is equivalent to a fresh install. Below are some high-level points, which can be analyzed to get better clarity around such projects. Data Provisioning: Data provisioning should be available in the existing system and leveraged as part of upgrade. This means the ETL rules and the data available should be reusable for upgrades. This is a very important question to ask because data is the base for the entire system and if the entire data structure (data model) is going to be changed, then most everything beyond - data quality, detection rules etc. - will require significant work. Therefore, the first question is if the data model for the upgraded system is like what is existing or not?  In addition, all the data provisioning customizations such as ETL, data quality, and most importantly data model, should also be investigated in the version you are planning to upgrade. If these changes were included without the participation of the solution provider or if there were no standard guidelines for you to follow, then it is probable that such changes are not included in the discussed version. Behavior Detection Rules, User Interface and Workflow: Once the data model has been identified, the next stage would be to understand the rest of the components of a system, such as user interface, AML detection rules and workflows. The question that should be asked is if the AML detection rules and workflows from the current program are available as part of the upgraded system or do you require additional custom work? The most important component will be AML detection rules because if those are expected to go through rewriting, then that will require a whole lot of additional effort from model risk management, which is model validation and audit review followed by further review by regulators. Earlier Customizations: This will require two main stages of analysis: 1) Understand overall percentage of customization vs. standard features where the entire AML system can be segregated into two categories - custom build vs. standard product feature. Out of which all of the organization explicit customizations (specifically the ones driven by product gaps expected as standard compliance requirements) are done as part of the previous AML system should be analyzed to understand if those are available as standard features in the upgraded system or if it requires additional work to achieve those customizations. It is also a good time to go through those customizations and see if they are still applicable in the present setting.  2) All of the currently used standard features should be re-reviewed to make sure they are still supported by the solution provider because there is a possibility that those crucial features are unavailable in the up-to-date version. 3rd Party System Integrations: Another important aspect of upgrading that should be examined carefully is integration with various 3rd party systems (both internal and external) leveraged by the AML program for data acquisition and consolidations, such as customer screening, negative news lists, adverse media scan, credit reports, etc. The analysis should be to understand how much of the existing integration capability will be available as part of the upgrade. Are those available via solution provider or will be a custom work? After considering the above points, you should know if the next AML system is available as part of a standard product by the present solution provider or not. If the examination is pointing towards significant onsite work as compared to a prepackage solution, then it is safe to assume that upgrading to a new version with the existing solution provider is more of a fresh installation as compared to a true upgrade. The plan and risk associated with the project for new install vs. upgrade are completely different; therefore, it is critical to have a clear understanding of the project. On occasion, organizations ignore the above points and don’t have a clear understanding of the fresh install vs. true upgrade and continue using the existing system. However, it will be worthwhile to venture outside your comfort zone and explore the best options available to suit your organization’s compliance needs. Many firms have undertaken large-scale remediation programs while trying to upgrade existing systems and look-back efforts to achieve AML compliance.  A carefully designed, ongoing program can help assess capabilities against industry practices, identify strengths and weaknesses, and pinpoint needed improvements. To learn more, visit us at ACAMS.        

Blog by Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. A well-made transaction monitoring system is a critical component of an effective anti-money...

CECL Challenges: The Changing Face of Qualitative and Environment Factors

Blog by Geetika Chopra, Senior Product Manager, Oracle Financial Services Analytical Applications. In my last post around the introduction of the CECL standard, I gave some recommendations on how to turn the challenges into business benefits.  Now with the CECL standards coming into effect soon, I’d like to address a specific key challenge that practitioners are gearing up to face: The inclusion of predictive forward-looking adjustments for reserve calculations. Though the standard does not prescribe any specific set of factors, if we look into the details given in the standard, some important points to note include: Historical loss experience is generally the basis for assessment of expected credit losses. This historical loss experience can be internal, external or both. Such historical loss experience should be adjusted for differences in current asset specific risk characteristics within a pool at a reporting date; such as: Underwriting standards, Portfolio mix, and Asset term. Adjustments should be made to reflect the current contractual term of the assets, or group of assets, when it differs from the historical experience. Adjustments should be done when current conditions and reasonable and supportable forecasts differ from the conditions prevalent during the period over which historical information was evaluated; such as: Unemployment rates, Property values, Commodity values, and Delinquencies. Such reasonable and supportable forecasts need not be for the entire contractual period of the asset or group of assets and the use of historical information beyond such a period is permitted. One important note: the stress on containing the forecast within a reasonable period, though prudent,  still leaves a lot of room to conclude and decide what a reasonable period is. Different factors can come with their own underlying assumptions of a reasonable period and it would be necessary for consistency to follow a standard reasonable period across the various factors. Though the Financial Accounting Standards Board (FASB) allows the use of external historical loss experience, we expect the use would be limited to Probability of Default (PD) models from external rating agencies and used mostly for corporate portfolios. The use of external data for the retail portfolio would be limited as these models are black boxes and are not open enough for auditor scrutiny.  The 2006 Interagency policy on Allowance for Loan and Lease Losses expects the institution to adjust its historical loss experience of its loan portfolio (assessed collectively under FAS 5) for changes in trends, conditions and other relevant factors that affect the repayment of the loan as of the evaluation date. The qualitative and environment factors that may cause the estimated ALL to differ from historical loss experience are: Changes in lending policies and procedures, including underwriting standards and collection, charge off and recovery practices. Changes in international, national, regional, and local economic and business conditions and developments that affect the collectability of the portfolio, including the condition of various market segments. Changes in the nature and volume of the portfolio and in the terms of loans. Changes in the experience, ability, and depth of lending management and other relevant staff. Changes in the volume and severity of past due loans, the volume of non-accrual loans, and the volume and severity of adversely classified or graded loans. Changes in the quality of the institution’s loan review system. Changes in the value of underlying collateral for collateral-dependent loans. The existence and effect of any concentrations of credit, and changes in the level of such concentrations. The effect of other external factors such as competition and legal and regulatory requirements on the level of estimated credit losses in the institution’s existing portfolio. The interagency statement issued in June 2016 states, “Estimating allowance levels, including assessments of qualitative adjustments to historical lifetime loss experience, involves a high degree of management judgment, is inevitably imprecise, and results in a range of estimated expected credit losses. For these reasons, institutions are encouraged to build strong processes and controls over their allowance methodology.” Hence, our understanding is that even though the parameters for qualitative adjustments remain the same, the process will undergo changes to include reasonable, supportable forecast and come under higher scrutiny with statutory auditors. Factors that are easier to predict include scale and volume of business, credit concentrations, change in terms of agreements, valuations of underlying collateral, changes in regulatory system and macro-economic factors. Analytical models and trend analysis can be reasonable evidence in support of such adjustments; but, factors like changes in lending policies and practices, changes in the quality of an organizations loan review system, changes in experience ability and depth of lending management and external factors such as competition are far more difficult to ascertain with reasonable forecasts and involve a higher degree of subjectivity. Unless the system of making these adjustments is itself highly transparent and subject to internal controls and managed with adequate internal reporting, auditors and other stakeholders would question the process. The Board has a much higher role to play here to ensure earnings management is ruled out for such adjustments. With higher shareholder activism, tougher whistle blower laws, and an environment for evidence and fact-finding, this needs to be addressed by creating adequate internal controls and processes, reporting and transparency around various assumptions and the need to maintain consistency within assumptions across running periods. As with any other accounting estimate this would need to be reasonably supported through audit evidence and needs to stand the test of time against any challenges in the future by a stakeholder. Most likely, best business practices and auditing standards would emerge over time, considering the same auditors audit various institutions. Industry forums will develop where finance professionals and auditors will exchange notes, leading to higher audit tests for these accounting estimates. What are you finding to be the most challenging element of CECL? I’d love to hear from you on this subject. I’ll be attending The Center for Financial Professional’s CECL 2017 Conference and would love to meet you to discuss further.        

Blog by Geetika Chopra, Senior Product Manager, Oracle Financial Services Analytical Applications. In my last post around the introduction of the CECL standard, I gave some recommendations on how to...

Banking

Measuring Up: Five key metrics to help banks measure their “Digital Maturity”

Blog by: Mark D. Atherton, Group Vice President, Oracle Corporation Digital disruption, fintech and emerging technologies have upset the status quo in banking. The conservative approach that served banks so well for so long could prove their Achilles heel in the new environment. Bank boardrooms are abuzz as they try to chart a way forward. And change will only accelerate as more institutions adopt emerging technologies such as artificial intelligence (AI) and blockchain distributed ledgers. The founder of the World Economic Forum, Klaus Schwab, coined the idea of the “fourth industrial revolution,” which is defined in part by its speed and its reach. The short version? It’ll happen faster and change more than the first three. And banks are among the most vulnerable institutions. If they don’t have a plan to measure, improvise and improve on their digital progress, they could lose revenue faster than they realize. A Common Measure of Digital Progress Here’s the rub: banks need new metrics because innovation is happening at such a tremendous pace. True digital maturity isn’t just about technology, it’s about transforming businesses so they can consistently keep pace with the changing market. Part of this process is coming up with meaningful criteria for measuring progress. After all, it’s difficult to improve if you don’t know where you stand to begin with. Of all institutions, banks should understand this. Yearly profit and revenue figures aren’t enough. Banks and other financial institutions need ways of measuring how well they’re adapting to digital disruption. The best banks are already benchmarking their progress. JP Morgan Chase is measuring the impact of digitizing each aspect of their business in its shareholder reports. The progress of digital engagement is now front and center of their agenda. One of our customers in Australia, Suncorp, measures digital engagement by a fine-grained analysis of logins across digital platforms. They measure the digital impact on the bank’s brand and progress across the digital ecosystem. They report to their investors on both profits and digital metrics. And the numbers are good. There were 63 million logins to its digital platforms, 5 million registered accounts and 55 million visits to its sites across brands. The Starting Line for Measurement – Key Metrics for Digital Progress Digital Engagement & Reach – Banks need to measure the extent and reach of digital engagement across their products and services. It is critical to measure progress based on customer’s ability and willingness to use self-service channels. Digitally engaged banks offer mortgage services, insurance, as well as investment accounts completely on digital channels and with complete digital process, cutting processing times from weeks to hours. To drive customer loyalty, banks need to constantly innovate. Savvy banks gamify their services, so that people will treat them like a financial Fitbit that lets users track their wealth. Pace of Adoption - Another measure is an institution’s ability to add a new customer entirely online and originate products digitally across the customer’s lifecycle. JP Morgan Chase, for example, now acquires 77% of its credit card customers digitally. The pace and readiness to adopt digital models shows a bank’s ability to cope with changing business models and disruptive market entrants. Depth of Data Analytics and Customer Insight – Banks need to measure and benchmark themselves very differently in the age of AI and Machine Learning. Customer data management and analytics need to be at the core of a bank’s processes and they should measure how real-time their decision making is, how data is shared internally – and that these systems are managed for optimum efficiency. Digital Enablement of the Workforce – A digital enterprise needs to be matched with a digital workforce. Technical skills are increasingly a core competency for workers in the finance sector.  A digitally mature organisation understands that progress is a continual process of improving the organisation’s capacity. Financial institutions need a clear gauge of their technical capacity, which means they need to remove inflexible processes and build a collaborative culture. Cross industry comparisons – Banks can also deploy traditional measures such as Net Promoter Scores to create cross-industry comparisons to see where they stand against leaders across Telco, Consumer Tech and other sectors. A Measured Digital Response Can Have Valuable Pay-offs While banks may seem to lag behind fintech start-ups on innovation and digital models, they also have some advantages that they can build on. They can partner with fintechs or buy them, and can capitalise on their already large customer base. The peer-to-peer payments market in the US is a perfect example. Zelle, which is backed by dozens of US banks, recently launched. It’s a strong product, and building on the major banks’ customer base, it’s expected to quickly overtake the market leader, PayPal’s Venmo. The rewards of measuring digital responses can be extraordinary. Research by Capgemini and MIT Sloan found that companies who understand the value of digital transformation are on average 26% more profitable than their competitors and have valuations that are 12% higher. Digital leadership can be measured, tracked and ultimately a bank’s level of ‘digital maturity’ is a critical gauge to sustainable growth.  

Blog by: Mark D. Atherton, Group Vice President, Oracle Corporation Digital disruption, fintech and emerging technologies have upset the status quo in banking. The conservative approach that...

Analytics

AnaCredit - Is it One regulation or Many? A look at the impact of National variations.

Blog By: Saloni Ramakrishna, author and senior director, Oracle Financial Services Analytical Applications. It depends on who is answering that question. Really!! It is a single uniform regulation at the ECB (European Central Bank) level with clearly spelt our coverage. To ECB’s credit they have given comprehensive implementation information through the detailed AnaCredit Reporting manual, part 1, 2 and 3. The information that they expect from all the participating NCBs (National Central Banks) is uniform and that is what is detailed in the manual. They however add that there is room for some national discretion (National Arrangements as ECB refers) with the National Central Banks (NCBs) on how they collect that information from their constituents. National Central Banks, that need to adhere to the AnaCredit regulation, have or will have a single uniform regulation for their constituents. Interestingly however, if you ask the banks and credit institutions that need to provide the required information in the required format and frequency, there would be two different answers.r Many? A look at the impact of National variations. The Domestic banks and credit institutions (those incorporated and operating in a single country and regulated/ supervised by one NCB) will say it is one regulation since they need to comply with only one set of guidelines as mandated by their NCBs. Regional and global banks will have a different answer. For them it is multiple variants of AnaCredit implementation as they will need to follow the mandates of the different NCBs that they report into. The critical point to be kept in context while discussing AnaCredit is that it is a two-step process as I spoke of in my previous blog. Step one is where Banks and Credit Institutions submit the Credit data sets across the 10 subject areas to their National Central Banks. Step 2 is NCBs submitting the data to ECB as required by the AnaCredit regulation (ECB refers to this as the “Baseline Scenario”). Note that I have not said the data NCBs receive from their constituents. And therein lies the plurality that regional and global banks will need to address. Scope of National Arrangement areas per part 1 of the AnaCredit Manual issued by ECB (AnaCredit Reporting Manual Part 1 dated 9th November 2016) are NCBs may collect the information to be transmitted to the AnaCredit database as part of a broader national reporting framework. NCBs may extend the reporting of granular credit and credit risk data beyond the scope outlined in the AnaCredit Regulation, for their own statutory purposes, in line with relevant national law What some of my Banking friends refer as AnaCredit ++. NCBs decide on the reporting format and timeliness in which they receive the data from reporting agents. May exempt reporting agents from reporting counterparty reference data information to the relevant NCB, when such information can be obtained using reliable alternative sources. May grant derogations to small reporting agents, if the total outstanding amount of loans granted to derogated entities does not exceed 2% of the total national outstanding amount of loans; Whether NCBs decide to deviate from the “baseline requirements”, and to what extent, will be communicated by them. The moot question is what does all of this mean to Banks and Credit Institutions that have global or even regional operations? Short answer, it impacts all three aspects - content, timelines and the format of reporting and that is saying a lot. Some of the countries that have come out with their implementation guidelines like Germany, Netherlands, Belgium, France and Ireland point to the reality of national variations of the execution of AnaCredit regulation in different geographies. Content Can become part of a larger data set as in case of Belgium (AnaCredit ++?) Reduced set of attributes may be asked if the National Central bank (Counter party reference data) can source some of the data from other sources e.g. DNB, Netherlands Exempt or Not exempt resident foreign branch from reporting – e.g. Germany Submission Different timelines Actual submission dates - Could be earlier than the ECB timelines Frequency of submission – eg Daily for changes, monthly, quarterly… Different submission formats – xbrl, xml, csv etc. All the above truly make the implementation of AnaCredit much more nuanced, requiring a well thought out inclusive functional and information policy and architecture. In my next blog, I will explore the “Regulatory hub” approach that the banks with multi country presence might want to consider so they can have a “Unity in Diversity” construct that can be leveraged not just for AnaCredit implementation but that which enables them to truly “use” and “reuse” their biggest asset – DATA to create business value.

Blog By: Saloni Ramakrishna, author and senior director, Oracle Financial Services Analytical Applications. It depends on who is answering that question. Really!! It is a single uniform regulation at...

Analytics

AnaCredit - the Nuanced regulation

Blog By: Saloni Ramakrishna, author and senior director, Oracle Financial Services Analytical Applications. I am quite fascinated with the European Central Bank’s (ECB’s) AnaCredit Regulation that came into effect on 18th May 2016 – A regulation that requires granular credit data sets from the national central banks (NCBs) of the member countries based on the data they collect from their constituents- Banks and Credit Institutions. The project was initiated in 2011 and made a regulation in May 2016. Data collection from the NCBs is scheduled to start in September 2018. The objective of ECB in asking for the granular credit data is to integrate and harmonize the national credit registers that will enable ECB to understand the credit Risks being taken by banks both at the Individual borrowers and corporate level. As the details of the regulations are getting released, it is becoming clear that AnaCredit is a nuanced regulation and requires a closer look as well as a well thought out plan for execution. A simple comparison with the other major regulatory reporting requirement the FinRep / CoRep brings out three critical differences in the construct and design of the regulations (Not speaking of the content here). AnaCredit  is a two step process of submission where the banks and credit institutions submit the relevant loan by loan data to their national central banks who in turn, after processing and enriching where relevant, submit it to ECB unlike the FinRep/ CoRep reporting which is a direct submission by the banks to the regional authority. It is granular data at a loan by loan level and not aggregated reporting as in the latter’s case. It is an evolving regulation – most regulations are one might say – but the difference perhaps, is that as banks and credit institutions prepare for implementation, given the timelines, the guidelines on implementations are being rolled out. Important to remember is that the September 2018 timeline is for the NCBs to submit to the ECB, the submission timelines for the reporting agents (Banks & Credit Institutions) will be much earlier, around December 2017 or early 2018, so as to give the NCBs sufficient time to collate, process, enrich and submit to the ECB. AnaCredit submission by the NCBs to the ECB is harmonized but some flexibility has been given to the NCBs of how they   implement the regulation in terms of what and how the information is collected from their constituents within their jurisdictions. ECB, post the promulgation of the regulation has released or is in the process of releasing a three part manual* from an implementation stand point.  Part 1 (Released - Nov 2016) – explains the general AnaCredit methodology and provides information about the reporting population and setting up the reporting, including a general description of the underlying data model. This part spells out the scope of national arrangements and national discretions, amongst other things. Part 2  (Released – February 2017)-  describes all datasets and data attributes of AnaCredit data collection in detail and provides specific reporting instructions. Part 3 (yet to be released) – Will present various case studies and in particular covers special scenarios that require more in-depth explanations. Expected to be issued around May 2017. As the NCBs are detailing the Anacredit implementation approach for their constituents, it is becoming clear that implementation will not be uniform across - there would be geographical variations and local flavors.  The manual of ECB, while is the umbrella guidance and baseline requirement, it is imperative for banks and credit institutions to read them in consonance with the national directive of their NCBs. For example DNB (DeNederlandschebank), the Dutch NCB, clearly spells this out when it says “The Manual does not take into account the specific implementation choices made by the individual Member States. So the Manual should be read taking into account the Dutch implementation and the choices made by DNB.” Therein is one of the most important nuances in the implementation of the AnaCredit regulation – that of national variations. Relevant information on whether or not the NCBs decide to deviate or not deviate as well as national extensions and the timeliness or format of the reporting will be provided by the relevant NCBs. The implementation at the reporting agent’s (Banks & credit Institutions) level will need to be in accordance with their NCB’s directives/ guidance. In the next blog, I will discuss the areas of national discretions in some detail and the challenges it poses to regional and global banks.

Blog By: Saloni Ramakrishna, author and senior director, Oracle Financial Services Analytical Applications. I am quite fascinated with the European Central Bank’s (ECB’s) AnaCredit Regulation that came...

Introducing Banking to Children and Young Adults in a Cashless World

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services The 'future’, everyone has an opinion on where technology is heading! What will mobility, communication, financial services be like in the future? While every business focuses its energies on being future proof, they need to pay attention to one key demographic who will ultimately consume all of these services in the future. This demographic is very unique, unpredictable and has a strong tendency to demand the unimagined. We are talking about children. Children and teenagers constitute roughly 32% of the total world population1. The future belongs to them and soon they are the ones who will call the shots on which car they want to drive, products and services to use and houses to live in. It is evident that the next generation has inherited a greater affinity for technology. The the likelihood of children adopting a new device and adapting to it almost instinctively cannot be underestimated. How do you introduce children to the world of banking and finance as cash is taking a digital form at lightning speed and the world is losing its physical connect with money. How do children spend most of their time these days and how do we gain their attention? Tiny tots love watching videos. It is safe to say video streaming websites are the new Kids TV channels. Children specific video apps with content specifically for toddlers are on the rise. If you are wondering why are we discussing what videos children watch, I would like to stress the point that this is where it all begins, as toddlers grow and near their teens they have already built an opinion about a certain brand based on what they have been seeing around their ecosystem and this has a huge influence on the kind of buying behavior they develop, believe me, if toddlers make up their mind on what they want to buy they do not stop until they get what they want. Also, not to forget the impact of a peer group – it is rightly said ‘children learn most from peers than parents.’ Hence these channels could have a multiplying effect on strengthening a brand’s identity. Are banks really building their brand to attract the attention of these tiny tots? There is no money to be made by doing so but it definitely is a long term investment. Banks can start early by partnering with various content developers (videos, games, music and literature). They can also create child friendly play areas in branches. Creating a mascot and displaying mascot derived imagery on ATMs, sponsored playgrounds and merchandise make the bank omnipresent. All of this can go a long way in developing an image of the bank in the eyes of a child. There is a need which parents often find difficult to fulfill, which is explaining the meaning or value of money to a child. A helping hand from the custodian of all things money will not go unnoticed and will surely be rewarded by grateful parents. By the time children are old enough to attend playschool they begin to develop a vague understanding of the value of money, this is where the piggy bank enters the spectrum. What is the use of piggy banks when the use of physical cash is steadily declining? Banks have a unique opportunity of handing their customers a digital piggy bank. Yes a digital piggy bank, one that is internet enabled, biometrically authenticated, accepts NFC payments, has its own QR code display for people to scan and transfer cash. Now doesn’t that sound like a big win for IoT devices! The digital piggy bank can have multiple themes for deposits like candy money, gaming money, toy money and also a college or school fund letting the adult transferring the money make the right choice for the child. This does not sound like fun for the child. To keep all parties happy the display on the piggy bank can show the child articles that can be bought with the funds collected and allow them to place an order directly. Doing this indirectly teaches the child of how money works. Banks have an opportunity to create a platform by partnering with vendors of products and services, such as educational institutions, amusement parks, toy manufacturers and display several products. The inclusion of children into the financial ecosystem begins with their parents. Banks are sitting on a treasure trove of data.  A simple run through of an account statement will identify customers who have children. Once banks get the parents’ consent to participate in a program that introduces money and finance to young children, a piggy bank in the full livery of the bank’s mascot can be placed in the hands of the child. Children will outgrow their piggy bank sooner than you realize and will be ready to actively participate in an adult environment. The average age when a child gets its first smartphone is 10.3 years 2; this is also the best time to introduce children to the world of money and finance. Fintech’s have already identified this gap and have introduced solutions like mobile apps linked to children’s debit cards. These apps are tools for parents to transfer pocket money to their children, monitor spend and also set spending limits. Ignoring a 10 year olds need for digital payments is definitely an opportunity lost; their needs range from downloading/ streaming music and video, mobile apps, in- app purchases, gaming console related purchases; the possibilities are limitless. Empowering children to make their own digital payments not only frees parent’s time, it also gives the child a sense of responsibility and self-belief. This also helps parents avoid a situation where their child may buy dinosaur on a Jurassic park game for $59003. Many parents employ a points system to get their children to complete daily chores, homework, being obedient etc. this system has scope to be digitized and presented to kids in a gamified avatar linking tasks completed with rewards as prescribed by the parents. Around the age of 10 is also when children are mesmerized by the world of gaming. The worldwide gaming industry hit 91 Billion USD in revenues for 20164. Games like the World of Minecraft have young children building worlds of their own, this game is very popular and termed the “Lego” of the digital world and it already has banks where children deposit Minecraft Money. Similarly educational games like TiViTz College Savings Game, which help children earn money for their college education are becoming increasingly popular. These games require setting up of a savings account. Barclays has already built an interactive game that teaches basic money management and financial life skills to young consumers. This is just the tip of the iceberg when it comes to the world of gaming for young children. Given the massive scope of introducing banking to the younger generation through gaming and gamification, banks need to develop a strategy to embrace this channel. Banks maybe funding game developers to build games for the younger generation, but banks also need to help game developers reach their existing bank customer base. Exposing banking APIs to game developers creating a seamless experience for children within the bank’s customer base is a good start. This will also lead to customer stickiness from the network effect the customer gains. Every game has its own form of digital currency which needs to be earned or purchased with real world currency, imagine a scenario where banks offer children a repository for their game money which can be used across different games. A move like this will attract gamers to banks the same way toddlers are drawn to video streaming websites, creating a new revenue stream. All game and no play makes Jack an ‘unhealthy’ boy. Banks today have begun incentivizing healthy lifestyles by providing customers with better rates, offers and added benefits if they lead a healthy lifestyle. Fitness tracking wearable devices paired with a bank’s app track customer’s footsteps, miles they run, cycle,and swim in addition to monitoring their heart rate, calorie counts and overall stress level. The same offerings can be extended to children who are the bank’s customers and linked to the children’s banking app we spoke of before. The success of Pokeman GO demonstrates how gamification can bring children, and some adults, back out into the open spaces. Devices are not only used to play and pay, they are at the forefront of education as well, Edtech is seen as the next fintech. Close to 7.3 Billion USD was invested in edtech just in 2016 5 and edtech is still considered to be in its infancy. Edtech startups have popped all across the world to cater to the growing demand of education on a digital platform. On the ground level, there is a growing demand to make financial education mandatory. Banks have the opportunity to play a key role as an enabler given their subject matter expertise, they also need to extend their infrastructure to act as sandboxes to help edtech startups in the financial domain to provide their users a near-real experience. This in turn will introduce banks to this untapped demography of youngsters taking their first steps towards financial literacy. Banks have already started collaborating with fintech’s through their innovation accelerator programs, gamification is one of the larger themes of research in many accelerators. Gamification also happens to be at the forefront of edtech helping children have fun as they learn. A partnership between financial institutions and edtech firms has high potential to blossom into a very fruitful collaboration with dividends to be earned for many years into the future. When banks become part of the digital learning ecosystem, students need not look very far when they require any sort of financial assistance for their studies, travels and entrepreneurial ventures. In conclusion, what we have been emphasizing is that the entire digital banking ecosystem needs to focus specifically on the younger demographic and provide them a banking experience from a very young age so that they board your platform for banking throughout their lifetime. The current set of banking offerings for youngsters are designed to attract their parents more than the children themselves, it’s time for banks to join children on the play-ground and make them their BFF (Best Friend Forever). My colleague Roger Lobo and I co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at Oracle dot com and roger dot lobo at Oracle dot com. 1  http://www.worldometers.info/world-population/world-population-gender-age.php 2  https://techcrunch.com/2016/05/19/the-average-age-for-a-child-getting-their-first-smartphone-is-now-10-3-years/ 3  http://in.pcmag.com/apps/98800/news/kid-racks-up-5900-bill-on-dads-ipad-playing-jurassic-world 4  http://venturebeat.com/2016/12/21/worldwide-game-industry-hits-91-billion-in-revenues-in-2016-with-mobile-the-clear-leader/ 5.  http://www.metaari.com/assets/Metaari_s-Analysis-of-the-2016-Global-Learning-Technology-Investment-Pat25875.pdf  

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services The 'future’, everyone has an opinion on where technology is heading! What will mobility, communication, financial...

Analytics

Japan's FSA launches Money Laundering Inspections across local Financial Institutions

Blog post written by Ido Nir, Oracle Financial Services Analytical Applications, Financial Crime and Compliance Management - Asia Pacific.  Enhanced enforcement to be carried out in light of upcoming Mutual Evaluation by the Financial Action Task Force (FATF) planned for October/November 2019.  Tokyo-based financial institutions have established a longstanding ecosystem to address anti-money laundering.  This ecosystem has served for many years as the standard dictating operational best practices across Japan, as well as the standard for customer due diligence procedures and the detection of suspicious activity.  In recent years, we have witnessed constant change in the direction of global regulation, focusing mostly on proper KYC processes, identification of the ultimate beneficiary, Trade Based Money Laundering (TBML), and other areas.  As a result, we see a distinct shift in the focus of global banks and a re-evaluation of best practices for transaction monitoring, customer due diligence, and screening. The Japanese government has acknowledged this shift and has recently introduced changes to the Act on Prevention of Transfer of Criminal Proceeds (effective October 2016), which set new standards for both banks and other financial institutions operating in Japan for customer on-boarding and monitoring (KYC, CDD, and ECDD), the verification of foreign correspondent bank relationships, monitoring and detection of suspicious transactions, as well as implementation of additional in-house AML measures. In previous years, the FATF has criticized the Japanese government for inadequately addressing AML issues.  The last official statement was given in 2014 where the FATF contemplated adding Japan to its "grey list" of high-risk jurisdictions.  The 2016 amendments take Japanese AML legislation one step closer to fully addressing the FATF's criticisms and establishing a more robust AML regime as seen in other developed nations. As a direct result of the new legislation, and in light of the upcoming FATF mutual evaluation planned for October or November 2019, the Financial Services Agency (FSA) began a nationwide evaluation of multiple banks, securities firms, and other financial institutions, surveying the measures they are taking to prevent money laundering.  The aim of this survey is to better tackle the growing problem of money laundering and financial crime among both global and regional financial institutions. At each of the financial institutions surveyed, the FSA is examining the precise steps being taken to prevent money laundering across a multitude of risk areas, and whether these policies have been properly adopted by individual branches and subsidiaries.  Furthermore, bank-wide knowledge and understanding of AML as well as management's understanding of related risks and exposure is also being evaluated. It is clear that the FSA will continue its effort to bring proper AML controls to the forefront and will vigorously require both global and regional financial institutions to adopt proper measures. Financial institutions are expected to update their current AML measures and policies to ensure compliance with the recently updated regulatory framework.  This change in policies, operations, and procedures should be applied with minimal impact to the existing business process - ideally it should enhance the existing process.  Financial institutions should set a goal to establish a holistic compliance program that handles the identification of customers, screening of customers and transactions, as well as monitoring of suspicious activity in a cohesive, effective and most importantly in a unified way, with minimal impact to business processes. By arming themselves with the right tools and technology, Japanese financial institutions can ensure they implement the level of scrutiny that regulators are demanding today, as well as ensuring their ability to meet both the business and regulatory challenges of tomorrow. Financial institutions must comply with and observe the changes in a set of standards imposed by regulators and, as indicated above, they are regularly audited to verify their compliance.  Financial institutions who are found in breach of their duties face legal consequences as severe as being stripped of their banking license and the threat of hefty fines rising potentially into the hundreds of millions of dollars.  Good examples of this include the case of 1Malaysia Development Berhad and the case of Deutsche Bank's USD 41M fine for AML lapse imposed as part of their May 2017 settlement with the U.S Federal Reserve.  There are numerous other examples where fines reach more than USD 1B, which clearly reflects the regulators' intentions. To avoid hefty fines, financial institutions must look at the banking products they offer, the markets in which they operate, and the regulations that apply in those markets to further understand their risks.  Then they must implement controls and solutions, such as detection scenarios.  Many financial institutions have followed a "rule-based technology approach."  This approach has worked well for many years, but while it will trigger alerts that catch bad guys, it may also result in false positive detection of legitimate activities.  False positive detection is a known problem across the industry that routinely requires analysts to spend time reviewing more alerts, incurs a larger workload and commensurate costs on AML operations. Machine learning algorithms are promising tools for reducing such false positives.  Algorithms can be developed using training data, and then customer-specific data to be fine-tuned, resulting in higher detection accuracy and increased performance.  However, when it comes to compliance, showing results is not enough for regulators.  Banks are required and must be able to explain how they arrived at their detection results. This is one of the key challenges with models trained using Machine Learning techniques, the advanced and more sophisticated algorithms are essentially a black-box, and the inability to explain the algorithm in a black-box has been a major roadblock for industry adoption of this technology.  However, the stakes are too high to give up on Machine Learning.  To learn more about this topic and what Oracle is doing to achieve delivery and regulator acceptance of Machine Learning techniques in AML applications, come see us at ACAMS and SIBOS.

Blog post written by Ido Nir, Oracle Financial Services Analytical Applications, Financial Crime and Compliance Management - Asia Pacific.  Enhanced enforcement to be carried out in light of...

注意差距:从流动性管理迈向资金合作伙伴

Blog By: Anand Ramachandran, Senior Director, Global Banking Solutions Practice, Oracle 如今,多股宏观经济和监管力量正对公司流动性管理产生影响。跨国公司涉及多货币和多地区业务,因此全球和区域银行都在与其打交道。 银行在公司客户业务中不断成长。但是,有关流动性管理实践的监管规定一直在收紧(例如,存放非经营性存款的限制)。这更突出强调了因多个DDA(活期结算账户)系统所导致的“流动性管理缺乏单一真实数据源”问题。随着数字化和电子商务的飞速发展以及妨碍公司企业在地区和全球市场扩张的门槛逐渐降低,这些公司企业从成立伊始就开始构建其作为地区和全球玩家的业务模式,并在多个司法管辖区开展业务运营。 这种趋势促进了跨市场和时区的多币种现金和流动性池的发展,也使得高效的流动性管理愈发成为必要。同时,国家层面的监管限制以及公司所处的具体税务和法律环境也导致公司在多个司法管辖区内经营时发生流动性受困的情况。 与此同时,实施《巴塞尔协议III》所带来的影响也正在逐步凸显,一方面是银行信贷的不断紧缩,另一方面银行也不再愿意接受非经营性存款。这些因素使得企业财务主管再也不能像全球金融危机发生之前那样随意把现金存放在银行里。 此外,货币市场基金改革的举措也无助于此种状况的改善。例如,美国货币市场基金不再保证收益率,因此,公司财务主管必须更有效地平衡“收益率-效率”之间的矛盾。 2016年出现的负利率现象,进一步加剧了实现有效流动性管理所面临的挑战。负利率曾一度被认为只是假设情景,如今却真实发生在日本和几个欧洲国家。 此外,公司财务主管们进一步认识到需要对财务和流动性功能进行集中管理,以帮助其应对日新月异的技术变化,如数字化、快速支付和开放API(应用程序接口)等等。 为什么流动性管理对公司很重要? 流动性管理是企业日常工作的重要组成部分。无论现金充裕还是紧张,在获得来自市场或银行的外部资金之前,公司企业都需要调动、优化其现金流并确保其自有现金储备获得最大回报。 事实上,全世界的企业财务主管都认识到,他们的大部分工作与现金或流动性管理有关。荷兰合作银行(Rabobank)2016年的一项调查显示,现金池(69%)、流动资金优化(52%)和资金归集(47%)属于财务主管关注的前三大领域。 北欧银行2017年的调查——“公司财务的未来”表明,从现在起到2017年底,集中管理集团的现金和流动性是企业财务人员关注的头等大事。 越来越多的企业开始认识到必须要改善其内部流动资金管理,并有效管控区域和全球现金或流动性头寸。因此,对于公司财务主管的要求就集中在能否实现更优化的区域和跨区域流动性解决方案。流动性解决方案现已成为整个业务流程中的核心价值。 全球和跨国公司正在努力实现流动性管理的三项关键成果——可见可控、易于补充、收益优化。为实现这些关键成果,公司财务部门正基于其强大的区域流动性管理能力基础上构建全球流动性管理架构。这样做的好处在于,财务部门可以同时应对集中以及分散的财务处理要求,并加强从分支机构的运营部门到财务总部的信息流。 技术进步正在促进集中化以及财务职能的广泛演进。 在欧洲、中东和非洲(EMEA)地区,公司财务主管需考虑英国脱欧对流动性管理实践的影响和潜在风险。在此情况下,该地区过去几十年来一直发挥作用的有效财务结构是否会进一步复杂化? 英国一直是世界资金的主要枢纽,拥有众多现金池(实体的和名义的)的顶层归集账户、财务中心和银行及企业的共享服务中心。如今面临的问题是,欧盟是否会为公司企业创造一个有利的生态系统,以帮助其适应新现状。 鉴于这些动态,公司财务主管和核心流动性服务提供者(如银行)被迫重新审视位置、决策标准以及了解新涌现出来的挑战。 总而言之,有效的流动性管理有助于公司在面对不断进化的市场和监管挑战时保持稳定,提高业务灵活性并达到所期望的流动性结果。   银行如何回应? 公司客户正面临动荡、不确定、复杂和模糊(VUCA)的环境,有鉴于此,银行可将自身定位为提供流动性服务的主要银行,帮助客户驾驭市场。 最近针对公司财务主管的调查表明,小型企业目前正与越来越多的银行合作,在一个或多个地区扩大其业务和利益;而大型公司则在继续梳理其银行业务关系并使其更加合理化。 流动性服务曾被认为是大型全球化银行的独家领域,如今早已不止于此,其需求范围越发广泛。许多中小企业(SME)自成立起便将自身定位为地区性和全球性公司。这为与SME客户打交道的地区和国家银行创造了机会,让其可将自身定位为流动性解决方案提供者。 随着闸门开启,各种类型的银行都开始提供流动性服务,这个市场将从那些全球性、区域性和国家银行之间的良性竞争中获益匪浅。这反过来又有助于促使全球流动性市场从目前的线性增长状态走向更大规模的指数式增长。 然而,那些希望对市场产生重大影响的银行则需要考虑为企业客户提供咨询服务,并力图成为他们首选的财务合作伙伴。   Oracle 的流动性管理解决方案如何为银行提供助益? Oracle 提供有简单灵活的解决方案,可由银行根据其时间安排及优先级进行实施,而无需采用那些复杂而昂贵的系统进行一次性全面替换。 Oracle 的全球流动性解决方案建立在最新技术架构之上。可以为那些向其公司客户提供流动性管理服务的银行提供一流的国内、区域和国际流动性管理功能支持。 Oracle 的解决方案为银行带来灵活性,可将此先进解决方案与银行一个或多个现有核心系统和基础设施进行集成,从而降低互操作性所面临挑战,并消除落后技术带来的影响。 银行将因此具备: 处理复杂帐户结构的能力。大多数成熟的公司客户往往拥有多个账户,跨越市场、币种和时区。Oracle 的解决方案可为传统和现代流动性管理需求提供全面支持。 灵活支持创新结构。要支持多个司法管辖区的监管要求,解决受困流动性并提高收益,一个灵活的解决方案必不可少。该解决方案以客为本,方便易用——其所使用的可视化工具有助于管理和维护复杂的流动性结构,并提高银行人员处理全球和地区业务需求时的运营效率。 无缝集成多银行交易。公司客户往往会先测试不同银行的服务,再决定选择哪一家银行作为主要的流动性服务提供商。Oracle 的解决方案可协助银行提供灵活的多银行现金归集服务,并有助于银行与公司客户逐步建立战略合作关系。 提供个性化公司解决方案。银行可通过系统深入了解公司客户,提供更好的客户服务,使潜在公司客户能更好地了解其资金流动性状况,并提高运营效率。 提供咨询服务。该系统拥有的全面解决方案功能,提供包括方便易用的工具,如仪表板、流动性热图(heat map)和模拟技术,使银行能够专注咨询服务,增加费用收入。 Oracle 的流动性管理解决方案内置有先进的模拟引擎,这将使银行能帮助其客户对可能的市场变化了然于胸,令各方均能做出明智决定。进而改变游戏规则。 Oracle 致力于金融领域,其巨大的研发入和轻量级解决方案使银行能够成为其客户的财务合作伙伴,并推动流动性管理最佳实践的发展。

Blog By: Anand Ramachandran, Senior Director, Global Banking Solutions Practice, Oracle 如今,多股宏观经济和监管力量正对公司流动性管理产生影响。跨国公司涉及多货币和多地区业务,因此全球和区域银行都在与其打交道。 银行在公司客户业务中不断成长。但是,有关流动性管理实践的监管规定一直在收紧(例如,存放非经营性存款的限制...

Analytics

Top 3 Trends Transforming AML Programs

Blog by Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. On the timeline of financial services development, the anti-money laundering discipline does not have a long history. It started around 1985 and evolved in more recent years beyond an anti-crime, anti-drug orientation to include an anti-terrorist financing element post 9/11. To have an effective anti-money laundering program, global banks and international bodies must deal with the confluence of data that emanates from the financial, social, and economic aspects of life. With recent technological advancements, tax evaders, terrorists, and cyber criminals hide behind and leave their trail in encrypted messages, social media, the cloud, big data, and the "Internet of Things." The focus for the past few years has been around unification of a financial crime and compliance platform, which brings various components of an anti-money laundering program as part of a single, unified compliance platform. Components include transaction monitoring, case management, suspicious transaction reporting, analytics, etc. as part of single platform. Several financial institutions have already jumped in and are now moving towards the next step of this journey. In addition to unification of a compliance platform, another major focus is on data and making sure there is quality data for detection and investigation for an effective program. While the industry continues to work towards these aspects, there will be a next level of items that financial institutions will focus on over next few years: Machine Learning in Suspicious Pattern Detection: Traditional ways of query-based detection have been successful for suspicious transaction monitoring. This query-based detection allows financial institutions to adjust various monitoring parameters for much better accuracy. Query-based detection monitoring requires regular testing and updating, which in addition requires a massive quantity of human resources, technology, and money. Furthermore, query-based detection lacks the ability to apply better detection logic based on prior behaviors by itself. There is always a possibility to miss some suspicious behaviors due to delay in updates in the traditional detection. And that is why this traditional query-based detection will evolve towards smarter systems, which will have any ability to learn by itself and keep up without human intervention. At a high level, learning will include historical customer behaviors and analyst conclusion to apply the updated detection logic. If it is possible to identify such repeated analyst conclusions and factor those in suspicious transaction monitoring, then the amount of false positives can also be dramatically reduced. This does not eliminate the need for regular verification of detection configuration; however, will certainly provide a smarter suspicious transaction detection capability.   Robotics for Investigation: Cost of compliance in anti-money laundering due to huge investigation teams is increasing. There have been some recent fines not due to the inability to detect suspicious transactions, but due to the lack of investigation. Banamex USA acknowledged that they conducted fewer than 10 investigations and filed only 9 so-called suspicious activity reports — even though its monitoring system identified more than 18,000 transactions as “potentially suspicious” during that period (detailed article here). Robotics is a general term that can refer to a few different uses of digital robots to automate work. In most cases, Robotics is referred to Robotic Process Automation (RPA), which is using Robotics to automate an entire process from start to finish. The other type of Robotics, which is discussed less often, is Robotic Desktop Automation (RDA), which combines human and robot. This technology allows an organization to automate many stages of an investigation process, allowing for more efficiency, consistency, and effectiveness.  Financial institutions will move towards making those well-known repeated tasks much more automated, such as searches on external websites or data providers. This way you aren’t adding “headcount”, you are enhancing your analysts and allowing them to focus on gathering more information and making better decisions – overall making them more efficient.   Service-Based Solutions (Cloud): Cost and risk are perennial concerns for executives and managers, especially when manpower and money are committed to on-premise deployments of enterprise software with no guarantee of success. In addition, it is a time-consuming process to on-board a new system or upgrade an existing system to keep up with the latest regulatory changes.  Therefore, the future technology will be service-based cloud solutions, which are mainly Software as a Service — a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted by vendor.  Service-based solutions will also allow faster on-boarding of a system for a financial institution and will reduce overall cost, allowing compliance teams to focus more on the things they care about most, which is compliance. (The worldwide public cloud services market is projected to grow 18% in 2017 to total $246.8 billion, up from $209.2 billion in 2016, per Gartner, Inc.) This change in trend will surely put additional burden on both financial intuitions and technology vendors. On one side, technology vendors will need to evolve to accommodate these upcoming demands, and on the other side, financial institution will have the biggest challenge to convince regulators in addition to train their staff and update policies. Like any other change, this phase will also take its own leap time and will require a few years to get to its mature stage. It will be interesting to hear your standpoint on what other trends you are seeing in this space and the challenges and effects we should anticipate while the industry will move forward to acclimatize these new trends.

Blog by Garima Chaudhary, Oracle Financial Services Financial Crime and Compliance Management Specialist. On the timeline of financial services development, the anti-money laundering discipline does...

Banking

Mind The Gap: Stepping Up From Liquidity Management To Being A Treasury Partner

Blog By: Anand Ramachandran, Senior Director, Global Banking Solutions Practice, Oracle There are multiple macro-economic and regulatory forces influencing corporate liquidity management today. Both Global and Regional Banks are dealing with multinational corporations that have businesses dealing with multi-currency and multi-location requirements.  Banks have been growing in the corporate customer businesses. However, regulations governing liquidity management practices have been tightening, such as the limits in placing non-operational deposits. This places greater emphasis on the problem of having ‘no single source of truth of liquidity’ brought about by multiple DDA (Demand Deposit Account) systems. With the digital and e-commerce explosion and as the entry barriers for corporations to expand regionally and globally have become progressively lower, corporations are structuring their business model as regional or global players’ right from the start and are dealing with business operations across multiple jurisdictions.  This trend has led to multi-currency cash and liquidity pools across markets and time zones, necessitating the need for effective liquidity management. Country-level regulatory restrictions along with a company’s specific tax and legal situations also lead to trapped liquidity in companies operating across multiple jurisdictions.  Meanwhile, the effects of Basel III are kicking in with credit tightening at one end and pushback from banks on non-operational deposits, this has deterred treasurers of companies from placing cash at will with their bankers as they had previously in the pre-GFC (Global Financial Crisis) world.  In addition, money market fund reforms initiatives have not helped the cause. For instance, U.S. Money market funds no longer guarantee yields and thereby corporate treasurers must effectively manage the yield-efficiency trade-off.  Challenges in achieving effective Liquidity Management is further amplified with the kicking in of negative interest rates in 2016. Once thought of as only a hypothetical scenario, negative interest rates have become a reality across Japan and several European countries There is also a greater realization among the corporate treasurer community of the need to centralize treasury and liquidity functions to help them deal with fast-evolving technology such as digitalization, faster payments and API (application program interface) initiatives.  Why is Liquidity Management Important for Corporates? As a core function of corporations, liquidity must be managed on a day-to-day-basis. Whether cash-rich or cash-strapped, corporations still need to mobilize, optimize and ensure maximum return out of their cash reserves before getting external funding from the market or the banks. In fact, treasurers spend most of their time on cash or liquidity management. This is acknowledged by the corporate treasury community worldwide. In a 2016 survey by Rabobank, cash pooling (69%), working capital optimization (52%) and cash concentration (47%) are the top three areas that treasurers focus on.  Nordea Bank’s 2017 survey – ‘The future of the corporate treasury’ found that centralizing the group’s cash and liquidity is the top priority for treasuries between now and end of 2017.  Increasingly, there is a growing impetus and drive for corporations in improving their internal working capital management, managing regional and global cash or liquidity positions. Thus, corporate treasurers are focused on achieving optimized regional and cross-regional liquidity solutions. Liquidity solutions have now become fundamental value enablers in the entire business process. Global and multinational corporations are striving to achieve THREE key outcomes – Visibility & Control, Access to Liquidity and Yield Optimization. To achieve these key outcomes, corporate treasuries are overlaying strong regional capabilities with a global liquidity structure. This approach enables them to support both centralized and decentralized treasury operations and enhance the information flow from subsidiary operations to treasury headquarters.  Technological advancements are facilitating centralisation as well as a wider evolution the treasury function. In the Europe, Middle East and Africa (EMEA) region, corporate treasurers need to factor in the impact of Brexit on liquidity management practices and potential risks. Will this fallout add further complexity to the efficient treasury structures that are in play for the last couple of decades in the region?  The UK has been a major hub for treasury, housing many cash pool headers (physical and notional), treasury centres and shared service centres for both banks and corporates. A question now is whether the EU will create a conducive ecosystem for corporations to live with the new reality.  Given these dynamics, corporate treasurers and core liquidity service providers such as banks are forced to revisit the location, decision criteria and understand the emerging challenges. In summary, effective liquidity management has helped corporations remain stable in the face of evolving market and regulatory challenges, increase business agility and achieve the desired liquidity outcomes  How Can Banks Respond?   With corporations facing a volatile, uncertain, complex and ambiguous (VUCA) environment, banks can position themselves as a prime bank providing liquidity services to help their clients navigate the marketplace. Recent surveys of Corporate Treasurers confirm that while smaller companies are working with a growing number of banks as they expand their footprint and interests in one or more locations, the largest corporations are still rationalising their banking relationships. Once thought of as the exclusive domain of large global banks, the demand for liquidity services has gone much further than that. Small and medium enterprises (SMEs) are positioning themselves as regional and global players from inception. This opens an opportunity for regional and country banks that deal with SME customers to position themselves as liquidity solution providers. With the floodgates open for banks of all types to offer liquidity services, the marketplace will enjoy healthy competition between global, regional and country banks. This in turn helps promotes global liquidity market growth from its current linear state to more exponential growth.  However, banks looking to make a significant impact on the market need to look at advisory services and become a treasury partner of choice.  How does Oracle’s Liquidity Management Solution Help Banks?  Oracle offers simple and flexible solutions that can be implemented by banks at their own time and based on their priorities – instead of having to adopt complex and expensive transformation programs all at one go.  Oracle’s global liquidity solution is built on a modern architecture with best-in-class functional capabilities delivered in-country, with regional and global liquidity techniques for banks to offer liquidity services to various corporations. Oracle’s solution offers flexibility to banks to bolt this best-in class solution to one or more existing core banking systems and infrastructure, thus easing interoperability challenges and eliminating technology obsolesce.  Banks will be equipped to:  Deal with complex account structures. Most established corporations tend to have multiple accounts that span across markets, currencies and time zones. Oracle’s solutions provide comprehensive support for both traditional and advanced liquidity management techniques.  Have the agility to support innovative structures. An agile approach is necessary to support regulations across multiple jurisdictions, address trapped liquidity and increase yield. The solution is very user friendly – The visual tools help manage and maintain complex liquidity structures and enhance operational efficiency of bankers while they deal with requirements of global and regional businesses.  Incorporate multi-bank transactions seamlessly. Corporates tend to test out different banks’ services before deciding on a primary liquidity service provider. Oracle’s solutions help banks provide flexible multi-bank cash concentration services, which will facilitate the transition between banks and their corporate customers to a strategic relationship over time. Propose personalized corporate solutions. With a system that offers banks an in-depth overview of their corporate customers, banks are able to provide better customer service, which allows potential corporate customers to get a better visibility of their liquidity and drive operational efficiencies.  Provide advisory services. The comprehensive solution capabilities including customer friendly tools such as dashboards, liquidity heat maps and simulation techniques allow banks to focus on advisory services and enhance fee-based income.  The advanced simulation engine in Oracle’s liquidity management solutions is a game-changer as it allows banks to give better visibility of likely market scenarios to their clients so that all parties can make an informed decision moving forward.  Oracle’s commitment to the financial sector, its R&D investment and lightweight solution allows banks to step up as a treasury partner and drive best practice liquidity management.

Blog By: Anand Ramachandran, Senior Director, Global Banking Solutions Practice, Oracle There are multiple macro-economic and regulatory forces influencing corporate liquidity management today. Both...

Banking

Faster Payments: How Banks are Reclaiming Lost Ground

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services The last couple of years have witnessed, innovations in payments dominated largely by Fintechs. According to McKinsey, payments represent 43% of FinTech startups, with payment companies securing 40% of the total funds out of the $36 Billion of global FinTech funding in 2016. The appeal of payments Fintechs lies in its personalized experience, differentiated business models and the power of digital end-to-end business value. Customers are seeking cheaper, faster, and more transparent payment options, and lean Fintech companies threaten to disrupt the marketplace. After the initial slumber, following the onslaught of the Payments Fintechs, incumbent banks are gradually stirring into action - slowly but surely. The progress is less dramatic, but full of promise. The SWIFT global payments Innovation (gpi) and SEPA instant payments are a welcome change, in an otherwise staid payments eco-system of the banking industry. The recent addition of US payment market infrastructures to support SWIFT gpi, adds to an already growing list of 56 SWIFT-connected market infrastructures, including EURO1 and TARGET2. SWIFT gpi is now live with 12 major global transaction banks exchanging tens of thousands of gpi payments. Nearly 100 banks have signed up to SWIFT gpi and numerous additional banks are set to go live in the coming months. The advantage of SWIFT's global payments innovation (gpi) over a fintech start-up doing a proof of concept is that the former is being piloted by banks using their production systems. That’s a huge leap. The recent shifts in service expectations around accelerated payment processing triggered the idea of instant payments, within the European banking Industry. To paraphrase the Euro Retail Payments Board’s definition of SEPA Inst Payments – an electronic retail payment solutions available 24/7/365 and resulting in the immediate or close-to-immediate crediting of the payee’s account with confirmation to the payer (All within 10 seconds of payment initiation). This is irrespective of the underlying payment instrument used or of the underlying clearing and settlement ecosystem. And it is planned to go live on November 2017 at 08:00 CET. As a huge boost to community banks and credit unions, the Same Day ACH (SDA), for credit entries, was made available on September 2016, marking the beginning of faster payments in US markets. Starting September 15, 2017, Same Day ACH will be available for debit entries, enabling the same-day processing of virtually any ACH payment. By March 2018, Receiving Depository Financial Institutions (RDFIs) will be mandated to make funds available from same day ACH credits. With nearly $5 billion in 3.8 million Same Day ACH transactions, in October alone, it does make for a profound statement.  These innocuous and subtle technological innovations could prompt a perceptible shift in scales in favor of the incumbent banks. How Banks Have the Advantage Banks have a captive customer base and offer multiple products - payments is just one among the many business offerings, while payments Fintechs are more transactional in nature, where the core revenue model thrives on float and transaction fee. The challenges of the long winding path to profitability will endure the initial euphoria of technological prowess of the Fintechs. And these innovations are not making it any easier. As banks realize the effects of faster, cheaper, and more transparent payments, they can expect to grow their international business, enhance supplier relationships and increase treasury efficiencies, while taking on payment Fintechs on their own turf. New solutions could be built across different customer segments that lower the total cost of ownership, maintain regulatory compliance, and provide a faster, more efficient, and more transparent end-to-end customer experience. Banks can now gain the ability to evolve and meet contemporary challenges of an evolving payments landscape. And the threat to Fintechs is beginning to show. In India, as per RBI data, bank wallets showed a growth of 57% (173 million in December 2016) and 20% (208.5 million in January 2017), while e-wallets showed a declining trend of their market share from 34% of total PPI transactions in December, to 33% in January to 29% in February . These are interesting times in an already vibrant payments industry, and the consumers are not complaining! My colleague Mridul Nath and I co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at Oracle dot com and mridul dot nath at Oracle dot com.  

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services The last couple of years have witnessed, innovations in payments dominated largely by Fintechs. According to...

Analytics

Fighting Financial Crime with Cutting-Edge Techniques

As you know, the pressure placed on financial organizations to combat crime and money laundering is always on the rise—and there’s good reason, too. It’s estimated that US$1.6 trillion in criminal proceeds pass through the financial system every year, or roughly 2 to 5 percent of the world’s GDP.1 If that sounds like a lot, it’s worth bearing in mind that this isn’t just a number. This is money that funds organized crime—that’s terrorist groups, large-scale corruption, the drugs trade, human trafficking, and counterfeiting—and those of us in the finance industry are in a unique position to fight it all. This is why anti-money laundering (AML), risk, compliance, and Bank Secrecy Act officials are constantly tasked with improving practices. But with more data flowing through the system than ever before, and with increasingly sophisticated threats, maintaining compliance and keeping crime at bay are tasks that grow in complexity. Right now, a lot of banks are still using traditional methods to uncover suspicious behavior. But these methods are costly—and rife with opportunities for failure. The time has come for a new approach.  Fighting Fires Before They're Fires There’s currently a real desire on the part of regulators to change the industry’s culture and approach to compliance, making it central for all operations. For example, a much bigger emphasis is currently being placed on due diligence and knowing your customers, with regulators really starting to scrutinize how banks have chosen their clients in the past. There’s good reason for this, too. This initial stage of interaction between an institution and potential customer is the easiest time to prevent fraudulent activity. Quite often, by the time customers have been on-boarded, the damage has been done. But vetting customers before this point—before they even engage with the institution—is by far the most successful method of prevention. It’s all about knowing who your customer is. How do you do this? Well, it’s clear that siloed, piecemeal solutions are no longer suitable. What the modern bank needs is a way of automating the on-boarding and risk-scoring processes to ensure compliance, while engaging in ongoing due diligence (ODD) and enhanced due diligence (EDD). Placing Big Data Analytics at the Heart of Your Defense Many organizations are already updating AML systems and Financial Crime and Compliance Management (FCCM) solutions to automate and standardize these on-boarding processes, gain a single version of the truth, increase transparency, and improve overall performance. Several are also starting to acknowledge the importance of data-driven analytics, business intelligence tools, and the techniques commonly seen in other areas of big data analysis to prevent financial crime. The fact is that although a lot of institutions feel limited to the data available in siloed databases, or to that which customers provide, many have the potential to access a much bigger pool of data in various formats residing in data lakes. With the right investments, this data can provide much better insight into who clients are and, ultimately, a better assessment of their risk to the institution. Guesswork or manual review are not appropriate strategies for protecting your bank, so using big data in the investigative and due-diligence process to find early correlations between clients and crime is the best way to stay ahead of potential threats. The main benefit is that where traditional SQL warehousing is slow and can only enable you to address structured data, big data analytics provide the ability to work with multiple data types like voice, chat, email, and machine logs, as well as transaction data. This means you can develop a more complete picture of compliance, track insider misconduct risks, and reduce instances of false positives. All-importantly, you can do all of this in real time, turning a responsive approach to protection into a proactive one and effortlessly putting compliance at the heart of your day-to-day operations. Right now, the use of big data analytics in financial compliance programs is becoming more mainstream. But in order to take full advantage, you need the right solutions in place to ensure your approach is effective, cost-efficient, and future-proofed—so no matter what changes are made to legislation, you’re able to keep up. A Solution Fit for the Modern Age So the measures you take to protect your bank need to do a couple of things: take the cost out of the onboarding process, and minimize risk. But what technology can help you do this? As we mentioned earlier, it all comes down to knowing your customer. There are plenty of tools available that can help you to get a more complete picture of your prospective clients from day one—making it far easier to monitor, detect, and investigate suspicious activity, and notify regulators of potential problem areas. These same know-your-customer tools can also automate and reduce the cost of on-boarding—which makes finding new customers that much safer, and maintaining compliance that much cheaper. Sound Good? To learn more about the current state of financial crime and compliance, and the solutions available to help you face your challenges, visit our resource center. Alternatively, if you want to experience the benefits of a modern approach right away, trial Oracle Financial Crime and Compliance Management today. 1 UNODC, "Illicit Money: How Much Is Out There?"

As you know, the pressure placed on financial organizations to combat crime and money laundering is always on the rise—and there’s good reason, too. It’s estimated that US$1.6 trillion in...

Analytics

Fundamental Review of the Trading Book(FRTB): How do you start on this journey?

Blog By: Rohit Verma The Fundamental Review of the Trading Book is painful, onerous & expensive to implement, and unfortunately it’s not going away. It’s an additional limb on an overflowing tree of regulatory requirements. With a 50% estimated increase in capital requirements assuming full IMA approval and a $200M estimated cost of implementing FRTB for each of the Tier 1 banks this is certainly no laughing matter, some may laugh until you cry however! The 2019 deadline seems a bit away at this time, but the steps required to ensure compliance aren’t easy and now is the time to start. I presented at the Marcus Evans Conference on FRTB back in February, and the consensus was fairly consistent with the attendees: where do we start? There is a lot to do with multiple challenges at multiple levels and the next 18 months may not be enough time. This is a capital adequacy regulation that also impacts the way the trading desks are organized and the way lines of business operate within the bank. So how do you start on this journey? I see there being 4 key challenges in addressing this regulation: Methodology  Data Management Technology And the intersection point of all 3 But just with any regulatory guideline, there is a great opportunity to turn the regulatory burden into a strategic business initiative that streamlines data and processes for the trading book. Let me take you through each of them. Methodology FRTB introduces multiple computational challenges for banks. While calculation of Expected Shortfall is likely to be a minor extension of the current VaR model in most banks, the bigger challenges are in the areas of model validation and risk factor identification & modeling.FRTB introduces the need for continuous monitoring of model validation results for each trading desk. In addition, there is the need to classify risk factors as model-able or non model-able and identify historical stress periods based on restricted set of risk factors . There is also a need to compute capital using standardized approach for trading desks that fail model validation tests. Banks have to build new models and methodologies, and in some cases tweak existing ones, to comply with FRTB. Data Management Data is the crux of any regulatory requirement these days, no matter which department it affects within the bank. Without the right data, you cannot accurately report on any regulatory requirement, but beyond that you cannot take the organization forward. According to a survey from Oracle and Deloitte, 67 percent of financial services institutions have a comprehensive data strategy, but two-thirds say it is better suited for complying with regulatory requirements than driving the business forward. The FRTB requirements are requiring firms to take a careful and thorough look at their data architectures. A single source of data is no longer optional; it’s a mandatory requirement for the development and execution of strategy wide variety of models across trading desks. Absence of a strong data management strategy will introduce inconsistencies across different metrics that will eventually result in higher capital requirements. Technology FRTB brings up new technology requirements; the reality is that most existing systems will not suffice for what’s required here. The calculations are complex, but more importantly they have to be done on a regular basis, some even daily. The whole process of FRTB requires frequent validation of models and making decisions based on the model validation results – the entire task of managing the process is very complex and a flexible technology solution is needed to help accommodate changes early in the process. FRTB is unlike any other regulation in the sense that most others can be set up to run on its own. For example, you can set up your data and analytical engines to run the regulatory requirements needed for Basel III. The process is a smooth one that goes from sourcing data to regulatory reporting in a series of steps without much human interaction required. However, FRTB is complex and requires multiple sets of eyes reviewing the process along the away and deciding on next steps. With the right technology solution in place supporting this, it will alleviate some of the burden. Intersection point So how do you bring it all together? First, you have to recognize that all three are imperative. You cannot have 2 out of the 3 and still be successful. For example, you can have the best models and the best technology in place but if your data is insufficient you won’t be compliant; this is a challenge that is often overlooked. When decisions are made in silos it creates the following: Missed regulatory deadlines Delays & higher expenses Opportunity costs Re-work and reduced employee morale It’s best to avoid additional challenges when attacking something as complex as FRTB; and that is easily avoided when you have the proper set up for your Methodology, Data Management, Technology Solutions and  where they all intersect with one another.I’d love to hear from you on how your organization is addressing FRTB.   Rohit Verma is Senior Director for Risk Analytics Strategy with Oracle Financial Services Analytical Applications and can be reached at Rohit.r.verma AT oracle.com

Blog By: Rohit Verma The Fundamental Review of the Trading Book is painful, onerous & expensive to implement, and unfortunately it’s not going away. It’s an additional limb on an overflowing tree of...

Analytics

Live from OIC17 - Continuing the Journey of Capitalizing on Artificial Intelligence

Blog By: Arjun Ray Chaudhuri I am live at Oracle Industry Connect and I just had the pleasure of sharing a demo with the attendees during Sonny Singh’s Keynote address around Machine Learning and applying it to the context of Next Best Offer.  For those of you that were at Oracle OpenWorld back in September, you may recall we introduced Jenni to you.  Jenny was applying for a car loan with her bank; she was able to apply and receive an offer to buy the car right on her mobile device. Well now at Oracle Industry Connect, Jenny is back, her life fast forwarded 3 years and she is ready to pay off her car.  Jenny is now posed with a situation – does she take the monthly amount she was paying into her car and buy a new car, or does she take that money and put it into something else.  At the same time, her bank is evaluating the relationship with Jenny and realizes she’s about to pay off her loan. Mobile banking remains Jenny’s preferred banking channel and the bank has been tracking Jenny’s activities over the mobile website and app and comparing them with her dynamic peer segments for many months. For the bank, it’s the right moment to pitch a marketing offer and deepen the relationship with Jenny given that one of the products is about to get closed. After Jenny has made the last car payment on her bank’s app on her phone, a marketing communication is displayed asking Jenny if she’d like to open 529 plans for her children.  But how did the bank know that 529 accounts were the right offer for Jenny? Why didn’t they make the offer earlier? Thanks to Machine Learning, financial institutions are better armed to analyze vast amounts of data, be it every transaction level data or online activity data of the customer in the form of weblogs and applogs. With this capability, banks are able to service their customers through data driven marketing offers, before they’ve even had a chance to think about taking that car payment to another institution or to make some other purchase.  The future is now, and organizations that sit back and wait to decide how to integrate these technologies into their operations will fall behind and start losing customers like Jenny.  Just as the McKinsey article stated: “Now is the time to grapple with these issues, because the competitive significance of business models turbocharged by machine learning is poised to surge.” We all know banks need to have strong and quality intelligence and this is needed for a variety of reasons: customer retention, cross sell/upsell, regulatory requirements, risk management and the list goes on and on.  But how can machine learning take financial institutions to the next level? Here is a list of benefits of applying machine learning on big data when compared to traditional statistical models: According to the “Innovation in Retail Banking” report from Efma and Infosys Finacle, financial institutions understand the potential impact and benefits of AI, but that they are still hesitant to act. The hesitation comes from a number of reasons, with legacy technology environments coming in as the biggest hurdle to jump, and a lack of unified vision for digital across the enterprise coming in a close second.  The priorities are there – they know how to best leverage the technology if they had it. 78% of organizations say the Creating a customer-centric organization is a priority, while 74% say enhancing channels to give an omnichannel digital experience is key; and even 68% say maximizing usage of digital technologies such as mobile and social are important. And our example with Jenny leverages each of these. So how can banks bridge the gap between priorities and barriers?? The trick is to recognizing the strategic business opportunities that exist here.  The use of Machine Learning helps optimize customer experience for marketing personalization and engaging them with relevant offer recommendations by processing vast amounts of information more accurately. As with any organizational change or regulatory requirement, there are cost implications and change management to monitor, but the extended benefits outweigh the concerns.  You don’t want to be an organization that is behind in leveraging advanced technologies and lose customers along the way.  I hope you join me next time as I continue with you on this journey of capitalizing on artificial intelligence. Arjun Ray Chaudhuri is a Product Manager with Oracle Financial Services Analytical Applications and can be reached at arjun.ray.chaudhuri AT oracle.com  

Blog By: Arjun Ray Chaudhuri I am live at Oracle Industry Connect and I just had the pleasure of sharing a demo with the attendees during Sonny Singh’s Keynote address around Machine Learning and...

Banking

The World Beyond Passwords - Biometrics & Banking

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services Managing passwords is a big issue for today’s connected customers in a digital world. Customer expectations are changing at a fast pace and is putting tremendous pressures on banks. Security and convenience have a major impact on customer choice. Due to outdated processes and increase in cyber crime, banks have experienced incredible amounts of friction and increased costs.  Banks have realized that one of the best features of biometrics is to present an exact, unique form of identity. Banks need to explore suitable biometric alternatives for authenticating tech savvy users that can balance both these factors. Customer trust in biometrics has also risen considerably in last two years because of the convenience and huge security benefits it has to offer. According to a study by Visa in the U.K., “9 out of 10 respondents (85 %) showed interest in using biometric authentication to confirm their identity by banks and 81 % would trust payment networks”. Biometrics is a priority for banks right now and several facial, eye, voice, vein, and fingerprint recognition solutions are being implemented. Over the coming years, the global biometric authentication and identification market is projected to undergo significant growth. According to Grand View Research, the market size for biometrics is expected to reach $24.59 billion in the next six years. The other inflection point in this field has been the rapid development of biometric sensors and the rapid reduction in their costs. Many mobile manufacturers are introducing various biometric sensors in their new launches which help collect data that can be used by service providers. Everyday hundreds of biometric apps are popping up in the App Store. According to a research by Globe and Mail "by 2019, around 770 million biometric authentication apps would have been downloaded worldwide”. Even on the software front, the ability to convert a standard camera into a biometric device has seen an upswing. Due to rising usage of smart phones and biometrics authentication, banks will be forced not only to work closely with biometrics technology providers but also with the biometrics developers to create seamless and easy to use solutions which can help their customers. Customers are looking for really simple and convenient ways of logging in. Social login is being used extensively for this purpose from existing social accounts like Facebook, Twitter, Google+ or LinkedIn. When the customer visits a site that offers social logins, they have the option to register, log in with their regular ID and password or through a widget or plug-in that connects the site to their choice of social platform. Although it is easy and convenient and helps in blocking spam mails and fake users, it is not very secure for identification and authentication purposes.  Here are some of the areas where biometrics is being explored: Banks can use biometrics beyond authentication, like emotion and mood detection and accordingly cross sell and modify their products.  A bank in New Zealand is using its facial recognition software to measure their customers' behavior in different financial scenarios, for example if the customer has to book a flight at last minute, the software records their muscle movement through webcam and decodes their micro-expressions. As the system captures such emotions better than what humans do, it offers enhanced guidance on their financial decisions using emotions. Recently, an Israel-based bank has become the first in the world to embrace behavioral biometrics as a password replacement on mobile devices. Shopping has been revolutionized due to advances in technology like wearables, mobile payments, wallets, and contactless cards. Retailers are taking full advantage of this and creating lucrative offers and deals for young shoppers to give them best experiences. Alibaba, a major player in mobile payments market recently launched ‘Smile to Pay’ or ‘Pay with Selfie’. Just click on the 'buy' button on the app and click a selfie with a smile to place your order. Mastercard recently rolled out its 'Selfie Pay' in Europe and will open this up for rest of the world by the end of this year. Surely, other major technology vendors like Apple and Google will come up with something in a couple of years. The market will soon start to see solutions which leverage biometrics for improved security and user experience as technology becomes more advanced and sensors (fingerprint, camera, and microphone ) on devices improve. Bank customers are showing an increased interest in IRIS scanning authentication, facial recognition is being used for mobile authentication by companies like MasterCard  and banks have embraced voice recognition for their call-in centers already. Apple is also coming up with a new biometric sensor, which will use a customer’s heart rate from a wearable device for identification. Until now banks have implemented Biometrics for replacement of mobile banking passwords. But now new areas are being explored like multi-factor authentication for high risk activities like payments and money transfers, cardless cash withdrawal and so on. At the basic level, security is about something you know and something you have which can be used for identification, hence making it impossible for someone to forge or tamper financial transactions. Bank of America, HSBC, Charles Schwab, Citi are already using multi factor authentication for their retail customers.  While banks are deploying Biometrics in a big way and customers are also embracing the shift, there is a high potential risk of cyber attacks, online fraud and identity theft. Banks and service providers are therefore investing heavily in cyber security measures. Banks are working closely with Fintechs to help curb cyber fraud. Fraudulent transactions can occur through multiple channels as customers have the flexibility to access their account from any channel as per their wish.  However, banks with advanced systems that give them a view of all of the customer's  transactions  will be able to able to prevent fraud based on their knowledge of  the customer's previous transaction behavior. Alternative modes like fingerprint scanning, facial recognition, and voice recognition which are nearly impossible to forge, due to the uniqueness associated with them, also offer a convenient way to prevent fraudulent activities. Though banks are pushing hard for a cashless, branchless future, eventually it will be customers who have to accept or reject the new tech heavy banking system. My colleague Parul Jain and I co-authored this blog. We would love to hear your views.  We are reachable at tushar dot chitra at Oracle dot com and parul dot j dot jain at Oracle dot com.    

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services Managing passwords is a big issue for today’s connected customers in a digital world. Customer expectations are...

Analytics

CECL Compliance – Evolving for the future

The Financial Accounting Standards Board (FASB) published an accounting standard update in June 2016 for estimation of credit losses for the purposes of financial reporting. The key highlights of the update include:  The earlier approach of an incurred loss model to provide for credit losses was considered to be restrictive. It delayed the recognition of a loss provision unless it is probable that a loss has been incurred The new standard requires all organizations to provide for expected credit losses based on historical experience, current conditions and reasonable and supportable forecasts  There is no single method prescribed by the standard and the FASB expects the complexity of the model to be aligned with the complexity of the organization’s business model and processes The estimates have to be based on the expected life of the asset and should be assessed collectively based on similar risk parameters Additional disclosures are required for credit quality indicators by Vintage (year of origination) A new impairment model for AFS debt securities Valuation of PCD assets at initial recognition Challenges Lie Ahead Not only are these changes expected to alter the month end and quarter end book closure processes, these will also impact the profitability numbers published by organizations (mostly financial institutions) and will undergo higher scrutiny with respect to what assumptions have been taken into account for the estimation of Allowance for Loan and Lease Losses (ALLL). Combine this with intersecting regulatory requirements from CCAR, DFAST and BIS and the understanding of credit risk a financial institution faces complicates even further. There would be multiple interpretations of how credit risk is viewed by the bank for different regulations and even worse, how it is understood and computed can differ by the various divisions in the same bank. When Life Hands you Lemons… Make lemonade...or so the saying goes! But in looking at this from a different perspective, I see the changes as an opportunity to help banks and the entire financial services industry to change and evolve for the better. Some of the benefits I see include:  Forcing the Risk and Finance teams within the organization to start aligning with each other and communicate in a language understood by both – leading to: Data convergence – how it is captured, processed and reported Model convergence – how risk parameters are derived and consumed   Changes in organization structures to support  This standard will necessitate an auditor to investigate more into the assumptions made for current conditions and reasonable supportable forecasts and would compare with similar assumptions made at peer organizations – leading to more transparency in peer groups and unearthing of black box models Most internal ratings-based (IRB) banks would prefer to extend and adjust their existing credit risk models to cover CECL expectations leading them to expose their models to the auditor, considering that the assumptions would need to pass the test of reasonableness. To stay relevant in these conditions, these models and associated technology applications would also need to be transparent and auditable. Considering there is no single prescriptive model for CECL, it is more likely that Tier 2 institutions would look towards enhancing the incurred loss models presently used and adjust them to reasonable supportable forecasts. This is a solution that can work in the interim, but it ultimately leads the organization to look at tactical solutions that may not pass an auditor’s test or scale up to future needs. Eventually most institutions will need to work towards a more strategic solution that solves the Risk and Finance intersections more holistically. Institutions that are already on the path of an integrated risk and finance model will stand to benefit the most. With the issuance of a discussion paper by BIS on regulatory treatment of accounting provisions, other regulators can also look at options to converge these models and assess credit risk with similar objectives to arrive at a standard way of computing these provisions either for financial reporting or regulatory reporting. The cost of compliance with tighter regulations is weighing down the financial services industry and continued divergence and regulatory changes will add to the burden; but it can ultimately lead to standardization of data, models, processes and systems that benefit the entire financial institution.  All parties involved, from the regulators to the banks and even including the vendors that support financial institutions, need to evolve and think out of the box to what the future will likely be and work towards holistic solutions for the same.  Apart from what I’ve outlined above there can be more reasons for evolution, such as convergence with treasury operations for valuations and hedge accounting given that standards on the same are still evolving.       Geetika Chopra is a product manager with Oracle Financial Services Analytical Applications leading the solution for IFRS Compliance.  She can be reached at geetika.x.chopra AT oracle.com.

The Financial Accounting Standards Board (FASB) published an accounting standard update in June 2016 for estimation of credit losses for the purposes of financial reporting. The key highlights of...

Banking

PSD2: Threat or Opportunity for Incumbent Banks

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services Payment Services Directive (PSD) adopted in 2007 aimed to create a single market for payments within the European Union. However limitations like each country having different regulations around third party access to customer accounts led to services being localized by country. Also banks were not obligated to grant third party providers access to their customers’ accounts In the light of these limitations, the revised Payment Services Directive (PSD2 – EU Directive 2015/2366) was proposed by the European Commission in 2013. The objective behind PSD2 is to create greater convenience and choice for customers in the European Union, integrate and improve payment process, create level playing field for payment service providers, foster innovation and competition What does PSD2 propose? Unconditional right of refund for direct debits under the SEPA CORE scheme Much stronger customer authentication system Ban on surcharges for card payments Enhanced customer protection for payments made outside of the EU or in non-EU currencies Introduction of third party payment service providers (TPPs) to the EU financial landscape: Payment Initiation Services Providers – PISP with access to the account information of bank customer Account Information Service Providers – AISP who can initiate a payment on behalf of the customer There will also be Account Servicing Payment Service Providers – ASPSPs which are nothing but banks who have to provide access to customer information to TPPs using APIs How does PSD2 change the financial landscape for the European Union? According to a PwC Strategy & study on PSD2 in 2016, 88% of consumers use third-party providers for online payments, indicating a large, ready base of customers for other digital banking services like payments, financial planning etc. 2018 is when the PSD2 is supposed to go live and according to industry experts will end banks' monopoly on their customer’s information. Under PSD2 banks will be obligated to share their customer information with third parties through open APIs.  These third party providers can then build their services using  the bank’s data. Banks will also have to bear increased costs of providing the security infrastructure around APIs that they will be exposing. This will result in a dramatic increase in competition in the financial sector with banks no longer competing with other banks but also with non-banks or Fintechs which will have easier access to the market. According to some projections as much as 9% of retail payments revenues are predicted to be lost to PISP services by 2020 just within two years of PSD2 going live.   How can banks respond? The banking sector is one of the biggest spenders on technology. However, the majority of this spend is on maintenance activities. What has prevented traditional banks from being innovative is not the cost of acquiring new technology but the hesitance to cede control and organization inertia. This can be a very risky approach in both light of the regulator’s push to level the playing field by allowing new players to enter the banking sector and changing customer expectations.  According to Fujitsu European Financial Services Survey of 2016, 37% of European consumers say they would change their bank if it did not offer them up-to-date technology1. Banks can use either look at PSD2 as just another compliance requirement or turn it into an opportunity to develop new business models while delivering services that the new age customers want. Bank as a Platform: Banks can open their APIs which can be used by external developers to extend the platform functionality at a technology level, while third party vendors can use the platform to create value for the consumers at a business level. Incumbent banks should take the lead by industrializing their APIs and build their own digital ecosystems and/or be a part of an external ecosystem. This approach will help banks to be more agile and create new opportunities in product creation and distribution while opening new revenue streams. There are two approaches of adopting “Bank as a Platform” strategy: Build a digital banking platform based on apps from third party applications like current accounts, credit and debit cards, and instant and contextual personal loans. Effectively the bank will be building an application marketplace like an e-commerce platform where third party players, FinTechs and even other smaller banks can list their financial services or products. Consumers will then use the banking market place to consume product and services as they do for example from an Amazon. The revenue for such a digital bank will not be form fees that end consumers pay, but from the app providers that list their apps on the platform. Banks can also expand their services by enabling some of these third party modules to their own core offerings and find new cross sell opportunities. Secondly they can become part of a third party API marketplace like the Berlin-based Open Bank Project (OBP), where they have access to community of developers who can quickly create new products and services for them. These third party API marketplaces also offer white labeled apps and sandboxes for banks and other third part providers to build solutions that offer better experience to their customers. This will reduce the development cost and time for banks as they do not have to invest in building products and APIs from scratch but would use the services of the developers as and when required. Offer PISP services: Banks can use this approach to spread out their transaction banking offerings and offer low cost and much faster API based P2P payment solutions. One prime example is Danish Saxo Bank, which opened up its APIs in September 20152.  Saxo Payments uses Oracle solutions to build Saxo Payments Banking Marketplace to empower Fintech businesses to deliver instant and low-cost cross border payments capabilities to their merchant clients. Another example is Capital One, a UK-based bank, which now enables affiliates to benefit through their APIs3. Monetize customer data and insights:  Banks have traditionally collected and aggregated customer information which is of tremendous value for new entrants. Banks can use advanced analytics to draw insights from transactional data which will help new entrants to better target their customers.  Using open APIs banks can work as providers of this rich data to third parties and thus create new revenue opportunities. Banks can also act as Account Information Service Providers to their customers, Barclays for example has a subscription service SmartBusiness application for SMEs to monitor their finances in comparison to their 500,000+ peers.  New AIS entrants who aim to provide similar services would be restricted by the number of customers that sign up for their services, while Banks already have a huge data base of existing customers which offers a tremendous first mover advantage. Tushar Chitra is the Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Tushar dot Chitra at oracle dot com Co Author: Abhishek Shukla is a Principal Product Manager at Oracle Financial Services. He can be reached at abhishek dot as dot shukla@oracle dot com   Sources: 1.Fujitsu (2016) :The Fujitsu European Financial Services Survey 2016 2.Saxo Bank (September, 2015): Saxo opens access to its trading infrastructure with the launch of OpenAPI 3.Capital One (March, 2016): Let’s Start Something Together.    

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services Payment Services Directive (PSD) adopted in 2007 aimed to create a single market for payments within the European...

Banking

Banks Must Dive In and Ride the Digital Wave

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services Evidence on banking transformation is mounting every day. One need look no further than FinTechs and their pace of growth. The global financing of FinTechs is estimated to have increased seven fold 1 in the three years between 2012 and 2015. This momentum did not abate in 2016. And FinTechs are likely to continue their bullish run through 2017 maintaining their uber momentum. Without doubt, high growth patterns will be the new normal for FinTechs. In tandem, digital will continue to spread its tentacles across every aspect of consumer life. FinTechs are further set to benefit from this rising digital adoption. Consider the success of mobile apps such as taxi apps. Pioneering mobile app innovation, mobile taxi apps have successfully redefined the cab landscape. They even created a new group of customers for banks. By virtue of moving their newly registered cab drivers into the banking ecosystem, they shared a part of their success with banks. Banks happily welcomed them, adding a big number of new bank accounts quickly in a matter of days. Left to banks, they would have taken a few years to add the same number of new customers. Such is the impact created by disruptors, whose effects transcend sectors and industries. Also, 2016 recorded a historic feat, largely unnoticed. For the first time the number of card payments surpassed that of cash payments 2. We see that consumer behavior and technology innovation outside of banking is now cascading into banking.   Evaluating the changing landscape and its outcomes are absolutely necessary to a bank’s survival. Let us continue with example of the taxi app, where roping in and retaining drivers is fundamental to their success. Many cab drivers had to open bank accounts only because of them and became part of the financial ecosystem. Now the taxi apps can provide incentives to their drivers through a variety of value added banking services to increase driver loyalty. The driver network can benefit from the new banking services offered by the app. These cab drivers otherwise may not have access to such services. For example, if the cab driver was looking to upgrade his vehicle then he would need a new car loan. With full knowledge of that driver’s history, the taxi app would be better placed to originate this loan instead of the partner bank. However, it would still be the bank which offers this loan. The banks benefit immensely by selling more products and services to new customers acquired through this partnership. By simply participating in such an ecosystem, banks can chart a new path to success. Disruptors such as mobile apps or other FinTechs cannot really act as a bank, and therefore they look to partner with the best of banks that are both agile and innovative. In a fast changing world, banks must recognize that success can come in different ways, forms and places. Going forward, it will be a world of collaboration and partnerships, where a bank will likely play a new but a crucial role.   Globally, regulators and governments are supporting innovation in the financial sector. Europe’s PSD2 and UK’s Open Banking Standard are important initiatives in this context. For example, PSD2 will enable third party providers such as retailers to now compete with traditional payment providers.  With the consent of consumers, retailers can directly seek information from their bank accounts to complete payment transactions without a need for intermediaries. Also, a new group of information aggregators who keep track of a consumer’s multiple bank accounts and their financial history are emerging. Such aggregators will help the users of their platform to engage in ways that will benefit consumers. Consumers can reap the benefits from this arrangement but only if they are willing to share their financial history with a trusted partner. By amending policies in  the financial sector, regulators are hopeful that it will create an environment which will foster innovation, increase efficiency, boost competition and provide a level playing field. However, consumers assent will be mandated in order to use their data. Compliance for data security and consumer privacy is set to become more stringent as banks begin to open up as mandated by regulators.    Beyond the drive from regulatory authorities, we also see fundamental shifts in technology affecting the banking landscape. State-of-the-art applications of internet of things, blockchain and artificial intelligence such as machine learning have the potential to change how banks operate and deliver products and services. Unleashing these technologies early in the adoption cycle inside a banking setup can yield good results, provided banking executives are receptive to innovation. Robot advisory, for example, is catching on.    As we move into an ‘Internet of Things’ world, we can further expand our earlier taxi example. Consider a scenario where the cabs are part of an IoT ecosystem. The vehicle generates a continuous flow of information about its usage, condition, driving style, and also maintenance records. Considering this level of detail for a car loan would certainly help determine a fair value for the old car and also help reward the good drivers – at least until the point where cabs become driverless.  One can safely assume that most banks today are not in a position to use this level of information (from IoT) to make decisions. The inertia of banks in responding to technology advancements and their unpreparedness will significantly hamper their growth.   The amount of regulatory and compliance requirements will increase with new technologies. As regulations compound and volumes of data explode, the challenge for banks will be to deal with this complexity. If not addressed appropriately, banks can become vulnerable to data thefts and hacking. At the same time, it also presents an opportunity for an emerging group of technology companies who specialize in securing banks with solutions that deal with complexities resulting from a combined explosion of digital, data and regulation. We can expect a steady growth for these companies called ‘RegTechs’, a shortened version of ‘regulatory technology’.   As banks get ready to face the incoming tide, here is a list of key criteria fundamental to any strategy: Know Your Customer: Banks have been gathering customer data and their financial history for over many years. This data resides in the many systems which banks have been running during this time. Despite the wealth of customer information residing within, many banks are unsure of how much they really know about their customers. The number of customers who transact using or with FinTechs / third party players will continue to rise. These newer players, who are at the front of customers, will start enriching their own database with valuable customer transaction information. These players are usually smaller and nimbler without any legacy baggage and operate with modern systems designed for scalability. Accumulating their customer transaction data over time, the newer players will also be able to generate invaluable customer insights in the future, very similar to what banks are aiming to do today. Without doubt, banks at the moment have an enormous advantage over the newer players. However, in order to stay relevant and essential, banks must ensure they consolidate all customer data already residing within their many systems and synchronize all customer data flowing through their different channels today.   Build Partnerships – but start early to influence and innovate: Partnering for growth will be a key element in a bank’s strategy. Banks can establish an ecosystem by collaborating with third party players, retailers, FinTechs, RegTechs, etc and allow for innovative methods in product manufacturing, delivery and customer onboarding. The real test is in their ability to influence and build their partner network. It is important that banks start early and exhibit leadership in developing their ecosystem rather than becoming a late participant. Otherwise they would risk losing their ability to offer their own infrastructure as per their own terms and certifications for utilization. They should constantly engage with IT users, partners and developers to nurture innovation and stay relevant in the evolving landscape. The significance of APIs and an open API ecosystem is growing largely due to their ability to establish such partnerships. If banks have to effectively participate in this API ecosystem, then banks should first simplify how information is exchanged between systems internally and reduce the difficulties arising from legacy systems. While building partnerships, they need to enforce policies which ensure security and protect privacy.    Make Digitization Top Priority: Digitization is crucial to banking processes with customers initiating transactions anytime and from anywhere, through FinTechs or other players. Customers expect instant gratification without any hassles 24X7. Banking partnerships can only be built based on their ability to fulfill customer needs. A growing partner network will further increase the need for concurrent usage and reconciliation. Beyond responding in real time, banks should be able to use the data generated by customers to contextually engage them either directly or through their partners. Therefore, it is imperative for banks to increase automation and improve systems integration. Digitization is vital to improve productivity, operational efficiency, security and lower risks.     Get Serious About Cloud: The ability to handle massive volumes of data without hampering the performance of systems crucial to banking can be gained only through cloud. Cloud will improve a bank’s agility and scalability. Non critical functions and non sensitive information are the easy start points that can be moved to cloud. Private and hybrid clouds are other options available for more sensitive and core business functions. Using cloud, banks can offer more services at lower costs. The economics of costs clearly favor cloud as compared to on-premise options. With increasing partnerships, globalization and newer technologies like IoT set to take off, the volumes of data to be handled will further multiply exponentially. Cloud will move from being an option available to banks to something fundamental to their existence.   In the evolving banking landscape, the above four criteria can be considered essential for banks to break the mould, but by no means is a complete list.    Tushar Chitra is the Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Tushar dot Chitra at oracle dot com   Sources: [1] http://investingnews.com/daily/tech-investing/fintech-investing/fintech-market-size-a-breakdown-of-the-basicssub/ [2] http://blog.euromonitor.com/2016/09/consumer-card-transactions-overtake-cash-payments-first-time-2016.html  

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services Evidence on banking transformation is mounting every day. One need look no further than FinTechs and their pace of...

Analytics

Technology: The Core of a Successful Compliance Program

In this video blog, Saloni Ramakrishna, author and senior director at OFSAA, discusses the unified platform approach for compliance and managing regulatory demands.  In the financial services space, there is no way to manually manage the “Multi” dimension of regulations – Multiple regulations, Multiple regulators, and Multiple geographies.  Technology is THE business enabler that harmoniously weaves in these “multiple aspects” of regulatory demands.  Watch this video blog to learn why building for compliance alone is sub-optimal and how building for business excellence naturally subsumes compliance needs while addressing the larger business goals. Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In this video blog, Saloni Ramakrishna, author and senior director at OFSAA, discusses the unified platform approach for compliance and managing regulatory demands.  In the financial services space,...

Analytics

What price a better pricing strategy for banks?

Blog By: Arjun Ray Chaudhuri The world of financial services is changing and banks must be ready to respond to these changes. This is the thinking behind a new Oracle report, “Responding to change – how are banks using information and pricing strategies to boost profitability?” The report explores and explains the results of a recent Oracle and Efma survey and also includes some observations from two Think Tank sessions hosted by Efma and led by Oracle. Two previous blogs in this series have looked at a couple of inter-related key elements of a response to changes that affect the financial services sector – the effective use of customer information by banks and the increasing potential for using big data to enhance customer engagement and profitability. This blog will focus on another key issue - the importance of powerful pricing strategies. Trends in pricing strategies A bank’s approach to pricing can have a significant impact upon its profitability. If a pricing strategy is right, it will enhance the customer experience and will help to boost profits. If it’s wrong, the results for a bank can be disastrous. Pricing has now become even more important, as customer loyalty can no longer be taken for granted. In the past, customers were sensitive to fees but were also very loyal to their banks and would still tend to buy products from them even if the price was slightly higher than from other banks. Product-based pricing was prevalent throughout the industry. Although this is still the primary strategy adopted by many financial institutions, others have since moved on to risk-based pricing. The results from our survey show this trend. Relationship-based pricing A few of the more progressive banks have gone even further and now rely on relationship-based pricing. This still takes risk into account but also includes different approaches to rewarding customers for their loyalty to the bank. Banks are also starting to differentiate in terms of the customer experience provided. This approach is still under debate within the industry. Legacy systems and bank silos are also delaying the adoption of this approach in many areas. Our survey suggests that Western European banks in particular are still reluctant to move over to a relationship-based pricing strategy. To explore this issue a little further, five in-depth interviews were conducted with five banks from different geographical regions. For example, a respondent from Russia said that his bank was using all three of the key approaches to pricing (product-based, risk-based and relationship-based). However, it’s focusing mostly on relationship-based pricing and intends to try and develop a seamless approach that will enables all of its business lines to access the same information. Meanwhile, a bank in East Asia, although currently using product-based pricing, is also trying to develop an approached based on the customer relationship. The bank is planning to roll out a special customer value tool to help with this. In contrast, a financial institution in Central and Eastern Europe is still focusing mainly on a risk-based approach, although an increasing number of banks in the region are now starting to develop a more customer-centric strategy. Finally, two other banks (based in the Czech Republic and the Middle East) aren’t really using relationship-based pricing at all. One of them has developed some very simple pricing strategies but wants to start moving towards a basic type of relationship pricing. The other bank is still focusing on product-based pricing. However, it again is moving towards a relationship-based approach, as it’s now starting to work on the concept of a model centered around the Customer Lifetime Value. Moving forward There is little doubt that price is once more becoming a major factor when customers are deciding upon a bank to use. Relationship-based pricing is the way forward for banks, even though that might mean a major cultural change – including the elimination of silos. As banks move over towards this approach and start rewarding their customers for their loyalty, this is likely to be reflected in greater success in both acquiring and retaining customers. Banks that fail to grasp the nettle of a progressive pricing strategy might find themselves ‘stung’ in other ways – by a loss of customers and a loss of business. So, banks can’t afford to be complacent and ‘keep doing things the way we’ve always done them’. Times are changing and banks need to change with them – or risk being overtaken by events. To read the full report on Responding to change, please visit Arjun Ray Chaudhuri is a Senior Principal Product Manager for Oracle Financial Services. He can be reached at arjun.ray.chaudhuri AT oracle.com.   

Blog By: Arjun Ray Chaudhuri The world of financial services is changing and banks must be ready to respond to these changes. This is the thinking behind a new Oracle report, “Responding to change –...

Analytics

A Video Blog Series #3: Building an ecosystem to future proof banks in a fast changing regulatory reporting landscape

In the second video blog of the 3-part series, Saloni Ramakrishna, author and senior director OFSAA, discussed AnaCredit, a new regulation from EBA as part of the evolving European Regulatory reporting framework. She detailed the nuances of AnaCredit emphasizing that the regulation has more than meets the eye. In the last video blog of this series, Ms. Ramakrishna shares her thoughts on how we can build a future proofing idea or construct into an overall ecosystem that we create for regulatory reporting. <span id="XinhaEditingPostion"></span> Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In the second video blog of the 3-part series, Saloni Ramakrishna, author and senior director OFSAA, discussed AnaCredit, a new regulation from EBA as part of the evolving European...

Analytics

Big data – a vast pool of potentially invaluable information

Blog By: Arjun Ray Chaudhuri Businesses - including banks - are constantly bombarded with a colossal amount of information, thanks to the ever-growing influence of the Internet. But how can financial institutions harness this information and start using it for their own purposes? Even finding and identifying the right information can seem like a headache – a bit like finding a stone in a raging torrent.  You need to know how to look and where to look. And then, when you have the information, you need to know how to use it to boost your profitability. This is an important area that Oracle’s been exploring in a recent report, “Responding to change – how are banks using information and pricing strategies to boost profitability?” The details of this joint Oracle and Efma publication can be found in my previous blog, “Are you making the best use of information?”, which looked at how banks are using their existing customer information. The potential of big data However, there is a much wider pool of information that is now available and can provide some fascinating insights into customer behavior. So-called ‘big data’ includes both structured and unstructured data from many different sources – including the many social networks that have sprung up across the Internet in recent years. Our study showed that much of this information remains untouched and unused by banks – and that relatively few senior executives have really grasped its potential for transforming the profitability of their organizations. The amount of data being produced continues to increase exponentially – and the gap between this amount and its usage by banks is also continuing to grow. This is partly because the data needs to be properly identified and prepared first, so that it can then be analyzed effectively - and also because the skills and resources required are still scarce. Banks find it hard to keep pace with the volume of data being produced and are unsure how to use it effectively. Using big data effectively Our survey therefore looked at how banks are progressing in terms of meeting the challenge of using big data. On the positive side, it showed that most banks are now beginning to understand the importance of big data and the need to put it as a priority in their future planning strategies. Banks have very different approaches in terms of their big data journey. However, the vast majority are still in the planning stages or are only just beginning to find ways of using big data. This finding was confirmed by in-depth interviews with banks from different geographical regions. For instance, a financial institution in Russia said that the big data journey was still in the very early stages. However, it hopes to develop its capabilities in the future – indeed, big data is high on the agenda for most of the banks in the region. Two banks (one in the Czech Republic and one in the Middle East) said that they were also starting off on their big data journey - one with the idea of using data from non-banking partners and the other by developing analytical models. In contrast, a bank in East Asia said that it doesn’t really have time to focus on this topic at the moment – and another bank in Central and Eastern Europe commented that it didn’t have a sufficiently critical mass of information. We also looked at the types of big data that banks are using and how they are using it. Most seem to be augmenting their existing data with structured data, although there is a greater emphasis on unstructured data in the US. At the moment, big data is mainly being used for ‘quick wins’, such as improving decision-making and enhancing the customer experience. Other areas – such as using the data to reduce fraud or to improve risk assessments – have been largely overlooked, even though these could lead to impressive returns for the banks. No gain without pain Unfortunately, the whole process of collecting, analyzing and using big data is an expensive business. However, it’s one that banks can’t afford to avoid. As a starting point, targeting can be a relatively easy and effective way of leveraging big data. This might mean exploring customer interactions on channels such as social media in more depth, and using geolocation and other data to make the right offer to the right customer in the right location. For instance, offers can be sent to the customer based upon the retail outlets within their vicinity. This will also help the customer to engage more with the bank, and enhances their perception of the value of the bank’s services. Ultimately, big data could open up a host of new opportunities for banks. The main question is whether they are both willing and able to take up the challenge. The key to success will lie in each bank’s readiness to invest more time and money in the big data journey. If the willingness is there, this could lead to great rewards in the future. To read the full report on Responding to change, please visit Arjun Ray Chaudhuri is a Senior Principal Product Manager for Oracle Financial Services. He can be reached at arjun.ray.chaudhuri AT oracle.com.        

Blog By: Arjun Ray Chaudhuri Businesses - including banks - are constantly bombarded with a colossal amount of information, thanks to the ever-growing influence of the Internet. But how can financial ...

Analytics

Are you making the best use of information?

Blog By: Arjun Ray Chaudhuri Information is the lifeblood of any business, and the financial services sector is awash with useful information – both valuable customer details and also a wealth of ‘big data’, some of which will be useful and some of which won’t.  So, are banks making the optimum use of the information that’s available to them? That was one of the key questions that Oracle has been addressing in a new report, published jointly with Paris-based Efma. Efma, for those of you who don’t know this not-for profit association, brings together senior executives from financial institutions across Europe and beyond. The report, entitled “Responding to change – how are banks using information and pricing strategies to boost profitability?” is based on the results of a survey conducted by Efma, along with two Think Tank sessions organized by the association and led by Oracle. Senior executives from financial institutions worldwide were questioned during the study. Collecting the information The use of big data and pricing strategies will be covered by future blogs. But what can our survey and the Think Tanks tell us about the use of basic customer information? The report starts by looking at how banks gather and use information so that they can have a 360-degree view of their customers. This is vital for increasing customer engagement and retention and for optimizing profitability.  It was immediately clear that although the vast majority of banks use details of their customers’ information from their financial transactions, relatively few make use of the wide array of information available on social media, either directly or indirectly. However, this can give valuable insights into customer attitudes and behavior. This was confirmed by in-depth interviews with five leading banks in different geographical regions. Some of the banks are using financial and other details obtained from their customers’ accounts. Although banks generally seem to agree that a 360-degree customer view is desirable, many are hampered by legacy systems, where information is stored in many different places and in different formats. This reflects the pressing need for a central source of information in many banks throughout the world. Analyzing and using the information In terms of analyzing the customer information that already exists within their banks, financial institutions are again missing some golden opportunities. Over half of the organizations questioned in our survey don’t use real-time analytics and only a small number use these on a daily basis. The in-depth interviews showed a varying picture across the different regions. For instance, a Russian bank has a team of analysts and employs various analytical approaches, although it’s not yet really employing real-time analytics. Two banks from East Asia and the Middle East said that they currently lack the capabilities needed for real-time analytics. Meanwhile, a financial institution in the Czech Republic is using machine-learning algorithms – and a Central and Eastern European bank is exploring the use of different types of data in real-time analytics. This bank is also using customer segmentation to help it to generate different campaigns for different types of customers. Once banks have collected and analyzed customer data, the results are invariably used for refining customer-focused marketing campaigns. However, few are taking this to the next level by using the results from the data to increase customer engagement throughout the campaign. An emphasis on customer engagement In a changing world, where banks face increasing competition from fintechs and new entrants from outside the financial services sector, they need to start making much better use of the wealth of customer information that they’ve accrued. This will involve gaining a greater insight into individual customers; identifying those that are most valuable; and finding ways of enhancing their service levels and increasing customer engagement.  Some progress is already being made in this area. However, for a few banks, the customer experience is still a relatively new concept. A bank in East Asia is now starting to grapple with this and wants to develop more seamless processes. In Central and Eastern Europe, a bank is exploring the use of customer behavior patterns for targeting its campaigns more accurately. And in the Middle East, a financial institution has recently completed a comprehensive customer-centricity training program. As a result of these changing demands, financial institutions need to explore the customer relationship and look at their needs, their pressure points and their relative profitability. This new emphasis on customer engagement will be a challenge for most banks, but it’s one that they must face and must overcome if they want to survive in the difficult times that lie ahead. To read the full report on Responding to change, please visit Arjun Ray Chaudhuri is a Senior Principal Product Manager for Oracle Financial Services. He can be reached at arjun.ray.chaudhuri AT oracle.com.

Blog By: Arjun Ray Chaudhuri Information is the lifeblood of any business, and the financial services sector is awash with useful information – both valuable customer details and also a wealth of ‘big ...

Analytics

A Video Blog Series #2: Paradigm Shifts in the European Regulatory Reporting Framework with focus on AnaCredit, the new requirement from EBA

In the first video blog of the 3-part series, Saloni Ramakrishna, author and senior director OFSAA, discussed the overall context and background of European Regulatory Reporting.  The second video focuses on AnaCredit, a new regulation from EBA as part of the evolving European Regulatory reporting framework. She details the nuances of AnaCredit emphasizing that the regulation is more than meets the eye. The last video blog of this series will detail Building an Ecosystem to Future Proof Banks in a Fast Changing Regulatory Reporting Landscape. Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In the first video blog of the 3-part series, Saloni Ramakrishna, author and senior director OFSAA, discussed the overall context and background of European Regulatory Reporting.  The second...

Analytics

A Video Blog Series: Paradigm Shifts in the European Regulatory Reporting Framework with focus on AnaCredit, the new requirement from EBA

In the first video blog of the 3-part series, Saloni Ramakrishna, Author and Senior Director for Oracle Financial Services Analytical Applications, discusses the overall context and background of European Regulatory Reporting.  She reviews the 4 pillars based on which regulators make decisions and interventions to ensure that the market is stable: Harmonization, Integration, Coherence, and Consistency. The following 2 video blogs of this series will detail AnaCredit, challenges the evolving Regulatory landscape presents, how to address them and how banks can future-proof themselves. Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In the first video blog of the 3-part series, Saloni Ramakrishna, Author and Senior Director for Oracle Financial Services Analytical Applications, discusses the overall context and background of...

Analytics

Aligning Risk and Data Governance Across your Financial Institution

There is a data deluge in the Financial Services sector. New regulations, requirements, and sophisticated analytical tools have conspired to drive huge demand for richer, more abundant, and more readily available data. For you, the Chief Risk Officer, this is proving to be both a blessing and a curse. In risk, data is knowledge, and knowledge is insight. These growing volumes of rich data can help you learn more than ever before about business practices and make identifying potential areas of risk extremely simple. On the flipside however, more data is a greater risk in and of itself. All of that extra data is valuable, and if it’s not being managed properly between different silos and lines of business, it’s not just being underutilized—it’s vulnerable. It can make your life much easier or significantly harder depending on how its managed, and if you want to consistently turn rich data to your advantage, you must align accounting, risk, and financial processes to ensure all data is collected and managed in the best way possible. The key to great risk dataThe difference between “acceptable” and truly great Risk Data Aggregation and Reporting (RDAR) is in how the data is collected and managed. If data is siloed or unavailable to key stakeholders and risk-assessment staff, it loses most of its analytical value. But if you can break down the silos and collect and collate data under a single management pane of glass, you can simplify reporting, and put yourself in a better position to spot areas of risk and profit-opportunities. Overcoming the barriers to risk reporting excellence With that single view and unified platform for enterprise data in place, the next step is rolling out new governance policies to ensure that everybody across all departments handles, captures, and stores data correctly using the new system. To do this, you’ll have to work closely with the Chief Data Officer. They have the final say on data policy and governance, and only by discussing what you want to get from enterprise data with them will you be able to ensure that the right practices are enforced enterprise-wide. The CDO will also be able to help you integrate your risk data aggregation and reporting systems into your core data platform, so that it can effectively utilize fresh, timely data from all departments and lines of business. Surprisingly, despite it’s clear benefits, few FIs have managed this yet. A recent study from Chartis on Risk Data Aggregation & Reporting Solutions found that only 7% of surveyed organizations had integrated Risk Data Aggregation and Reporting systems into other business areas. Building lasting partnerships with top data stewards As Chartis summarizes in its report: “The trinity of the CRO, CDO and Big Data should be at the heart of how FIs approach business, and they should all be closely aligned.” Building a strong relationship with the CDO and working with them to improve governance and data management standards across all departments—especially accounting and finance—is your key to better data-driven risk management. During this process, it may also be beneficial to meet with different lines of business, especially accounting and finance, to discuss why you’ve put a new data management platform in place and created new governance policies to support it. By helping them understand your motivations, and how better risk management will benefit their department in the long term, you can get the kind of enterprise-wide support and alignment that you need for new initiatives like these to succeed. Good governance demands good technology Aligning essential data stakeholders and processes across your organization is just one piece of the puzzle. Another is making sure your data and analytics technologies can handle the breadth of data generated by today’s FIs, and that you have the capabilities in place to effectively unify the data everyone is generating. Take a look at how Oracle Financial Services Analytical Applications can help ensure your FI is ready to deliver consolidated, consistent and authoritative data across the enterprise, and take the first step towards eliminating risk with better visibility and reporting today.

There is a data deluge in the Financial Services sector. New regulations, requirements, and sophisticated analytical tools have conspired to drive huge demand for richer, more abundant, and more...

Analytics

How the CFO can Prepare for the Future of Financial Regulations

The rules and regulations placed on financial institutions are constantly changing, and trying to adapt and ensure you’re compliant with them all is far from a simple task. Preparing for changes in advance is difficult, as nobody can truly predict what’s coming. But if we look at what’s happening to the world’s largest banks, we can gain some valuable insight into what the future might hold for all financial institutions. BCBS 239 and beyond For the GSIBs at the very top of the financial services industry, a new set of regulations and requirements has brought major changes to the way they manage data governance, and look at their enterprise data infrastructure. Within BCBS 239, GSIBs have been asked to comply with a number of strict new regulatory requirements that go beyond the data itself and instead look at the company’s data management infrastructure, data lineage, and governance practices. The motivation for this is clear. To fully ensure data consistency, you have to look beyond how specific data is being handled—you have to address issues with the big picture, the entire data infrastructure, and the impact it is having on the quality and security of that institution’s data. What does it mean for the rest of the world’s banks? Leading with these new principles makes sense, as it is enabling regulators to eliminate data issues at their source instead of just fighting fires as and when problems occur. It’s a bit like the “teach a man to fish” proverb. Regulate a company’s data and they’ll become compliant today, regulate their data infrastructure and they’ll become compliant for a lifetime. With that in mind, it would also make sense if we saw those principles applied to all banks, in one form or another. If the world’s biggest banks successfully adapt to those new regulations, the regulatory environment could very quickly shift everybody’s focus towards data infrastructure and governance. It’s not just a matter of compliance While only a handful of banks are currently required to comply with the infrastructure and governance requirements outlined in BCBS 239, many others may want to consider meeting those standards anyway. Those that have been pushed to comply are now beginning to see the real business benefits of a strong, integrated enterprise data environment, and the benefits go far beyond compliance. It’s making reporting, analytics, and risk far simpler to manage, and becoming a real source of competitive advantage. For some other large banks that fall just outside of the GSIB catchment, adopting the same principles is becoming a necessary requirement if they want to keep pace with their competition. So what can the CFO do about it all? The CFO is highly experienced in dealing with regulatory requirements and ensuring that their company is compliant, and if they want to prepare for potential future changes, they need to start thinking proactively about what they can do to improve their data infrastructure today. There are many ways to improve data management across the entire enterprise, but for many FIs, the first step is standardization. To maximize data quality and ensure compliance with new regulations, it needs to be stored in a unified, integrated way that allows it to be accessed, analyzed, and shared with key stakeholders across the entire enterprise—and often outside of it. This means: Standardizing the way you collect, handle and store data Standardizing the platforms used to manage and store data Standardizing across different LOBs and data silos Standardizing the way data lineage is tracked and maintained By doing this, you can create a single source of truth where you and your team can find clean, consistent and verified data that you can rely on. As the CFO, it’s up to you to work closely with the CDO and the CRO to think “architecture first” and consider what changes you need to make to your data architecture in order to achieve governance and data quality excellence.

The rules and regulations placed on financial institutions are constantly changing, and trying to adapt and ensure you’re compliant with them all is far from a simple task. Preparing for changes in...

Analytics

Three Steps to Optimized Financial Data Governance

Over the past decade, the role of meaningful, actionable data has completely changed within the modern Financial Institution (FI). While clean and consistent data was once a luxury that could help you get ahead and improve your practices, today it’s a “must have”, and as Chief Data Officer, your role has been created to ensure that it’s in place. It’s no secret that different institutions have different motivations for improving data quality, control, and governance. For some, it’s just a regulatory requirement—a box-ticking exercise that they’re legally obliged to complete. But, for many others, it’s becoming a key competitive differentiator. Forward-thinking institutions recognize that their new Chief Data Officer isn’t just someone that’s going to help them ensure that they’re ticking all the right boxes—they’re also a powerful new strategic force that can lead the company to success. If you want to make a real difference as CDO and empower your entire organization with the high-quality, highly-available data it needs to get ahead, your first task is achieving excellence in data governance and meeting the strict new principles outlined in BCBS 239. Creating the kind of controlled environment laid out in these new principles is no simple task, but we’ve identified 3 key steps that can help you adapt and start changing the way you manage enterprise data today. 1. Standardize how data is stored, accessed and reconciled When data is siloed, incompatible with key systems, or difficult for your people to access, it very quickly loses its value—and can seriously slow down managerial and regulatory reporting. Standardizing your data requires a careful blend of technology and processes to ensure all data is extracted, transformed, and stored in a way that keeps it timely, relevant and accurate. More often than not, it’s an FI’s technology that lets them down. Not all data platforms are created equal, and many solutions are unable to apply governance rules consistently, or early enough in the capture process. Today’s FIs need a solution that can provide them with a unified staging area where all data can be treated, consolidated, and standardized as soon as its captured—ensuring consistent data quality across the enterprise. 2. Establish a single source of truth for all enterprise data Once data is captured, you also want to make sure it is highly visible, and available to any person, group, or application that may want to use it. If you want to unify all of your data, you have two main options: Integrating a variety of point solutions to tie your risk, performance, compliance and customer data together Using a unified platform that can help improve performance beyond just integration The former is in place in many enterprises already, and while it seems like the simpler way to “bolt on” data transparency, it can introduce further complexities in the long run. Acquiring all the necessary integrations can be costly, complex, and in some cases can even create break points that bring your data unification efforts to a screeching halt. Though it may cost more upfront, a platform designed with data unification in mind can help simplify data management, enable use of additional analytics, and further simplify compliance with regulations such as Basel III and the BCBS 239 principles. 3. Create a cultural change around data quality For data and analytics technologies to truly shine, the right processes need to be in place across the organization to route data to the right systems and people in the right way. But dedicated data teams can’t do all the heavy lifting alone. Data comes from all LOBs in the modern FI, and HR, Finance, Sales, Management and IT all have a responsibility to uphold data quality. Getting all these people on board requires a complete cultural shift for some FIs. Achieving this cultural change requires rigorous processes to be applied throughout the organization, combined with the relevant education to ensure that they are followed correctly. By changing behaviors in all LOBs, you can make sure data capture and processing remains consistent across the enterprise—simplifying your regulatory reporting operations. Great infrastructures power great data governance Following these steps can help you drive your FI towards data excellence, but to get there, you need a data platform that offers all of the high-quality data, and all of the right management, analysis, and reporting tools you need to succeed, in one place. Oracle is already delivering solutions that can help standardize data capture and distribution, establish a single source of truth for all data, and help you go above and beyond what’s expected of the CDO and drive real strategic change across your enterprise. Discover how Oracle Financial Services Analytical Applications can help your FI take control of your data environment once and for all.

Over the past decade, the role of meaningful, actionable data has completely changed within the modern Financial Institution (FI). While clean and consistent data was once a luxury that could help you...

Analytics

Business Case for Positive and Active Compliance Management (PAC-M)

In her sixth video blog of the series, Practical Solutions to vexing Compliance Challenges, Saloni Ramakrishna touched on how some of the practical solutions to vexing compliance challenges may not be in formal rule books and require “out of the box” thinking. In her seventh video blog, Business Case for Positive and Active Compliance Management (PAC- M), she discusses how the compliance funding conversation is limited by viewing it just as a cost function, while in reality it is as much a revenue generation and revenue preservation function. Ms. Ramakrishna flips the challenge into a benefit statement and touches on the 3 major categories of benefits created by active compliance: Direct benefits , Costs Saved, and the Intangible benefits. Watch Saloni Ramakrishna’s seventh video blog of the Compliance Risk Management Series, where she lucidly articulates a strong business case for positive and active compliance program.  The next and the final video blog in the series of Enterprise Compliance Risk Management is Technology at the Core of a Successful Compliance Program. Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In her sixth video blog of the series, Practical Solutions to vexing Compliance Challenges, Saloni Ramakrishna touched on how some of the practical solutions to vexing compliance challenges may not be...

Analytics

Regulatory Burden May Lead to the Emergence of “Risk Utilities”

New regulation forcing businesses to provide consistent and timely evidence of accountability are hitting processes and operating costs hard and are leading to the emergence of third-party, specialist “financial crime and compliance risk utilities”. Matthew Long of Oracle looks at the pros and cons of outsourcing compliance and risk to third parties. The dust may have settled following the crash of 2008, but the financial-services sector is still feeling the reverberations to this day in terms of risk and compliance regulations. The fall out from the Panama offshore financial-services leak may result in further action, and governments and regulators are responding to calls to prevent such an event happening again and are overseeing reforms designed to change practices and behavior. These reforms mean that there has never been a tougher—or more rewarding—time to work in financial-services compliance and financial-crime risk departments. While the fact that it’s a fast-growing and increasingly important function may make it an attractive career choice, it is also fraught with high levels of personal risk, especially in senior or management positions, where accountability is high. If the organization is seen to be in breach of regulation, it can be the compliance and risk executives in the firing line. Most recently, we’ve seen the UK’s Financial Conduct Authority (FCA) and the Prudential Regulation Authority (PRA) reveal new measures that can result in hefty fines or up to seven years in jail for individuals.  Personal and corporate risk can be mitigated somewhat by the effective use of specialist technology and personnel; however, the conditions are also ripe for the rise of “financial crime and compliance risk utilities”. Used most effectively, these third parties can potentially mitigate financial crime and compliance risks as well as lower operating costs. Risk utilities look increasingly attractive to financial institutions. Outsourcing data-intensive tasks can help identify operational inefficiencies that increase non-compliance risks and overall compliance costs. For example, functions such as alert optimization, initial triage, detection scenario development, and testing and risk assessment could all potentially be outsourced. This could help tackle today’s tactical financial-crime risk and compliance problems while bringing greater predictability to compliance-related spending. Additionally it moves the personal-risk burden away from the compliance and risk officers. Outsourcing companies are also more likely to have dedicated up-to-date technology, because it is their business rather than a cost of doing business. Many financial organisations today rely on “best-of-breed” technology systems from different suppliers to deal with very specific aspects of financial crime and compliance management (FCCM), such as transactions monitoring, data-quality management or risk assessment. While these systems may have been cutting edge at the time of purchase, many may not be well-suited to the rapidly changing regulatory landscape. Financial institutions are already using third parties in other parts of their businesses—such as payments processing and auditing—so it’s not too much of a jump to see that this could move to risk, too. There are, of course, some structural issues that would need to be overcome for risk utilities to thrive—not to mention the creation of very robust contracts and service-level agreements. Banks have traditionally been unwilling to place sensitive compliance and financial-crime data outside their four walls, but recent reports from the likes of Ernst & Young suggest a growing interest in broader financial crime and compliance business-process outsourcing. Additionally, more financial institutions are growing comfortable with storing information in the cloud and outside their four walls. Of course risk utilities would need a rock-solid reputation for handling sensitive data securely and would need to be trusted from day one. This is not an opportunity for startups. Already established players such as technology partners and management consultancies would have a natural advantage. There are, of course, benefits for keeping this function in-house. Handing over sensitive data about business processes to a third party brings up legitimate questions about security and compliance that need to be satisfactorily answered. Choosing an outsourcing partner may mitigate risk, but it does not abdicate responsibility. Other considerations before outsourcing include the likely loss of expertise in-house and dependence on the supplier. This could prove an issue if the organisation needs or wants to bring the work back in-house at any point. However, as tighter regulations continue to be implemented, it is likely financial institutions will increasingly look to risk utilities. Outsourcing provides a rigorous approach to monitoring and surveillance activity that generates meaningful alerts, enables efficient investigation and analysis, and streamlines ongoing management and reporting. This is key to meeting more stringent regulatory expectations and achieving an operating environment that ultimately protects its reputation and customers. Matthew Long is a Global Solution Lead for Financial Crime, Governance, Risk and Compliance Management at Oracle Financial Services. Based in Luxembourg, Matthew has close to 20 years’ experience working with international clients in the financial services sector, including leading roles within risk and compliance management and transformation programmes. Within Oracle Financial Services, Matthew, a Certified Financial Crime Specialist, is part of a global team helping financial services companies meet their operational risk, compliance and financial crime requirements through improved analytics, systems and processes.

New regulation forcing businesses to provide consistent and timely evidence of accountability are hitting processes and operating costs hard and are leading to the emergence of third-party, specialist...

Banking

A customer oriented collections strategy boosts collections success and reduces operating costs

In the aftermath of the financial crisis and the accompanying recession, consumers retreated heavily on debt. Growth rates in debt began to accelerate around 2012 and today about 90 percent of new debt has come into auto and student loans, according to Schlagenhauf and Ricketts 1.  CardHub calculates that the average American has about $7,879 in credit card debt 2. These rising NPAs are a concern of not only banks but regulators and federal auditors as well.  Banks today find  that their access to capital is constrained with weary shareholders and depressed markets. With limited borrowing, the opportunity to increase earnings on their loans portfolio is extremely low. They have had to move away from business models that relied on high liquidity and low cost access to capital and are pursuing unpaid loans through debt recovery activities. Saddled with inflexible and aging IT investments and little process improvement,  this assignment is extremely difficult for collection and recovery operators. Adding complexity to the situation is an intricate business environment where customers have multiple debts in multiple products and increased regulatory focus on debt collections.  Collections operators need to invest in IT systems that enhance productivity and provide capabilities for omni-channel initiatives, self-cure, self-service options and data management. Such a system would leverage today’s flexible, and responsive technologies and help bridge the gap between how financial institutions collect on a borrower’s delinquent account and the borrower’s experience throughout the debt collection process. Here is a brief analysis of what an ideal collections system should look like for financial intuitions to build more proactive and effective collections strategies.    Seamless integration and data consolidation  The collection system should have the capability of integrating with applications of external entities such as agents, attorneys, anddata providers. This allows for  seamless flow of data  and avoiding data loss,  inconsistencies and duplication of effort.  Most banks have their  channels – Web, Branch, ATM and Call Center operating insilos, integration of applications allows banks and agencies to gain a completeview of a customer accounts and relationship.  Address multiple products and provide support through the entiredelinquency life-cycle with a single system In a majority of the banks collections operations are based onthe lines of business – mortgage, credit card, personals, auto loans, etc. This meanscustomers are contacted several times for different delinquent loans resultingin unhappy and frustrated customers. This approach also prevents lenders from gettinga 360 degree view of all the customer’s accounts and relationship with the bankinhibiting the bank from providing options that may be most suitable to cure customers’delinquent debts. Additionally integration between internal and external applicationsof the agencies and other parties is poor or does not exist. Recovery data is sent to agencies in spreadsheets, often missing key documents and other criticalinformation related to the customer's collections history. Agencies have to,therefore, start afresh rather than continue from the point where the bank endedits recovery efforts. This results in unnecessary delays and increasesturnaround time. A collection system that is designed to service all consumerproducts throughout the entire delinquent life-cycle will enhance debt recoveryefforts, eliminate duplicate process and optimize operational costs. Automated workflows Over the past couple of years the banking industry has undergone significant technological transformation but debt collections still largely involves  manual processes bloating cost and turnaround time of recovering debt. Done in isolation by various entities, both internal and external, manual processes cause data inconsistencies and unnecessary delays that erode both efficiency and the ability to derive maximum value from the debt recovery process. An end-to-end automated system that sits on top of a sophisticated business process management capability can streamline the entire collection process and enhance operational efficiency. Automating work flows reduces effort and time helping employees to focus on value-oriented tasks that contribute to improving business results. Real time reporting capabilities can help monitor agencies and employees effectively.  Borrower-centric intelligent segmentation  Several financial institutions and collection agencies still use historical experience and expertise to prioritize or rank delinquent accounts for collections. There is very little analytical intelligence used while segmenting customers. There are several draw backs to this approach – It lacks sophistication and  is limited by a small data set, strategic decisions are based on personal experience and limited to structured data that rarely changes or is up-to-date and prioritization of debt collection is done based on accounts not customers. Borrower centricity is an approach where lenders and agencies employ a system that provides a 360 degree view of all the accounts of the customer and segments customers based on certain parameters and calculates appropriate priorities for payment collections for each customer segment. The borrower-centric approach effectively identifies a customer’s ability to pay his delinquent debt. Effective payment options   Once customers are segmented, what bankers and agencies need is an intelligent system that is able to provide the best possible payment options for customers to cure their delinquent debts. In conclusion, the growing complexity in the debt collection industry requires lenders and debt collection operators to make  significant changes to  their existing processes. A fully automated and integrated collections system that is able to service all consumer products throughout the entire delinquency life-cycle, will be able to inject efficiency with  improved workflows, enhanced internal and external communication, easier documentation, agency management and reporting. All of these can increase recovery rates and optimize operational costs. Tushar Chitra is the Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Tushar.chitra AT oracle.com. Sources 1. http://www.cnbc.com/2016/03/08/student-debt-load-growing-so-are-delinquencies.html 2. http://www.fool.com/investing/general/2016/03/20/the-average-american-household-carries-this-much-i.aspx http://www.bizjournals.com/prnewswire/press_releases/2015/02/10/MN28416

In the aftermath of the financial crisis and the accompanying recession, consumers retreated heavily on debt. Growth rates in debt began to accelerate around 2012 and today about 90 percent of new...

Banking

Risks vs Rewards – The Dynamics of Debt Collection

A joint study conducted by the Urban Institute and Encore Capital Group's Consumer Credit Research Institute showed that about 77 million Americans currently have a debt in collections, which amounts to 35% of consumers with credit files or data reported to a major credit bureau 1. The Consumer Financial Protection Bureau (CFPB or Bureau) also reports that US consumers have submitted more complaints about debt collection than about any other product or service 2. Meanwhile rising cost of collections, the mandate for higher provisions against loan losses, and combating a flat economy is threatening lenders’ profitability.  IT budgets are strained even as mobility, analytics, and new technology trends hold the promise of streamlining processes and simplifying debt collection operations. In the light of these facts it is clear that financial institutions simply cannot afford to write off bad debts neither can they ignore customer experience. Even a fractional reduction in loss rates for large consumer portfolios can result in a significant and recurring reduction in credit losses. Maximizing return on investment by minimizing unpaid loans and managing traditional credit risk as well as profitability factors such as customer retention and resources are all key components of a financial institution’s Collections and Recovery process.  Here are a few trends and challenges that are currently impacting debt collection operations: Collections functions are increasingly the focus of regulators. One of the most impactful regulations comes from the Office of the Comptroller of the Currency (OCC) -- The OCC 2013 vendor management rule requires banks to audit, monitor and mitigate risks of third-party debt collection agencies 3. Banks (State and Federal) are expected oversee and control every operation that affects a customer. For the agencies this is a significant rise in costs  by way of connecting to the banks monitoring systems and reporting solutions, and increased  time and effort in responding to audit queries and training staff in compliance procedures. Furthermore, debt collections agencies now fall under the CFPB regulations, either directly or indirectly because of their relationships with banks. Accordingly, these companies must be compliant with CFPB standards and guidelines and provide assurance to their bank counter-parties of such compliance. The New York Department of Financial Services (DFS) has issued its own debt collection rules. What this means for collectors - they will have to make changes to meet evolving regulatory norms and expectations. Saddled with complex and rigid applications that cannot be configured at a business level, they are likely to find themselves involved in time consuming, cumbersome and expensive IT improvements or face the consequences of lapses in compliance and governance.  Current debt collection systems are not very effective and involve several disparate applications with very little integration between them. Debt collectors do not have consolidated data portals or centralized operational control leading to data inconsistencies, loss of information, duplication of efforts and high operating costs. Collectors incur the added expense of maintaining these systems and training IT staff to manage this complex infrastructure. Debt collection agencies do not segment the customers efficiently and provide appropriate flexible payment arrangements. Even if some form of segmentation of customers is done and payment arrangements are arrived at, they are not effective because they are based on the expertise and experience of the debt collection agent and not on intelligent segmentation using vast amount of historical and current customer information. In many cases collection agencies try to close the debt by settling for lower payment or foreclosing the loan by reclaiming an asset instead of applying appropriate collection strategies.  Collectors currently do not have right tools at their disposal to improve delinquency rates and maintain borrowers as customers. Delinquent borrowers are, first and foremost, valued customers. It is not uncommon for customers, especially those in early-stage collections, to quickly cure following a temporary hardship or have other accounts in good standing. Without a consolidated comprehensive view of the customer’s relationship with the bank, collection personnel are unable to make quick decisions or offer customers a best fit solution and clear delinquent accounts. Lacking a consolidated borrower-centric approach, debt collection officers unwittingly authorize different agents to communicate with the same customer. Repeated aggressive calls by different agents leaves customers frustrated and the likelihood of having customer service rated poorly and broadcast over social media to the larger public, is quite high. And there is a distinct possibility of the customer switching loyalties in search of better customer experience. Multi- product, multi-channel and multi- debt obligations are characteristics of today’s debtors.  They expect instant seamless, frictionless access to products and services. They are turned off by interruptive calls and frustrated by repeated contact. They prefer to talk to an agent when they are ready. Furthermore, The Fair Debt Collections Practices Act (FDCPA) prohibits the use of threatening or repeated phone calls to individual borrowers. Considering the behavior and preferences of customers and the fact that staffing cost is one of the biggest expenses for lenders there is a growing demand for self- service options.  The Bureau of Labor Statistics anticipates that between 2015 and 2016 the debt collection industry will experience a 23% rate of growth 4, much faster than the average for all industries. The time is ripe for financial institutions to take a strategic look at their collections operations. They need to examine what additional changes can be made to better align collections with the achievement of the organization’s overall business strategies and objectives including: increased profitability, improved customer experience and regulatory compliance.  An integrated, customer-centric approach can be applied to the management of delinquencies. Improving the robustness of systems and operational controls around collections process will not only improve recovery rate but also promote fair and consistent treatment of customers. Debt collections officers must make full use of today’s flexible, responsive operational and IT systems to deal with new, emerging risks in the debt market.  Tushar Chitra is the Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Tushar.chitra AT oracle.com. Sources: 1: WaterlooRegion Record. 30 July 2014. Avention. 2: MonthlyComplain Report, January 2016, CFPB, http://1.usa.gov/1PW7D7e 3. http://www.occ.gov/news-issuances/bulletins/2013/bulletin-2013-29.html 4: "DebtCollection Statistics," debtcollectionanswers.com.debtcollectionanswers.com/Debt-Collection-Statistics.html

A joint study conducted by the Urban Institute and Encore Capital Group's Consumer Credit Research Institute showed that about 77 million Americans currently have a debt in collections, which amounts...

Analytics

A new perspective on the potential of data

I’m sure most people will agree that data as an asset is not a new idea. However, I can make a strong argument that regulation as an asset is. This is quite a simple idea: Data is the new currency. Regulation has encouraged us to record, archive and interpret data so closely that we can now gather enough information to create a valuable asset. By forcing organizations to refine data in great detail, regulation has turned the underlying asset of risk management into a tool for wider business management. Data is now more accessible than ever before. We can now understand and use risk insights such as understanding of financial crime, operational risk and credit risk as business intelligence. This change can provide a better view of your business, informing business decisions. It’s an opportunity to take a strategic approach to delivering value. Data isn’t static. Because we’re talking about a resource that grows every day, this potential could be endless.  Its potential to deliver a return could create seismic changes in business intelligence. Let’s not let this resource go to waste. At a time of cost cutting, investing data into the wider business is a route to growth. The more clear and complete vision management has of what is known and unknown creates better control over all areas of business.  Get your stakeholders on-board to create systems to do this. Howard Mather is a Principal Sales Consultant for Oracle Financial Services. He can be reached at howard.mather AT oracle.com.

I’m sure most people will agree that data as an asset is not a new idea. However, I can make a strong argument that regulation as an asset is. This is quite a simple idea: Data is the new...

Analytics

Negative Interest Rates: A Path into the Unknown

When historians look back at what unfolded during the global financial crisis of 2007 onwards, and the actions taken to save the global economy from the precipice, it would be a fair assessment that governments and central banks used a wide range of tools, including “throwing in the kitchen sink” to try and get the wheels of economic growth moving again. Conventional wisdom of slashing interest rates, printing money, asset purchase programs have all, to a large extent, not had the desired impact the authorities would have hoped for. Policy makers are faced with the dilemma that after almost nine years of prescribing a variety of measures, the global economy still remains weak and faces major downside risks.   So where to next?  Well, with most options now exhausted, negative interest rates are seriously being considered as a viable policy alternative.  In actual fact, negative nominal policy rates have been at work for a year or so in Switzerland, Denmark, Sweden, the Eurozone and more recently were introduced in Japan.  Interestingly, not too long ago the US 3mth T-Bill Rate fell into negative territory, highlighting the bleak outlook for the global economy.  In light of the strong headwinds, the Federal Reserve has instructed banks to assess the implications of an acute stress scenario (as part of the annual stress tests) where there is a deep global recession, punctuated by negative yields on short-term US Treasuries, unemployment rapidly rising to 10%, corporate distress and a prolonged period where the US 3mth T-Bill Rate remains negative at -0.5% up until Q1 2019.  By compelling banks to incorporate a negative interest rate outlook into the scenario, does this mean the Fed could entertain cutting the Fed Rate below zero, at some future point in time, if the macroeconomic environment deteriorates further?  A tricky, albeit hypothetical question to answer, but the immediate response from the Fed has been a firm ‘no’.  However, it is worth noting that negative rates have been on the table before, but the jury is still out to the longer term impact of going down that road. Although the preliminary evidence from those European countries mentioned above, may not be as bad as originally anticipated.  So at the ground level, what has been the cumulative impact of very weak/negative interest rates on banks’ profitability? Simple answer is ‘not good’.  According to a recent S&P study, it estimates that Japanese lenders and banks will see falls in profitability of circa 15% and 8% respectively.  How does this bode for the other Tier 1 banks in other regions?  Judging by the results thus far, banks are reorganizing their global operations and in parallel, embarking on major cost cutting exercises in order to remain competitive.   Where and how the fallout of negative interest rates has been or will be felt is not so clear and herein lies the challenge – how to adjust the business model to produce healthy, stable shareholder returns in the face of a less than favourable interest rate outlook in conjunction with a heavier regulatory burden.  Central Bank policy rates are the barometer for all other financial market, corporate and retail customer interest rates, as well as exchange rates, bond yields and other asset valuations.  If policy interest rates turn negative, should a bank simultaneously drop its interest rates below zero on its key deposits?  According to text books, a customer would not be happy to be charged an interest rate on their deposit and if they were, they would immediately withdraw their funds and place them elsewhere.  It is the risk of these major outflows that partly explain why banks would be reluctant to take this draconian pricing action, that alongside the potential damage to the franchise and loss of market share.  The upshot is, banks feel obligated to pass the effects of lower central bank rates onto its loan customers, but not to its depositors, thus manifesting in a major squeeze on the bank’s net interest margin (NIM).  Absorbing these costs is painful and in the longer term, unsustainable, unless a bank can come up with innovative ways to mitigate this NIM compression by way of re-aligning or redesigning its product offerings or somehow better hedge the risk in the market.  In some respects, customers are already being charged an implied negative interest rate, by virtue of the charges and levies for managing their accounts.  This is deemed acceptable, so long as the advertised interest rate applicable is zero or slightly above, otherwise the consequences could be dire.  But is that what would really happen, or put it another way, what we feel instinctively could happen?  If we were to look at the characteristics of the deposits that receive favourable outflow treatment under the LCR regime, we could conclude these deposits were large in volume, offered on average, low interest rates and were relatively inelastic to rate movements.  Such attributes are representative of a stable ‘sticky’ deposit base and are reflected in many a deposit model to help ascertain between a bank’s core and volatile deposits.  An argument could be made that if these core LCR/NSFR friendly deposits (which receive a negligible interest rate) were now being charged an interest rate just below zero, how many customers would actually withdraw their funds?  Yes, there would be some outflows, but would they amount to distorting the ‘core’ deposit base in a major way?  It could be agreed that a rational customer would instead choose to hold the physical cash themselves, but this in itself does not come without costs and risks.  So long as these costs and risks of hoarding cash are greater than the negative market interest rates, there is a possibility that the outflows may not been as severe as expected.   This may be a simplistic view of the problem, but I am convinced it would be a major agenda item being discussed at ALCO meetings.   Further analysis needs to be undertaken and presented to evaluate the deposit outflow scenarios under a range of negative interest rates.  Perhaps the analysis should take into account a tiered approach for charging core deposits, similar to what banks are charged for their excess reserves they park at the central bank.  If the evidence purports to a minimal impact on the ‘core’ deposit base, then these deposits that remain could be viewed as ‘super sticky’ and thus argued should attract an outflow rate between 0% to 2% under the LCR regime.  On the flip side, the escalating costs of managing a compliant LCR have seen banks become less accommodating with respect to LCR unfriendly deposits.  A good example is short-term non-operational corporate deposits, where there is no real financial benefit for the bank to take in the deposit because of the penalties under LCR regime and the prohibitive cost of placing the funds at the central bank.  Turning down these large deposits appears somewhat illogical, but the reality is a bank can ill-afford to carry the costs in what is already a challenging market environment.   Turning to the investor side, the lack of yield across the curve is somewhat depressing.  According to Bloomberg, by February this year, there were approximately USD 7 trillion of global government bonds that offered yields below zero.  Under these conditions, investors will be forced to look elsewhere for returns, and this may well mean moving down the curve into riskier asset classes.  If negative interest rates became more widespread and persisted for longer, then expect a review of risk appetites and a rewrite of investment policies. Let’s assume for now, negative interest rates are here to stay for a while.  Operationally, do financial institutions have the right processes, procedures and systems, alongside appropriate legal frameworks in place to handle all the ramifications of negative interest rates?  For floating rate assets/exposures there is a real risk that historical pre-crisis transactions do not reflect or take into account policy rates falling below zero.  In certain instances, this will create a bizarre situation where a bond investor ends up paying the issuer or a bank finds it has to pays a customer who took out a negative spread, base rate linked mortgage before 2007.  In each of these examples, it is not clear how the anomaly would be resolved.  Furthermore, banks need to update their model assumptions for pricing certain non-vanilla exposures and outline to their respective committees what the impact would be on valuations and the overall risk profile.  An example is whether the bank could still efficiently price interest rate options even if the risk free rate is negative.  This brings us to a bank’s booking and analytical systems and whether they can readily adapt to a scenario where the bank applies a negative interest rate to some of its asset and liabilities.  In this case, questions around pricing, payments, interest accruals, cash-flows, margins, amortization schedules, funds transfer pricing, collateral agreements, regulatory reporting, risk reporting etc. all have to be explored and discussed.  If for some reason the applications are incapable of reflecting negative interest rates, then what will be a workaround and is it sustainable in the medium term? In summary, from a strategic standpoint, what type of negative interest rates analysis should ALCOs be demanding as part of their monthly ALCO report?  In essence, I expect all the following will already be visible to the ALCO, but it is worth reiterating their importance.  As a precaution, the list is by no means exhaustive and it is recommended that the Treasury, ALM, Risk and Legal teams collaborate closely to ensure any other existential threats are identified and understood.  Balance Sheet forecasts to capture a range of negative interest rate scenarios and what their impact will be on NIM and NII These forecasts to be performed both on a static and dynamic basis, whereby the former reflects a balance sheet that remains fixed and in run-off mode and the latter assumes certain balance sheet transactions will be undertaken during the forecast period Where negative interest rates are being applied on certain products or exposures, then those transactions to be separated out and explicitly highlighted in liquidity, re-pricing gap, market value, sensitivity and VaR reports LCR to be assessed under a mild and severe negative interest rate scenario with varying lengths of duration  Of specific interest pertaining to the LCR is the cash-flow implications where both deposit and loan rates are below zero, is there acceleration in outflows? Review core deposit models and decipher if a ‘super sticky’ pool of deposits exists The costs of continually funding the SHQLA under a negative interest rate scenario How long can the bank sustain to absorb the NIM squeeze and what is the break-even point  where a bank is compelled to apply a negative interest rates on its key retail deposits Quantifying the ongoing cost of hedging interest rate risk Additional costs of managing collateral market relationships for both on and OBS transactions Capital implications, in particular whether any new capital needs to be raised Negative interest rates go against convention and are not something both retail and financial markets are familiar with, but it is unlikely that we will see a rapid return to normalized interest rates anytime soon.  With that in mind, banks will be charting in unfamiliar territory and will need to navigate a delicate path, so as to avoid the unknown risks that lurk below.  Ziauddin Ishaq is a Global Solutions Lead Treasury & Liquidity Risk for Oracle Financial Services. He can be reached at ziauddin.ishaq AT oracle.com.

When historians look back at what unfolded during the global financial crisis of 2007 onwards, and the actions taken to save the global economy from the precipice, it would be a fair assessment...

Analytics

What’s the difference between see-through and look through?

Sometimes working with data can seem like an existentialist exercise. Are we thinking about data or are we thinking about the human condition? Sometimes it’s a little of both. Donald Rumsfeld, former US Secretary of Defence had this to say about discovery and self improvement: “There are known unknowns... things that we know we don’t know. But there are also unknown unknowns… things we don’t know we don’t know.” - Donald Rumsfeld, US Secretary of Defence, 2001 to 2006 This is where see-through and look-through come in. Sometimes you can’t avoid the unknown, but being sure of what is known and unknown gives management control. A single set of data can do this by giving management the capacity to look through assets. It can also provide the insight to see through into specific trading desks at investment banks.  There has never been more clarity of data. Managers can get the detailed information they need but also an overview showing what information is available and what is not. Timeliness is important, too. A single set of data must be managed with dedicated task-specific tools, as it is efficiently moved and analysed across the enterprise. That requires a set of outputs to deliver the insights needed in as close to real time as possible. Remember, we need that detailed information for regulators and managers who need to look through the data on a granular level. This means that the tool must be able to handle massive historical data sets for more in-depth analysis. To use this information to grow on the back of the investment driven by regulation, senior management needs to take action to: Engage with stakeholders to get buy-in on the vision. Use those functions to gain acceptance of the model within the wider organisation. Establish which functions can see the greatest return from an integrated approach.  Howard Mather is a Principal Sales Consultant for Oracle Financial Services. He can be reached at howard.mather AT oracle.com.

Sometimes working with data can seem like an existentialist exercise. Are we thinking about data or are we thinking about the human condition? Sometimes it’s a little of both. Donald Rumsfeld, former...

Analytics

Centralized Questionnaire Management for a Greater KYC Program

The Know Your Customer (KYC) provision has been around for several years and yet still challenges banks.  This financial regulatory rule was mandated by the Bank Secrecy Act and the USA PATRIOT Act of 2003. Section 312 of the USA PATRIOT Act requires U.S. financial institutions to perform due diligence and, in some cases, enhanced due diligence with regard to: Correspondent accounts established or maintained for foreign financial institutions Private banking accounts established or maintained for non-U.S. persons Financial Institutions capture a plethora of information as part of the client on-boarding process to comply with this regulatory requirement. To capture this information, customers are asked a variety of questions during on-boarding and, once available, responses are shared with compliance teams as part of Customer Due Diligence for risk assessment. Client on-boarding and Account Management are critical functions for any financial institution, especially before embarking on a relationship with the client. The initial verification, risk assessment, and profiling of the client all assist in creating a relationship between two firms. However, it is this critical function that very often becomes a victim of redundant processes and bad data inputs. A recent Thomson Reuters survey shone a spotlight on the KYC challenges faced daily by multiple market participants regarding client on-boarding.  A different research article discovered that UK banks are losing around 40% of their applicants due to lengthy and tedious on-boarding processes. Many organizations maintain these questionnaires in silos as part of a number of client on-boarding systems. The management of these silos tends to require a lot of time and money to accommodate changes. The foremost issue with this silo approach is an “un-unified Questionnaire Library.”  Keeping an on-boarding questionnaire in different systems or as part of an on-boarding system may lead to an un-unified definition of information. This then requires a huge amount of funding and effort around data manipulation before the data can be fed into a KYC system for record keeping and risk assessment. Another problem that arises from the silo approach is longer time to accommodate future changes. If a questionnaire is maintained in multiple systems, then all of those systems need to be updated in case of any future changes, which will lead to longer times to accommodate those changes. More systems = higher costs. One way to tackle the above problems around KYC on-boarding questionnaires is by having a Centralized Questionnaire Management Platform. Rather than maintaining KYC on-boarding questionnaires in multiple systems, it is easier and more cost efficient to maintain them on a centralized platform. This centralized platform should be integrated with all of the institution’s on-boarding systems. On a broad level, the Centralized Questionnaire Management Platform should have the below capabilities: Capability to define various types of questions, such as free text field, number, lookup, radio button, etc. for financial institutions to define what information they would like to confine. Capability to define various types of questionnaires, such as a basic type questionnaire, a score based questionnaire, which will lead to an overall score based on the responses, etc. Configurable workflow capability to accommodate organizational specific processes. Flexible architecture to integrate with various on-boarding systems. The main advantage of the above approach is a unified definition of information and data that financial institutions are capturing as part of their KYC process. This can provide better management reporting around customers and a 360 degree view of the business. Investments around data manipulation due to different definitions of information can be avoided too.  By implementing a unified platform, the integration with systems – such as client on-boarding, Know Your Customer, and CIP – will be easier to maintain. Therefore, better integration will allow financial institutions to move towards a better customer experience and seamless client on-boarding processes. A centralized, unified platform for questionnaire management will accelerate any future regulatory changes. Do you have additional thoughts around this topic?  Please leave your comments below; I’d love to read them.  Garima Chaudhary is a Senior Sales Consultant for Oracle Financial Services Analytical Applications. She can be at garima.chaudhary AT oracle.com.

The Know Your Customer (KYC) provision has been around for several years and yet still challenges banks.  This financial regulatory rule was mandated by the Bank Secrecy Act and the USA PATRIOT Actof...

Analytics

From FATCA to GATCA: The Making of Global Tax Network (Part 2)

I hope you liked the first part of the Global Tax series and welcome to last and final Part 2 where I’ll build on the context set up in the first post. In this post, we’ll see how GATCA is different from FATCA and how banks and other FIs can implement GATCA without breaking their banks (and head)! What is GATCA? Under the aegis of OECD, the Reporting and Administrative structure is standardized and the legal canopy is put in place to simplify the Reporting and reciprocal obligatory structure for all the signatories. So now there is NO need to enter into any bilateral agreements between countries as all the signatory countries of GATCA will implicitly agree to share and report all such information amongst them automatically and periodically. Such an internationalized FATCA is now a reality for many international banks. The full version of the new global standard was released in July 2014 with the main objective to combat offshore tax invasion by foreign citizens and foreign companies through measures which increase transparency and enhance reporting. The standard consists of 2 main parts: Competent Authority Agreements (CAA):  The Model CAA links the CRS and the legal basis for the exchange (such as the Convention or a bilateral tax treaty) allowing the financial account information to be exchanged. They also contain representations on confidentiality, safeguards and the existence of the necessary infrastructure for an effective exchange relationship. Common Reporting Standard (CRS): The CRS contains the reporting and due diligence standard that underpins the automatic exchange of financial account information. A jurisdiction implementing the CRS must have rules in place that require financial institutions to report information consistent with the scope of reporting and to follow due diligence procedures consistent with the procedures. See the Figure 1 below for the two major components of CRS: Figure 1: CRS Components (source: www.slideshare.net) Over 94 jurisdictions have committed to swiftly implement the GATCA CRS. Of those, more than half are “early adopters” requiring them to adhere to the following timelines in Figure 2 below: Figure 2: Aggressive timelines for GATCA CRS implementation (source: www.tcs.com) GATCA vs. FATCA: Transition or Co-Existence? Although GATCA uses FATCA as a template to build upon, there are some notable differences: See the Figure 3 below for high-level differences between the two regimes. Figure 3: GATCA vs. FATCA (source: www.goldmansachs.com) Differences between FATCA and the CRS mean that financial institutions may not be able to use the same due diligence and reporting systems for both standards. The CRS relies heavily on local AML/KYC requirements and self certification by account holders. Although the U.S. in principle has already agreed to the GATCA standards, raising expectations that GATCA will ultimately replace FATCA as the global standard in the long run. However, given that the U.S. is neither an early GATCA adaptor nor are there any moves to repeal the FATCA act, many financial institutions will be required to comply with two similar, yet different reporting standards. The below Table 1 summarizes the major points of differences between IRS FATCA and OECD CRS: Table 1: GATCA vs. FATCA The Challenge: Roadmap to GATCA As financial institutions continue to upgrade their internal infrastructure and processes to comply with FATCA, it is important they do not lose sight of the bigger picture. Solutions adopted today need to be robust enough to not only meet the requirements at hand, but also the ones yet to come. The major challenges foreseen while implementing CRS will be:  Requirement understanding from Business perspective Data collection and integration IT systems Other human and non-human Resources The Financial Institution should understand the components and the roadmap for the Report preparation and how effectively they can leverage their existing AML, KYC and FATCA processes and IT systems. The below Figure 4 effectively captures all the processes and components of the compliance ecosystem. Figure 4: Implementation of the CRS compliance (source: https://home.kpmg.com) More importantly, the Financial Institution need to realize OECD CRS is more than just an enhanced version of FATCA. Actually, GATCA is much wider in scope than FATCA. Financial institutions that took a tactical approach to their FATCA solution, either by creating temporary manual processes or by excluding US persons, cannot simply upgrade their FATCA systems. Instead, they may have to invest in flexible information technology (IT) architecture that can adapt to evolving regulations, and to new countries coming on board. Some low-risk financial institutions that do not come under the purview of FATCA are covered by CRS. The number of customer accounts required to be reviewed under CRS is also larger, due to a multi-lateral approach and lack of threshold limits. The below Table 2 summarizes the Roadmap needs to be followed by the Financial Institutions for the effective implementation of the GATCA CRS.  Table 2: Roadmap to GATCA Implementation The current capabilities of FATCA compliance programs, and the ability to use existing systems and processes to meet CRS requirements, are major drivers in determining the challenges, cost, and complexity involved in CRS compliance. While existing FATCA programs can be leveraged to some extent, differences between the two standards necessitate enhancements to systems to ensure the capture of additional reportable account information as per the requirement. Conclusion Under the CRS, data handling involves collection, analysis, and report generation from a huge volume of data, while under FATCA, identification & reporting is restricted to US persons. Financial institutions need to implement well-planned CRS programs. While upgrading the internal systems and processes, financial institutions must focus on the bigger picture and ensure scalability to meet future requirements. Financial institutions that adopted a strategic solution based on FATCA Model 1 IGA will gain a competitive edge in CRS compliance, over those that followed a tactical approach involving temporary manual processes, or excluded US persons. Gaurav Kesarwani is a Consultant with Oracle Financial Services Analytical Applications.  He can be reached at Gaurav.Kesarwani AT oracle.com.

I hope you liked the first part of the Global Tax series and welcome to last and final Part 2 where I’ll build on the context set up in the first post. In this post, we’ll see how GATCA is different...

Analytics

A single regime for data: Management of risk becomes management of business

When I’m asked how risk and financial data can become an asset, I find it easiest to put this into financial terms: Once you have data assets. You can value them. It makes sense that as data becomes more useful, it becomes more valuable. Consolidating control into a single view of actionable data can help businesses handle both risk and business management. This provides business intelligence, which provides great value. Now, organisations can view the same data differently, depending on the levels of detail needed. Different angles of interpretation are also important. When you run your core data set in real time, under a single regime it can be used in multiple ways. This can be a starting point for overcoming problems as well as creating new opportunities. It can: underpin understanding of financial crime, operational risk and credit risk. enable well-managed governance, risk and compliance to guard against exposure to money laundering, internal or external fraud and weakness in risk controls. scale without creating duplication or inefficiency. provide customer insight.   The better this data set is, the less need there is for added precautions such as collateral buffers, saving even more money and adding even more value. Business continuity is also highly valuable when we speak about productivity and customer confidence. Crucially, by virtualising and consolidating applications and simplifying structures, we can achieve this technology without any disruption to the back office. Howard Mather is a Principal Sales Consultant for Oracle Financial Services. He can be reached at howard.mather AT oracle.com.

When I’m asked how risk and financial data can become an asset, I find it easiest to put this into financial terms: Once you have data assets. You can value them. It makes sense that as data becomes...

Analytics

Practical Solutions to vexing Compliance Challenges

In her fifth video blog of the series, The Umbilical Cord Between Business Model and Compliance, Saloni Ramakrishna discussed the fundamental connect between business model and compliance and how compliance has remained in the reactive stance due to a lack of appreciation of this connect.  In her sixth video blog, she touches on how some of the practical solutions to vexing compliance challenges may not be in formal rule books  and requires “out of the box” thinking. Implementable ideas from peers in the industry can be a rich source for organizations that want to embrace compliance  as a value multiplier instead of being forced to be compliant. Watch Saloni Ramakrishna’s sixth video blog of the Compliance Risk Management Series, where she argues that just like how non-compliance is punished, compliance need to be rewarded, not necessarily by monetary bonuses, but by acknowledgement, appreciation certificates or factoring additional points into annual assessments and the like as a positive incentive.  The next video blog in the series of Enterprise Compliance Risk Management is Business Case for Positive & Active Compliance Management (PAC- M). Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In her fifth video blog of the series, The Umbilical Cord Between Business Model and Compliance, Saloni Ramakrishna discussed the fundamental connect between business model and compliance and how...

Banking

No More Dithering Over Digital Transformation!

Blog By: Aubrey Hawes There’s little doubt that one of the main factors currently affecting the financial services industry – and that’s likely to go on affecting it for years to come – is the move towards digitization. Digital disruption in itself isn’t new – it’s been happening in different industries over the past 20 or 30 years. Now it’s the turn of financial services. A new report, ‘Digital Transformation – The challenges and opportunities facing banks’ looks at the current and future impacts of digitization. This emerged from a joint ‘Think Tank’ venture between Oracle Financial Services and Efma (the not-for-profit association that provides a knowledge base, resources and networking opportunities for financial institutions). I had the opportunity to present the output from the Think Tanks at the Efma Distribution Summit at London in April. This was attended by some 300 senior executives from financial institutions across the world. (To find out more, visit https://www.efma.com/index.php/events/conferences/overview/EN/2/513/1-1JOVMD) The Think Tank sessions were very useful, and highlighted the concerns that banks are feeling about the topic of digital transformation. Many are reticent about making significant changes that could change the whole direction of their banks – whilst also recognizing that change is both necessary and inevitable. The report starts by examining the impact of digital disruption over the years, and most recently the role of fintechs. There is still a question mark over whether they are likely to be partners of banks or competitors in the future. The report explores four key digital strategies that banks have been using with varying success. These are: Launching a digital brand – and creating engaging digital experiences for customers. This might involve positioning a new brand differently from the existing one; or developing a set of processes that enable the new digital brand to compete in a different way.  Digitizing processes – and the challenges banks are facing from non-traditional players that are creating their own new business processes rather than using old analog processes. At the moment, banks risk being left behind in the race towards digitization. They need to simplify their processes while going digital. Modernizing the digital experience – and the need to use tools that will enable banks to engage more effectively with customers. The development of a digital enterprise is based around the four ‘Ps’ – Product, Price, People and Place. Place (which can be physical or virtual) refers to the need to enhance the digital experience at the point where the customer is. This can also include delivering APIs but we will discuss this more in a later blog. Launching new digital capability – such as money movement apps, mobile wallets and the use  of data as ‘currency’. Banks need to start thinking more seriously about how they can take  advantage of innovations such as virtual reality, FitBit or the Internet of Things, as all of these will play an integral role in the future lives of their customers. They also need to re-examine and transform the role of the branch, so that it becomes more focused on customer service and advice rather than transactions. Time and tide… Ultimately, banks can’t really avoid the need for digital transformation. If they try to, it’s unlikely that they will survive over the next 10 or 20 years. Even if they do embrace the need for change, there’s still a risk that they will only go into this half-heartedly or with a lack of proper planning. This isn’t likely to help any bank seeking the holy grail of an enhanced customer experience!  Yes, there are many challenges ahead and many pitfalls and potholes that could cause banks to stumble, but there are numerous resources and points of advice available – and the benefits of transformation far outweigh the difficulties involved. Whatever they decide to do, banks need to start doing it now, if they haven’t already. Time isn’t on their side. But Oracle is always at hand to help… To see the full report on Digital Transformation, please visit http://www.oracle.com/us/dm/seo100518687-emea-gb-wh-de1-ev-2950962.html. Aubrey Hawes is a Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Aubrey.Hawes AT oracle.com. The views expressed herein are the views of the author and not necessarily the views of the employer.  

Blog By: Aubrey Hawes There’s little doubt that one of the main factors currently affecting the financial services industry – and that’s likely to go on affecting it for years to come – is the move...

Analytics

Greater visibility boosts profitability and performance

We’re all watching our pennies these days, especially in banking. The cost of capital is increasing while regulatory requirements become more demanding. Every decision about the viability of lines of business needs to be an informed one. An organisation can improve its offering, putting its pennies to work by evaluating the costs and returns associated with each counterparty and client. It’s important that this is visible so that a financial institution can put the information into context. As quoted by the Financial Conduct Authority, ‘Investment and corporate banking market study: Terms of reference’ May 2015, “We will want to understand if there is significant cross-subsidisation between primary market activities and related activities or within primary market activities. We will also want to understand whether …bundling or cross-subsidisation extends … to secondary market activities and whether [it] has adverse effects on competition and clients." But how do we keep track of it all? To do this, and give information a wider value, it’s important to establish a set of applications that can organise and analyse that information. With a single source, or aggregation point for data, this information is easier to access. Organisations can then create a fertile field for a greater understanding of the business to take root and grow. The granularity required by regulators and investors can create enormously detailed pictures of business areas, allowing their profitability and the dynamics that drive profitability, to be scrutinised. Regulators want the answer to this question: “Which business do customers support?” while clients want to know the risk and value carried by individual desks or funds. With access to this information insights can be taken to the strategic level of the business, so that delivering value becomes a constant, achievable goal, boosting profitability and client performance that enables organisations to look to the future with confidence. Howard Mather is a Principal Sales Consultant for Oracle Financial Services. He can be reached at howard.mather AT oracle.com.

We’re all watching our pennies these days, especially in banking. The cost of capital is increasing while regulatory requirements become more demanding. Every decision about the viability of lines...

Analytics

IFRS 9 (Impairment) - Its Positive Effects on the Financial Services Industry

In a nutshell, the International Financial Reporting Standards (IFRS) are a set of accounting standards developed by the International Accounting Standards Board (IASB) that is becoming the de facto global standard for financial reporting.  And more specifically, IFRS 9 is the IASB’s response to the financial crisis; the package of improvements introduced by IFRS 9 includes a logical model for classification and measurement, a single, forward-looking ‘expected loss’ impairment model and a substantially-reformed approach to hedge accounting. Ok, we got the background out of the way.  Now onto the more important information: how is IFRS 9 benefiting banks? With the inundation of regulatory requirements for financial institutions, we have to find the silver lining somewhere!  While all of the requirements are painstaking, they can, and should, have valid business benefits. The IFRS 9 impairment guidelines are posing a lot of practical challenges to financial services institutions to implement, but there are a number of positive effects that cannot be overlooked. Here are what I find to be the top 3 reasons why IFRS 9 is a good thing for financial institutions. #1 Credit appraisal and pre-sanction processes Going forward, financial institutions are expected to tighten their credit appraisal processes.   Time and again improper and inadequate credit appraisal processes were observed as the main internal cause for loan accounts turning out bad, amongst other external causes such as economic slowdown, willful defaults and so on. Given the requirement of forward looking expected credit losses, financial institutions are about to look at further strengthening of their underwriting standards and credit appraisal processes. #2 Closely monitor assets Financial institutions are expected to closely monitor the “watch-list” category of assets that are likely to slip into the category of “under-performing or credit impaired” to avoid assets being classified into “stage 2 or 3”.  In addition, they  are expected to make all efforts to ensure that the assets classified into stage 1 as of last reporting date do not migrate to stage 2 or 3 owing to the Lifetime ECL impact. Needless to say, these efforts will be initiated on a pro-active basis in order to safeguard an institution’s distributable profits, brand value and keep the NPA (Non-Performing Assets) levels under control. All of these are expected to result in further strengthening of existing post disbursal follow up processes and set to streamline the collection efforts. #3 Capital & business planning IFRS 9 impairment guidelines have a direct effect on the retained earnings and thus, the bank’s capital. Given the capital constraints, it is quite likely that each line of business within an institution be mandated to efficiently utilize the scarce capital resources allocated to them and maximize profits. Other positive effects Being a crucial part of the risk and finance transformation journey of an institution, these guidelines will also bring about strong coordination between risk and finance teams. IFRS 9 guidelines are expected to indirectly drive sound documentation of internal credit risk management policies and procedures.  Better transparency to the investors, public and other stakeholders in terms of enhanced market disclosures. In conclusion, IFRS 9 with its forward-looking impairment model is expected to further augment the efficiency of the banking system. It can be concluded that this impairment model has served its main purpose if another financial crisis is averted and thereby the interests of shareholders and general public are safeguarded. Only time can prove the effectiveness of the revised set of guidelines. Although IFRS 9 is an accounting standard, it is expected to play its part in strengthening the financial institution’s credit risk management system in particular and bringing about a sound banking system at large. Krishnamurthy Venkatraman is a Senior Principal Product Manager for Oracle Financial Services Analytical Applications. For more information, contact Geetika Chopra. The views expressed herein are the views of the author and not necessarily the views of the employer.

In a nutshell, the International Financial Reporting Standards (IFRS) are a set of accounting standards developed by the International Accounting Standards Board (IASB) that is becoming the de...

Analytics

From FATCA to GATCA: The Making of Global Tax Network (Part 1)

"If you've never heard of FATCA, I don't blame you. Few people have, and even fewer fully grasp what it really means… Now that FATCA has become a fait accompli, the foundation has been laid for GATCA. In the end, this means a permanent record of every penny you have ever earned, saved, borrowed, or spent anywhere in the world will be available in an instant to be analyzed and scrutinized." - Casey Research, International Man You must have heard about the recent Panama Paper leaks which by far is the largest leak of its kind, releasing more than 11 million documents containing transactions for the last 40 years. The Panama based company, Mossack Fonseca, helped the global elite in establishing more than 214,000 shell companies (See Figure 1) to evade tax and lauder money. However, this massive leak exposed only a small part of the tax evasion industry (which is estimated between $21-$32 trillion) as obviously this firm is just one of many firms in dozens of international tax havens providing their services to the global select few. Figure 1: Offshore shell companies created by Mossack Fonseca (source: http://www.financialmail.co.za/) So the bigger question going forward is, “How are we going to check tax evasion – through a well orchestrated process or through leaks?” How an international process can deter these transgressions, or better still, how can they catch the rogues? And how the FIs are going to catch up with all the action going on? And what does this mean for the IT and business, which is already crushing under the regulatory pressures? What’s the Roadmap any bank should follow to be on the top of this? All these questions and the complete landscape will be addressed in this and the upcoming post! Keep tuned in. By the way, like every cloud has a silver lining, these leaks brought GATCA into public consciousness so much so that in few days after the release of the Panama papers, Panama finally agreed to comply with the standard! Well, before we proceed, it would be contextual to give a brief of FATCA and how and why GATCA came into picture (don’t confuse with Gotcha – though intention is similar!). FATCA, a foot-in-the-door? It is estimated that the U.S. Treasury loses as much as 100 billion USD annually to offshore tax non-compliance. FATCA is a US law to prevent tax evasion by US citizens (and residents) through use of offshore accounts. The FATCA provisions were signed into US law in March 2010. The law is designed to tackle the non-disclosure by US citizens of taxable income and assets held in foreign accounts. FATCA had far-reaching impact on US-based companies as well as foreign companies with US assets or clients. Under the new provisions, a FFI may enter into an agreement with IRS requiring it to report information on the FFI's US accounts. A FFI that enters into such an agreement becomes a Participating FFI. If a FFI does not enter into an agreement with the IRS, all relevant US-sourced payments (such as Dividends and Interest paid by US corporations) will be subject to a 30% withholding tax. The same 30% withholding tax will also apply to gross sale proceeds from the sale of relevant US property or stock investments. All FFIs must comply with FATCA or be subject to withholding (effectively, cutting the institution off from any profitable US investment opportunities). Under FATCA, the United States is not required to give anything in return. FFIs were all too willing to provide the information in order to avoid the 30% withholding tax, but unfortunately most could not do so without being in violation of their home country’s data protection and privacy laws.  It would be illegal to send client information to a third party or government. Violating the home country’s privacy laws could mean the institution would be shut down in the very jurisdiction where the financial institution was located.  Thus, this unilateral approach to FATCA was not workable and the US had to come up with a more palatable plan.  The US had to convince the governments of other countries to get on board and help ease FATCA’s implementation.  The fruit of this plan took the form of the so-called IGA. Intergovernmental Agreement (IGA) The FATCA Agreements with other countries is based on two Model IGAs. These agreements form the legal canopy under which the participating FFIs can pass on the required information to the IRS, without violating their home country’s data protection and privacy laws. The first agreement, known as the Model 1 IGA, would require FFIs to report all FATCA-related information to their own local governmental agencies, which would then pass on the same to the IRS. Such an FFI will be treated as “deemed-compliant” with FATCA, and will generally not be subject to the 30% withholding. Most of the Model 1 IGAs are reciprocal (to an extent), requiring the U.S. to provide certain information about residents of the Model 1 country in exchange for the information that country provides to the U.S. An FFI covered by a Model 1 IGA will not need to sign an FFI agreement, but it will need to register on the IRS’s FATCA Registration Portal or file Form 8957. The Model 2 IGA would require FFIs to report information directly to the IRS. Under such IGA, FFIs will need to register with the IRS. Under Model 2 IGA, FFIs will need to register with the IRS and will sign an agreement modified to reflect the IGA. To date, only 14 countries have signed Model 2 IGA with IRS and major countries being Japan, Hong Kong and Switzerland. Figure 2: FATCA Implementation Timeline (source: http://themarketmogul.com/) How Big is FATCA? To date, a total of 113 jurisdictions have signed FATCA, pretty much covering all major countries and jurisdictions of the world. Around 77,000 banks and other FFIs have already registered with the US and received a Global Intermediary Identification Number (GIIN) to comply with FATCA. Most of the agreements are based on the Model 1 IGA. FATCA requires almost all of the banks in the world to register with the IRS if they want to continue the privilege of earning, moving, or even touching U.S. source income. FATCA seeks to create an international banking and financial database. The purpose of the data base is to enable the respective FATCA-friendly countries to track compliance of their respective citizens. Now comes the Big Brother - GATCA! As the world becomes increasingly globalised it is becoming easier for all taxpayers to make, hold and manage investments through financial institutions outside of their country of residence. Vast amounts of money are kept offshore and go untaxed to the extent that taxpayers fail to comply with tax obligations in their home jurisdiction. Offshore tax evasion is a serious problem for jurisdictions all over the world. Tax administrators need to work together to ensure that taxpayers pay the right amount of tax to the right jurisdiction. An open international architecture where taxpayers operate cross-border but tax administrations remain confined to their national borders can only be sustained where tax administrations co-operate. After all, Americans are not the only tax evaders! So, for example, if India wishes to have the same FATCA like treaty with Singapore where both the countries will report to each other all the Income and Capital gains of Indian and Singaporeans respectively, there needs to be a taxation treaty between these countries. Now permutation and combinations takes over and imagine all countries of the world signing treaties with all countries—that’s when in February 2014, OECD presented a common global standard for the automatic exchange of tax information between countries. The standard is officially called the Automatic Exchange of Information (AEOI), and informally referred to as GATCA (Global Account Tax Compliance Act). So, you are right to guess that there’s coming up new era of Global Tax Reporting Network where it would become very difficult to evade tax in your country or outside! (At least theoretically as of now) Get excited for what’s to come next! In the next and final blog of this series, we’ll see what is GATCA (CRS) and how is this set up to be the winner! Gaurav Kesarwani is a Consultant with Oracle Financial Services Analytical Applications.  He can be reached at Gaurav.Kesarwani AT oracle.com.

"If you've never heard of FATCA, I don't blame you. Few people have, and even fewer fully grasp what it really means… Now that FATCA has become a fait accompli, the foundation has been laid for GATCA....

Banking

Can ‘IoT’ Shake Banks?

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services As disruptive technologies continue to fuel innovation, banks are not likely to remain unaffected by the emergence of  the Internet of Things (IoT). While IoT has been around for longer than the surrounding hype, it’s time to acknowledge that IoT is real and exhibits tremendous growth. Research* suggests that around 6.5 billion things will be connected in 2016 growing to 20.8 billion by 2020 – creating a vast network of appliances, cars, smart phones, manufacturing equipment and wearables with a potential to reshape our lives. Are banks prepared to seize the new opportunities presented by IoT?  Here is a three dimensional perspective on how banks can gear up if indeed IoT reaches a tipping point. 1. Get ready to Connect Banks need to constantly re-design customer experiences to reflect the lifestyle trends of customers and to stay ahead of niche players offering innovative services. We have seen a transformation in the way customers interact with a bank – which include touch points ranging from telephone, web and mobile to the now popular apps and wallets. IoT is further set to accelerate this proliferation of customer touch points requiring banks to take a fresh look on how they establish the customer connect. For instance, we saw how the Amazon Dash Button enables a consumer to place a direct order for a re-fill and also make the payment by just a single push of that button.   Consider a scenario where a car acting as a wallet can make payments on the go. A biometric identification on the car’s ignition start button can help a fuel dispenser instantly authenticate the driver and authorize the car for refueling. Or when driving into a parking lot the parking can anticipate the car’s arrival, direct it to a vacant spot and deduct payment as it leaves, all this without the driver having to stop or look around. And when the car is close to a Best Buy store, the bank can make an attractive loan offer on a television set that the consumer has been browsing online and social media for days. What if the insurance company can gain insight about the car’s health and the driver behavior? Banks and Financial Services Institutions can leverage the IoT ecosystem to connect and act as trusted advisors using information from the networks built around the life of the customer - from cars, kitchen, toaster, coffee maker, washing machine, refrigerator, medical records, healthcare providers, doctors, retailers, weather, cars, traffic signals, web, social, and so on. 2. Design for Scale and Agility As IoT innovations cut across industries, it will enable banks to collaborate with their customers and businesses in new and unprecedented ways. Undoubtedly, there will be a deluge of data stemming from IoT, creating an impact unseen since the birth of internet in the 90s. We will see ‘Cloud’ applications and ‘IoT’ produce a virtuous circle of growth.  This data will be up for grabs and organizations across industries will benefit based on how well they can leverage big data insights in real-time. Banks can draw insights from the data available to help bank personnel in functional roles, say credit officers for identifying risks and evaluating credit worthiness, or collection agents for using the data for debt recovery. Origination process for a bank is likely to be disrupted considering the vast network of devices, people, data and the resulting hyper connectivity. Loans can potentially originate from failing washing machines, refrigerators and car service stations. Banks must be able to adapt to the changes with improved operational strategies and by designing for agility and scale within the bank. IoT in banking has the potential to redefine a number of functional roles, business processes and products. IoT is capable of creating an impact on the banking industry similar to what Ford’s Model T created for the automotive industry in the early 20th century. When we look back, the adoption of assembly lines in the manufacturing of Model T created massive efficiencies, scalability and cost reduction which led to its success and widespread car usage. Similarly, IoT will raise the bar for operational agility  within the bank.  In order to stay relevant, banks will be forced to readjust their technology requirements. Banks should look at ‘business technology’ as an approach to prepare for IoT. In this approach, customer interests are identified first and then technology requirements are designed to meet those needs – this is different from the traditional information technology, which is highly focused on internal operations. To manage both the complexity and the level of uncertainty, banks must build a modern banking architecture and an application landscape that is flexible, one that supports real-time processing capabilities synchronized across channels. Such increasing levels of sophistication in the front-end driven by IoT will push banks to move from an omni-channel world to a world of ‘intertwined’ channels. Transaction volumes will see massive growth with an increase in the number of customer touch points. Banks that are capable of scaling up to the demands of new banking methods stand a better chance of reaping the benefits of IoT. Conversely, a traditional bank operating with a complicated legacy system runs the risk of losing their competitive edge to other agile banks.  Chances of cost escalations while trying to cope with a sophisticated front end cannot be ignored in a legacy system. Traditional banks have the choice of running their operations on an updated legacy system or moving to a banking platform which is modern, nimble and scalable.   3. Bake in Security On one hand IoT builds security through more monitoring devices and sensors, and on the other it increases vulnerabilities because of the number of entry points available to hackers. With more and more unsecure devices connecting to the internet, banks run the risk of becoming easy targets for hackers. We have witnessed enough number of data breaches across organizations in the past to know why banks should be more vigilant. The current level of security features within the banking landscape is simply inadequate in an IoT ecosystem. A strategy must be in place for banks to mitigate potential risks by building enough safeguards against data siphoning. For example, consider your printer on the IoT network is running low on ink and the cartridge needs a replacement. Now, the printer can place a direct order for a new cartridge with an online retailer by itself using your banking credentials linked to your retailer account – of course as authorized by you. If a hacker is able to gain access to your printer, then he may also possibly be able to gain access to your retailer account and place orders for a host of things that he may want to buy! Also, consider the fact that your printer and the retailer account could be linked to your accounts on other applications and a host of other devices like laptops, mobiles and tablets, etc. Security risks have just been multiplied exponentially!  To step up security in an IoT world, banks must include increased levels of authentication and authorization which can include biometric measures and geo-location/contextual verification. A holistic approach to security will encompass people, data, network, applications and processes. We can also expect core banking regulatory and compliance mandates to get increasingly stringent and mechanisms for guarding customer information and privacy will be tightened. Corporate CEOs and CIOs will be uncompromising when evaluating their choices for a banking partner and security will be a critical factor that will influence their decision. Implications With IoT, the opportunities for multi-device banking will continue to rise and bring in several new customer touch point apps, increasing the need for more real-time data integration and reconciliation within a bank’s application landscape. Banks may be required to expose many parts of banking functions as API services to these customer touch point applications. This emphasizes the significance for a common customer data hub and service oriented architecture. Banks will need these two elements to form a strong foundation that can house and serve a constantly evolving application landscape. While these are key ingredients, getting the right design and level of granularity will determine success. It also drives home the point that banks will have to gain a deep understanding of their customers to derive maximum benefit from the new and emerging opportunities. However, gaining insights that enable real-time contextual customer engagement requires dealing with a large amount of external unstructured data and structured data from within the bank’s application landscape. Early adopters who are ready with an infrastructure and intelligent strategies will be better poised to gain a competitive advantage and win new customers, especially the millennials.  Finally, going back to the question on whether IoT can shake the banks, the answer to this may be in the answer to another question –  How ready is your bank? * http://www.gartner.com/newsroom/id/3165317   Tushar Chitra is the Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Tushar.chitra AT oracle.com.

Blog By: Tushar Chitra, Senior Director, Product Marketing, Oracle Financial Services As disruptive technologies continue to fuel innovation, banks are not likely to remain unaffected by the emergence...

Analytics

You have the Data, Now take Control of Business

We all know the age-old saying: “Knowledge is power.” It’s still true, perhaps more now than ever. I’ve noticed that the more data regulatory bodies require us to create and archive, the more access we have to new information about business, clients and assets. It’s time to make use of this vast repository of data to create a seismic shift in the way we work. Data becomes more valuable when we organise it in a structure that includes: Innate transparency - Giving every level of management lateral and downward visibility Increased flexibility by virtualising and/or consolidating applications and simplifying structures Reduced complexity - Homogenising the data architecture to reduce cost and complexity Wise banks and other financial organisations are offsetting the cost of compliance with virtualisation and consolidation of applications and putting the savings back into the business, creating even more opportunities. According to EY's Global Banking & Capital Markets report, John Gerspach, CFO, Citi said: “Of the US$2.9 billion expense savings that we’ve gotten through our efficiency efforts, approximately 50% are being consumed by additional investments that we’re making in regulatory and compliance activities.” We can now pinpoint the profitability of business lines and clients in way that were not previously possible, but two levels of support are needed to make this work in the wider business. Financial support recognizes that investment can provide a tangible return over regulatory goals. Managerial support relies on a company-wide understanding of how a company can grow through better data. A well-defined approach that matches the needs of individual business lines, as well as the wider business can be supported with accessible data used to identify the profitability of business lines and clients. Howard Mather is a Principal Sales Consultant for Oracle Financial Services. He can be reached at howard.mather AT oracle.com.

We all know the age-old saying: “Knowledge is power.” It’s still true, perhaps more now than ever. I’ve noticed that the more data regulatory bodies require us to create and archive, the more access we...

Banking

Off the Beaten Path – Exploring Banking Solutions for Women Owned SMEs

Women make up more than half of the world’s population and studies predict that in a decade their impact on the global economy as entrepreneurs, employees, producers and consumers will be as significant as that of India or China.  In 2012, an estimated 126 million women were starting or running new businesses in 67 economies around the world. Despite the growth of women-owned businesses, there is a tremendous gap in their access to finance. It is estimated that over 70% of women-led SMEs are either unserved or underserved financially, which translates to a roughly $300 billion market opportunity for financial service providers 1 .Clearly, financial institutions have not yet realized the business opportunities of meeting the specific financing needs of women entrepreneurs as a distinct customer group. The numbers alone make a strong business case. There are multiple financial and non financial reasons for why women owned SMEs are reluctant to approach a bank for their financial needs - Social and cultural pressures - In some societies, running a business is often seen as a male venture, women are traditionally associated with home and hearth. This heavily impacts their perception and path as self-employed, but more importantly impacts their decision-making process. Limited network – They have limited professional connections as compared to men which adversely impact their access to business opportunities, information and contacts. They have fewer role models, and mentors which adversely impacts their growth. Attitude to risk – Women fear whether they will be able to pay back loans or not and ‘what if’ question creates trouble for them. They lack confidence in their ability to make up any investment losses through future earnings. Women see themselves as savers rather than investors. If given the opportunity, through innovative solutions for theconstraints stated above, women can thrive and be a profitable segment forfinancial institutions. Banks on the other hand hesitate to cater to the uniqueneeds of women SMEs due to complex internal financing policies and stringentrisk assessment mandates. A few other reasonsare cited below - Perception and risk evaluation –Financial institutions perceive women entrepreneurs as a high risk and low return due to the size of their enterprises 2. This negatively influences their appetite to lend whether this perception of higher risk is based on actual facts, experience or conjecture. Financial Institutions tend to offer credit only to those whom they know well. In many cases tracing credit histories is difficult. Collateral ownership – Banks usually need collateral of much higher value as compared to the amount of loan given, which is unavailable with women. And in cases where they have land titles in their name women are reluctant to provide these as collateral since losing possession of these documents creates an adverse impact on them as well as other family members. Financial literacy – This is a crucial factor as women tend to have less financial knowledge and limited access to formal financial products than men. Even when they have access to information on the financial services and market opportunities available to them, women may be less equipped to process it. Possibly due to the prevalent gender inequality, discriminatory regulations, laws and customs.    In a changing global financial landscape, wherealternative non-bank sources of SME finance are becoming more prominent, it isvital for banks to reposition themselves and adapt to an emerging client base.Enabling women entrepreneurs gain access to credit not only opens up growth opportunitiesfor their business but also marks an entry point for the consumption of otherbanking services.  Here are a few recommendations that banks can follow - ‘Know your customer’ strategy should be applied in an effort to understand the size, sector, and patterns of performance of women-owned SMEs in a bank’s portfolio. Another area of profitability for banks lies in increasing the financial literacy of women by offering knowledgeable sessions and friendly products.  In today’s banking world of robo advisors and mobile only banks, there is a sea of opportunity to serve women digitally, be it for inclusion, customized offerings or contextual marketing. By leveraging new technologies, such as biometrics, online community investment, and data driven individualized products, banks can overcome the gender gap and increase their customer base.   Unconventional credit scoring models can also be developed by using data derived from mobile phone usage or utilizing alternative methods designed for women who do not have access to traditional credit assets or a credit history.  FIs can make digital services available to women by working in partnership with fintechs and governments to boost their use of banking services while lowering transaction costs by building a comprehensive digital ecosystem. Banks must think beyond providing financial assistance to women for supporting this segment. Launching leadership programs and mentoring women can bridge the gap. Finally, when banks provide a holistic collection of products and services both at personal as well business level for women; they make it easier for them to manage finances in one place. Banks then stand a higher chance of becoming ‘the primary bank’ of their women customers. While FIs who adopted the above mentioned practices have the early bird advantage and are already enhancing their profits, this market is growing fast and has plenty of business to offer and banks that ignore this segment may lose out entirely in the race of market share. **1, 2 IFC Study,women world bank -2014 Tushar Chitra is the Senior Director for Product Marketing at Oracle Financial Services. He can be reached at Tushar.chitra AT oracle.com.

Women make up more than half of the world’s population and studies predict that in a decade their impact on the global economy as entrepreneurs, employees, producers and consumers will be as...

Analytics

It's time to make use of your new asset

I’ve been thinking a lot about regulation lately. We often see it as a hoop to jump through, but I’d like to pose an alternate view: What if the outcome of regulation could be used as an asset to our business, a tool to create more opportunities? That is exactly what is happening with financial risk data. The consistency and completeness of data required by regulation has become a valuable asset. According to this paper, it offers an opportunity to add real business value to savvy banks and other financial institutions. We now have access to vast quantities of business intelligence. What’s most interesting about this is the way it gives us the opportunity to analyse and interpret information to create new opportunities and ways of working. This change can be revolutionary. It can provide an understanding of your business, as well as its cash flows and interdependencies, at a level never before possible. It provides an opportunity to take a strategic approach so that delivering value becomes a constant, achievable goal. But how can we do this? It’s easier than it seems. First of all, show your stakeholders how access to new information can help the company grow through better data. This builds an organisation-wide appreciation for the systems that make data accessible. Secondly, ensure that the applications holding the data are structured simply enough to actually make the data accessible and understandable to all levels of management. This is where the value of a more transparent and malleable data architecture is key. This clear vision of what is known and unknown gives management control over a business in all areas. There has never been more clarity of assets held or insight into specific trading desks at investment banks. At a time of cost cutting, investing data into the wider business is a route to growth-boosting productivity. We can now look to the future with confidence. Howard Mather is a Principal Sales Consultant for Oracle Financial Services. He can be reached at howard.mather AT oracle.com.

I’ve been thinking a lot about regulation lately. We often see it as a hoop to jump through, but I’d like to pose an alternate view: What if the outcome of regulation could be used as an asset to...

Analytics

The Umbilical Cord Between Business Model and Compliance

In her fourth video blog of the series, Lessons NOT Learned, Saloni Ramakrishna discussed the lessons that the financial industry have not learned from failures, collapses, takeovers, bailouts and fines that grow more astronomical by the day. In her fifth video blog of the series, Ms. Ramakrishna discusses the fundamental connect between business model and compliance and how compliance has remained in the reactive stance due to a lack of appreciation of this connect.  She speaks of the two components of Business Model (BM) - Target Strategy Model (TSM) and Target Operating Model (TOM) and how compliance is at the core of Target Operating Model. Weaving compliance into the TOM creates the foundation for positive and active compliance that is a value multiplier. Watch Saloni Ramakrishna’s fifth video blog of the Compliance Risk Management Series, where she explains why compliance needs to be a way of business as opposed to a side function to enhance the bottom line.  The next one in the video blog series of Enterprise Compliance Risk Management is Practical Solutions to Compliance Challenges. Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In her fourth video blog of the series, Lessons NOT Learned, Saloni Ramakrishna discussed the lessons that the financial industry have not learned from failures, collapses, takeovers, bailouts and...

Analytics

Big Data: From Hype to Insight - Part 4 Banking on Big Data

I hope you liked the third post of the Big Data series and welcome to fourth and the final part of this series. In the preceding posts, we covered A-Z of the Big data Technology —so now, the natural progression is to understand how the Banks can incorporate the insights as delivered by the analytics in their business models, not to have any competitive advantage, but to catch up with the competition and be relevant in today’s market! I’m taking the use case of the Banking firms as it has a very compelling story to tell –not only we deal with Banks on a daily basis but also Banks are being Digitally Transformed by new emerging technologies on one hand and the ever demanding Customers and Regulators on the other hand. These challenges need to be met while Banks differentiate their offerings in the commodity banking business. When it comes to how banks can use Big Data analytics to their benefit, the business driver can be summed up in broadly two categories— First, Customer Data Analytics or how banks can monetize Big Data to increase customer wallet share and relationship intimacy. Secondly, how Banks check and predict the Financial Fraud and Operational losses which can save millions of dollar while managing the Reputational risk. As you can see below in Figure 1, all aspects of banking ecosystem from customer Acquisition to Engagement to Retention can be predicted and forecasted by the numerous Big Data Analytics and Machine Learning models available today for commercial use: Figure 1: Big Data Banking Opportunity Landscape (source: http://www.Mcr.ae/) 1. Customer Acquisition, Engagement and Retention: Until recently, customer Acquisition, Engagement and Retention strategies were mostly built based on historical information or gut instincts. To keep a competitive edge, financial institutions must leverage big data analytics to master these strategies. Once they harness the power of data and adopt a more proactive and personalized marketing strategy, the results will follow. Bank customers are generating massive amounts of data like purchase history, profile data, browsing history, product usage patterns and social media behavior every single day. Used wisely, this explosion in data can be harnessed to personalize marketing efforts tailored to customers’ interests, adjust product strategy based on usage patterns and preemptively predict which customers are likely to leave. To successfully acquire, engage and retain customers, and ultimately gain a competitive advantage with Big Data analytics, Banks should consider the below analytical strategies for handling these situations (See Figure 2 below). Figure 2: Predictive Analytics fueled by Bank Digitization (source: www.mahindra.com) Customer Acquisition – Analytics Goldmine, if you can! Acquiring customers is probably the topmost priority for Banks; however, it is getting more and more difficult as banking has become a commodity business. Banks need to differentiate themselves not only on the offerings but also on the services side. This customer intimacy is set in motion by knowing your target customers better than the competition. Marketers are faced with the challenge of being able to make the right offer to the right customer at the right time, which ensures a superior customer experience and in turn leads to an increase wallet share. The key to deliver this is to know about the potential customer as much as possible in the shortest time frame. Data Analytics can play a massive role in solving this problem by integrating data from both online and offline channel and provides a unified view of the customer. This integrated data feeds into the bank’s CRM solution, supplying the call center with more relevant leads and personalization. They can study the customers’ likes, dislikes, sentiment and behavior pattern in order to offers personalized services. Analytics can process huge volumes of data from different sources and run scoring and segmentation models that can aid in defining what products / solutions will be the best fit for the prospective customer. Figure 3 below captures various big data analytics capabilities and how these are helping banks in various ways: Figure 3: Big Data impact on Banking (source: www.celent.com) Customer Engagement – Path to Profitability! By looking inward to analyze customer usage data, companies can fine-tune product strategies and quickly deploy product improvements to keep customers engaged and satisfied. Historically, product management teams have spent a majority of their time managing reporting processes, rather than understanding how end-users interact with the product. To be competitive, they must employ data analysis solutions that allow them to quickly correlate changes in user traffic patterns and perform cohort analyses in the context of events such as new releases, A/B testing and outages to identify what is working and what is not. Armed with insights from the analysis, the bank can focus on increasing value by accelerating new product models. In the same context, the ‘Next best offer’ analytics can unlock the opportunities to drive revenues and engagement from cross-sell and up-sell strategies. The insights gleaned from big data analytics allows to make more accurate decisions and banks can target specific micro customer segments by combining various data points such as past buying behavior, demographics, sentiment analysis from social media along with CRM data. This helps improve customer engagement, experience and loyalty, ultimately leading to increased sales and profitability. As per one study done by CapGemini, Predictive analytics alone can improve conversion rates by 7X and top-line growth 10X. See Figure 4 below for Analytics journey from Un-Structured data to Customer Insights. Figure 4: Customer Analytics from Data Insights (source: www.slideshare.net) Customer Retention – Reduce Customer Churn with Behavioral Analytics! Not only can big data analysis help financial institutions acquire new customers and engage current customers, but it can also mitigate attrition and help retain customers who may be on the brink of leaving. With big data analysis, organizations can track specific behaviors leading up to a transfer or withdrawal and preemptively identify and engage with the customer to address their outstanding concerns or needs. To do so, they need to track activities such as if a customer called in for information with an outside financial consultant on the line, a change in employment or power of attorney or if they recently browsed for information on the company site about transferring funds. By correlating this data, they can determine the statistical relevance of each activity or combination of activities that resulted in a withdrawal or transfer. By taking actions like offering relevant promotions, potentially changing interest rates, or other incentives based on the insights gained, companies can significantly increase the percentage of funds they retain that otherwise would have been transferred to a competitor. And needless to say, this drives top line growth and builds into the effort already expended to onboard the customers. 2. Fraud and Operational Losses Big data Analytics is well provisioned in terms of Fraud Prediction, Detection and Prevention competencies. Presently, many organizations remain vulnerable to fraud and financial crime because they aren’t taking advantage of new capabilities to fight fraud and new threats. These capabilities rely heavily on new and improved big data and analytic technologies that are now available. With these technologies, banks can manage and analyze terabytes of historical and third-party data—far more than they ever could before. The ability to analyze massive data volumes enables banks to create highly accurate predictive models for recognizing and preventing future fraud. Organizations can also utilize Big Data capabilities for analyzing streaming data in real time using Spark Platform. Using this platform, banks can analyze transactions as they occur, detect fraud as it is happening and stop it before it causes serious damage. Fraud management is understandably very attractive part of predictive analytics solutions, given the Fraud and AML costs to the banks and FIs – as provided below in Figure 5. Figure 5: Fraud and Anti Money Laundering Global Cost (source: https://theknowlegegroup.org/) Here’s an example of how Analytics can be utilized in financial Crime: As a first step, overall enterprise data can be scrutinized for out-of-limit events, with these compared to relevant KPIs. Analytics capabilities operate in real-time detection mode (working with live data Streams) and in background and pattern-building and analysis mode. Big data analytics can detect anomalies, such as the simultaneous presence of a SIM card in two locations in the network, or continuous calling to destinations or toll numbers not previously contacted by the particular individual. Fraud patterns can be generated and, using associated predictors, fraud profiles and fraud propensity models can be built. Enhancing Fraud and cyber-security using new data sources Many banks are using unstructured data to improve fraud detection and cyber-security, particularly with respect to online banking. Here web session data is being used to create a profile of a user’s typical activity patterns. This is used alongside with other anti-fraud approaches to help identify actions that maybe result of fraudulent access, with the bank able to impose additional security validation check in this case. This approach uses machine-generated log data and message transmission from the web sessions, (i.e., clickstream data) to understand the paths users take when using websites. This data contains unstructured elements, such as user and page request details, and is very difficult to analyze with SQL methods given the significant preparatory work required to link the data together. The path analysis running the Spark compute engine on Hadoop on a real-time basis allows deployment while online activity is being conducted. This means institutions can implement measures to prevent fraud losses rather than need to react post-event. By using a data lake approach, it’s possible to create a central Data-as-a-Service platform that can store all relevant data using Hadoop along with an analytical workflow orchestration layer that manages data quality, processing and model execution and reporting. As we come to an end of this scintillating Big Data journey, I genuinely hope you got a good perspective of the Big Data Technologies, Architecture and Analytics and how the Big Data Insights can be a game changer for all the Industries, Domains and Organizations especially Banking. As we wrap up this series, kindly let me know your comments or questions if any. Gaurav Kesarwani is a Consultant with Oracle Financial Services Analytical Applications. He can be reached  at gaurav.kesarwani AT oracle.com.

I hope you liked the third post of the Big Data series and welcome to fourth and the final part of this series. In the preceding posts, we covered A-Z of the Big data Technology —so now, the natural...

Analytics

The Data Governance Journey: Adjusting to Cultural Change, A follow up conversation from FIMA US

I attended the FIMA US Conference last week and had a great experience meeting with many of you there. This event was a good mix of financial institutions with attendees from banks ranging in size.  The attendee profile consisted of a wide variety of individuals with a focus on data management and governance, risk management, and the technologies surrounding this space including several CDOs.  This year’s agenda was driven around data governance and management; no matter where you sit in the bank, this is at the top of the priority list! I had the privilege of hosting the Pre-Conference Roundtables on the topic of data governance and there was a lot of feedback!  It was discussed that data governance, and governance within the organization as a whole, is definitely not one size fits all! Management is fully aware of the need for governance and the benefits surrounding it; but how can it can be fully achieved isn’t very clear for everyone.  Funding for data management and data governance as a holistic solution is always an issue; budget typically gets allotted for tactical spends surrounding regulatory requirements.  Management is unsure on how to solve the problem. It is a long journey and though the benefits are known they are hesitant to venture out on this path. Many organizations are at various stages of the journey, some just in the planning phase and others are executing. Here are some of the steps that organizations are starting off with in this journey: Start SmallWhen looking at the number of data sources a bank has amongst the various lines of business and departments, it’s easy to get overwhelmed and not know where to start.  Within the bank’s overall data strategy, we recommend that banks pick one line of business or project to begin with and move on from there.  For example, Risk and Finance transformation is a good starting point from a data management and governance perspective and this needs to be the case with the cultural shift.  The Financial Conduct Authority has stated that British Retail Banks have “got it” and are making the change in cultural shifts and understanding the need to guard against misconduct. One main point discussed in this article is that it doesn’t happen overnight and to the outside world, it doesn’t appear that change is happening at all. Take a cue from othersMany changes within an organization can be planned for by looking at your peers.  Why reinvent the wheel? Deutsche Bank has established an entire department on corporate culture and values. They are recognizing that responsibility after the economic crisis has to be their main focus.  And other organizations are following suit:  The Royal Bank of Scotland is has a mission of restoring RBS to being “a really good bank for customers, shareholders and society as a whole.” Citi is changing the culture to bring more accountability and discipline through the use of score cards for top executives and Barclays is leading the ‘Transform’ program intended to make Barclays the ‘Go-To’ bank. It can’t be with just the executivesWithin the bank structure, executives are not in front of customers and there is a long line of employees between them and the front line.  It is the responsibility of the bank to ensure the front line employees are aware of the culture changes; however, a study by the New City Agenda think-tank and Cass Business School concludes that messages from top banking executives about cultural change are still not reaching many frontline staff. This is not only detrimental from the bank’s brand and customer satisfaction perspective, but also from the data and regulatory perspective.  For example, when a front office employee is working with a customer to open an account, it is their responsibility to ensure all data fields are entered and are accurate.  If a zip code is skipped this can ultimately affect a slew of other data requirements and reports.  The CDO from a Tier 1 Global Bank shared why there is a need for education with the front office so they have a true understanding of why this is happening.  As part of their awareness campaign they’re promoting stories around bad data to grab the attention of the employees.  One story example was of a personal account that has been open for over 100 years!  Even if the customer was over 100 years old, it was highly unlikely they’ve had the account for that long. This is a basic example, but a good way to show how something like this can have a long affect.   While working on shifting the organization culture, it was discussed how technology and data platforms can help the journey. It is every financial institutions’ expectation to have a unified data platform, operational governance layer, data integration abilities and regulatory reporting facilitation.  Please join me next week as I discuss the technology and data platform benefits to your data governance journey. Lovell Mathews is the Product Manager for Data Governance at Oracle.  He can be reached at lovell.mathews AT oracle.com.

I attended the FIMA US Conference last week and had a great experience meeting with many of you there. This event was a good mix of financial institutions with attendees from banks ranging in size. ...

Analytics

Big Data: From Hype to Insight - Part 3 Analytics and Visualization

I hope you liked the second part of the Big Data series and welcome to part 3. Building on the context set up in the first two posts, it’s time now to look into the exciting world of Big Data Analytics! This is the filtered outcome of the entire infrastructure we talked about in the second post and delivers business insights which guide strategic decision making. I believe that what you do with data is far more important than how big or small it is. So by 2020, the jargon “Big Data” may not exists as there won’t be anything like “small data” but we’ll still be talking about all important Data Analytics and the juicy insights we get from it. What is Big Data Analytics? Big Data Analytics is a bunch of software products that support predictive and prescriptive analytics applications running on big data computing platforms such as Hadoop and NoSQL databases. These tools provide the framework for data mining and analyzing data, discover patterns in real time, propose analytical models to recognize and react to identified patterns, and then enhance the performance of business processes by embedding the analytical models within the corresponding operational applications. Business Intelligence vs. Business Analytics Why do the Business Intelligence tools not work for big data? Well, the data sets are so large, fast and complex that they require advanced Data Analysis tools referred collectively as Business Analytics and the process of doing this is known as Data science (and the people who do this are known as data scientists, for the lack of better terminology). Banks, to date, deal with analytics capabilities designed to address structured data, such as basic queries, predictive modeling, optimization and simulations. However the structured data analytics can’t support web mining, spatial-temporal analysis and data visualization, to name a few—and thus the need for the big data Analytics. See Figure 1 below for bullet point differences between BI and BA. Figure 1: Business Intelligence vs. Business Analytics Self-Service BI, Any Takers? Traditional BI has always been about building a centralized warehouse and having a consolidated approach to reporting and dash-boarding. The mantra was about centralization and IT being the producer and business being the consumer. However recently, the rules of the game changed and many new entrants started selling directly to the business, circumventing IT. This came to happen as CIOs started to see analytics as business phenomenon, rather than being driven by IT. This disrupted the model, and it has been accelerating ever since. This is what is known as Self-service BI or BI which is bought, run, managed and owned by business, rather than IT. According to Gartner Hype cycle for business analytics and Intelligence (Figure 2 below), the next few years will focus on the idea that business users should be able to seek answers on their own to typical data-driven questions. This is also termed “data discovery” – how you connect data discovery software to your data sources and allow business users to discover enlightening answers, which help them in their work. This Hype cycle can also be broken down to understand what’s the passé, what’s the expectation now and what will be the customer expectation in couple of years down the line. To better understand this expectation and delivery shift in the Business Intelligence and Analytics (BI&A) market, we can also divide this into versions as BI&A 1.0, BI&A 2.0 and BI&A 3.0 which will make it clear that who’s going to win this war of data analytics and who’s going to be left behind. Figure 2: Hype cycle for Business Intelligence and Analytics, 2015 BI&A 1.0 (or Business Intelligence and Reporting) The BI&A technologies and applications currently adopted in the industry can be considered as BI&A 1.0, where data are mostly structured, collected by companies through various legacy systems, and often stored in commercial relational database management systems (RDBMS) and Reporting is largely done using the OLAP Cubes. According to the Gartner BI Hype Cycle analysis, the following eight Tools/technologies can be considered to be supported in the BI&A 1.0: Reporting and Dashboards Ad hoc query Search-based BI OLAP Interactive visualization Scorecards Predictive modeling Data mining BI&A 2.0 (or BIG Data Analytics) Since the early 2000s, the Internet and the Web began to offer unique data collection and analytical research and development opportunities. IP-specific user search and interaction logs that are collected seamlessly through cookies and server logs have become a new gold mine for understanding customers’ needs and identifying new business opportunities. Web intelligence, web analytics, and the user-generated content collected through Web 2.0-based social and crowd-sourcing systems have ushered in a new and exciting era of BI&A 2.0 research in the 2000s, centered on text and web analytics for unstructured web contents. According to the Gartner BI Hype Cycle analysis, the following ten Tools/technologies can be considered to be supported in the BI&A 2.0: Opinion mining Question-answering Web mining Social network analysis Spatial-temporal analysis Information semantic services Natural language Processing Content/text analytics An immense amount of company, industry, product, and customer information can be gathered from the web and organized and visualized through various text and web mining techniques. By analyzing customer click stream data logs web analytics tools, such as Google Analytics, can provide a trail of the user’s online activities and reveal the user’s browsing and purchasing patterns. Web site design, product placement optimization, customer transaction analysis, market structure analysis, and product recommendations can be accomplished through web analytics. What’s next? BI&A 3.0 analytics is undergoing research and started showing up in various ways like Mobile interface visualization and human–computer interaction, location and context-aware techniques, mobile BI to name a few, which are all leading to the Internet of things (IoT) technologies and Analytics. So putting things together, emerging analytics research opportunities  can be classified into five critical areas—data analytics, text analytics, web analytics, network analytics, and mobile analytics—all of which contribute  to BI&A 1.0, 2.0, and 3.0.  (See Figure 3 below). Figure 3: BI&A Research Framework: Foundational Technologies and Emerging Research in Analytics What is Big Data Visualization Analytics when combined with data visualization provide the power to the banks that can then begin tapping into the value of Big Data to boost overall effectiveness and realize a greater return on invest¬ment. And enterprises that use data visualization can be assured they are getting the best answers from the Big Data they collect, limiting missed business opportunities and helping them focus on strategic growth Data visualization is a general term that describes any effort to help people understand the significance of data by placing it in a visual context. Visualizations, if done right, have the ability to tell a story through images, directing users towards conclusions about data and empowering them to make better decisions. Patterns, trends and correlations that might go undetected in text-based data can be exposed and recognized easier with data visualization software. Today's data visualization tools go beyond the standard charts and graphs displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, spark lines, heat maps, etc. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis. Indicators designed to alert users when data has been updated or predefined conditions occur can also be included. What enterprises need are tools to help them easily and effectively understand and analyze Big Data. Employees who aren’t data scientists or analysts should be able to ask questions of the data based on their own business expertise and quickly and easily find patterns, spot inconsistencies, even get answers to questions they haven’t yet thought to ask. Otherwise, the effort and expense that companies invest in collecting and mining Big Data may be challenged to yield significant actionable results.  And companies run the risk of missing important business opportunities if they can’t find the answers that are likely stored in their own data. Need and types of Data Visualization tools There are many motivations of using visualization tools based on need, audience maturity and message which need to be communicated. Broadly speaking, there are 8 types of quantitative messages that users may need to communicate: Time-series: A single variable is captured over a period of time, such as the unemployment rate over a 10-year period. Example Line Chart Ranking: Categorical subdivisions are ranked in ascending or descending order, such as a ranking of sales performance. Example Bar Chart Part-to-whole: Categorical subdivisions is measured as a ratio to the whole. Example Pie Chart Deviation: Categorical subdivisions are compared against a reference, such as a comparison of actual vs. budget expenses for several departments of a business for a given time period. Example Pareto chart Frequency distribution: Shows the number of observations of a particular variable for given interval, such as the number of years in which the stock market return is between intervals. Example Box Plot Correlation: Comparison between observations represented by two variables (X,Y) to determine if they tend to move in the same or opposite directions. For example, plotting unemployment (X) and inflation (Y) for a sample of months. Example Scatterplot Nominal comparison: Comparing categorical subdivisions in no particular order, such as the sales volume by product code. Example Heat Map Geographic or geospatial: Comparison of a variable across a map or layout, such as the unemployment rate by state. Example Cartogram Features of Visualization Tool Data visualization has become an integral part of business intelligence systems, making reports more informative and visually appealing and helping to fuel the development of user-friendly executive dashboards and mobile BI applications. But successful visualization efforts hinge partly on selecting the right tools for the job at hand -- and that require understanding the important ‘must-have’ features for any visualization Tool- as below: Benefits of Data Visualization Tools One of the major benefits of the visualization tool is when it is effectively combined with self service BI application is that it enables business users to quickly and easily explore data. This means that employees don’t have to be well-versed in analytics in order to work with Big Data; line-of-business users can rely on their own expertise such as marketing, finance, or supply-chain operations to ask informed, specific questions of the data, gain insight from the answers, and put those answers to use to improve the business. Secondly, with visual analytics, business users can drill down into data to confirm a hunch, spot patterns, understand trends, or figure out where a process went wrong. And because these tools convey results visually, they are significantly easier to work with and derive value from than traditional analysis tools. It also gives users new perspectives for data analysis, allowing them to look at more options and make more precise decisions. Please see Figure 4 below for understanding other benefits of the Big Data Visualization Tools: Figure 4: Benefits of Data Visualization Tool Now that we have seen the analytics evolution and how this copes with the demanding real-time analytics requirement of the 21st century, we need to put things together and understand how these analytics can help banks and financial institutions cope with Fraud and help banks monetizing the data and their investments, which I’ll talk about in the last series of this post shortly. I’m excited for the next post and hope you are also plugged in and enjoying the journey! I hope you find this post informative. Kindly drop by any questions or comments you have. Gaurav Kesarwani is a Consultant with Oracle Financial Services Analytical Applications. He can be reached  at gaurav.kesarwani AT oracle.com.

I hope you liked the second part of the Big Data series and welcome to part 3. Building on the context set up in the first two posts, it’s time now to look into the exciting world of Big Data...

Analytics

Know Your Customer: Moving Beyond Regulatory Compliance Part 2

Thank you for continuing to join me in the Know Your Customer journey! We can definitely say it’s always an exciting journey, albeit one with its challenges. In my last post I discussed the key challenges around a KYC program and will now continue to outline the best strategy on how to move beyond just checking the regulatory box and ensuring your KYC program is a sound business strategic initiative. Adaption of Common KYC process across multiple business lines: In order to adapt common KYC processes across multiple business lines, financial institutions should focus on the following listed key drivers: Centralization of name matching (watch list scanning). Deployment of centralized watch lists which further eliminates risks associated with different lines of business, using different versions of watch lists. Consistency and coherence in arriving at the risk score (based on KYC, product used and transaction) across various lines of business  Consistent KYC process applied at the enterprise level in association with the severity of the product / accounts and transactions. Single view of customer transaction pattern, behavior deviation, linked customer, ratio of alerts vs. transactions.  Establish a single view access to history of investigations on existing customers. Improved correlation capabilities which further publish analytics of customer risk that span across various lines of business in a single window. A Straight Through Process (STP) should be implemented at enterprise-wide level to address the key pain areas such as capturing and storing required KYC documents, integration with document management system, integration with transaction processing system to block / freeze the transactions which appear to be fraudulent in nature (Based on the KYC risk severity of the customer and validated further enhanced due diligence); however, exceptions can be flagged for manual review.  STP ensures that all relevant KYC data, transaction history and investigation data are captured / reviewed at each stage reducing the risk of incomplete KYC data. Computation of Dynamic risk severity: Deploying an intuitive algorithm to detect risk severity (like Very High, High, Medium, Low and Very low) of a customer based on KYC data, product usage, deviation in transaction behavior   and based on the risk severity automated enhanced due diligence needs to be carried out with in a stipulated timeframe; listed below are some of the drivers which financial institution should adhere to while computing the Dynamic risk severity: Intuitive algorithm should be customizable and it should accommodate assigning risk severity based on customer demographical data, geographical data, transaction behavior, channels used for carrying out transactions, instruments used for carrying out transactions, previous history of suspicious activities etc. Based on the risk severity mandated CDD and EDD ( Data Collection process, Re profiling , Re computation of risk severity , Documentation ,  Opinion from third party etc ) activities should be validated with in a stipulated time frame. Enhanced Work flow: An enhanced work flow should support centralized, de-centralized and hybrid mode of work flow to accommodate a quick resolution of KYC red flags, listed below are some the key drivers for implementing enhanced work flow: Deployment of work flow should be based on rule. Workflow rule should automatically prioritize the KYC red flags based on attributes such as risk severity of the customer, product usage, deviation in behavior, and frequent movement of customer across different bands of risk. Configurable KYC processes including risk evaluation, drivers for requirements by KYC type and entity data gathering. Dynamic configuration of KYC work flow such that  financial institutions can apply due diligence based on several KYC conditions  (such as “Applies When,” “Required When,” ”Complete When”) which are driven by regulatory requirement / as per the policy of financial institution. KYC graphical investigation Tools: KYC graphical investigation tools should  enable financial institutions to make the right decision quickly by looking at the 360-degree view of the customer details, to make a better decision.  Rich graphical investigation tools, as listed below, play a very imperative role in assessing the customer risk severity 360-degree view of customer should publish risk movement of customer over a period of time along with graphical representation of product usage, static link and dynamic link maps, drill down facility to view transactions pertaining to every instrument, channels, and geography involved in a single window. Dash boards should publish landscape of customer risk severity at an enterprise level whereby user definable period and attributes (such as Profile, Risk severity, type of customer, geography etc) are considered. GIS Maps – integration with Google maps, GIS Maps should publish enterprise wide customer severity risk exposure with a built in capability to drill down and view 360 Degree KYC risk exposure of a customer in a single window. Reusable KYC Data models: Financial institutions should look at constructing rule based KYC data models which: Organizes KYC data into reusable and extensible case and folder structures that maintain all KYC information for an entity’s profile Supports creation of references to various KYC document requirements with integration to third-party content management systems  Integrates easily with legacy client information and on boarding systems  Extends easily to “Know Your Employee” and “Know Your ______” requirements Tests any  KYC Data models constructed on a test bed  wher by large volume of customer data needs to be considered Audit Trails: Financial institutions should generate/view audit trails pertaining to KYC process in WHO-DID-WHAT-WHEN-WHERE format, such that: Time-stamped audit history of manual and automated steps taken during evaluation, or at any stage including related documentation are recognized at enterprise level Audit trail of all system changes, including rules, re configuration of KYC data models and workflow can be viewed. History of KYC cases stored in profile for future reference and reporting KYC Service level agreement: Define the KYC service level agreement(SLA) ensuring that the required KYC processes are managed with in regulatory and internal control time frames, however if the SLA are not adhered to the required service level, provision should be available to auto-generate and drive cases for re-evaluation of due diligence, documentation and review. Conclusion Know your customer is what the contemporary financial industry is all about. Well developed KYC processes enable financial institutions to manage their risk exposure within the framework of regulatory compliance / controls as set by the policy.  With the right program/policy in place, the financial institution can experience improved customer service, increased profitability and a good relationship with the regulators.  What lessons has your organization learned while establishing a KYC program? I would love to hear your thoughts! Gururaja Prasanna is aPrincipal Sales Consultant at Oracle.  He can be reached at gururaja.prasanna AT oracle.com.

Thank you for continuing to join me in the Know Your Customer journey! We can definitely say it’s always an exciting journey, albeit one with its challenges. In my last post I discussed the key...

Analytics

Big Data: From Hype to Insight - Part 2 Infrastructure and Technology

I hope you liked the first part of the Big Data Ecosystem series and welcome to the second part of this series. Now, it’s time to peep into the nitty-gritty of what makes Big Data click! Big data requires more than a change in mindset – it requires a change in the technologies used to deal with the different types of data in play and the way they need to be handled to maximize the benefits. When we talk of Infrastructure and Technology, it’s a sum total of many things and means different things to different people and that’s why we need to define and limit our discussion to the below pointers: Platform and Framework Storage and Database Hadoop Applications in the ecosystem Data Policy, Governance and Security Big Data Talent Nevertheless, I’ll touch on the complete ecosystem from Technology to Talent – all things which can affect the successful implementation of Big Data projects. So, let’s get started! 1. Platform and Framework Hadoop is a Platform for distributed Storage and Processing of data sets on computer clusters built from commodity hardware. A computer cluster consists of a set of loosely or tightly connected computers (also called node) that work together so that, in many respects, they can be viewed as a single system. They have each node set to perform the same task, controlled and scheduled by software. Hadoop framework believes that moving computation is cheaper than moving data and thus allows the data to be processed on the local nodes where the data resides,and fair enough, this architecture processes data faster and more efficiently than it would on a supercomputer where computation and data are distributed via high-speed networking. See Figure 1 below for the complete Hadoop Ecosystem. It’s interesting to note that the complete Hadoop Ecosystem is free and open source software (FOSS) which is incubated and developed by Apache Software Foundation. Apache is funded and managed by many large corporations and individuals who voluntarily offer their services, time and money for the noble cause. Figure 1: Hadoop 2.0 Ecosystem (Multi-use Data Platform-Batch, Online, Streaming, Interactive, etc.) Hadoop 2.0 is a framework for processing, storing, and analyzing massive amounts of distributed unstructured data and this framework is composed of the following core modules: Hadoop Common – contains libraries and utilities needed by other Hadoop modules Hadoop Distributed File System (HDFS) – a distributed file-system that stores data on commodity machines, providing very high aggregate bandwidth across the cluster (more on this later) Hadoop YARN – Yet Another Resource Negotiator (YARN) is often called the operating system of Hadoop because it is responsible for managing and monitoring workloads, maintaining a multi-tenant environment, implementing security controls, and managing high availability features of Hadoop Hadoop MapReduce – Framework for programming model and an associated implementation for processing and generating large data sets with a parallel, distributed algorithm on a cluster (more on this later) The term Hadoop has come to refer to not just the above mentioned core modules, but also the collection of additional software packages that can be installed on top of core Hadoop, such as Pig, Hive, HBase etc. We’ll discuss all of these components subsequently. Right now, it’s important to address MapReduce and how it works. Hadoop manages the distribution of work across many servers in a divide-and-conquer methodology known as MapReduce. Since each server houses a subset of your overall data set, MapReduce lets you move the processing close to the data to minimize network accesses to data that will slow down the task. The MapReduce algorithm, introduced by Google, contains two important tasks namely Map and Reduce. Map takes a set of data and breaks the individual elements into tuples. Then the Reduce task takes the output (from a Map function) as an input and combines those data tuples into a smaller set of tuples. The reduce task is always performed after the Map job. An example will help to understand how MapReduce actually works; Figure 2 below captures how the number of words are counted under this framework, which represents asymptotic complexity if done using one machine. Figure 2: Example of MapReduce 2. Storage and Database It is important to understand how HDFS Stores data. At its most basic level, a Hadoop implementation creates four unique node types for cataloging, tracking, and managing data throughout the infrastructure as below: Data Node: these are the repositories for the data, and consist of multiple smaller database infrastructures that are horizontally scaled across compute and storage resources through the infrastructure. Client Node: this represents the user interface to the big data implementation and query engine. The client could be a server or PC with a traditional user interface. Name Node: this is equivalent of the address router for the distributed infrastructure. This node maintains the index and location of every data node. Job tracker: represents the software job tracking mechanism to distribute and aggregate search queries across multiple nodes for ultimate client analysis. Figure 3 below captures how these nodes work in tandem to store, retrieve, and process data by taking advantage of data locality and in-built fault tolerant nature of the HDFS Architecture. Figure 3: HDFS Architecture - How Storage Works Under Hadoop Now, talking of the Database technologies, you would agree that the Relational databases do not lend themselves intuitively to the unstructured or semi structured data and hence new type of Databases by name of NoSQL (Not only SQL) came into picture. Broadly speaking, there are four types of NoSQL Database in the usage right now – each has different runtime rules and different trade-offs. The complexity of the data and scalability of the system decides which database to use at which time. Figure 4 below depicts different type of Databases and their usages. Figure 4: NoSQL Database Types 3. Applications in the Hadoop Ecosystem As noted below in Figure 5, the Hadoop Ecosystem is well supported by many useful applications, which abstract the complexity and provide a platform to the business users to access Hadoop Architecture in efficient and productive manner. The below applications are discussed briefly to understand their utility and applicability in the ecosystem. The readers are encouraged to refer external sources for more details. Hive - Hive is a "SQL-like" bridge that allows conventional BI applications to run queries against a Hadoop cluster. It’s a higher-level abstraction of the Hadoop framework that allows anyone to make queries against data stored in a Hadoop cluster just as if they were manipulating a conventional data store. It amplifies the reach of Hadoop, making it more familiar for BI users. Hive allows SQL developers to write Hive Query Language (HQL) statements that are similar to standard SQL statements and are broken down by the Hive service into MapReduce jobs and executed across a Hadoop cluster. PIG - Pig was initially developed to focus more on analyzing large data sets and spend less time having to write mapper and reducer programs. Pig is made up of two components: the first is the language itself, which is called PigLatin and the second is a runtime environment where PigLatin programs are executed. It’s a high level scripting language that enables data workers to write complex data transformations without knowing Java. Figure 5: Business Abstraction for Hadoop Sqoop - Sqoop efficiently transfers bulk data between Hadoop and Relational databases. Sqoop helps offload certain tasks (such as ETL processing) from the EDW to Hadoop for efficient execution at a much lower cost. Sqoop can also be used to extract data from Hadoop and export it into external RDBMS. Sqoop works with almost all major RDBMS available in the market. Ambari - It’s a web-based platform for provisioning, managing, monitoring and securing Apache Hadoop clusters. Ambari takes the guesswork out of operating Hadoop. It makes Hadoop management simpler by providing a consistent, secure platform for operational control. Ambari provides an intuitive Web UI as well as a robust REST API, which is particularly useful for automating cluster operations. HBase - HBase is a column-oriented database management system that runs on top of HDFS. It is well suited for sparse data sets, which are common in many big data use cases. Unlike relational database systems, HBase does not support a structured query language like SQL. Flume - Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of streaming data into the HDFS. It has a simple and flexible architecture based on streaming data flows; and is robust and fault tolerant with tunable reliability mechanisms for failover and recovery. YARN coordinates data ingest from Flume that deliver raw data into the Hadoop cluster. Storm - Storm is a system for processing streaming data in real time. Storm makes it easy to reliably process unbounded streams of data, doing for real-time processing what Hadoop did for batch processing. Storm on YARN is powerful for scenarios requiring real-time analytics, machine learning and continuous monitoring of operations. Zoo keeper - ZooKeeper is a centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services. All of these kinds of services are used in some form or another by distributed applications. Oozie - Oozie is a workflow processing system that lets users define a series of jobs written in multiple languages – such as Map Reduce, Pig and Hive – then intelligently link them to one another. Oozie allows users to specify, for example, that a particular query is only to be initiated after specified previous jobs on which it relies for data are completed. Mahout - Mahout is a data mining library and takes the most popular data mining algorithms for performing clustering, regression testing and statistical modeling and implements them using the Map Reduce model. Also, produces free implementations of distributed or otherwise scalable machine learning algorithms focused primarily in the areas of collaborative filtering, clustering and classification. Kafka - Kafka is publish-subscribe distributed messaging service. A single Kafka broker can handle hundreds of megabytes of reads and writes per second from thousands of clients. Kafka is designed to allow a single cluster to serve as the central data backbone for a large organization. It can be elastically and transparently expanded without downtime. Other messaging variants include RabbitMQ and NSQ. Spark - It’s an upcoming cluster computing framework like Hadoop, albeit, with much faster processing speed than MapReduce (upto 100X). However, Spark does not provide its own distributed storage system. For this reason, many Big Data projects involve installing Spark on top of Hadoop, where Spark’s advanced analytics applications can make use of data stored using the HDFS. Spark handles most of its operations ‘in-memory’ – copying them from the distributed physical storage into RAM memory. Spark arranges data in what are known as Resilient Distributed Datasets (RDD), which can be recovered following failure. 4. Security of the Hadoop Ecosystem “The larger the concentration of sensitive personal data, the more attractive a database is to criminals, both inside and outside a firm. The risk of consumer injury increases as the volume and sensitivity of the data grows.” – Edith Ramirez, Chairwoman, U.S. Federal Trade Commission. In the age of Big Data, data Security is a deal maker. Organizations need to adopt a data-centric approach to security. Organizations are now in need of big data environments that include enterprise-grade authentication and authorization (LDAP or Apache Sentry project). Securing the big data life cycle requires the following security controls: Authentication and authorization of users, applications, and databases Privileged user access and administration  Encryption of data at rest and in motion  Data redaction and masking for both production and non-production environments Separation of responsibilities and roles  Implementing least privilege  Transport security  API security  Monitoring, auditing, alerting, and reporting Organizations can achieve all the benefits that big data has to offer while providing a comprehensive, inside-out security approach that ensures that the right people, internally and external, receive access to the appropriate data at the right time and place. In addition, organizations must be able to address regulatory compliance and extend existing governance policies across their big data platforms. 5. Big Data Talent The discussion about the big data infrastructure can’t be complete without talking about the new expertise and human resource required managing and analyzing this huge data. New breed of tools and programming languages are evolving which are very much in demand to cater to the growing needs of the Industry. As you can see below, the mix of the talent and competency will drive the demand and the availability for the new breed data scientists and data owners. Needless to say, right now there is huge gap in the demand and availability of the core skills required for data science and Big Data talent. Figure 6: Big Data skill requirement I hope you find this post informative.  Kindly drop by any questions or comments you have in the comments section below. In the upcoming post, we’ll talk about the Big Data Analytics – the one area which is most talked about in the field right now. I’m excited for the next post and hope you are also plugged in and enjoying the journey! Gaurav Kesarwani is a Consultant with Oracle Financial Services Analytical Applications. He can be reached at gaurav.kesarwani AT oracle.com.

I hope you liked the first part of the Big Data Ecosystem series and welcome to the second part of this series. Now, it’s time to peep into the nitty-gritty of what makes Big Data click! Big data...

Analytics

Lessons NOT Learned: A Video Blog

In her third video blog of the series, Gray Areas and Myths of Compliance, Saloni Ramakrishna discussed the myths and conflicts around compliance that move the compliance conversation into the real world. In her fourth video blog of the series and in line with the theme of real world issues, Ms. Ramakrishna discusses the lessons that the financial industry have NOT learned from failures, collapses, takeovers, bailouts and astronomical fines that grow more astronomical by the day. History has loads of examples where firms that have been in existence for a century or more have disappeared almost overnight and that is a reality to which financial services cannot afford to be blind.  She breaks down the root cause of failures to four major classes: Lack of internal controls Disregard to meet regulations and code of conduct Taking undue advantage of gullible customers The slips (conscious or otherwise) financial crimes space  Watch Saloni Ramakrishna’s fourth video blog of the Compliance Risk Management Series, where she explains how unlearned lessons of history will repeat themselves, if not addressed proactively. Ignorance is neither bliss nor a defense any longer in the compliance space.  Stay tuned for the next post in the video blog series of Enterprise Compliance Risk Management is The Umbilical Cord between Business Model and Compliance. Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In her third video blog of the series, Gray Areas and Myths of Compliance, Saloni Ramakrishna discussed the myths and conflicts around compliance that move the compliance conversation into the real...

Analytics

Successful Data Management: A Marathon, Not a Sprint

The ongoing data deluge in financial services has been quite apparent, and as a result, the criticality and significance of data management and governance needs no further vindication. Regulators have emphasized and enforced the principles for data management with sufficient rigor. Even outside the regulatory pressure, there has been a realization within organizations on the necessity and new-fangled benefits that they would have with an effective data management program and, more importantly, a technology to implement the program. Many financial organizations have embarked on this data management journey (I would rather call it an expedition). All of them are at different levels of maturity and are progressing cautiously. The reason for the unhurried progress is because it is a huge transformation. It involves change to people, processes and technology. It is a cultural change in the organization. The EDM Council has recently developed a Data Capability Maturity Assessment Model (DCAM) to help organizations establish, enable, and sustain a data management program. The model has been jointly created by data management experts in the banking industry along with the EDM council members. This DCAM model of assessment was widely participated in and is being considered as a guideline for implementation of a data management program. The DCAM Methodology defines 8 primary components for a successful data management program viz. strategy, funding, data management program, data governance rules, data architecture, technology, data quality and control environment.  The methodology does not stop at being indistinct and gets specific by breaking down these 8 categories into 36 specific capabilities, and these capabilities into 112 sub-capabilities. The sub-capabilities and their objectives together define a very sound methodology to assist an organization in their data management initiatives.   The council went further than just prescribing a methodology for data management. It also laid down a way to help evaluate the maturity of organizations. The assessment methodology defines a set of standard criteria used to measure data management functions and processes. The assessment methodology has a set of 21 questions that spans the concepts laid down in the DCAM capabilities and sub-capabilities. The EDM council went ahead and conducted a benchmark with several institutions from the banking, asset management and insurance sectors. They received responses from 128 different organizations, out of which 74% were from tier 1 organizations.  There was active participation from senior management of the participating organizations and the respondents were CDOs, SVPs, VPs, MDs, Head of Departments, Enterprise Data Architects and so on. The survey resulted in a score (on a scale of 1-6 where 1 means not initiated and 6 means enhanced levels in that area) being generated for 7 key areas in data management.  A quick summary of the industry average and the control group average is provided in the table below (The control group includes EDM Council members whose firms have been active in data management. They were hand-selected by the Council to offer a comparison against the industry. The control group includes many of the G-SIBs and a good representation from both large asset managers and global custodians). A full detailed report of the assessment and its results can be found on their website. Some of the key observations from the EDM council are: The whole governance process and framework is still being established in organizations. Classifying data as a key asset, assigning accountability and stewardship is a cultural change in the organization and is difficult. Establishing the lineage mapping the complex data flows, building unified, conformed and reconciled repositories is a very daunting task at hand, but remains a priority for organizations.  Data quality procedures and controls need to get operational and provide a confidence in data. Here at OFSAA (Oracle Financial Services Analytical Applications), we are seeing banks around the globe leverage a comprehensive data management solution, which goes beyond the regulatory needs to provide a holistic data management framework.  There is a need for a strategic data platform that provides a single source of truth for Risk, Finance, Treasury, Compliance and Customer Insight.  Organizations leveraging a comprehensive data management solution are starting the journey ahead of their competitors. Please join OFSAA at this year’s FIMA conference where my colleagues will be hosting the Roundtable, Achieving Consistent Data Governance Across the Organization, on the pre-conference day.  Learn why Chartis RiskTech100® ranked Oracle #4 and listed as Category Winner for Core Technology, Risk Data Aggregation & Reporting, and Geographical sector: Americas. In addition, Oracle has earned the following accolades: Operational Risk & Regulation: Best Regulatory Reporting Platform – 2015 Risk Tech Ranking from risk.net: #1 vendor for Risk Data Repository and Data Management category under Enterprise Operational Risk Management  Chartis Research Category Leader in Risk Data Aggregation and Reporting 2015 – Top Vendor Lovell Mathews is a Product Manager at Oracle Financial Services Analytical Applications.  He can be reached at lovell.mathews AT oracle.com.

The ongoing data deluge in financial services has been quite apparent, and as a result, the criticality and significance of data management and governance needs no further vindication. Regulators have...

Analytics

Know Your Customer: Moving Beyond Regulatory Compliance

Know Your Customer (KYC) and Enhanced Due Diligence (EDD) refer to operational tasks carried out by financial institutions in order to mitigate their risk exposure and to comply with anti-money laundering (AML) and counter-terrorism financing laws and regulations that are promulgated by the various financial regulatory agencies.   Know your customer (KYC) is a continuous process and not limited to New/prospect customer who is seeking to open new account. The Customer identification process (CIP) is aimed at establishing customer’s identity. Based on the initial risk profile, product usage pattern (Disclosed), transaction pattern (Un Disclosed), customer is subjected to customer due diligence (CDD). Based on Dynamic risk computation/customer profile group, product usage pattern, transaction pattern (Based on value, volume, velocity, geography, channel etc.,.) of the customer required enhanced due diligence (EDD) procedures are validated as per the regulatory requirement/policy. Due to frequent changes/updates in regulatory requirement/internal policy, financial institution are in great pressure to accommodate required due diligence to assess risk of a customer. As mergers and acquisitions are taking place within the financial industry, managing customers, especially those involved in differing risk profiles, need to be validated to understand the higher risk level than the previously independent entities, which results in major challenges. And the challenges are not unique to banks based on their size; Tier 1 banks such as HSBC and smaller organization such as Brickell Bank, based in Miami, Florida have both been hit with large fines over the last couple of years. Key Challenges Despite these ideas of AML, KYC, and EDD being around for many years, there are still significant challenges banks are facing regarding these.  I have outlined some of these below and have followed up with a suggested solution on banks can minimize the challenges.  Financial institutions irrespective of Business lines including retails, wholesale, private, investments, securities and capital markets and insurance companies are trying to establish complex global KYC requirements that include geography, customer, product and transaction specific. Increased time to board and apply due diligence over  customer and  transactions, due to the manual and diluted KYC processes, which has resulted in non conformity to Global KYC standards. Diluted manual KYC processes have resulted repetition of documentation on both existing and prospect customers and the amount of time this process takes leads to non-compliance.  The lack of financial institutions to derive / accommodate risk rating and KYC process by customer type, by geography, by product and by transactions has resulted in a significant increase of non compliance, and exposes financial institutions to regulatory, constituents and competition risk. Financial institutions are finding it difficult to review the severity of the risk associated (keeps changing based on demographic, geographic and transaction pattern) with the customer, which further results in non compliance. Few institutions have automated their KYC processes that are limited to customer, product and transactions; however the implemented system has a minimum capability to arrive /compute the severity of customer risk exposure at an enterprise level to apply enhanced due diligence. Note: The term customer includes the notion of account holder, joint-account holder, power of attorney holder, beneficial owner, founder and occasional clients. These actors can be physical persons or legal persons. Building Blocks - Moving KYC Beyond Regulatory Compliance The G7 leading nations created the Financial Action Task Force (FATF)  to fight against money laundering, however financials institutions are supported by contribution from various organization like International monetary fund (IMF), United Nations (UN), and the Basel committee on banking supervision for development and economic co-operations. Global KYC standards address its concerns for integrity, direct and indirect losses that may be incurred by financial institutions that do not adhere to the key KYC due diligence process. The primary role of a financial institution is to make profit such that they can better serve the  customer and retain them. In order to fulfill the same, the financial institution needs to know their customer better than before, but as a business objective, rather than a compliance regulatory requirement.  Key drivers for building a profitable financial institution are to have a strong KYC process / policy / Guidelines.   Revision of Existing KYC Process KYC processes within the bank need to be evaluated on a regular basis to ensure there are no defects; it is best to have the organization fix the process before the regulator catches the issue(s).  Listed below are the criteria which needs to be validated before reviewing the existing KYC process:  Should the Revision effort be at Enterprise level vs. Business unit level? Source/existing systems where existing KYC data needs to be collected  What different KYC data/information are required for completing KYC revision process? Missing KYC data/KYC information that needs to be collected under law vs. internal policy  Gathering information from a specific group/category of customer vs. total customer  Processing KYC data which were available in current system vs. Data which were captured from public sources In most financial institutions, a high proportion of customer data are stored in electronic format, a gap analysis can be carried out to find out find the missing KYC data considering all the accounts held by the customer with in the financial institution, In case if the multiple account relation of customer are stored in multiple system, cleaning and interlinking the customer relationship with in the enterprise needs to be established.  The risk severity of the customer depends on type of customer, product usage, transaction pattern, geography in which the customer operates, it is very imperative to financial institution to prioritize the receipt of information from customer who is deemed to pose higher risk, however financial institution can gain a insight into customer risk severity based on the customer’s transaction patterns. Effective prioritization logic would help financial institution to maximize KYC compliance for high risk customers at an optimal cost. While the financial institution is in process of reviewing KYC data, the customer should not be inconvenienced or find out the FI is seeking information to rectify an issue that should have been addressed during customer boarding. However, the  financial institution needs to  assess how to collect the missing information via own records, public records or through investigative / verification agencies. It is preferred that a relationship manger reach out to the high net worth customer to continue the personalized services and to gather the missing KYC information. Financial institutions should also clearly define response management such that following listed activates are managed efficiently: Training – Various people who connect with customer needs to be trained on the script, how to assess the collected data and required further action. Scalability – All the required hardware, software and infrastructure needs to tested for performance, speed and capacity to hold huge volume of revised KYC data. Response Metrics – Metrics needs to be formulated at enterprise level such that Number of customer Whom all information has been received, Number of customers for whom information has been received in part, number of customer for whom no response has been received, however the above said parameters needs to consider other attributes like customer risk severity, medium used to reach customer, product usage, transaction patterns, geography, Branch etc. Please join me next week as I continue to outline the best strategy on how to move beyond just checking the regulatory box and ensuring your KYC program is a sound business strategic initiative. Gururaja Prasanna is Principal Sales Consulting at Oracle.  He can be reached at gururaja.prasanna AT oracle.com.

Know Your Customer (KYC) and Enhanced Due Diligence (EDD) refer to operational tasks carried out by financial institutions in order to mitigate their risk exposure and to comply with...

Analytics

Big Data: From Hype to Insight - Part 1 Landscape and Architecture

Before I start, let me give you the context and structure of the complete blog series, Big Data: From Hype to Insight, so that you can stay tuned and know what to expect next! You must have heard or read about big data in different contexts and at different times; however, the complete and coherent picture is missing – and that’s the objective of this 4 part series addressing the complete ecosystem and putting the pieces of the puzzle together. From Beginner to Enterprise Architect to ‘C-Suite’ Management – they all see the same picture and talk the same language. Also, this topic will be addressed not only from the Strategic standpoint but also from the Tactical and Operational perspective, and that is where the real value resides. And I’m sure everybody, irrespective of being a techie or functional, will have something to take home from this series. As you see in Figure 1 below, there are 4 distinct layers of Big data Information Systems: Starting from Data Ingestion to Business Insight. This reflects the complete Big Data Ecosystem and this blog series will address each of these layers in the corresponding 4 posts, such as below: BIG DATA Landscape & Architecture BIG DATA Infrastructure & Technology BIG DATA Analytics & Visualizations BIG DATA Actionable Insights: Financial Fraud and Data monetization Figure 1: Big Data Landscape (source: http://www.wired.com)  For the Sake of Introduction So, let us start this wonderful journey by striking the right ‘data’ cord – In 2006, BIG data was a new jargon and much of the discussion was limited to the technical arena; but in 2016, this is surely not a big deal and more and more is being talked about in mainstream media. Nevertheless, to get on the same page, here goes the wiki definition of Big Data:   Big data is a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools. The challenges include capture, curation, storage, search, sharing, analysis, and visualization.  -- Wikipedia   In other words, big data refers to the sheer mass of data produced on a daily basis at a pace exceeding the capacity of current databases. It’s not only the amount of information, but also its variety and complexity as well as the speed by which the data has to be analyzed or delivered. The value-add is dependent on rapid analytics, with answers provided in seconds. Examples of this are Facebook, Google or Amazon, which analyses user statuses or search terms to trigger targeted advertising on user pages. Now, let’s address more interesting aspect of ‘really’ how BIG and FAST this is –The amount of data is growing exponentially. Today, our best estimates suggest that at least 2.5 Billion Gigabyte (GB) of data is produced every day! As Figure 2 points out,  the amount of data being created and replicated every 60 seconds across the world, and that’s everything from data queried on Google to streaming Netflix movies to Whatsapp call and messages to Facebook likes and comments to Twitter to Youtube Videos and the list goes on…and on. Figure 2: How BIG and Fast is Data Creation and Replication? (source: www.alphr.com) Even before we proceed, let’s take a deep breath and ask ourselves what’s the need to do this? I don’t need to convince you here – as the below picture captures this succinctly:  Figure 3: Benefits for Investing in Big Data Today! (source: http://www.i-scoop.eu) Also, big data phenomenon is not limited to Banks or Financial Services, it’s all encompassing – from Retail to Healthcare, from Utilities to Telecom to Manufacturing – it has immense benefits for all industries and no area, as we know today will go untouched by this tidal wave!!!  Next let’s touch upon the various sources of big data and their constituents, which multiply complexity (and challenges) in terms of being un-structured and have wide-ranging Velocity, Variety and Volume dimensions. Figure 4: Big Data Challenges (source: http://www.slideshare.net) By now, I hope you are convinced (and excited!) by the opportunities offered by Big data—so now just roll up your sleeves and let’s dive deep into the Architecture of the Big Data. Big Data Architecture More regulation, faster decision cycles, new interactions with customers, and a legacy IT architecture — these are just a few business challenges and technology issues facing financial services firms. Modern CIOs are faced with a couple of major challenges in unifying the increasingly disparate aspects of the enterprise data architecture. The first is aligning the existing architecture with the information needs of analysts and data scientists. The second is assimilating the continuous stream of innovative data management capabilities (such as Hadoop or NoSQL) to be integrated within the enterprise. On top of this, the data needs to be integrated and accessed while reducing overall systemic complexity. There are two important aspects that need to be addressed immediately. First, Big Data does not mean a single technology or a single use case, and there is no single path to start or expand an existing Big Data architecture. Secondly, Enterprise Data Warehouse (RDMS) still has a place in the new BI architecture—at least for the foreseeable future. Over the past two decades, many data warehouses and business intelligence systems have been developed. These systems used to be relatively simple, or at least straightforward. They allow organizations to run regulatory and compliance reports that show up-to-date overviews of revenue and detail what customers have bought recently. All of that information is very valuable, but it’s not enough anymore. In the big data era, things aren’t so simple. New technologies, such as Hadoop, HBase, stream processing systems and NoSQL databases, have entered the picture. Earlier ones came up with new avatar like columnar databases and in-memory processing tools—spurred partly by big data uses. (Figure 6 captures the Information Architecture Components) Figure 5: Relational Data Warehouse vs. Big Data Architecture - Logical View (source: http://www.oracle.com) Unified Architecture Road Map Given the complicated Information management architecture that exists today, the key question you should ask is, “just how can Big Data technologies be applied to create additional business value or reduce the costs of delivering Information Management?” After all, simply adding Big Data technology to your existing estate will do nothing in and of itself, other than add additional costs and complexity to an already costly and complex environment. It is only through decommissioning some of your existing estate or enabling additional insights that would hitherto not have been possible, will you actually add value to the business. To this end, the Unified Architecture has been seen to produce the best results– when the Big Data infrastructure and technology is integrated to the existing BI Architecture under the Incremental Agile development methodology. In short, Big Data technology alone is NOT an architecture, but is merely the latest aspect of a comprehensive and, hopefully, integrated enterprise-class information management capability. As you can see below, the RED color components are the addition on top of the existing BI Architecture (blue color).  Figure 6: Relational Data Warehouse vs. Big Data Architecture - Logical View (source: http://www.oracle.com) Unified Architecture offer a way to organize your components. The layers simply provide an approach to organizing components that perform specific functions. A big data solution typically comprises these logical architectural components - see the Figure 8 below: Big Data Sources: Think in terms of all of the data available for analysis, coming in from all channels. Decide and clarify what data is required to perform the kind of analyses you need. The data will vary in format and origin. Data Massaging and Store Layer: This layer is responsible for acquiring data from the data sources and, if necessary, converting it to a format that suits how the data is to be analyzed. Compliance regulations and governance policies dictate the appropriate storage for different types of data. Analysis Layer: The analysis layer reads the data digested by the data massaging and store layer. In some cases, the analysis layer accesses the data directly from the data source. Designing the analysis layer requires careful forethought and planning. Decisions must be made with regard to how to manage the tasks to produce the desired analytics Consumption Layer: This layer consumes the output provided by the analysis layer. The consumers can be visualization applications, human beings, business processes, or services. Aspects that affect all of the components of the logical layers are covered by the vertical layers: Information Integration: Big data applications acquire data from various data origins, providers, and data sources and are stored in data distributed storage systems. This vertical layer issued by various components (data acquisition, data digest, model management, and transaction interceptor, for example) and is responsible for connecting to various data sources. Integrating information across data sources with varying characteristics (protocols and connectivity) requires quality connectors and adapters. Big Data Governance: Data governance is about defining guidelines that help enterprises make the right decisions about the data. Big data governance helps in dealing with the complexities, volume, and variety of data that is within the enterprise or is coming in from external sources. Strong guidelines and processes are required to monitor, structure, store, and secure the data from the time it enters the enterprise, gets processed, stored, analyzed, and purged or archived. Systems Management: Systems management is critical for big data because it involves many systems across clusters and boundaries of the enterprise. Quality of Service layer: This layer is responsible for defining data quality, policies around privacy and security, frequency of data, size per fetch, and data filters: Figure 7: Architecture of Big Data Solution (source: www.ibm.com) Gaurav Kesarwani is a Consultant with Oracle Financial Services Analytical Applications. He can be reached at gaurav.kesarwani AT oracle.com.

Before I start, let me give you the context and structure of the complete blog series, Big Data: From Hype to Insight, so that you can stay tuned and know what to expect next! You must have heard or...

Financial Services

A 2020 vision for today's Transfer Agent

As we move to 2020 and look forward to the opportunities this can bring to the Transfer Agency landscape, it is worth remembering that many companies are still not sure of the direction to take.  Today's Transfer Agent is caught up in a myriad of Operational and IT challenges following the market and regulatory changes since 2008.  Many in the market are looking at their current complex IT and Operational model and thinking of the next steps.  These are the steps that need to be taken but were deferred for a variety of reasons over the past number of years. As such, the year 2020 offers many Transfer Agents the chance to streamline their organisations and in particular their Operational and IT bases. Therefore the coming years offer the potential to maximise revenue growth with minimal spend on supporting software and hardware.  Transfer Agents will need to identify a Core Transfer Agency Platform that delivers comprehensive functionality for all business processes and manages complex workflows involving core processing and fund distribution on a common Platform.  With the correct software a Transfer Agent will then have the opportunity to maximise revenue streams and leverage the value of the Platform.  This can be done expanding the Transfer Agent’s offering and by addressing diverse market segments including Retail Transfer Agency and Distribution Platforms (Sub-TA). The need to change has been much discussed (see Emerging trends white paper) but many in the industry, both as Asset Managers and Service Providers, are still utilizing sub optimal, legacy applications.  An opportunity now exists to look forward with confidence and invest in the next generation software needed to boost revenue.  It is worth remembering that there have been developments in the past 18 - 24 months that have impacted upon the Transfer Agent.  These include: FinTech and Digital Disruption Regulation End of life software and systems The search for a consistent Global / Regional IT and Operational model The Cloud Market dynamics There has been much discussion for the past few years on the impact of FinTech, and more recently on digital disruption, within Financial Services.  The Transfer Agency market has not seen the full impact yet as many companies chose to extend the lifespan of current, legacy systems.  However, as Fund investors have become 'tech savvy' and the demand from Regulation continues to grow many organizations now find that they cannot move forward as the market and their clients demand.  FinTech companies can offer alternatives and we have seen impacts on established market participants.  Nevertheless many FinTechs are not focused on the fundamental aspects of Transfer Agency therefore they do not always cater to the needs of the key players. The same is true for those offering a transition to the digital market.  It has been found that with the rush to be digitally ready many companies have not received the expected Return On Investment.  To be digitally ready you first need to be on a stable core platform that offers a global foundation for current and future needs.  This requirement is one of the strategic elements that any next generation Transfer Agent platform should provide. The regulatory burden is again a topic that has been well covered but still draws attention due to the burdens placed on all those in, and who use, Financial Services (see Why expat Americans are giving up their passports).  The modern Transfer Agent system needs to be robust enough to maintain current demands and requirements yet be flexible enough to meet future changes in policy and regulation at a local, regional and global level. To this end Transfer Agents need to consider those platforms that have unique features allowing the product can to be extended (or adapted) through the use of code 'hooks' via an Extensibility tool.  This gives the Transfer Agent the ability to quickly adapt to any regulatory or client demand without the need to wait for a vendor to complete the work.  A rapid time to market means those with an Extensible, Next Generation application will have a market lead and best of class service for their clients. Any such Extensibility change should be owned by the Transfer Agent and is therefore controlled and implemented in house ensuring proper processes and oversight can be followed.  Further once this change is promoted to the client's installation it should be able to be applied to future versions of the Core Transfer Agency platform, without issue. The Core Transfer Agency platform with this feature helps the Transfer Agent strive to the Holy Grail of 'Consistency', from both an IT and Operations perspective.  This consistency must also built on a Platform that is truly a Multi Region, Multi Fund Structure, Multi Currency, Multi Time zone, Multi Lingual application.  In addition, the Transfer Agent should ensure that the software can be deployed as a single instance on a global basis. This gives the opportunity to have a Global Operating Model that can be fine tuned on a regional and/or local basis, if required.  Transfer Agents with a standard Global Operating Model can ensure like-for-like steady state costs decline and not exaggerated by bespoke models in various locations. Further, with the use of such tools like Extensibility, IT teams will not need to have a myriad of macros and other tactical solutions bolted onto existing applications to ensure a BAU state. IT teams can further benefit by making use of the best of breed Cloud solutions for their Core Transfer Agency platform.  It is now possible to have leverage Managed Cloud solutions thereby reducing the CapEx requirement of running the Transfer Agency business. As Service Providers and Asset Managers within the UK and EU markets look to redefine themselves at the dawn of this next stage of evolution we have seen marquee names retreat from market segments, along with over-commitments and under-delivery by others.  This new market evolution will ensure that those Transfer Agents who have the ability to adapt and expand quickly will thrive, placing those with an agile and flexible system at the forefront. The change in the market is seen in an independent report from PwC which claims that by 2020 we will see the emergence of a new breed of global managers. The report (see PwC 2020 a brave new world) states that the Transfer Agent will be “one that will have highly streamlined platforms, targeted solutions for the customer and a stronger and more trusted brand.” Following on with this new focus where the Fund 'consumer' now becomes the centre for the Transfer Agent and for Asset Managers we have seen recent developments  in the market place where a technological partner is key to make the progression in this new environment (for example Maitland chooses Oracle). With the need to change to a next generation technological solution many Transfer Agents are seeing Oracle FLEXCUBE Investor Servicing as the fit to their 2020 Vision. As it is based on the Oracle 'Apps to Disk' Stack it is possible to utilise Oracle FLEXCUBE Investor Servicing within the Oracle Integrated Cloud solution. This makes Oracle the partner of choice when looking for the next generation solution in the dynamic world of Transfer Agency. Richard Clarkson is the Principal Consultant for FLEXCUBE Investor Servicing at Oracle.  He can be reached at richard.clarkson AT oracle.com or on Twitter @RMClarkson.

As we move to 2020 and look forward to the opportunities this can bring to the Transfer Agency landscape, it is worth remembering that many companies are still not sure of the direction to take. ...

Analytics

Predictive Analysis In Fraud Risk

by Deepali Dharwadker Afterspending over a couple of decades in Data and Analytics, I am often asked “Whatkind of Analytics are most suitable for my organization?” There are many kindsof Analytics. In this article I will focus on Predictive Analytics in the FraudRisk domain which constitutes the major play in the Analytics world in thissegment. Industrysurvey states that over 70% of BFSI executives believe that Big Data can play akey role in Fraud Prevention and Detection provided they embracethe technology for Statistical and Algorithmic Techniques along withcontinuous transaction level data monitoring. “BIdelivers Insight, Predictive Analytics delivers Action” Thesedays there is a shift in the concerns for the Chief risk Officer (CRO) - inaddition to concerns about regulatory risks, business continuity they now seeksolutions for RADAR (Risk Assessment Data Aggregation& Reporting) i.e. – Having an integrated picture of risk across the enterprise Extending risk coverage and information dissemination to the larger business community to help strategize Predicting risk FraudRisk: Assumingwe are all familiar with RADAR, lets focus on the most popularly usedPredictive Analytic Techniques for Fraud Risk :- Predictive analytics thrive on models - which is a mathematical formula or an equationthat takes in data and produces a calculation, such as a score. It appliesitself to data as a set of instructions to deliver a particular kind of result. Theresult is a score which is a numerical value generated by the model whenapplied to a data set. However not all models generate scores Predictivemodels are often embedded in operational processes and activated during livetransactions. They analyze historical and transactional data to isolatepatterns e.g. :- What a fraudulent transaction line entry looks like ? Look for identical or repetitive patterns in the transaction details e.g. location, threshold amount, business unit, dates like month end, weekend etc What a risky customer looks like ?  Theseanalysis draw the relationship between hundreds of data elements to isolateeach customer’s risk or potential, which guides the action on that customer.This way, the customers may be tagged GREEN (good customer), RED (bad customerpotential fraud) with varying scores.  Beloware some popularly used Predictive Analytics in FraudRisk                 1.Neural Networks :- Inenvironments with heavy data traffic, huge transaction volumes and abnormaldata patterns, Neural Networks provide some help. However they work the bestwith pre-transformed smooth data and hence potentially viable for use in anRADAR ecosystem Thehidden layer is the mathematical core of a neural net. It selects thecombinations of inputs (e.g., dollar amount, transaction type) that are mostpredictive of the output—e.g When your credit card is used or a claim isprocessed for payment. 2.Clustering :- Clusteringmodels use demographic data and other customer information in order to findgroups or “clusters” of customers with similar behavior, background orinterests e.g. one step might be to vary the monitoring threshold oftransactions for different cluster of customers. Using this cluster you couldbuild a subset of varying thresholds to monitor his actions closely and accordinglyproduce Amber Data for caution As a result, clustering can be used as aprecursor to predictive modeling 3.Risk Maps :- RiskMaps are very popular with the firms. They provide a list of potential risk,the ‘probability of the risk occurring’ and the ’impact size of the risk’. Thisway high impact risk can be closely monitored, however though this model doesnot provide the co-relation of 1 risk to another and hence every risk is anisolated find. For instance if we take 2 risk ‘fraudulent values of an asset’and ‘lender of the asset’ as separate risk items, then the identification ofrisk is isolated whereas ideally both these risk need to be monitored inconjunction for Fraud Risk.The best way to use fraud related risk maps isthrough collaborative effort of different business units across varyingfunctions within an organization to provide the business linkages to potentialfraud risk. 4.Simulations :- Simulationtechniques are typically used in getting information about how something willbehave without actually testing it in real life. It works on followingprinciple :- Input=uncertainnumbers/values * Intermediate Calculations = Output(uncertain numbers/values) Sohow do these uncertain values help Fraud Risk ? The concept here is to do a‘What-if’ Simulation of data. Each model is executed thousands times withvarying data input to determine the probability and respective outputs e.g.Monte Carlo Simulation. Unlike the Risk Map the Monte Carlo Simulation factorsin correlations between the variables. Conclusion:- Whilepredictive analytics can be very useful and provides an objective view of datarelated to risk along with mitigation ideas, it is fundamentally important totie the model appropriately to the business case, business unit and the relevantmetrices along with behavioral and economic data. Deepali Dharwadker - Consulting Practice Director - Business Intelligence for Oracle Financial Services Consulting. She can be reached at deepali.dharwadker AT oracle.com. The views expressed herein are the views of the author and not necessarily the views of the employer.

by Deepali Dharwadker After spending over a couple of decades in Data and Analytics, I am often asked “What kind of Analytics are most suitable for my organization?” There are many kindsof Analytics....

10 Reasons why Banks should move to Exadata

by Reshmi R (Kurup)  Oracle Exadata Database Machine hasbeen around since 2008 and many financial institutions are already reaping thebenefits of moving to Exadata. If your organization has still not moved on,here are some reasons why you should. Business agility:Financial industry demands responding quickly to changing customer requirementsby adding products and services quickly. So is the importance of timelyinformation for business application such as ERP, SCM, HCM, and CRM etc tosupport business decisions. Oraclebusiness applications can run onExadata and can quickly improve the business operations times and employeeproductivity. Improve User Experienceand response time quickly: Would you like to gain substantial performanceImprovement for existing applications which are running on Oracle Database withno changes to code?  Moving to Exadata can help here. Exadata hasStorage Server Software which does Smart Scan – i.e. offloading query executionon to the storage server closer to data and pass only desired results back tothe user. Smart Scan can work without any changes to existing oracle databaseand banks have noticed 10 to 100x performance improvements. More Transactions per minute ormore customers serviced within same time: Exadata feature Smart Flash Cache accelerates Oracle Database processing and speedI/O operations allowing more transactions to be performed. The unified network – i.e.a single network for both server-to-server communication and server-to-storagecommunication is based on InfiniBand, which is the fastest network technologyon the market, running at 40 gigabits per second. In addition, DB optimizedflash logging algorithms make Ultra-fast transactions possible. This can help scale theapplication and also reduce the overall business process cycles. Adopt Cloud and associatedbenefits with ease: Exadata CloudService allows you to get started with Exadata with minimal investment withits pay per use model. Oracle Experts Manage the infrastructure and theenvironment is 100% Compatible withon-premise database enabling easy migration and hybriddeployments. Exadata also makes it easy to implement database as a service in your enterprise and walk away with allbenefits of cloud – Self service, faster provisioning and agile environment. Itsaves the IT a lot of effort too. ElasticConfigurations for your specific requirement: Exadata comes in variousconfigurations based on your requirement - DBIn-Memory Machine,Extreme Flash OLTP Machine, and Data Warehousing Machine. Sayyou are looking for a High Performing DB in-memory machine and you already haveyour enterprise data primarily on Oracle database. Moving to in-memory onExadata is like a switch option and choosing which data you want in memory.You want to start small but have big plans: Exadata is pre-configured, saving you months ofconfiguration,performance tuning effort and can give you instant ROI irrespective of size ofyour enterprise. You can get a Quarter rack if you want to start small and getall the benefits. Scale to Half, Full and Multi-Racks as your enterprise andrequirements grow. Improve Reliability andAvailability: Your Bank needs to ensure that you properly manage, store andprotect Terabytes or PetaBytes of customer data reliably. When availability and reliability ofcustomer data is a top priority for you, consider Exadata to protect yourbusiness from planned and unplanned outages. Fewer or no outages mean fewer ITsupport calls and more satisfied customers. Ensure Quality of Serviceby Optimizing resource usage: Do you want your customer facing applications tobe most responsive and given extra resources? Exadata helps you manage andprioritize workload against available resources from CPU to Network to Storage.It allows business to define how the resources should be used for applications.It is seen that with Exadata workloadprioritization organizations can support almost 4 times more databases inthe same hardware. Compress largedatawarehouse and archived data: If you are a large bank with datawarehouse orarchival data growing to unmanageable sizes, Exadata can help in reducing thestorage requirements and also increase performance by reducing I/O withcompressed data with its Hybrid columnarcompression feature. One of the banks recently deployed Oracle ExadataDatabase Machine to reduce the size of the global trading data warehouse from40 terabytes to less than 8 terabytes Supportability from Oracle and partners: Because Exadata ispre-configured and deployed Exadata systems look alike, Oracle’s has anincomparable ability to monitor, maintain and support Exadata. This enablesOracle to offer Platinum Support. Oracle Advanced customer services, OracleConsulting services and Oracle Financial Services Primesourcing help movecustomers from Oracle and non-Oracle databases to Exadata with reliability. And,because it all comes from Oracle—from the hardware all the way to thesoftware—you get full end-to-end support, so there’s no finger-pointing betweenvendors, with the customer stuck in the middle. Exadata performance manifests itself in manyways delivering clear benefits to various lines of business within a bank - from providing business agility to simplifyingIT and allowing customers to work faster and smarter with real time businessintelligence and highly available and responsive transactional applications.   When are you moving to Exadata? Reshmi R (Kurup) is a Solution Architect for Oracle Financial Services Consulting. She can be reached at reshmi.kurup AT oracle.com.

by Reshmi R (Kurup)  Oracle Exadata Database Machine has been around since 2008 and many financial institutions are already reaping thebenefits of moving to Exadata. If your organization has still...

Analytics

Gray Areas and Myths of Compliance: A Video Blog

In her second video blog of the series, Drivers of Compliance in the 21st Century, Saloni Ramakrishna discussed the classes of compliance drivers, Direct and Indirect, and highlighted the fact that while they are industry and environment driven, what adds real value to the sustainable growth of the organization and can be driven internally is an active compliance management approach.  In her third video blog of the series, Ms. Ramakrishna discusses the myths and conflicts around compliance, which moves the compliance conversation into the real world. In the blog she discusses: The Myths surrounding compliance vs. the realities The overlap areas that hinder smooth functioning unless the boundaries are clearly understood The conflict areas that affect organizations performance negatively. Emerging risks and patterns that add to the complexity of Compliance landscape Watch Saloni Ramakrishna’s third video blog of the Compliance Risk Management Series where she delves into real world issues that need to be acknowledged and addressed in right earnestness to enable an active compliance program and the resultant healthy growth of the organization. The next post in the video blog series of Enterprise Compliance Risk Management will be “Lessons NOT Learned.” Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In her second video blog of the series, Drivers of Compliance in the 21st Century, Saloni Ramakrishna discussed the classes of compliance drivers, Direct and Indirect, and highlighted the fact that...

Analytics

Oracle Ranked as Category Leader in 2016 Chartis RiskTech 100®

Chartis, a leading provider of research covering the global market forrisk management technology, recently released the 2016 ChartisRiskTech 100®. Now in its10th year, this report is globally recognized as the mostcomprehensive study of the world’s most significant risk and compliance technologycompanies; Oracle ranked as a category winner in the following categories: Core Technology Risk Data Aggregation & Reporting Geographical sector: Americas In addition, Oracle moved up to #4 in theoverall ranking from #6 the last 2 years. The report, conducted by a leading team ofanalysts and advisors from the risk management and financial services industry,surveyed a record number of vendor respondents globally.  The RiskTech100®companies are drawn from a range of risk technology specialisms, meetingthe needs of both financial and non-financial organizations. However, theyshare a number of qualities that rank them among the top 100 risk technologyproviders in the world. The rankings are drawn up based on the following classifications: Source: Chartis RiskTech100®, December 2015 The study evaluated Oracle FinancialServices risk management solutionsincluding creditrisk management, operationalrisk management, liquidityrisk management, stresstesting, antimoney laundering and fraud,modelrisk management, and including those around riskdata aggregation, Baselregulatory capital, IFRS9, and risk-based compliance platform and technology.  The findingsrevealed the strength of Oracle’s technology – the OFSAA portfolio ofapplications.  These rankings prove Oracle’s commitment to beingthe market leader and optimal technology partner for financial servicesinstitutions needing solutions to solve specific risk and performancemanagement, regulatory and compliance issues.  We are pleased to berecognized by Chartis as one of the top providers of Financial ServicesAnalytical solutions in the market today.  Click here to learn more about our financial servicessolutions and offerings. In addition to the ranking, the reporthighlights key trends for the coming years. Chartis estimates the global expenditure on risk and compliancetechnology to being in excess of $100bn in 2016 and growing rapidly. With thisrapid growth comes new challenges and new opportunities in remaining compliantand minimizing the organization’s overall risk. Over the last year, the financial services industry has seen trendsaround enterprise-wide risk management shift slightly from the way it’s beenapproached in the past. Financialinstitutions need to be more agile and have the flexibility to reduce time andcost to compliance. Automation andsimplification have resounded over the recent months. Key trends moving into 2016 and beyondinclude: Enterprise governance, compliance and operational risk Risk data aggregation and reporting Enterprise stress testing and model risk management Financial crime risk management and cyber security IFRS 9 Financial institutions and their executivesare under increased pressure on many different fronts: more stringentregulatory requirements that lead to top- and bottom-line challenges; changingcustomer requirements – demand for more individualized relationships andofferings; challenger institutions and FinTechs eager to make their mark anderode bank market share; financial crime and fraud continues to rise infrequency, complexity, and scope. Not to mention that each different executivehas their own set of challenges, yet they are all working towards the samegoal: compliance, profitability and customer satisfaction. Partnering with a vendor that offers a unifieddata foundation, platform and applications for integrated risk, performance,customer insight, compliance, and reporting with full management and regulatory reporting will help yourorganization accomplish the laundry list of requirements for each role whileeliminating data silos poor data quality andpainful processes for regulatory submissions. Is your organization moving towards a unifiedplatform? Do you agree with the report’s findings? I would love to hear yourthoughts. Ambreesh Khanna is the Vice President of Oracle FinancialServices AnalyticalApplications.  He can be reached at ambreesh.khanna AT oracle.com.

Chartis, a leading provider of research covering the global market for risk management technology, recently released the 2016 Chartis RiskTech 100®. Now in its 10th year, this report is globally...

Analytics

Drivers of Compliance in the 21st Century: A Video Blog

In her first video blog post of the Compliance Series, Interesting Shifts in the Compliance Space, Saloni Ramakrishna presented a bird's eye view of the shifts in the compliance landscape.  In her second video blog of the series, Ms. Ramakrishna discusses what's driving compliance in the 21st century.  The classes of compliance drivers she discusses are: Direct Drivers such as the complexity of the financial services industry, regulatory rigor and intrusive supervision Indirect Drivers such as media scrutiny, globalization and the growing customer awareness She rounds off the discussion by highlighting the fact that while the first two are industry and environment driven, what adds real value to the sustainable growth of the organization, and can be driven internally, is an active compliance management approach. Do watch Saloni Ramakrishna's second video blog of the Compliance Risk Management Series to understand the drivers that are shaping the contours of compliance in financial services.  The next post in the video blog series is "Gray Areas, Myths, and Conflict Zones of Compliance." Ms. Saloni Ramakrishna, author of Enterprise Compliance Management - An Essential Toolkit for Banks & Financial Institutions, is a  financial services industry practitioner with nearly three decades of experience. She brings to table rich hands on knowledge with real world perspectives in Risk, Compliance and Performance areas. In her role she interacts with senior management of banks, consulting professionals and regulators across multiple countries. Ms. Saloni Ramakrishna is invited to share her views on industry trends by national and international finance forums like GARP, Ops Risk Asia, RiskMinds and Asian Banker amongst others. Her ideas have appeared as articles and quotes in printed & online media and television interviews.

In her first video blog post of the Compliance Series, Interesting Shifts in the Compliance Space, Saloni Ramakrishna presented a bird's eye view of the shifts in the compliance landscape.  In her...

Oracle

Integrated Cloud Applications & Platform Services