Monday Oct 15, 2012

Oracle Exalogic Customer Momentum @ OOW'12

[Adapted from here

At Oracle Open World 2012, i sat down with some of the Oracle Exalogic early adopters  to discuss the business benefits these businesses were realizing by embracing the engineered systems approach to data-center modernization and application consolidation. Below is an overview of the 4 businesses that won the Oracle Fusion Middleware Innovation Award for Oracle Exalogic this year.

Company: Netshoes
About: Leading online retailer of sporting goods in Latin America.

  • Rapid business growth resulted in frequent outages and poor response-time of online store-front
  • Conventional ad-hoc approach to horizontal scaling resulted in high CAPEX and OPEX
  • Poor performance and unavailability of online store-front resulted in revenue loss from purchase abandonment

Consolidated ATG Commerce and Oracle WebLogic running on Oracle Exalogic.
Business Impact:
Reduced abandonment rates resulting in a two-digit increase in online conversion rates translating directly into revenue up-lift

Company: Claro
About: Leading communications services provider in Latin America.

  • Support business growth over the next 3  - 5 years while maximizing re-use of existing middleware and application investments with minimal effort and risk

Consolidated Oracle Fusion Middleware components (Oracle WebLogic, Oracle SOA Suite, Oracle Tuxedo) and JAVA applications onto Oracle Exalogic and Oracle Exadata.
Business Impact:
Improved partner SLA’s 7x while improving throughput 5X and response-time 35x for  JAVA applications

Company: UL
About: Leading safety testing and certification organization in the world.

  • Transition from being a non-profit to a profit oriented enterprise and grow from a $1B to $5B in annual revenues in the next 5 years
  • Undertake a massive business transformation by aligning change strategy with execution

Consolidated Oracle Applications (E-Business Suite, Siebel, BI, Hyperion) and Oracle Fusion Middleware (AIA, SOA Suite) on Oracle Exalogic and Oracle Exadata
Business Impact:
Reduced financial and operating risk in re-architecting IT services to support new business capabilities supporting 87,000 manufacturers

Company: Ingersoll Rand
About: Leading manufacturer of industrial, climate, residential and security solutions.

  • Business continuity risks due to complexity in enforcing consistent operational and financial controls;
  • Re-active business decisions reduced ability to offer differentiation and compete

Consolidated Oracle E-business Suite on Oracle Exalogic and Oracle Exadata
Business Impact:
Service differentiation with faster order provisioning and a shorter lead-to-cash cycle translating into higher customer satisfaction and quicker cash-conversion

Check out the winners of the Oracle Fusion Middleware Innovation awards in other categories here.

Wednesday Sep 26, 2012

Supercharging the Performance of Your Front-Office Applications @ OOW'12

[Re-posted from here.]

You can increase customer satisfaction, brand equity, and ultimately top-line revenue by deploying  Oracle ATG Web Commerce, Oracle WebCenter Sites, Oracle Endeca applications, Oracle’s  Siebel applications, and other front-office applications on Oracle Exalogic, Oracle’s combination  of hardware and software for applications and middleware.

Join me (Sanjeev Sharma) and my colleague, Kelly Goetsch, at the following conference session at Oracle Open World to find out how Customer Experience can be transformed with Oracle Exalogic:

Session:  CON9421 - Supercharging the Performance of Your Front-Office Applications with Oracle Exalogic
Date: Wednesday, 3 Oct, 2012
Time: 10:15 am - 11:15 am (PST)
Venue: Moscone South (309)

Saturday Jul 14, 2012

Oracle Exalogic in Higher-Education: Virtual Learning Environments

In the quest to become the leading education institutions of choice  and draw world-class academic and student talent forward-thinking universities continue to embrace and evolve ICT to further their agenda in learning, teaching and research. However the global and domestic financial and operating environments impacting universities have grown increasingly challenging applying increased pressure in two ways.

  1. From a revenue perspective, the potential of a second global financial crisis looms large, with the potential to trigger yet another global recession. While education sector has been less affected by economic cycles in the past, the unprecedented level of economic turmoil that exists today makes it difficult to anticipate the revenue ramifications.
  2. From a cost perspective, further globalisation has greatly increased the competitive nature of the higher education sector, especially so due the boom in demand for education services and proliferation of education providers in emerging markets.
So how are leading universities preparing themselves to respond to this challenge and what sort of transformation are they hoping to realize?

Integral to achieving their goal of attracting top student talent is being able to provide outstanding student experience. While enhancing the campus based experience is still very important, universities are increasingly looking to augment physical campus based learning with virtual, online delivery of educational programs and services.

To offer students an engaging, stimulating and fun environment for learning universities have invested in a range of Information and Communication Technologies (ICT) at the core of which is the Student Portal, the Learning Management System and Student self-services such as IT help-desk etc. Periodically universities need to undertake a major refresh of these applications to deliver the next generation, collaborative and mobile learning experience. In addition to this, back-office university information systems must support seamless and cost effective access to information for decision making, and transactional services. Universities increasingly want to deliver shared services in collaboration with other institutions. As such universities are refreshing their back-office finance and resource planning applications with to ensure they can drive efficiency in their critical budget planning and operations processes.

Now there are many other applications universities rely on to manage their infrastructure, administrative services, alumni services and so on. A key challenge facing universities in their large-scale application modernization efforts is that upgrading to modern applications places further demands on data-centre infrastructure include storage, compute nodes and networking gear. Not to mention, refreshing the data-centre infrastructure entails integration risk due to multi-vendor procurement, testing, tuning and optimization. 

An approach to IT that worked well in the past centered around plugging the gaps in desired capability and driven by ad-hoc requests. However the sustainability of that approach is becoming a real impediment to optimizing CAPEX and OPEX budgetary controls given the woes of infrastructure fragmentation. As such universities are now standardizing their infrastructure and consolidate core applications on an open standards based environment. This is what is shifting their thinking towards engineered systems from Oracle including Oracle Exadata for the data tier and Oracle Exalogic for the middle tier.

Broadly speaking, there are primarily 3 factors why Oracle Exalogic has become the logical choice of running business applications for higher education institutions:

Firstly, Oracle Exalogic has enabled universities to accelerate the go-live time for application modernization by providing a pre-integrated, pre-tested, pre-optimized and pre-tuned infrastructure that enables end-to-end apps-to-disk management.

Secondly, Oracle Exalogic has allowed universities to consolidate application workloads, thus reducing the number of physical servers and further improving data-centre density through virtualization. This has brought cost savings in terms of software licenses, maintenance and energy consumption.

Thirdly, Oracle Exalogic and Oracle Exadata have allowed universities to shift towards a private cloud platform model for metering and charging computing resources as multi-tenant services, effectively transforming their IT from a cost-center to a profit-center.

Find out more, at the upcoming Exalogic Elastic Cloud 2.0 Launch.

Thursday Jul 12, 2012

Oracle Exalogic in Public Sector: Law Enforcement

The mandate for government law enforcement agencies is to safe-guard the protect the public against instruments of terror and enforce legislation. Ensuring authorized entry or exit of people, keeping unauthorized and dangerous immigrants under check and monitoring large scale goods movement is clearly one of the most critical and demanding operating environments. Processing thousands of people and vehicles places extreme demands on the underlying IT infrastructure. To support their mission critical operations, governments rely on massive data center facilities typically dispersed across multiple locations. Such IT operations are managed by several thousand personnel.

When it is a matter of national security, failure is not an option under any circumstances. Uninterrupted 24x7x365 operation is a necessity, since an hour of downtime can back up thousands of people at the borders during peak traffic times. Needless to say, the IT solutions law enforcement agencies require must offer utmost reliability, scalability, and security. Given the mission-critical nature of their operations, homeland security agencies are looking to evolve their IT infrastructure in a sustainable manner so as to cater to both existing and future workloads.

The overarching priority for law enforcement agencies is to reduce data center costs through application and infrastructure consolidation.

The challenge with the traditional approach is that:

  • Parts are not guaranteed to work together.
  • There are too many possible variations and a lack of standardization.
  • The final product is not optimized for best performance or maintenance.
  • There is no overall warranty.

The Engineered Systems approach to data centre operations is a paradigm shift from the traditional approach of assembling disparate layers of storage, networking, compute nodes, operating systems and so on.

Broadly speaking, there are primarily 3 factors why Oracle Exalogic has become the logical choice of running business applications for law enforcement agencies:

Firstly, by virtualizing the middle-tier infrastructure, Oracle Exalogic enables law enforcement agencies to eliminate large-scale, legacy systems. Moreover, consolidating multiple platforms onto a smaller number of machines requires very little or no modification of the application code. The benefit of physical and virtual consolidation is a manifold reduction in data center footprints and accompanying energy savings (in terms of cooling and consumption).

Secondly, the simplicity of managing a distributed private cloud middle-tier infrastructure in another important deciding factor. Distributed systems often require separate teams to manage the various solution components. As a result, root-cause analysis can be subject to undue complexity as multiple vendors claim that others are responsible for the issues at hand. With Oracle Exalogic, there is an single touch-point for any support issue involving Oracle Applications, middleware, compute nodes, networking and storage.

Thirdly, Exalogic has native integration with Exadata, Oracle’s engineered system for OLTP workloads. Having a standardized data-tier on Exadata creates unmatched synergies in running the middle-tier on Oracle Exalogic. Find out more, at the upcoming Exalogic Elastic Cloud 2.0 Launch.

Wednesday May 23, 2012

Performance - What? How? Why? - part 1 / 3

Performance has been the lingua franca for IT vendors for some decades now. When asked about the merit of a product be it hardware, software or even a framework, the instinctive response of vendors is to proclaim superiority of their product in terms of  Performance - "5x Performance Gains", "micro latency", "2ms response time", "10 million transactions/sec", etc., and more, the list is endless. With due respect to the R&D efforts behind developing exceptional software and hardware products, i find it bewildering and beguiling when sales, marketing (and sometimes technical people) draw a simplistic connection between Performance and business value. For instance, how do you connect the dots when someone says "5X performance improvement in XYZ leads to faster time-to-market"? Surely there must be a connection but i find the leap of imagination delusional.

Motivated by my own inadequacy to comprehend the business value of Performance i decided to get back to fundamentals and develop my thinking based on the notion of Wait-time from Queuing Theory. This theory states that average Waite-time in a single queue is a function of the systems Capacity, Utilization and Efficiency. i.e. Wait-time = f(Capacity, Utilization, Efficiency), where Wait-time is inversely related to Capacity and Efficiency and directly related to Utilization. Two things worth noting in this model are that firstly, Capacity is costly and such needs to be sized for economics and secondly for a given capacity there is an optimal Utilization Rate beyond which Wait-Time increases exponentially.

Let me illustrate this with an example of a capacity planning for a dine-in restaurant. If there are too many chairs, wait-time for guests arriving at the restaurant will be very little but this will incur overhead of over-capacity. On the contrary, if there are too few chairs, wait-time for guests will be very high, in the worst-case long enough to cause the queue of waiting people to churn. In theory this would mean a Utilization of 100%! This is an important point to note as what i mentioned earlier too that Utilization increases exponentially beyond a certain threshold level. Efficiency is what influences when this threshold level is reached. Hence capacity will need to be planned keeping long-term and peak traffic at the restaurant in mind while achieving a utilization rate that offers the optimal balance between idle-time overhead and queue wait time.

Performance of computers too is analogous to the Queuing Theory model. The graphic below depicts the dimensions of performance, the primary technological approaches that drive improvement in those dimensions and finally how those approaches map to the notion of Capacity, Utilization and Efficiency.

In the next post, i will delve deeper into articulating How performance is accomplished in terms of Capacity, Utilization and Efficiency through engineered systems.

Sunday Apr 29, 2012

Day 2: Highlights @ Gartner BPM Summit, Baltimore, 2012

Below is a graphical approach to depicting a Business Transformation (in other words value of BPM) that was shared during one of the sessions at the Gartner BPM summit.

I reckon this is really useful for IT decision makers in articulating value of BPM to the business leaders and showing progress along a time horizon i.e. the transformation roadmap.

Note: The color scheme could indicate priority or magnitude of investment or some combination of the both

Wednesday Apr 25, 2012

Day 1: Highlights @ Gartner BPM Summit, Baltimore, 2012

Social BPM

  • Social Media
    1. is an online environment for mass collaboration
    2. comprises of the following
      • Social Creation e.g. YouTube, Flickr, Blogger, Zoho
      • Social Networking e.g. LinkedIn, Facebook
      • Social Publishing e.g. RSS, Twitter
      • Social Feedback e.g. Digg, Delicious, Like, +1
    3. mere access to social media social media success
      • ditch the “provide and pray” approach
      • social media success is about mobilizing communities i.e. mass collaboration around a shared purpose that lies at the intersection of individual, organizational and community interests
      • Develop Purpose maps to understand what drives each stakeholder e.g. drill down with “Why or So What?” to 3 or 4 levels to get to the root of the motivational driver
  • Mass Collaboration

Embrace social processes as enablers of business processes by enabling mass collaboration. Mass collaboration has the following aspects

    1. Collective Intelligence
      • Wisdom of the crowds enables a continuous cycle of Contribute -> Feedback -> Collective Judgment/Intelligence -> Change
    2. Expertise Location
      • Overcoming limited/scarce/missing intellectual capital resources with a global, distributed pool of talent e.g. To develop its video recommendation engine Netflix ran a public challenge with a million dollar reward for the winning team
    3. Interest Cultivation
      • Ability to extend business capabilities through the community e.g. community “staffed” support forums vs traditional help-desk
    4. Relationship Leverage
      • Level the playing field e.g. ability of executive management to get direct insights from the field thereby flattening bureaucratic hierarchy
    5. Flash Coordination
      • Ability to self-align and rapidly mobilize large number of people, Massively Scalable Sense-Respond Systems e.g. Rail Services Outage management, Disaster-response Emergency Response
    6. Emergent Structures
      • Enabling increased participation of customers and partners to gain a value-chain advantage
  • Process candidates for Social BPM
    • Human-centric processes
    • Processes that are broken
    • Processes with a high frequency of change

Intelligent Business Operations (IBO aka iBPMS)

  • Dimensions of Context
    • Process
      • Workflow Predictability i.e. variability of a process across LOB, channels, geographies, customer segments etc.
      • Activity Standardization i.e. degree of re-use of an activity across process variants
    • Information
      • Criticality i.e. impact of information in flow on stakeholders or dependent task
      • Diversity i.e. structured, semi-structured and un-structured content
    • People
      • Complexity i.e. structure of teams, organization etc.
      • Diversity i.e. # of job roles, job levels, skill levels etc.
  • Dimensions of IBO
    1. Situational Awareness
      • ability to incorporate real-time context to make predictive (probabilistic) path selection
    2. Decision Management
      • ability to incorporate transactional context to make predictive (probabilistic) path selection
    3. Process Management
      • ability to self-tune / automate process optimization
  • IBO technology capabilities (horizontal axis) vis-à-vis IBO goal

Event Management

Decision Management

Flow Management


        Event analysis 

        Time / Causal patterns

        Statistical patterns

Transaction Log analysis

Statistical patterns

Process Orchestration

Service Orchestration


Event Model

Decision Model

Process Model





Note: the above table is incomplete as I am missing some information and will update it shortly.

  • For more, refer to the following public Gartner research articles:
    • G00213721
    • G00219274
    • G00214729
    • G00227539

PS: Thanks to my colleague Vikas Grover for useful edits to this post.

Thursday Jan 12, 2012

Stuck in Cement: Turn to BPM for edge applications

[Note: Cross-posted from]

"Stuck in Cement: When Packaged Apps Create Barriers to Innovation", reads the title of a recent Forrester research paper. The author, Craig Le Clair, laments that packaged applications create inertia that makes it harder for organizations to embrace change from an execution perspective. As per the report, there is widespread frustration with regards to ability to packaged applications to allow businesses to break free from operational silos and embrace change. So does that mean packaged applications are the root of all organizational inertia and should be dispensed with? Certainly not!

Vertical or horizontal applications packaged applications were intended to provide scale to business operations in terms of Capacity (i.e. volume), Performance (i.e. Straight Through Processing (STP)) and Compliance (with standards and /or  regulation) while mitigating time, effort and comprehensive skills set requirements, both technical and functional, of developing custom applications. The same rationale and value of packaged applications holds true, even more so today,  when time-to-value (lead-to-cash and trouble-to-resolve) and time-to-market (concept-to-market and time-to-compliance) pressures are greater than ever. While technology innovations such as Cloud accelerate initial set-up time and effort, to a large extent, cloud based applications apportion up-front and on-going costs of packaged applications over their life-time. It would be sacrilegious to claim that cloud based applications will solve the agility issues faced with on-premise applications. In fact the integration challenge would remain largely the same, if not get more complicated especially given the security, privacy and data synchronization concerns.

The problem of responding to change from an packaged applications perspective has been incorrectly associated with the eradication of business silos. Organizational and IT systems stove-pipes have been berated as being the cause of dysfunction in responding to change. But are organizational silos really bad? If so, why do they develop in the first place? Organization and IT system silos are a consequence and concomitance of natural evolution as the organization grows in the depth and breadth of its offerings, geographic reach, vertical specialization and market (i.e. customer segments). To respond to business priorities, that is revenue growth, margins, profitability or market share, organization will continue to become more complicated.. Matrix organizational structures are giving way to mesh (i.e. network) like organizational structure where the boundaries between internal lines of business and the external stakeholders (including customers, partners and suppliers) is blurring. Shouldn't businesses then be making more investments in packaged applications that are purpose-fit for specific customer niches, geographies and industries? Clearly, the flexibility of changing existing packaged-applications to meet new business needs is overrated in today's business environment.

The solution lies in providing a consistent experience across external interfaces while continuing to make investments in internal applications (packaged or custom). After all specialized, purpose-fit, applications will deliver a competitive advantage. This is where edge applications built on BPM shine in overcoming the change inertia plaguing businesses. For instance, let's consider a local retailer contemplating entry in an overseas market. What if the retailer's existing CRM system does not fit the requirements of rapid-entry into the target market? What choices does the retailer have?

One choice, could be to customize the existing CRM system through customized development effort. Another choice, could be rip-and-replace the existing CRM system with a new on-premise or cloud based CRM system. The latter approach may appear tempting in vendor pitches but is not for the faint-hearted in practice. To quote Carl Von Clausewitz, "Everything in strategy is very simple, but that does not mean that everything is easy!" In reality neither of the above approaches scale in the long-term.

Yet another alternative, one that businesses typically resort to, is to deploy a new CRM system that is purpose-fit for the requirements of the overseas market. In this case, the business is faced with the time and effort of re-coding business rules and compliance policies in the new CRM system. Though this approach makes sense it becomes harder to scale when future needs complicate integration effort and consistent enforcement of business rules and compliance policies across the stove-pipe CRM systems. However businesses can circumvent these issues if they build an intermediate layer that interfaces with the customer channels and orchestrates the orders across the different front-end CRM systems. In this manner, businesses get the performance and capability benefits of purpose-fit packaged-applications and while being able to apply business rules and compliance policies consistently across them thereby providing a uniform customer experience across the external channels.

The future is here today and BPM addresses the long-standing challenge of strategy-execution gap by serving as a platform for building edge applications.

Saturday Dec 10, 2011

Harnessing Business Events for Predictive Decision Making - part 3 / 3

The previous posts on this topic discussed the need for brain-like decision systems, key attributes of such systems and the enabling technology components. This post drills down into some of the common use-cases where opportunity cost of split-second "sense-and-respond" is overwhelming and intelligent BPM systems, or iBPMS (a term coined by Gartner in a recent research report) are gathering momentum. 

  • Financial Services - Payments processing is the bloodstream of financial services institutions. Banks and network providers (e.g. card issuers, clearing houses etc.) are experiencing phenomenal growth in volume of payments driven by emergence of newer payment channels (e.g. NFC contact-less mobile), greater payment types and aggressive drive to reach out to the un-banked population. Hence there is ever greater regulatory and commercial pressure to prevent fraudulent activities such as identity theft, terrorist financing and money laundering. It is no longer sufficient to rely on existing risk and governance systems to do retrospective analysis to detect and identify source of breaches. After all millions of dollars can be siphoned off in a split second and the perpetrators impossible to trace if the crime was committed with a mobile phone. What's needed is the ability to look at all of the payments transactions as they flow across payments processes, identify a rogue transaction (based on business rules or as an exception) and trigger an alert process to intervene a likely act of fraudulence. Clearly there will be a few false triggers but over time, just like our brains, such systems will be able to predict with greater confidence.

  • Healthcare - Patient Monitoring Systems (PMS), especially life-support systems that are meant to keep patients alive in medical situations where one or more critical organs have failed or are likely ot fail. Clearly, speed of emergency response is highly mission critical, if not life critical, in these systems. The premise of such systems is to monitor vital life statistics continuously and trigger alerts when critical thresholds are reached. However in some instances it is too late for any remedial action even when doctors respond without delay to an alert. In such instances precious lives could be saved if the PMS was able to predict likely organ failure or a threshold breach just a little bit beforehand and not after the fact. Intelligent dashboards that integrate such real-time data feeds from multiple PMSs would allow centralized monitoring and pro-active response thereby increasing critical-care success rates. 
  • Public Sector - Governments are expressing increased concern around cyber-security to safe-guard national interests. As more and more, government workloads and data shift to the internet and inter-linked systems, the vulnerability to and cost of breaches increases manifold. "Terabytes of data are flying out the door, and billions of dollars are lost in remediation costs and reputational harm, government and private security experts said in interviews", (Source: Reuters, June 16, 2011). The cyber-security problem has been tackled largely in a reactive manner till date, where security vendors rush to offer fixes after breaches have occurred or vulnerabilities are disclosed by software vendors. For governments such an approach is simply unacceptable. For instance, by the time a security fix is offered the damage could already be done if US nuclear documents fell in the wrong hands. Next generation cyber-security systems monitor both external i.e. web and internal i.e. business processes in real-time and correlate seemingly isolated data points to detect suspicious activity.

In addition to the above industries, we are seeing application of such iBPM systems, in telecommunications, retail and transportation. My colleague, Dan Tortorici, has written an interesting whitepaper on how Oracle BPM and Oracle Complex Event Processing (CEP) technologies are collectively enabling intelligent process automation, continuous process improvement and business transformation.

Friday Dec 02, 2011

Harnessing Business Events for Predictive Decision Making - part 2 / 3

In my earlier post i discussed the workings of the human brain to illustrate capabilities desired of the next generation decision systems in order to harness business events for real-time predictive decision making.

Below is a graphical depiction of the attributes of a "brain-like" decision system, its benefits and the underlying technology enablers.

Achieving near real-time predictive intelligence creates special demands of the technology components, hardware in particular, in terms of scalability, fault-tolerance and capacity (both compute and storage). After all performance is of the essence in building decision systems that operate at brain-like speeds. A remarkable capability of the human brain is that data and instructions are physically part of the same component, called neurons. This is what allows the brain to store exabytes (1 exabyte = 1 Million terabytes) of data and process equally vast number of real-time events in a flash. In fact more than 99% of the data storage, retrieval and processing in the brain happens without even conscious thought. In contrast, memory and processing is handled by separate components in computers. This is why it takes supercomputers that consume Megawatts of energy (for comparison sake, the human brain consumes around 12 Watts of energy at its peak performance) to simulate human brain activity. Now building a supercomputer for predictive decision making is certainly an over-ambitious, if not audacious endeavor, for most businesses.

Given enough time, skills and financial resources it is certainly possible to build such hardware systems. However, this is neither a forte nor a desirable capability which businesses should strive for. After all, businesses have more pressing concerns around realizing returns on IT investments in ever shrinking time-scales than experimenting with their technology infrastructures. A possible solution lies in integrated systems that are engineered from the ground up and not merely assembled from multi-vendor components. The rationale here is that optimizing individual parts is unlikely to optimize the whole. Hence simply assembling a hardware system by self-integrating server, storage, inter-connects and operating systems from multiple vendors is likely to create bottlenecks at the the slowest links in the chain thereby negating any benefits arising from performance claims of individual component vendors.

Oracle has a unique capability in delivering engineered systems comprising of middleware components and hardware substrata to allow you to get up and running with such decision systems while rationalizing costs and mitigating execution risk. If you are contemplating such event-driven decision systems feel free to drop me a note.

Sunday Nov 27, 2011

Harnessing Business Events for Predictive Decision Making - part 1 / 3

Businesses have long relied on data mining to elicit patterns and forecast future demand and supply trends. Improvements in computing hardware, specifically storage and compute capacity, have significantly enhanced the ability to store and analyze mountains of data in ever shrinking time-frames. Nevertheless, the reality is that data growth is outpacing storage capacity by a factor of two and computing power is still very much bounded by Moore's Law, doubling only every 18 months.

Faced with this data explosion, businesses are exploring means to develop human brain-like capabilities in their decision systems (including BI and Analytics) to make sense of the data storm, in other words business events, in real-time and respond pro-actively rather than re-actively. It is more like having a little bit of the right information just a little bit before hand than having all of the right information after the fact (premise of the book, "The Two Second Advantage"). To appreciate this thought better let's first understand the workings of the human brain.

Neuroscience research has revealed that the human brain is predictive in nature and that talent is nothing more than exceptional predictive ability. The cerebral-cortex, part of the human brain responsible for cognition, thought, language etc., comprises of five layers. The lowest layer in the hierarchy is responsible for sensory perception i.e. discrete, detail-oriented tasks whereas each of the above layers increasingly focused on assembling higher-order conceptual models. Information flows both up and down the layered memory hierarchy. This allows the conceptual mental-models to be refined over-time through experience and repetition. Secondly, and more importantly, the top-layers are able to prime the lower layers to anticipate certain events based on the existing mental-models thereby giving the brain a predictive ability. In a way the human brain develops a "memory of the future", some sort of an anticipatory thinking which let's it predict based on occurrence of events in real-time. A higher order of predictive ability stems from being able to recognize the lack of certain events. For instance, it is one thing to recognize the beats in a music track and another to detect beats that were missed, which involves a higher order predictive ability.

Existing decision systems analyze historical data to identify patterns and use statistical forecasting techniques to drive planning. They are similar to the human-brain in that they employ business rules very much like mental-models to chunk and classify information. However unlike the human brain existing decision systems are unable to evolve these rules automatically (AI still best suited for highly specific tasks) and  predict the future based on real-time business events. Mistake me not,  existing decision systems remain vital to driving long-term and broader business planning. For instance, a telco will still rely on BI and Analytics software to plan promotions and optimize inventory but tap into business events enabled predictive insight to identify specifically which customers are likely to churn and engage with them pro-actively.

In the next post, i will depict the technology components that enable businesses to harness real-time events and drive predictive decision making.


A business centric perspective on Private Cloud, Data-center Modernization and EAI.

Sanjeev Sharma
Twitter: @sanjeevio


« August 2016