X

The Integration blog covers the latest in product updates, best practices, customer stories, and more.

Recent Posts

Oracle Integration Day: Winter and Spring 2019

Oracle Integration Day kicked off last summer with stops in Tampa, New York, Boston, Toronto, and more. We are excited to announce our winter and spring Integration Day stops coming up over the next few months. Oracle Integration Day brings the experts on application integration, data integration, APIs, and process automation to you. Join us to hear real-world stories about how Oracle customers are able to adopt new digital business models and accelerate innovation through integration of their cloud, SaaS, on-premises applications and databases, and Big Data systems. Learn about Oracle’s support for emerging trends such as Blockchain, Visual Application Development, and Self-Service Integration to deliver competitive advantage. With interactive sessions, deep-dive demos and hands-on labs, the Oracle Integration Day will help you to: Understand Oracle's industry leading use of Machine Learning/AI in its Integration Platform and how it can significantly increase speed and improve delivery of IT projects Quickly create integrations using Oracle’s simple but powerful Integration Platform as a Service (iPaaS) Secure, manage, govern and grow your APIs using Oracle API Platform Cloud Service Understand how to leverage and integrate with Oracle’s new Blockchain Cloud Service for building new value chains and partner networks Understand how Oracle’s Data Integration Platform Cloud (DIPC) can help derive business value from enterprise data; getting data to the right place at the right time reliably and ensuring high availability Learn more about our upcoming stops and register here.   Atlanta, GA - January 23, 2019 Dallas, TX - January 30, 2019 Reston, VA - February 6, 2019 Santa Clara, CA - February 20, 2019 Columbus, OH - March 5, 2019 Philadelphia, PA - March 12, 2019 Houston, TX - March 19, 2019 Seattle, WA - April 4, 2019 Minneapolis, MN - April 25, 2019  

Oracle Integration Day kicked off last summer with stops in Tampa, New York, Boston, Toronto, and more. We are excited to announce our winter and spring Integration Day stops coming up over the next...

New Podcast Now Available: Integration and Blockchain - Heart of the Digital Economy

Authored by Steve Quan, Principal Product Marketing Director, Oracle    Digital transformation is inevitable if you are looking to thrive in today’s economy. In the first three podcasts of the series “Integration: Heart of the Digital Economy,” we heard about how application and data integration play a central role in business digital transformations.    In this new podcast, you’ll learn how the combination of Blockchain technologies, APIs, application and data integration extend enterprise backend systems for B2B or B2C business models. Blockchains are a series of secured transactions between two parties called a block. These blocks are linked together and used as a distributed ledger to store any information used in a business transaction. So as your company starts to develop blockchain specific applications, you’ll need to carefully plan and link your blockchain strategy with your integration strategy. This ensures that you can push (and pull) corporate data into (or from) their Blockchains. Learn how different industries use Blockchain technology. Listen to the podcast Blockchain Integration: Powering the Smart Economy.   Learn more about Oracle’s Application Integration Solution here. Learn more about Oracle’s Data Integration Solution here. Dive into Oracle Cloud with a free trial available here.   Oracle Cloud Café Podcast Channel  

Authored by Steve Quan, Principal Product Marketing Director, Oracle    Digital transformation is inevitable if you are looking to thrive in today’s economy. In the first three podcasts of the series “In...

Integration: Heart of the Digital Economy – New Podcasts Now Available

Authored by Steve Quan, Principal Product Marketing Director, Oracle Mobile devices and AI technologies are rapidly changing the way customers interact with businesses.  Some organizations are quickly assembling ad-hoc solutions to meet these challenges by writing custom code to hard-wire systems together. Without modern application integration, organizations end up building systems looking like a tangled mess of spaghetti on a plate, creating solutions that are costly to maintain and update. The two, new podcasts in our six-part series, Integration: Heart of the digital Economy, are now available in the Oracle Cloud Café. Integration: Fuel for AI-Enable Digital Assistants – Learn how National Pharmacies used modern, cloud application integration tools to build AI-enabled digital assistants. These chatbots gave shoppers a seamless experience when shopping in the cloud and in the store, enabling the company to grow sales in a matter of weeks.   Creating Modern Applications with API-First Integration – Success in today’s economy require companies to adopt cloud applications or become extinct like some brick-and-mortar businesses - remember the Blockbuster video rental stores? These companies need a new class of applications to remain synchronized to a digital heartbeat. It requires combining internal and external software and services that are glued together through APIs. Listen to the third podcast in this series to learn how API Management helped the company create new solutions quickly to compete successfully in the digital age.   Learn more about Oracle’s Application Integration Solution here. Learn more about Oracle’s Data Integration Solution here. Dive into Oracle Cloud with a free trial available here.

Authored by Steve Quan, Principal Product Marketing Director, Oracle Mobile devices and AI technologies are rapidly changing the way customers interact with businesses.  Some organizations are quickly...

My Monitor is Wider Than it is Tall - New Layouts in Integration Cloud

New Layouts in Integration Cloud I am sure most of you have noticed that your monitor is wider than it is tall.  To take advantage of this we have a new formats available in Integration Cloud to change the way we view orchestrations. Vertical & Horizontal Layout The first new feature is the ability to switch between veritcal and horizontal layouts. If we have an orchestration we can change it between horizontal and vertical layout using the Layout button at the top of the canvas as shown below. Choosing Horizontal will switch a vertical down the screen layout to a horizontal across the screen layout as shown below. New Views in Integration Cloud In addition to the ability to switch between horizontal and vertical layouts we now also support additional views of an orchestration.  The view above is the traditional canvas view of the orchestration.  But by selecting the different icons on the left at the top of the canvas we can switch to other views. The second icon from the left is the "Pseudo View" that adds pseudo code to the canvas to help identify what each step in the orchestration is doing. Note that the invoke tells us the connection and connection type being used. The third icon from the left provides us with a "Code" view.  This is not editable but allows us to see the actual orchestration code.  This can be helpful at times to understand unexpected behaviors. Summary Integration Cloud now makes it easy to switch between horizontal and vertical layouts of an orchestration on the canvas.  It also allows us to see annotations on the canvas using "Pseudo View" that helps us to understand what individual activities are doing.  Both the "Canvas View" and the "Pseudo View" are editable.  The "Code View" is not editable.  

New Layouts in Integration Cloud I am sure most of you have noticed that your monitor is wider than it is tall.  To take advantage of this we have a new formats available in Integration Cloud to change...

What is the Value of Robotic Process Automation in the Process Automation Space?

This blog originally appeared on LinkedIn; written by Eduardo Chiocconi. During the last two decades, much of the Process Automation efforts concentrated on using Business Process Management Systems (BPMS) as a means to document and digitize business processes. This technology wave helped the Process Automation space make a significant step forward. BPMS tools armed with Integration capabilities allowed organizations (and their business and IT stakeholders) visualize the processes they wanted to automate. From this initial business process documentation phase, it was possible to create a manageable digital asset to help “orchestrate” all business process steps regardless of its nature (people and systems). Without risking to exaggerate, most of Process Automation (or Business Process Management) practitioners would agree, that one of the hardest implementation areas is integrating with systems of information that the business process needs to transact with. BPMS vendors offered a wide array of application integration capabilities, usually in the form of application adapters, to integrate with these Enterprise and Productivity Applications. As more systems needed to be integrated from the business process, the hardest the implementation phase became. As much as we would like for Applications to enable all transactions via publicly available APIs, this is not the case and limits what integration service capabilities can do to integrate in an automated and headless manner. Simplification in the integration space helps! New Enterprise and Productivity Applications have started to really invest early in Application Programming Interfaces (API). REST based Web Services as an implementation mechanism and an API-First approach to offer Application functionality, certainly offered a simpler consumption of Application functionality and by transition it simplified the Process Automation implementation projects “hardest” last mile: integration. Integration vendors can leverage these APIs and offer a direct and easy way to transact against these Applications. But is this not well enough? Well… if your business processes create logic around new SaaS Applications you may be lucky. But for many organizations (and specially those that have gone the path of merger and acquisitions) it is not. Whether we like it or not, there are still many systems that are very hard to transact or interact with. This category of Applications include mainframe systems and homegrown to Enterprise Applications. But also, any kind of application that has gone some kind of customization where this functionality is only available through the application user interface (UI). Robotic Process Automation (RPA): The new kid on the block! What exactly is Robotic Process Automation? These questions may have many different answers. But to me, RPA offers a new mechanism to integrate and transact against Applications using the same UI that their users use. And via this non intrusive approach, it is possible to interact with the application as if it would be done by an person, but rather than a person doing the clicks and entering data, it is an automated application that we call a robot. Period! Why do we talk about RPA in the context of Process Automation? My first observation is that these two technologies are not the same. Secondly, that if you combine them to work together, it is possible to take Process Automation to the next level as RPA offers new ways to integrate with systems of record that could not be integrated before. The simplicity of the way in which it transacts with Applications also offers a first step of automation while a more robust and throughput optimal adapter or API approach is used. But let's drill down one level down and review two important use cases. From a Process Automation top-down point of view, we can sum it down to these: Use Case #1: Use robots to replace repetitive non-value added human interactions. This use case aims to reduce the unnecessary human touch points. In this scenario, it is possible to streamline the business process, since robots can execute these tasks without errors as they follow the same procedure over and over again. Moreover, robots will use the input data and avoid any “fat finger” issue that comes from humans accidentally mistyping the input data. It is worth putting some caution when using this use case, as robots cannot replace the necessary human decision intelligence and knowhow. In this later scenario, we will be better off to use the human discretion and criteria as it makes the process better. In the end, not all process steps can be fully automated without human touch points! Use Case #2: Use robots to prototype integration, as well as integrate with Applications when there is no other headless integration approach available (for example: API or Adapter). Leveraging RPA as “another” integration mechanism offers new ways to transact against Applications besides the ones known to the market to date. How do we bring more value combining Orchestration with Robotic Process Automation? As it was described through this blog, RPA offers “another” way to integrate with systems of record, complementing the existing adapter and API mechanisms offered by Integration platform capabilities. If we agree with the fact, that Integration is one of the hardest Process Automation implementation tasks to nail, then having another tool in our toolset definitive helps! While RPA may not be a silver bullet, it does make Process Automation better and offer a way to better digitize and automate your business processes. If you are using RPA in the context of Process Automation efforts, I would like to hear your thoughts.  

This blog originally appeared on LinkedIn; written by Eduardo Chiocconi. During the last two decades, much of the Process Automation efforts concentrated on using Business Process Management...

Make Orchestration Better with RPA

This article originally appeared on LinkedIn; written by Eduardo Chiocconi Nobody can deny, that when used correctly, RPA has the potential of providing a great ROI. Specially in situations where we are trying to automate manual no value added tasks as well as used as a mechanism to integrate with systems of information that do not have headless way (for example no APIs or Adapters if you are using an integration broker tool) to interact with them. I would like to start this article with a simple example. Imagine for a second, an approval business process where a Statement of Work (SOW) needs to be approved by several individuals within an organization (consulting manager to properly staff project, finance manager to make sure project is viable). Once the approvals are done, the SOW should be uploaded and associated to an opportunity in this company's CRM application (where all customer information is centrally located). At the core of this business process, there is orchestration that coordinates people approvals and should also integrate with the CRM application to upload the SOW to the customer opportunity. The diagram below illustrates the happy path of this orchestration using BPMN as the modeling notation to map this business process (screenshot from Oracle Integration Cloud - Process). Process Automation tools can easily manage the human factor of these orchestrations. Different tools manage integration to applications differently. Depending on the integrated system, the task of transacting against this system may be simple, complex and at times not possible at all. If we take a closer look at the step in which we need to upload the SOW document to the opportunity, then we have the following options: Option a) If the CRM application has an API that allows uploading documents and link it directly to an opportunity, then this transaction can be invoked from the orchestrating business process and automated in a headless manner. When available, this is the preferred way as it is more scalable and it does not come with the overhead of transacting via the application User Interface. Option b) If the CRM application does not have an API (or a headless way to transact with ti), the chances of automation are at stake. The immediate option is to ask a human (such as an Admin) to do the work. The orchestrating business process can route a task to the Admin and this person can get the SOW file, connect to the CRM application via its user interface, find the opportunity and then upload the SOW document to it. Not only this is a highly manual, repetitive and to be honest none value to the organization, but it is also at the mercy of the Admin having the bandwidth to perform this task (and also hopefully not associated to the wrong opportunity). But are these the only two options? Is there a middle ground between option a) and b)? As a matter of fact, YES! And the answer is Robotic Process Automation. The work that the Admin performs can be captured within an RPA process and via the RPA vendor APIs, be invoked when the flow reaches that step in the process (Upload SOW to Opportunity). Now, a Robot will perform the Admin's work (which was not really needed in the first place as this was requested due to the lack of integration alternatives). More importantly, it will be done the same way over and over again and at any time (even after working hours). Because the Robot is configured and scripted to do certain work, it is not necessary to train persons to learn this work on how to perform this transaction against the CRM Application. This automation via RPA, allows the consulting company to close and share the SOW faster with their customers. While the RPA process may need to interact with the application via its User Interface and the RPA script may be sensitive to UI changes, it is definitively a better option that having to work for a person to do the work manually! The screenshot below outlines who performs now the different steps of the process Great! As we combine people, robots and services, we are creating a digital workforce that performs business processes optimally. But wait! Can RPA automate this process end to end? Well, in reality it CANNOT! And this takes me to the second part of this write up. I would like to expose a handful of points where I make a case to always have an orchestration coordinate the work of people, robots and services calls to systems. Important people’s decisions cannot be automated: While in this example, it is possible to look for conditions where the consulting and finance managers may not need to approve the SOW, there will always be cases in which a person's decision and discretion if needed. This reason alone makes the case for an orchestration tool with human task interactions to be in the loop as RPA solutions do not manage the workflow and human element. Orchestration helps better recover from discrete action failures: One of the main functions of an orchestrator is to coordinate when to move on to the next step in the orchestrated flow. This happens ONLY the a step has been successfully completed. If it fails, then it will stay there until it can be perform without problems. Orchestration tools are built from the ground up with these capabilities in them and how to deal with exceptions, failures and retry logic so that the orchestration developer does not need to deal with these details when you get off the beaten happy path. RPA scripting cannot be considered an orchestration technology. For Robots to be resilient all the exception handling logic will need to be coded within the RPA process script itself, likely leading to spaghetti code which will be hard to maintain and understand. Bottom line (and the case I am making), you will be better off by coordinating small and discrete RPA process executions through an orchestration technology. RPA processes should be simple and discrete in what they do. If they fail, let the problem and error management logic be managed by the orchestration layer. The RPA process can always be retried, and it if it keeps failing, be delegated to a person who will be warned via a central monitoring location along with the rest of the integration and orchestration services. Orchestration with RPA, make Orchestration better. RPA with orchestration, make RPA better. One leading Orchestration tool is Oracle Integration Cloud. If you are looking forward to scale your orchestration or RPA efforts, I hope you find this example and lessons learnt useful.

This article originally appeared on LinkedIn; written by Eduardo Chiocconi Nobody can deny, that when used correctly, RPA has the potential of providing a great ROI. Specially in situations where we...

The Power of High Availability Connectivity Agent

High Availability with Oracle Integration Connectivity Agent You want your systems to be resilient to failure and within Integration Cloud Oracle take care to ensure that there is always redundancy in the cloud based components to enable your integrations to continue to run despite potential failures of hardware or software.  However the connectivity agent was a singleton until recently.  That is no longer the case and you can now run more than one agent in an agent group. Of Connections, Agent Groups & Agents An agent is a software component installed on your local system that "phones home" to Integration Cloud to allow message transfer between cloud and local systems without opening any firewalls.  Agents are assigned to agent groups which are logical groupings of agents.  A connection may make use of an agent group to gain access to local resources. The feature flag oic.adapter.connectivity-agent.ha allows two agents per agent group.  This provides an HA solution for the agent, if one agent fails the other continues to process messages. Agent Networking Agents require access to Integration Cloud using HTTPS, note that the agent may need to use a proxy to access Integration Cloud.  This allows them to check for messages to be delivered from the cloud to local systems or vice versa.  When using multiple agents in an agent group it is important that all agents in the group can access the same resources across the network.  Failure to do this can cause unexpected failure of messages. High Availability When running two agents in a group they process messages in an active-active model.  All agents in the group will process messages, any given message will only be processed by a single agent.  This provides both high availability and potentially improved throughput. Conclusion If resiliency is important then the HA agent group provides a reliable on-premise connectivity solution.

High Availability with Oracle Integration Connectivity Agent You want your systems to be resilient to failure and within Integration Cloud Oracle take care to ensure that there is always redundancy in...

Sending OIC notifications from an email address of your choice

Most of the avid OIC users are aware that the OIC notifications, whether it is system status reports or integration notifications, are sent out from an oracle address i.e. no-reply@oracle.com. But with the latest enhancements, OIC gives flexibility to the users to choose the sender for these notifications. OIC achieves this by providing a simple and intuitive UI, where a user can easily add a list of email addresses which can later be approved to qualify as an Approved Sender in OIC. Let’s see how we can do this in a few simple steps: Navigate to the Notifications page. Here, you will see a table where you can add a list of email addresses that you want to register as Approved Senders with OIC. When you click on add button (plus sign) on the bottom right corner of the page, a new row is added to the table where you can enter an email address. You can also choose one of the email addresses for sending System Notifications such as status report for successful message rate, service failure alerts etc. You can do this by checking the box corresponding to email address of your choice. Please note that you can only choose one email address for sending System Notifications. When you are done entering the list of email addresses, click on Save. Upon saving, a confirmation e-mail is sent out to each of the email addresses in the list. Approval Status is changed to reflect the same information. The recipient of the email is then required to confirm his email address by clicking on the confirmation link in the mail. Sample snippet of the confirmation email is pasted below Upon confirmation, the Approval Status is changed to Approved.            (To refresh the approval status, please use the refresh button on the top left corner of the section.) Congratulations! You have an approved sender registered in OIC. You can now use this approved sender in the From Address section of Notification Action in the Integration Orchestration canvas as depicted below. In additon to this, you can also choose this Approved Sender for sending System Notification. Please note: In a scenario where a registered email address is still “Waiting for User Confirmation” and the user uses it in the Notification action or chooses it to send system notifications, then the sender will be defaulted to no-reply@oracle.com. Hope this blog was able to shed some light on how OIC is helping users manage their notifications better, whether it is by providing the ability to register any number of email address or deleting a previously approved email address from the list of approved senders or changing the primary sender of System Notification any number of times. Hope you have fun incorporating this feature into your use-cases!

Most of the avid OIC users are aware that the OIC notifications, whether it is system status reports or integration notifications, are sent out from an oracle address i.e. no-reply@oracle.com....

Integration

Working with Create Error Activity

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; line-height: 21.0px; font: 14.0px Arial; color: #404040; -webkit-text-stroke: #404040} span.s1 {font-kerning: none} span.s2 {font-kerning: none; color: #000000; -webkit-text-stroke: 0px #000000} span.s3 {text-decoration: underline ; font-kerning: none; color: #0000ee; -webkit-text-stroke: 0px #0000ee}   Want the ability to add an Create Error activity at any point in the execution of an integration flow?  Now you can.  You will need to enable "oic.ics.console.integration.throw-action" feature flag to enjoy this feature. Create or Edit an Integration Drag the 'Create Error' action Notice the gray plus icons that appear over the connector lines ​ The error can be Dropped on any of those plus icons Once dropped enter a Name and Description (optional) for the error ​ Upon choosing Create, you will automatically be take to error details The error can contain information such as Code, Reason and other Details ​ To edit the fields you'll need to click the pencil icon which will take you to the Expression Builder to enter the information ​ The user can also provide a Skip Condition which will prevent the error from actually being thrown ​ Closing the details will return you to the orchestration and the node will be added to the diagram Without a Skip condition the diagram is displayed with a dashed line to the next activity signifying it will NOT continue execution after executing the error. ​ When a Skip condition is specified the line to the next activity is solid while the decoration around the error activity is dashed.   The solid line indicates that it’s possible that the execution of the flow will bypass the error and go straight through to the next activity. ​

  Want the ability to add an Create Error activity at any point in the execution of an integration flow?  Now you can.  You will need to enable "oic.ics.console.integration.throw-action" feature flag t...

See how easily you switch your integration views

In OIC, we spend most of our time building the integration. Currently, when you view/edit the integration in editor, it shows the integration in vertical layout. Now, you can view/edit the integration in several ways: Canvas view Vertical: Displays the integration vertically. Horizontal: Displays the integration horizontally. Pseudo view: Displays the integration vertically with child nodes indented. Details about each node in the integration are displayed to the right. In addition to the above, you can also view the integration outline style.You will need to enable "oic.ics.console.integration.layout" feature flag to enjoy this feature. The above diagram shows how to select different views and how the integration looks like in vertical view layout. Canvas view: Canvas view allows you to select the layout. There are two options for the layout: Vertical: This is the default view mode of the integration. In this mode, the integration is shown vertically. Horizontal: While in Canvas view, you can switch the layout to Horizontal and the integration will be shown horizontally. Pseudo view: In this view the integration is shown vertically with indented child nodes. For each node, it shows the details of it. This helps you to easily understand the integration without need to drill down to each node to see the details!  You can use the inline menu to add new nodes/actions. In this view mode, you won't be able to change the orientation of the nodes but you can do the reposition of the nodes , i.e moving Assign inside the Switch node etc. Outline View: You can also view your integration as outline. You need to click on the menu and select 'View Integration Outline'. The outline view is a great way to view your integration at a glance. You can expand and collapse nodes which has child nodes. You can also maximize the Outline view which will also show you details of each node. This view is particularly very useful when you have a large integration and want to understand the functionality of it quickly. Please note that outline is a read only view, you can't use this to add/modify actions/nodes.   Hope you find this feature helpful. Enjoy the integration views!  

In OIC, we spend most of our time building the integration. Currently, when you view/edit the integration in editor, it shows the integration in vertical layout. Now, you can view/edit the integration...

Oracle OpenWorld 2018 Highlights

With another Oracle OpenWorld in the books, we want to take a moment to reflect on some of this year's highlights.  First, let us start by thanking those who make OOW the success that is it: our incredible customers and partners. Your stories inspire us every day and we are so glad to have been able to share them with thousands of attendees at OpenWorld.  Thank you to our customer and partner speakers and panelists: Erik Dvergsnes, (Aker BP), Michael Morales (Quality Metrics Partners), Lonneke Dikmans, (eProseed Europe), Patrick McMahon (Regional Transportation District), Steven Tremblay (Graco Inc), Wade Quale (Graco Inc.), Suresh Sharma (Cognizant Technology Solutions Corporation), Sandeep Singh (GE), David VanWiggeren (Drop Tank), Deepak Kakar (Western Digital), Timothy Lomax (Mitsubishi Electric Automation), Candace McAvaney (Minnesota Power), Mark Harrison (Eaton Corp), Awais Bajwa (GE), Nishi Deokule (GetResource Inc),  Murali Palanisamy (DXC Technology), Bhavnesh Patel (UHG Optum Services Inc.), Biswajit Dhar (Unitedhealth Group Incorporated), Karl Jonsson (Reinhart), Lakshmi Pavuluri (The Wonderful Company), Eric Doty (Greenworks Tools), Susan Gorecki (American Red Cross), Timothy Dickson (Laureate Education), Marc Murphy (Atlatl Software), Chad Ulland (Minnkota Power Cooperative), Amit Patanjali (ICU Medical), Rajendra Bhide (GE), Jonathan Hult (Mythics), Wilson Farrar (UiPath), Xander van Rooijen (Rabobank), Kevin King (AVIO Consulting), Milind Joshi (WorkSpan), Palash Kundu (Achaogen), Simon Haslam (eProseed UK), Matthew Gilbride (Skanska), Chris Maggiulli (Latham Pool Products), Duane Debique (Sinclair Broadcast Group), Ravi Gade (Calix, Inc), and Jagadish Manchikanti (Tupperware). We would also like to congratulate our 2018 Oracle Cloud Platform Innovation Award winners: Drop Tank, Ministry of Interior Turkey, The Co-operative Group, and Meliá Hotels International. Their innovation journeys were truly inspiring!  More than anything, #OOW18 was about innovation, sharing our customer’s successes, and Oracle’s strategy and vision! This year's OOW was abuzz with 60,000 customers and partners from 175 countries and 19 million virtual attendees. We had 50+ sessions on integration, process and APIs, taking center stage even in SaaS sessions for ERP, HCM and CX cloud. This year was all about using integration to mobilize digital transformation, looking at areas like API-led integration and innovation with Robotic Process Automation, IoT, AI, blockchain, and machine learning.  Connecting with customers and partners is always a top highlight of OpenWorld. This year, Oracle VP of Product Management, Vikas Anand, had a chance to connect with UiPath’s Brent Haley. Take a look to hear a bit how we are bringing AI and RPA into Oracle's Integration Platform and more.  Before OpenWorld, we shared a few of our most buzzed about sessions with you. No matter which sessions you were able to attend, we hope you found them informative and left OOW with fresh knowledge and inspiration.  And as always, executive keynotes were a major highlight of OOW. In case you missed any, you can catch them here.  Cloud Generation 2: Larry Ellison Keynote at Oracle OpenWorld 2018 Accelerating Growth in the Cloud: Mark Hurd Keynote at Oracle OpenWorld 2018 With the help from our customers and partners, #OOW18 was a smash hit! We cannot wait to see what the next year will bring.   

With another Oracle OpenWorld in the books, we want to take a moment to reflect on some of this year's highlights.  First, let us start by thanking those who make OOW the success that is it: our...

Integration

Little known way to change connections in an integration

Sometimes we want to change the connection(s) we are using in an integration. We may have the following use cases: We have created an integration which uses connection A but want to replace that with connection B. We have imported an integration which uses a Oracle Sales Cloud connection (Sample sales cloud) but we have already a Oracle Sales Cloud connection configured (My Sales Cloud) in our system and want to use that connection instead of using the connection that came with the integration. We have cloned an integration and in the cloned integration we want to use a different connection. In this blog, I will show you the trick to replace connection(s) in an integration ! Couple of points to remember before updating the connection in an integration. The integration can't be in locked or activated state. Only connection of same adapter type can be replaced. You can't replace a connection of Oracle Sales Cloud adapter with a connection of FTP adapter type. I will utilize the integration update REST API to replace the connection. As an example, I will use the Incident details from Service Cloud Integration which is delivered as sample in OIC and uses 'Sample Service Cloud'  as one of the connections. I will replace the 'Sample Service Cloud' connection with 'My Service Cloud' connection which I had already configured. First, I will use the retrieve integration REST API to see the details of the 'Incident details from Service Cloud' integration. You can use curl or Postman REST client. As you can see from the dependencies section, this integration uses two connections and following are the Identifiers for them MY_REST_ENDPOINT_INTERFAC SAMPLE_SERVICE_CLOUD Now, I will use the update Integration REST API to replace the 'SAMPLE_SERVICE_CLOUD' with 'MY_SERVICE_CLOUD' (this is the identifier of the 'My Service Cloud' connection). You can find the identifier of an connection by going to the connection list page and clicking on the info icon for that connection. In the body of the REST API you will need to provide the dependencies details. You can copy and paste the dependencies that you got from the first REST API call. Make sure to use X-HTTP-Method-Override = PATCH in the HTTP header. You can go to the Integration list page and hover your mouse over the connection. You will see the integration is now using the 'My Service Cloud' Connection.    

Sometimes we want to change the connection(s) we are using in an integration. We may have the following use cases: We have created an integration which uses connection A but want to replace that with...

Integration

How to migrate from ICS to OIC?

  In this blog I'd like to show you how to migrate Metadata from an ICS (Integration Cloud Service) instance to OIC (Oracle Integration Cloud) instance. Metadata that will be migrated includes the following: Integrations, Connections, Lookups, Libraries, Packages, Agent Groups, Custom Adapters etc.  Integrations in any state (in-progress, activated etc) will be migrated. All resources such as Lookups, Connections that are not referenced by integrations also will be migrated. Endpoint configuration (configured in connections). Certificates. Credentials stored in CSF store. Settings such as Database, Notification. The migration tool automates some of the below tasks that otherwise have to be done manually if using manual export and import: Bulk export of all integrations along with their dependencies (such as Connections, Lookups etc) into a migration package. Migration of endpoint configuration and credentials Automatic replacement of host / port from source ICS instance to target OIC instance for "Integration calling Integration" use cases. Automatic "Test Connection" Automatic activation of previously activated integrations. Enabling Migration in OIC A feature flag has to be enabled in OIC to import content into OIC as part of migration. To turn on the feature flag, open a Service Request with Oracle support.     Migration Lifecycle         High level steps that need to be performed for the migration: Create an object storage bucket in the underlying Oracle Cloud Infrastructure environment (If the migration target is OIC autonomous). This is needed to transfer the migration package between ICS and OIC. Check this link for detailed steps on how to create a storage bucket. Once the above step is completed, then using the storage URL and storage credentials, invoke the export REST API within ICS environment. This will copy the data from ICS into the storage service. Invoke a REST API to provide the status of the export operation if needed.  For information on what objects have been exported or any error or warnings that were raised as part of the migration can be retrieved from a migration report. Then perform the import operation in OIC environment passing the storage URL and storage credentials. This will import the content from storage into OIC. Invoke a REST API to provide the status of the export operation if needed.  For information on what objects have been imported or any error or warnings that were raised as part of the migration can be retrieved from the migration report.   Exporting the data from ICS Export the data from an ICS environment using the below steps: (Please see the section "Exporting the data from OIC" for exporting from OIC) Using administrator access, execute the export REST API. A sample is shown below using Postman REST client: Export Request: Construct the storage URL based on the configuration done within the storage service based on the format "https://swiftobjectstorage.region.oraclecloud.com/v1/tenancy/bucket" passing the storage credentials as well. Check this link for more details on creating a storage bucket. Response: Checking status: Checking the migration archive: Importing the data into OIC Import the data from into an OIC environment using the below steps: The migration utility supports different modes for the import process No importActivateMode value Description 1 ImportOnly This mode only imports the objects and doesn't activate integrations. Used in case a manual operation needs to be performed such as Adapter agent installation. 2 ImportActivate This mode imports and activates all previously activated integrations. 3 ActivateOnly This mode only activates previously activated integrations. Using administrator access, execute the import REST API. A sample is shown below using Postman REST client: ImportOnly Request: Construct the storage URL based on the configuration done within the storage service based on the format "https://swiftobjectstorage.region.oraclecloud.com/v1/tenancy/bucket" passing the storage credentials as well. ImportActivate Request: ActivateOnly Request:     .   Response: Checking the import status: Note: The jobId returned in the payload of the Import request is passed in as part of the resource,  in the example below the jobId is "405"   Checking the migration report The result of the migration import process can be checked using the below steps: Migration report location: Sample report:     Exporting the data from OIC Export the data from an OIC environment using the below steps: Using administrator access, execute the export REST API. A sample is shown below using Postman REST client: Export Request: Export Response:   Checking status:  

  In this blog I'd like to show you how to migrate Metadata from an ICS (Integration Cloud Service) instance to OIC (Oracle Integration Cloud) instance. Metadata that will be migrated includes the...

Integration

This is the SOA session you were looking for

I often hear from customers that we don't have enough SOA Suite on premises sessions at Oracle OpenWorld. Well, here is an exciting SOA session with three amazing speakers, who will share their adventures with Oracle SOA Suite: Candace McAvaney from ALLETE/Minnesota Power, Mark Harrison from Eaton Corporation and Awais Bajwa from GE Digital.  Candace McAvaney is the senior Enterprise Application Integration architect/developer at ALLETE/Minnesota Power. In her over 10 years of development using the SOA framework, she has implemented nearly 100 interfaces between applications including the Oracle Customer Care and Billing application, IBM Maximo Work Management application, Oracle Enterprise Business System, GE Outage Management System, Sensus Automatic Metering/Smart Grid system, and Accruent Mobile Workforce application. She has presented sessions at the Oracle SOA Customer Advisory Board and the Minnesota Fusion Middleware User's Group. During the OOW panel, Candace will provide Insight into ALLETE/Minnesota Power’s SOA Suite history, the reasoning behind selecting Oracle SOA Suite, their integration guidelines and best practices and what components they have been using. She will also spend a few minutes on their plans for a hybrid platform, spanning cloud and on premises.   Mark Harrison just completed 20 years’ service with Eaton Corporation. He was initially working as Oracle Applications DBA and DBA Manager and for the past 6 years working within the Integration space as Eaton Information Integration Manager. He is managing a virtual team of professionals (70+) who are responsible for the end 2 end business integration of systems across Eaton, business partners and customers (Internal/External). Amongst other technologies, they utilize Oracle SOA Suite, Oracle Data Integration, Oracle OAG and MFT.  Mark is responsible for Integration strategy globally, project delivery (PM certified), enterprise integration support, vendor management and budget planning/tracking. The focus of Mark’s presentation will be the use of Oracle SOA 12c within a traditional manufacturing environment at Eaton, from the use case to the implementation, best practices, lessons learned and future plans.   Awais Bajwa is an Enterprise IT leader (Integration & cloud technologies) for GE Digital. Awais started his career as a Java expert and is now a seasoned professional in the IT industry for 18 years. He has been exclusively focusing in Oracle technologies, integrations and Oracle ERPs for the last 11 years. He specializes in Oracle cloud PaaS and iPaaS platforms including OIC/ICS/PCS, traditional on-premises SOA, OCI and ERP integrations. Awais has extensive experience in rolling out large scale global Oracle program initiatives in different parts of the world including North America, Middle East and Asia. He is also passionate about Microservices Architecture, Event Driven Architecture, API-oriented integrations and Machine Learning. At GE Digital, he drives the strategy & roadmap for Oracle integration technologies and Cloud adoption that is founded in these modern architecture practices. Awais holds a master’s in computer science from Al-Khair university and he is originally from Lahore, Pakistan. Avais will discuss GE’s SOA Suite use cases, the business value, their architectural governance for design-time and runtime and how they handle automation and monitoring. He will also go into details on best practices and lessons learned and the implementation of a business Use case leveraging API-Based approach. Finally, he will give a short overview of their next generation Oracle SOA and hybrid cloud roadmap and strategy, which includes SOA, BPM, SOA CS and OIC. Please also check our Focus on Document for more integration sessions and don't forget to visit us at the demo grounds in Moscone South.

I often hear from customers that we don't have enough SOA Suite on premises sessions at Oracle OpenWorld. Well, here is an exciting SOA session with three amazing speakers, who will share...

Integration

Don't miss your chance to get your hands on Oracle Integration Cloud

As the countdown to Oracle OpenWorld continues, attendees are building their schedule, adding keynotes and the sessions that seem most interesting to them. But OpenWorld also provides the unique chance to get your hands on our products and work through labs with the assistance of the Product Managers and engineers who built the product. The 2 hands on labs for Oracle Integration Cloud (OIC) will introduce you to the Integration, Process and Insight features of OIC and teach you how to build, test and run a simple end to end use case. Of course there will also be an opportunity to ask questions and get to know the OIC team. You are invited to join Antony Reynolds, Nathan Angstadt and myself for this hour of learning and fun: Extending and Connecting Applications with Oracle Integration Cloud [HOL6298] Wednesday, Oct 24, 11:15 a.m. - 12:15 p.m. | Marriott Marquis (Yerba Buena Level) - Salon 5/6 Enhance your CX Applications with Oracle Integration Cloud [HOL6299] Thursday, Oct 25, 10:30 a.m. - 11:30 a.m. | Marriott Marquis (Yerba Buena Level) - Salon 5/6 If you don't have time to attend the labs, or would like to get more information on Oracle Integration Cloud Platform and learn about additional use cases, I recommend you visit the demo grounds in Moscone South. We will be there Monday morning till Wednesday evening with demos on Integration Cloud, API Platform, Self Service Integration (SSI), Robotic Process Automation (RPA), SOA Cloud Service, B2B and Managed File Transfer (MFT). As always, you can find more information about our sessions, hands on labs and demos in the Focus on Document.  

As the countdown to Oracle OpenWorld continues, attendees are building their schedule, adding keynotes and the sessions that seem most interesting to them. But OpenWorld also provides the unique chance...

Integration

A simple guide to use nested scope in orchestration

Wish you cloud have nested scopes in orchestration? Now, you can use nested scope in OIC integration! You will need to enable "oic.ics.console.integration.nested-try-scopes" feature flag to enjoy this feature. In this short blog, I will show you how to use nested Scope in your orchestration. Scope activities allow users to group other child activities which have their own Variables, Fault and Event Handlers. Create or Edit an integration Drag a Scope activity onto the canvas or use the inline menu. Upon dropping the Scope you will be prompted to enter Name and Description (optional) Upon clicking Create the scope will be added to the canvas. Scope activities can also contain other scope activities, this is referred to as nested scopes. This provides a more sophisticated way of organizing or separating the activities into a subsection of the flow. Drag a Scope activity Inside the other Scope activity Upon dropping the Scope you will again be prompted to enter a Name and Description (optional) Upon clicking Create the scope will be added to the canvas inside the other scope A nested scope behaves the same way as a basic scope, it provides its own container of child activities and fault handlers There is no limitation to the levels of nesting.  Even scope’s fault handlers can have nested scopes Hope you enjoyed the nested scope feature!    

Wish you cloud have nested scopes in orchestration? Now, you can use nested scope in OIC integration! You will need to enable "oic.ics.console.integration.nested-try-scopes" feature flag to enjoy this...

Integration

How to invoke an Integration From another Integration in OIC without creating a connection

In this blog, I am going to show you how to use Oracle Integration Cloud Service's ‘Local Integration’ feature to invoke an integration from another integration. With the advent of this new feature, you don’t need to create any explicit connection for the integration you want to call. To utilize this feature, you will need to turn on "oic.ics.console.integration.invoke.local.integration" feature flag. We will be creating a new Integration Invoke Hello World to call "Hello World" integration. The "Hello World" integration is delivered with OIC as a sample. For more info on the "Hello World" sample see: https://docs.oracle.com/en/cloud/paas/integration-cloud-service/icsug/running-hello-world-sample.html First activate the Hello World Integration. Then, follow the steps below to create the "Invoke Hello World" integration. From the Integration list page, click on "Create" and Select "App Driven Orchestration" and provide the name as "Invoke Hello World" and create the integration. We will create a REST trigger which will take name and email as parameter. In order to do that, drag and drop "Sample REST Endpoint Interface" as the trigger or use In-line menu to add that and follow through the wizard. "Sample REST Endpoint Interface" connection should be already in your system. Configure the Tracking Field. Add the Name as tracking field. For more info on tracking see : https://docs.oracle.com/en/cloud/paas/integration-cloud-service/icsug/assigning-business-identifiers.html   Drag and Drop Local Integration Click on Integration Artifacts, click on Business Integrations and then drag and drop "Local Integration" on the integration after the Rest Trigger (getNameAndEmail). This will bring up the Local Integration wizard. Provide Details and click Next. This page shows the list of all the activated integrations that you can invoke. You can type the integration name to filter the integration list. Select "Hello World (1.2.0)" and click Next. Select the Operation and Click Next. In the Summary screen, click on Done. Now edit the "CallHelloWorld" Map to map the name and email. Confgure/Edit the "getNameAndEmail" Map Now, save and close the Integration. From the landing page, activate the integration. Also enable Tracing and Payload during activation. Run the integration using the Endpoint URL, you can past the URL in browser and run it. It will be similar to https://host/ic/api/integration/v1/flows/rest/INVOKE_HELLO_WORLD/1.0/info?name=[name-value]&email=[email-value] Go to Monitoring->Tracking to monitor the integration run. You will see the Hello world Integration was successfully called from the Invoke Hello World integration. You can also go to the Hello world instance from this page. Click on the "CallHelloWorld" local integration invoke and select "Go to Local Integration instance.." Icon. It will show you a popup, click on "Go" to see the Hello World instance which will open up in an new tab. This functionality is only applicable for REST or SOAP based invoke and doesn't apply to Scheduled Orchestration. How to Invoke a Scheduled Orchestration You can also invoke a scheduled orchestration from another integration. You can only call the scheduled orchestration as "Submit now". I will be creating a new integration Invoke File Transfer which will call  the "File Transfer Sample". For more info on File Transfer Sample, see https://docs.oracle.com/en/cloud/paas/integration-cloud-service/icsug/running-file-transfer-sample.html First activate the "File Transfer Sample". Then create a scheduled orchestration "Invoke File Transfer" and drag and from the local integration. Go through the wizard and select the "File Transfer Sample" as the local integration and "runNow" as Operation. See above "Drag and Drop Local Integration" for details. Now edit the "CallFileTransfer" Map. In the mapper click on the action to go to the "Build Mapping" screen to enter "NOW". This is the important step to run a scheduled orchestration. Configure tracking. Save and Close the integration. Activate the "Invoke File Transfer" and from the menu click on Submit Now to run the integration. From the Monitoring→Runs page you can see the runs. As you can see from the below screenshot, "Invoke File Transfer" ran and in turn called the File Transfer Sample. In this blog, we have learned how to use "Local Integration" feature to call another integration. It is a good practice to break a large integration into multiple smaller integrations using this pattern which promotes better design and provides modular functionality for easier maintainability. 

In this blog, I am going to show you how to use Oracle Integration Cloud Service's ‘Local Integration’ feature to invoke an integration from another integration. With the advent of this new feature,...

Eight #OOW18 Integration Cloud Sessions You Won't Want to Miss

With Oracle OpenWorld 2018 just two weeks away, you are probably ready to get busy building your schedule and looking forward to packed days in San Francisco full of fun, learning, and networking. If you’re curious about Oracle Integration Cloud, we have some great sessions for you to check out. Make sure to take a look at the Focus On: App Integration guide to make the most of your time at #OOW18. Coming up this week on the blog, we will continue to highlight #OOW18 iPaaS sessions that you'll definitely want to make the time for.  From Roadmap to How-To, you will find just what you’re looking for in iPaaS knowledge. Monday, Oct. 22nd, 2018 Oracle Cloud Platform Strategy and Roadmap [PKN5769] Time:  9:00 a.m. - 9:45 a.m. Location: Yerba Buena Center for the Arts (YBCA) Theater Speaker: Amit Zavery, Executive Vice President, Fusion Middleware and PaaS Development, Oracle But why should I attend?? Join this session to learn about the strategy and vision for Oracle’s comprehensive and autonomous PaaS solutions. See demonstrations of some of the new and autonomous capabilities built into Oracle Cloud Platform including a trust fabric and data science platform. Hear how Oracle’s application development, integration, systems management, and security solutions leverage artificial intelligence to drive cost savings and operational efficiency for hybrid and multi-cloud ecosystems. AI-Powered Oracle Autonomous Integration Cloud and Oracle API Platform Cloud Service [PRO6176] Time: 12:30 p.m. - 1:15 p.m. Location: Moscone West - Room 2002 Speakers Vikas Anand, VP, Product Management, Oracle Susan Gorecki, Information Technology Sr. Director, American Red Cross Kevin King, Consultant / Contractor, AVIO Consulting, LLC But why should I attend?? Join this session to learn how to best leverage Oracle’s recent innovations within Oracle Autonomous Integration Cloud, Oracle API Platform Cloud Service, process automation, and robotic process automation. Learn about the most exciting artificial intelligence and machine learning integration innovations today and how you can leverage them to jumpstart tomorrow’s digital transformation. Tuesday, Oct.23rd, 2018 Oracle Cloud: Modernize and Innovate on Your Journey to the Cloud [GEN1229] Time: Tuesday, Oct 23, 12:30 p.m. - 1:15 p.m. Location: Moscone West - Room 2002 Speakers: Steve Daheb, Senior Vice President, Oracle Cloud, Oracle Erik Dvergsnes, Architect, Aker BP Michael Morales, CEO/Managing Partner, Quality Metrics Partners But why should I attend?? In this headliner session, you’ll learn how to manage conflicting mandates: modernize, innovate, AND reduce costs. The right cloud platform can address all three, but migrating isn’t always as easy as it sounds because everyone’s needs are unique, and cookie-cutter approaches just don’t work. Learn how the Oracle Autonomous Cloud Platform automatically repairs, secures, and drives itself, allowing you to reduce cost and risk while at the same time delivering greater insights and innovation for your organization. In this session learn from colleagues who found success building their own unique paths to the cloud. The Future of Integration Is Autonomous with Machine Learning and Artificial Intelligence [TIP1372] Time: 5:45 p.m. - 6:30 p.m. Location: Moscone West - Room 2004 Speakers: Daryl Eicher, Sr Director Product Marketing Oracle Autonomous Integration Cloud, Oracle Bruce Tierney, Director Product Marketing Oracle Autonomous Integration Cloud, Oracle Jagadish Manchikanti, IT Director, Tupperware But why should I attend?? Self-defining integrations take the hassle out of connecting SaaS and on-premises systems by providing prebuilt adapters and machine learning–powered mapping recommendations. In this session learn how autonomous integration, including decision modeling and other cool technologies, transform IT for speed to revenue and conversational AI wins. Come see what the future of autonomous integration looks like now! Wednesday, Oct 24th, 2018 The Next Big Things for Oracle's Autonomous Cloud Platform [PKN5770] Time: 11:15 a.m. - 12:00 p.m. Location: The Exchange @ Moscone South - The Arena Speakers: Amit Zavery, Executive Vice President, Fusion Middleware and PaaS Development, Oracle But why should I attend?? Attend this session to learn about cutting-edge solutions that Oracle is developing for its autonomous cloud platform. With pervasive machine learning embedded into all Oracle PaaS offerings, see the most exciting capabilities Oracle is developing including speech-based analytics, trust fabric, automated application development (leveraging AR and VR), and digital assistants. Find out how Oracle is innovating to bring you transformational PaaS solutions that will enhance productivity, lower costs, and accelerate innovation across your enterprise. Oracle Integration Cloud Best Practices Panel: Transforming to Hybrid Cloud [CAS5215] Time: 12:30 p.m. - 1:15 p.m. Location: Marriott Marquis (Golden Gate Level) - Golden Gate C3 Speakers: Rajendra Bhide, It Director, GE Amit Patanjali, Oracle Solution Architect, ICU Medical Chad Ulland, Software Development Supervisor, Minnkota Power Cooperative, Inc. But why should I attend?? In this session, you will get tips and tricks from Oracle Integration Cloud customers as they share expertise on how to move from on-premises deployment to a hybrid on-premises and cloud integration solution. Thursday, Oct 25th, 2018 Antipatterns for Integration: Common Pitfalls [PRO6175] Time: 1:00 p.m. - 1:45 p.m. Location: Moscone West - Room 3022 Speakers: Vikas Anand, VP, Product Management, Oracle Rajan Modi, Development Sr. Director, Oracle Aninda Sengupta, Vice President, Software Development, Oracle But why should I attend?? In this session join Oracle integration, process, and API engineers to learn anti-patterns that you should be aware of as you look at solving integration needs for your enterprise. Learn best practices for application integration based on real-life experience with multiple customers spanning various use cases across real-time/batch integrations, and cloud/ground applications. See how the system warns of common pitfalls and anti-patterns that impact production deployments. In addition to the awesome sessions above, we will also have Hands-On-Labs, Demos, and Theater Sessions. Between sessions, don’t forget to stop by The Innovation Studio at Oracle OpenWorld, located at the top of the escalators in Moscone North where you'll be able to learn about all the ways that Oracle Cloud Platform brings together technology.  To find out about our entire #OOW18 iPaaS session offerings, check out the Focus On: App & Data Integration Document here.  

With Oracle OpenWorld 2018 just two weeks away, you are probably ready to get busy building your schedule and looking forward to packed days in San Francisco full of fun, learning, and networking. If...

Integration

Integration Podcast Series: #1 - The Critical Role Integration Plays in Digital Transformation

Authored by Madhu Nair, Principal Product Marketing Director, Oracle Digital transformation is inevitable if organizations are looking to thrive in today’s economy. With technologies churning out new features based on cutting edge research like those based on Artificial Intelligence (AI), Machine Learning (ML) and Natural Language Processing (NLP), business models need to change to adopt and adapt to these new offerings. In the first podcast of our “Integration: Heart of the Digital Economy” podcast series, we discuss, among other questions: What is digital transformation? What is the role of Integration in digital transformation? What roles do Application and Data Integration play in this transformation? Businesses, small and big, are not able to convert every process into a risk reducing act or a value adding opportunity. Integration plays a central role in the digital transformation of a business. Businesses and technologies run on data. Businesses also run applications and processes. Integration helps supercharge these critical components of a business. For example, cloud platforms now offer tremendous value with their Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS) offerings. Adopting and moving to the cloud would help companies take advantage of the best technologies to run their businesses on without having to worry about the costs of building and maintaining these sophisticated solutions. A good data integration solution should allow you to harness the power of data, work with big and small data sets easily and cost effectively, and make data available where data is needed. A good application integration solution would allow businesses to quickly and easily connect application, orchestrate processes, and even monetize applications with the greatest efficiency and lowest risk. Piecemeal cobbling together of so critical elements of digital transformation would undermine the whole larger cause of efficiency that such a strategic initiative aims to achieve. Digital transformation positions businesses to better re-evaluate their existing business models allowing organizations to focus on their core reason for existence. Learn more about Oracle’s Data Integration Solution here. Learn more about Oracle’s Application Integration Solution here. Oracle Cloud Café Podcast Channel Be sure to check out the Oracle Cloud Café, where you can listen to conversations with Oracle Cloud customers, partners, thought leaders and experts to get the latest information about cloud transformation and what the cloud means for your business.  

Authored by Madhu Nair, Principal Product Marketing Director, Oracle Digital transformation is inevitable if organizations are looking to thrive in today’s economy. With technologies churning out new...

Integration

Securely Connect to REST APIs from Oracle Integration Cloud

Oracle REST Adapter provides a comprehensive way for consuming external RESTful APIs including secure APIs. In this blog we will provide an overview of the available methods of consuming protected APIs using the REST Adapter. Oracle Integration cloud provides a re-usable connection that can be used to specify the security policy for accessing protected APIs. Once configured, users can test, save and complete the connection and use this in integration flows just like any other connection.   When a REST Adapter connection is updated with new security credentials, then the change is automatically visible to all the deployed integrations and there is no need to update or re-deploy the integration flows.  The integration developer must ensure that the new security credentials have identical access and privileges to the APIs and Resources being referenced within the integration flow.    Any other change, especially in the co-ordinates for the external REST APIs such as Base URI / URL to Swagger / RAML documents may call for deactivation and reactivation of impacted flows - and in some cases, it would call for re-editing of impacted adapter endpoints.  The following security policies are supported in Oracle Integration Cloud: Basic Authentication HTTP Basic authentication is a simple authentication scheme built into the HTTP protocol. The client sends HTTP requests with the Authorization header that contains the word Basic followed by a space and a base64-encoded string username:password. In the REST Adapter, users should select the Basic Authentication security policy and provide the username and password. REST Adapter ensures that the credentials are securely stored in a Credentials Store. During API invocation, the adapter will inject a HTTP header along with the request as follows:    Authorization: Basic <base64-encoded-value-of-credentials>   The username and password is not validated even if test connection is successful. Integration developers should validate the credentials before using them in this security policy.   API Key Based Authentication In order to consume APIs protected using an API-Key, integration developers should use the API Key Based Authentication security policy.   REST Adapter provides an extensible interface for developers to declaratively define how the API Key needs to be sent as part of the request. During API invocation, the adapter will inject the API-Key as specified in the API Key Usage along with the request.     The API-key is not validated even if test connection is successful. Integration developers should validate the API-Key before using it in this security policy. Please see our detailed blog API-Key Based Authentication: Quickly and Easily for more details on this security policy.   OAuth Client Credentials The client application directly obtains access on its own without the resource owner’s intervention using its Client Id and Client Secret. In the REST Adapter, users should select the OAuth Client Credentials security policy and provide the required information.   Test connection will use the provided credentials to obtain an access token from the authorization server. This access token will be securely cached internally and also refreshed when required. The REST adapter will procure an access token using the values provided in the security policy as follows: curl -X POST -H 'Content-Type: [Auth Request Media Type]' -H "Accept: application/json" -H 'Authorization: Basic {base64#[YOUR_CLIENT_ID]:[YOUR_CLIENT_SECRET]}' -d  ‘{'grant_type=client_credentials&scope=[YOUR_SCOPE]' '[YOUR_ACCESS_TOKEN_URI]' During API invocation, the adapter will inject the access token as an authorization header along with the request as follows:    Authorization: Bearer <access_token_obtained_using_oauth_client_credentials>    The access token is for a specific scope. The integration developer must ensure that the connection is used to access resources within the same scope.    Caution:  The OAuth2 specification deliberately leaves out the exact mechanism for client authentication. Access token attributes and the methods used to access protected resources are also beyond the scope of this specification. As a result, there are many implementations of OAuth2 that cannot be addressed using the standard policy. Oracle REST Adapter provides a flexible Custom Two Legged OAuth security policy that can be used with any flavour of OAuth Client Credential configuration. Please see our blog entry to see the details. We will explore this in the later sections. OAuth Resource Owner Password Credentials The resource owner password credentials can be used directly as an authorization grant to obtain an access token. Since the resource owner shares its credentials with the client, this policy is used when there is a high degree of trust between the resource owner and the client. In the REST Adapter, users can select the Resource Owner Password Credentials security policy and provide the required information.   Test connection will use the provided credentials to obtain an access token from the authorization server. This access token will be securely cached internally and also refreshed when required. The REST adapter will procure an access token using the values provided in the security policy as follows: curl -X POST -H "Authorization: Basic {base64#[YOUR_CLIENT_ID]:[YOUR_CLIENT_SECRET]}" -H "Content-Type: [Auth Request Media Type]" -d '{"grant_type": "password", "scope": "[YOUR_SCOPE]",    "username": "[USER_NAME]", "password": "[PASSWORD]" }' "[YOUR_ACCESS_TOKEN_URI]" During API invocation, the adapter will inject the access token as an authorization header along with the request as follows:    Authorization: Bearer <access_token_obtained_using_oauth_ropc>   The access token is for a specific scope. The integration developer must ensure that the connection is used to access resources within the same scope. Caution:  The OAuth2 specification deliberately leaves out the exact mechanism for client authentication. Access token attributes and the methods used to access protected resources are also beyond the scope of this specification. As a result, there are many implementations of OAuth2 that cannot be addressed using the standard policy. Oracle REST Adapter provides a flexible Custom Two Legged OAuth security policy that can be used with any flavor of OAuth Resource Owner Password Credential configuration. Please see our blog entry to see the details. We will explore this in the later sections.   OAuth Custom Two Legged Flow Custom Two legged security policy provides Oracle Integration Cloud the necessary flexibility to connect with a plurality of OAuth protected services including services protected using OAuth Client Credentials and OAuth Resource Owner Password Credentials flows. In the REST Adapter, users should select the OAuth Custom Two Legged Flow security policy and provide the required information. Test connection will use the provided credentials to obtain an access token from the authorization server. This access token will be securely cached internally and also refreshed when required. Access Token Usage provides an extensible interface for developers to declaratively define how the access token needs to be sent as part of the request. During API invocation, the adapter will inject the access token as specified in the Access Token Usage along with the request.   The access token is for a specific scope. The integration developer must ensure that the connection is used to access resources within the same scope. Please see our blog entry describing OAuth Custom Two Legged policy in more details. OAuth Authorization Code Credential The authorization code grant is an OAuth flow where the resource owner is required to provide consent before an access token can be granted to the client application. In the REST Adapter, users should select the OAuth Authorization Code Credential security policy and provide the required information. Once configured, the integration developer can click on provide consent, which will redirect the user to the authorization URL where the resource owner should authenticate with the authorization server and provide consent to the client application. This concludes the OAuth Flow successfully. The integration developer can test, save and complete the connection and use this in integration flows just like any other connection.   Test connection will validate that the provide consent flow was successful and an access token from the authorization server was obtained. This access token will be securely cached internally and also refreshed when required. During API invocation, the adapter will inject the access token as an authorization header along with the request as follows:    Authorization: Bearer <access_token_obtained_using_code_authorization>   The access token is for a specific scope. The integration developer must ensure that the connection is used to access resources within the same scope. Caution:  The OAuth2 specification deliberately leaves out the exact mechanism for client authentication. Access token attributes and the methods used to access protected resources are also beyond the scope of this specification. As a result, there are many implementations of OAuth2 that cannot be addressed using the standard policy. Oracle REST Adapter provides a flexible Custom Three Legged OAuth security policy. We will explore this in the next section.   OAuth Custom Three Legged Flow OAuth Custom Three legged security policy provides Oracle Integration Cloud the necessary flexibility to connect with a plurality of OAUTH2 protected services that include a Code Authorization Flow. In the REST Adapter, users should select the OAuth Custom Three Legged Flow security policy and provide the required information. Once configured, the integration developer can click on provide consent, which will redirect the user to the authorization URL. The resource owner should authenticate with the authorization server and provide consent to the client application. This concludes the OAuth Flow successfully. The integration developer can test, save and complete the connection and use this in integration flows just like any other connection.   Test connection will validate that the provide consent flow was successful and an access token from the authorization server was obtained. Test connection will also validate the provided refresh mechanism by refreshing the access token if specified. The refreshed access token will be securely cached internally and also refreshed when required. *Since the provide consent flow requires the resource owner’s intervention, it is recommended that a refresh mechanism is specified so that the access tokens can be refreshed without the resource owner’s intervention at runtime. Access Token Usage provides an extensible interface for developers to declaratively define how the access token needs to be sent as part of the request. During API invocation, the adapter will inject the access token as specified in the Access Token Usage along with the request.   The access token is for a specific scope. The integration developer must ensure that the connection is used to access resources within the same scope. We will include a detailed entry for describing OAuth Custom Three Legged policy in more details.   OAuth 1.0 One legged Authentication OAuth 1.0a (One-legged) enables a client to make authenticated HTTP requests to gain access to protected resources by using their credentials. The method is designed to include two sets of credentials with each request, one to identify the client, and another to identify the resource owner.  Before a client can make authenticated requests on behalf of the resource owner, it must obtain a token authorized by the resource owner. Test connection will only check that the required values are provided.  At runtime these credentials will be used to generate a signed access token. Since authenticated tokens are meant for one-time use only, the generated tokens will not be cached. During API invocation, the adapter will inject the access token as an authorization header along with the request as follows:    Authorization: OAuth <generated_oauth1.0a_access_token > The access token is for a specific scope. The integration developer must ensure that the connection is used to access resources within the same scope. In today’s connected world, where information is being shared via APIs to external stakeholders and within internal teams, security is a top concern. Most of the service providers provide secure access to their APIs using one of the mechanisms listed above. In this post, we have reviewed how Oracle REST adapter can be used to securely consume these protected services. We will follow this up with detailed accounts of most of the security policies listed above. In particular, the Custom OAuth Policies provide a flexible interface to work with a multitude of OAUTH protected services.  

Oracle REST Adapter provides a comprehensive way for consuming external RESTful APIs including secure APIs. In this blog we will provide an overview of the available methods of consuming...

Integration

#OOW18 Executive Keynotes and Sessions You Won’t Want to Miss

With Oracle OpenWorld 2018 less than two weeks away, you are probably busy crafting an agenda to fit in all the sessions you want to see. We want to make sure your experience is tailored to perfection. In a couple days, we will share our full list of integration sessions and highlight a few special events just for Integration folks. In the meantime, let’s start our planning with a bang by introducing you to some of the executive keynotes and sessions we are most excited about: CLOUD PLATFORM & CLOUD INFRASTRUCTURE EXECUTIVE KEYNOTES AND SESSIONS Cloud Platform Strategy and Roadmap (PKN5769) – Amit Zavery Mon Oct 22, 9-9:45am | Yerba Buena Theater In this session, learn about the strategy and vision for Oracle’s comprehensive and autonomous PaaS solutions. See demonstrations of some of the new and autonomous capabilities built into Oracle Cloud Platform including a trust fabric and data science platform. Hear how Oracle’s application development, integration, systems management, and security solutions leverage artificial intelligence to drive cost savings and operational efficiency for hybrid and multi-cloud ecosystems. Oracle Cloud: Modernize and Innovate on Your Journey to the Cloud (GEN1229) – Steve Daheb Tue Oct 23, 12:30-1:15pm | Moscone West 2002 Companies today have three sometimes conflicting mandates: modernize, innovate, AND reduce costs. The right cloud platform can address all three, but migrating isn’t always as easy as it sounds because everyone’s needs are unique, and cookie-cutter approaches just don’t work. Oracle Cloud Platform makes it possible to develop your own unique path to the cloud however you choose—SaaS, PaaS, or IaaS. Learn how Oracle Autonomous Cloud Platform Services automatically repairs, secures, and drives itself, allowing you to reduce cost and risk while at the same time delivering greater insights and innovation for your organization. In this session learn from colleagues who found success building their own unique paths to the cloud. Autonomous Platform for Big Data and Data Science (PKN3898) – Greg Pavlik Tue Oct 23, 5:45-6:30pm | Yerba Buena Theater Data science is the key to exploiting all your data. In this general session learn Oracle’s strategy for data science: building, training, and deploying models to uncover the hidden value in your data. Topics covered include ingestion, management, and access to big data, the raw material for data science, and integration with autonomous PaaS services. The Next Big Things for Oracle’s Autonomous Cloud Platform (PKN5770) – Amit Zavery Wed Oct 24, 11:15-12pm | The Exchange @ Moscone South - The Arena Attend this session to learn about cutting-edge solutions that Oracle is developing for its autonomous cloud platform. With pervasive machine learning embedded into all Oracle PaaS offerings, see the most exciting capabilities Oracle is developing including speech-based analytics, trust fabric, automated application development (leveraging AR and VR), and digital assistants. Find out how Oracle is innovating to bring you transformational PaaS solutions that will enhance productivity, lower costs, and accelerate innovation across your enterprise.   You can start exploring App Integration and Data Integration sessions in the linked pages. We are also sharing #OOW18 updates on Twitter: App Integration and Data Integration. Make sure to follow us for all the most up-to-date information before, during, and after OpenWorld!

With Oracle OpenWorld 2018 less than two weeks away, you are probably busy crafting an agenda to fit in all the sessions you want to see. We want to make sure your experience is tailored to...

Timezone functionality in OIC Schedules

Timezone is a very powerful feature while dealing with schedule integrations. Now user can create schedules in desired timezones. Step-by-step guide Following steps will help you to set a preferred timezone and enables you to create a schedule in the desired timezone. 1. Select Preferences by clicking on the user menu (Note this is the Preferences under OIC user menu. This is not the Oracle Cloud My Services Preferences):             2. Select Timezone and then click on Save button. Here, I selected "New York - Eastern Time (ET)" as per below snapshot: 3. While creating a schedule for Scheduled Integration, you can see "This schedule is effective:" section shows From and Until dates in selected timezone. There is also a new entry "Time zone", which displays the selected time zone. 4. Next, when you go to "Schedule and Future Runs" page after successfully creating a schedule, following new changes can be observed: New field "Schedule Time Zone" displays the timezone in which schedule has been created. All runs get executed as per this time zone Future runs timestamps are in the selected timezone as per the current session Note: Even if user selects different timezone in another browser session, this page ("Schedule Time zone" ) will continue displaying timezone in which schedule has been created. All future runs will execute as per this timezone. Let's take an example: In the above screenshots, I created a schedule with New York - Eastern Time (ET). Now in a new browser session, let's change the time zone from New York - Eastern Time (ET) to Los  Angeles  - Pacific Time (PT). "Schedule Time zone" continues showing the timezone which was used while creating a schedule. All other dates and times displays current browser session timezone. In the below screenshot, current browser session timezone is Los Angeles - Pacific Time (PT) and the schedule was created in New York - Eastern Time (ET). Future Runs displays the runs in the current session timezone - which is the exact time corresponding to the Schedule Time Zone.  

Timezone is a very powerful feature while dealing with schedule integrations. Now user can create schedules in desired timezones. Step-by-step guide Following steps will help you to set a preferred...

Using a Library in OIC

Introduction A library is a file or a collection of multiple files bundled in a JAR that contain Javascript functions. Library is used within an integration and is executed by a Javascript engine on the server as part of an integration flow. This document describes the following: Requirements that a Javascript function needs to meet to be used within integration. How to create Javascript file or collection of Javascript files that are suitable to be used in creating a Library.   1. Javascript function requirements Following are the requirements based on which Javascript function should be written so that it can be registered and works correctly in OIC. 1.1 Function return values should be named Consider this example function add ( param1, param2 ) {   return param1 + param2; } Even though the above example is a perfectly valid Javascript function it can't be registered as a library in OIC because without a named return value the library metadata editor is unable to identify parameters returned by this function so that it could be used in mapper for mapping downstream activities in an integration. OIC requires you to change the above function and name the return parameter like this example. function add ( param1, param2 ) { var retValue = param1 + param2; return retValue; } In this case the return parameter is named retValue. This change will let the user map the return parameter to a downstream activity. 1.2 Inner functions If your Javascript function defines another function within it, the inner function will not be identified by the library metadata editor in OIC so no metadata will be created for the inner function. When metadata is not created for a function it can't be used in OIC. However, the inner function can be used within the outer function. function parseDate(d) { function foo(d) { if(typeof d === 'string') { return Date.parse(d.replace(/-/g,'/')); } if(!(d instanceof Array)) { throw new Error("parseDate: parameter must be arrays of strings"); } var ret = [],k; for(k=0;k<d.length;k++) { ret[k] = foo(d[k]); } return ret; } var retVal = foo(d); return retVal; } In the above example, foo() is defined within function parseDate(). So the metadata UI editor ignores foo() and you will be able to configure only the outer function parseDate(). However, foo() is used within parseDate() which is perfectly valid. 1.3 Long running functions Javascript function execution should not exceed 1500ms. If a function execution including it's dependencies exceeds this time limit the process is automatically killed by the server; log messages will indicate the flow was killed because it exceeded 1500ms. 1.4 Input parameter types OIC currently has support for String, Number, Boolean input and return value types. If your Javascript function uses a different object type like an Array or a Date type the incoming parameter which would be either of the supported types will have to converted to the type that the function expects. Here is an example of how an input of type Array should be handled. 1.4.1 Array Input Type Consider the following array processor example function myArrayProcessor(myArray) { var status = 'FAILED'; for (i=0; i<myArray.length; i++) { } return status; } Because the supported parameter types don't include an Array type this code will have to be changed as in the following example function myArrayProcessor(myArray) { var status = 'FAILED'; var data = myArray.replace(/'/g, '"'); myArray = JSON.parse(data); for (i=0; i<myArray.length; i++) { } return status; } While configuring metadata for this function mark the input type as String and while using the function in an integration the incoming string will be parsed and converted to Array within the function. 2. Creating library using a Javascript file or multiple files 2.1 Using single Javascript file Creating a library using single Javascript file is straight forward. All dependency are contained within the single file. If a function depends on another function the dependent function should be present within the same file. When you register the library it would be enough to configure metadata only for the function that needs to be used within integration. Other functions if present to satisfy dependency need not be configured. Consider the following interdependent Javascript function example function funcOne(param1) { return param1; } function funcTwo(param2){ return param2 } function myFunc(param1, param2) { var ret = funcOne(param1) + funcTwo(param2); return ret; } In this example, funcOne() and funcTwo() are used by myFunc(). While configuring metadata for this library it is enough to configure myFunc() function so that it could be used within integration. 2.2 Using multiple Javascript files A library can be created based on multiple Javascript files. When multiple files are involved you should bundle all files into a JAR file and register the JAR file as a library. For an example consider an EncodeAndDecode library where-in encoding and decoding is done using a Base64 encode and decode scheme.  The library consist of two files Base64.js which contains the Base64 encode and decode logic and Encrypt.js which contains wrapper function that depends on functions within Base64.js. The encode and decode wrapper functions are used within integrations. Both the files are contained within a JAR file and the library is created with this JAR. To use the library configure metadata for encode and decode functions within Encrypt.js as shown in the image above. 3. Debugging Javascript in a library Over time it may be possible the Javascript library grows in size and complexity. Usually debugging a Javascript library is to test Javascript in a browser and once found working satisfactorily roll the code as a library. This may not always work because browsers are regularly updated with newer execution engines that support latest Javascript version and in general browser engines are more liberal in terms of ignoring many code errors; Javascript engine within OIC is stricter. To debug Javascript code use an instance of CXConsole() like in the sample code below. function add ( param1, param2 ) { var console = new CXConsole(); var retValue = param1 + param2; console.log("# retValue: ", retValue); console.warn("# retValue: ", retValue); return retValue; } The log messages written by the above code goes into server-diagnostic.log file.

Introduction A library is a file or a collection of multiple files bundled in a JAR that contain Javascript functions. Library is used within an integration and is executed by a Javascript engine on...

Integration

Enabling the Future Today - Feature Flags in Oracle Integration Cloud

Enabling the Future Today Last Updated: Friday 9 November 2018 Within Integration cloud we are moving to a model that allows us to trial new features without making them available to everyone.  Everone runs the same codebase but feature flags control what is available to a specific instance.  Why would we do this? For multiple reasons: Gain feedback on new features before rolling them out to the whole user base. Test new features in "the wild" in a controlled manner. Be able to rollback new features that may have unforeseen problems. How It Works Each new feature is given a flag that is used to control its availability.  For instance the flag for the small footprint OIC agent was oic.adapters.connectivity-agent.light-weight-agent.  If this flag was enabled for a given OIC instance then they could download the lightweight connectivity agent.  Other OIC instances running the same code but with the flag turned off would not offer the new agent. Flags are controlled from a central system and can be updated in real time by Oracle development and operations.  This means that feature flags can be turned on very quickly, and also if a problem occurs they can be disabled. Feature Flag Lifecycle Feature flags have a lifecycle as illustrated below. The different stages are: Internal Only You may see a product manager demo features on an instance that are not currently available, if using a production pod these may only be available to internal users.  This is where we try things out internally before turning them on for any customers.  Once we are happy with the feature internally we are ready to share it with selected customers and move the feature to Feature Controlled.  Note that this change in stage does not require any code changes, it just alters our internal approval process to enable the feature. Feature Controlled Once a feature enters the feature controlled stage then a customer may request that the flag be enabled for one or more of their OIC instances.  If approved then those instances will have the flag enabled and the feature will become available within a few minutes of being enabled.  Again there are no code changes to the customer instance, just the change in the flag status from disabled to enabled in the central feature flag server. Feature Controlled General Availability Once we are happy with the stability of a feature we will enable it for all instances.  This again does not require a code change.  We leave the flag in place so that if a specific customer has a problem we can disable the feature just for them or roll it back.  This is a safety measure in case problems occur that were not caught by internal users or early adopters of the feature. General Availability Eventually the flag controlling the feature will be removed.  This has no impact on the end user, it just allows us to keep the code paths clean and remove unused code that has been made obsolete by the new feature.  End user will see no difference between this stage and the previous one.  So I mention it here only to explain how we keep our codebase clean. What Flags are Available? The following flags are currently available in the Feature Controlled stage.  We will be blogging about these features and as we do we will update the detailed explanation with a blog entry explaining the feature in detail.  As we add new features we will update this blog. Feature Flag Name Description Detailed Explanation Earliest Version oic.adapter.connectivity-agent.ha HA support for connectivity agent The Power of High Availability Connectivity Agent 18.3.1 oic.cloudadapter.adapter.erp.fileUpload File Upload functionality in ERP Adapter.   18.3.5 oic.cloudadapter.adapter.hcm.fileUpload File Upload functionality in HCM Adapter.   18.3.5 oic.cloudadapter.adapter.netsuite.customRecord CRUD and search support of custom record in Netsuite 18.4.1 oic.cloudadapter.adapter.rest.awssignaturev4 Amazon Signature Version 4 Policy in Rest Adapter 18.4.3 oic.cloudadapter.adapter.rightnow.mtom.upload File upload as MTOM in Rightnow   18.2.3 oic.cloudadapter.adapter.soap.dynamicInvocation SOAP Dynamic Invocation   18.4.1 oic.cloudadapter.adapter.utilities.wsdlupload Option to upload wsdl for inbound Utilities adapter   18.2.3 oic.cloudadapter.adapters.epm Oracle Enterprise Performance Management Adapter (EPM Adapter)   18.2.3 oic.cloudadapter.adapters.otacOracle Talent Acquisition Cloud adapter 18.4.1 oic.ics.console.diagnostics.oracle-litmus-support Litmus support for automated testing. How to use Litmus to create OIC Integration unit tests automatically and run them to catch regressions 18.2.5 oic.ics.console.integration.generate_analysisjsonGeneration of analysis.json document 18.4.1 oic.ics.console.integration.invoke.local.integration Integration calling integration activity How to invoke an Integration From another Integration in OIC without creating a connection 18.3.1 oic.ics.console.integration.layout View integration as pseudo style layout. See How Easily You Can Switch Your Integration Views 18.3.1 oic.ics.console.integration.nested-try-scopes Allow user to create nested scopes A simple guide to use nested scope in orchestration 18.2.5 oic.ics.console.integration.throw-action Allow users to throw error in integration Working with Create Error Activity 18.2.5 oic.ics.console.schedule.parameter-override-support Allows user to override the schedule parameters Overriding Schedule Parameters 18.2.3 oic.ics.mapper.jetmap-enablement New Jet UI based mapper New JET based OIC Mapper 18.2.3 oic.ics.stagefile.reference.processing Allow users to configure file reference in stagefile operations   18.3.3 oic.insight.consoles.instanceprogress Support different display for Insight instance details page   18.3.3 How to Request a Feature Flag To request a feature flag be enabled for one of your environments raise a Service Request via My Oracle Support.  Provide the following information in the SR: Name of the feature flag that you want to be enabled. URL of your OIC instance Content of the About Box from your OIC instance which should include the following: Version e.g. 18.3.3.0.0 (180823.1018.14180) Database Schema Version e.g. 18.08.16 Service Instance e.g. myinstance Identity Domain e.g. idcs-xxxxxxxxxxxxxxx Service Type e.g. Autonomous - 2 Justification Explain why you want the feature enabled, providing a use case. Your request will then be submitted to a product manager for approval.  Once approved then the request will be forwarded to enable the feature on your requested environment. Caveats Features are in controlled availability because they may still have some defects in them.  Be aware that by using feature flag controlled items ahead of general availability means that you are being an early adopter of new features and although we do our best to ensure a smooth ride you may experience some bumps.  Occasionally we may have to make changes to the functionality enabled in the feature flag before it becomes generally available.  Just something to be aware of.  However the feature flag enables us to release new features to customers whose use cases will benefit from them before we are ready to make a feature generally available.  We think this is good for both you, the customer and us, Oracle,  A win-win situation! Previous Flags Now Generally Available The following flags are no longer used as the features they controlled are now available to all instances of Oracle Integration Cloud. Note that if you are using User Managed Oracle Integration Cloud then you may need to upgrade to the latest release to get these features. Feature Flag Name Description Detailed Explanation Earliest Version oic.adapters.connectivity-agent.light-weight-agent Lightweight Connectivity Agent Managing the Agent Group and the On-Premises Connectivity Agent 17.4.3 oic.adapters.hcm-cloud.atom-feed-support Atom Feed support for HCM Adapter Subscribing to Atom Feeds in a Scheduled Integration 18.1.3 oic.cloudadapter.adapter.aq.rawobjectqueues This feature enables RAW and Object type Queues for consuming and producing messages via AQ Adapter in OIC   18.3.3 oic.cloudadapter.adapter.database.batchInsertUpdate Oracle Database Adapter - Operation On Table - Insert and Update   18.2.3 oic.cloudadapter.adapter.database.batchSelect Oracle Database Adapter - Operation On Table - Select and Merge   18.3.3 oic.cloudadapter.adapter.db2database.batchInsertUpdateSelectMerge DB2 Database Adapter - Operation On Table - Insert, Update, Merge and Select   18.3.3 oic.cloudadapter.adapter.dbaasdatabase.batchInsertUpdateSelectMerge Oracle DBaaS Adapter - Operation On Table - Insert, Update, Merge and Select   18.3.3 oic.cloudadapter.adapter.ebs.enableOpenInterface Support for Oracle E-Business Suite Open Interface Tables and Views in Oracle E-Business Suite adapter   18.4.1 oic.cloudadapter.adapter.hcm.dataExtract Data Extract for HCM Adapter Configuring the Extract Bulk Data Option in an Integration 18.2.3 oic.cloudadapter.adapter.mysqldatabase.batchInsertUpdateSelectMerge MySql Database Adapter - Operation On Table - Insert, Update, Merge and Select   18.3.3 oic.cloudadapter.adapter.rest.oauth10aPolicy OAuth for REST Adapter   18.1.5 oic.cloudadapter.adapter.rightnow.fileDownload Rightnow (Service Cloud) Adapter file download feature   18.1.5 oic.cloudadapter.adapter.rightnow.noSchema     18.1.5 oic.cloudadapter.adapter.rightnow.queryCSV.Validation Rightnow adapter query CSV Validation Specifying QueryCSV Statements when Configuring the Oracle RightNow Cloud Adapter as an Invoke 18.1.5 oic.cloudadapter.adapter.soap.enableMtom MTOM Support for SOAP Adapter   17.4.5 oic.cloudadapter.adapter.sqlserverdatabase.batchInsertUpdateSelectMerge SQL Server Database Adapter - Operation On Table - Insert, Update, Merge and Select   18.3.3 oic.cloudadapter.adapters.dbaasdatabase Oracle DBaaS Adapter   18.2.3 oic.cloudadapter.adapters.oraclehcmtbe Taleo Business Edition (TBE) Adapter   18.2.3 oic.cloudadapter.adapters.rest_opa Oracle Policy Automation Adapter   18.2.3 oic.ics.console.connection.soap.uploadzip Support Zip File Upload in SOAP Adapter   18.2.3 oic.cloudadapter.adapters.uipathrpaUI Path RPA Adapter 18.4.3 oic.common.clone_service Allow cloning of an existing ICS or OIC instance Docs for cloning an ICS instance into a new OIC instance. Blog How to Migrate from ICS to OIC  18.2.3 oic.ics.console.connection.soap.uploadzipSOAP Upload Zip Support 18.2.3 oic.ics.console.integration.inline-menu Allow user to add actions/trigger/invoke inline from canvas instead of drag and drop   18.3.1 oic.ics.mapper.encode-decode-on-files Base64 Encode/Decode for Files   18.1.3 oic.insight.jetui.production Jet UI for Insight   18.3.1

Enabling the Future Today Last Updated: Friday 9 November 2018 Within Integration cloud we are moving to a model that allows us to trial new features without making them available to everyone.  Everone...

Integration

Overriding Schedule Parameters

A quick recap on schedule parameters Schedule Parameter feature supports adding scalar variables for Scheduled Orchestration Integrations. These parameter values are available across scheduled runs for the particular Integration and can be overridden by downstream actions like Assign. A maximum of five schedule parameters can be defined per Integration. Requirements like the following can be achieved by using schedule parameters: Maintaining the Last Run Time (position) of the scheduled integration to avoid duplicate processing of data. Process information for specific directory/area/region. As mentioned above, schedule parameters can be updated in the orchestration by using Assign. Please refer Oracle Documentation for more details about this feature.   How to override schedule parameters Today if user wants to invoke a Scheduled Integration (containing schedule parameters) with different parameter values, they need to deactivate the Integration, configure a new default value and activate it back. Schedule parameter override feature enables user to provide parameter values while invoking the Integration without deactivating it. This feature is controlled by feature flag oic.ics.console.schedule.parameter-override-support. Once the feature is enabled, a popup will be displayed when user clicks Submit Now or Start Schedule - for Integrations that have schedule parameters defined. Users can view the Default and Current Value of parameters and, if required, input a New Value to override the Current/ Default values. The New Value field is optional. If no new values are specified then Current Value will be considered for next Run. If Current Value is also empty then Default Value will be considered.   If the integration updates these schedule parameter values using Assign, then the updated value will be saved and will become the Current Value for the next Run.   Schedule parameter values for Submit Now and Scheduled Runs Typically for a given Integration, schedule parameter values for Submit Now and Start Schedule use cases are not shared. So if user defines new schedule parameter values as part of Submit Now operation it will be saved and shown as Current Value for all subsequent Submit Now operations. It will not impact the saved parameter values of Start Schedule. In certain usecases, it might be required to share schedule parameter values between a scheduled Run and a Submit Now Run. To enable sharing, Run as part of schedule checkbox needs to be enabled when doing a Submit Now operation. Below image shows the option: Note: Run as part of schedule checkbox shows up only if there is a schedule defined for the integration. Based on the checkbox selection, appropriate Current Value and Default Value will be displayed in the table.   Updating schedule parameter values of a scheduled Run after starting schedule Schedule parameter values of a scheduled Run can be updated in-between Runs. In order to do this, use the menu item Update Schedule Parameters in the Schedule and Future Runs page. This will launch schedule parameter popup and allow updating the values which will be used for the next Run. Note: If any Run is in running state then update operation is not allowed. Updating parameter values is only allowed in-between Runs as mentioned above.

A quick recap on schedule parameters Schedule Parameter feature supports adding scalar variables for Scheduled Orchestration Integrations. These parameter values are available across scheduled runs for...

How to use Asserter to create OIC Integration unit tests automatically and run them to catch regressions

In this blog, I'd like to show you how easy it is to use Oracle Asserter, a new feature added to Oracle Integration Cloud for creating unit tests automatically with a few clicks and run those tests to catch regressions. Asserter supports the following use cases: Enable Integration Cloud users to create unit tests automatically and play them back to catch regressions when they modify their integrations (typically when they enhance an already created integration before making it production). Enable Integration Cloud QA to catch product regressions as part of a new release of Integration Cloud. Send Oracle a recorded instance so that Oracle can play back the instance to reproduce an issue or a bug. This is difficult without Asserter because all the dependent endpoints and third party adapters might not be available in-house to reproduce the issue. With Asserter, the endpoints are simulated and hence not needed to reproduce the issue.   Enabling Asserter Let's assume that you have built an integration which runs as per your requirements and you have completed all your manual testing. Now you are ready to go production. At this point, you might want to create a Asserter Test and want to check that into your source repository. This is so that when you want to change that integration later, you can rely on the Asserter test to catch regressions. Regression in this case is an assertion failing because the response you're sending to the client has changed due to a bug that was introduced in a mapping as an example. Enable the Asserter with below steps: A feature flag has to be enabled in OIC to enable Oracle Asserter. To turn on the feature flag, open a Service Request with Oracle support. Once the feature flag is enabled, login as a developer. From the list of integrations displayed in the integrations page, click the inline menu for the integration and click Oracle Asserter -> Enable Asserter Recording  You can also enable Asserter as part of the Activation as well.   Creating a test using Asserter After the Asserter is enabled, you can create a test for a given integration using the below steps: Run your integration once. That's it. Your Asserter Test (also called as Recording) is created now. To check the recording, go to Oracle Asserter -> Recordings and you can see the recording is displayed. The last one created will be displayed first. You can create up to 5 recordings for a specific integration. Note that a given integration can take multiple path or branches depending on certain values in your input payload. So you might want to create multiple recordings by sending different input values. You can identify each recording using the Primary Identifier column.   Exporting and Importing a test  After a recording is created, you can export and import a test for a given integration using the below steps: Login as a developer. From the list of integrations displayed in the integrations page, click the inline menu for theintegration and click Export Check the box that says Include Asserter Recordings. To import a recording click the button at the top right that says Import. Select the integration archive file (.iar) and check the box that says Include Asserter Recordings. Playing back a recording After a recording is created, you can playback a recording for a given integration using the below steps: Go to Oracle Asserter -> Recordings and identify the recording from the list that you want to playback. Click the playback button. You will see the following message displayed in the banner: "Successfully invoked playback for recording RecordName_7. Please refer Track Instances page to view the Asserter instance details." At this point, the recording is played back asynchronously and follow the next section to check the status of the test run ​ Checking the test status After a given Asserter recording is played back, you can check the test status using the below steps: Go to Monitoring -> Tracking. Identify the Asserter instance from the list of runs. To differentiate the Asserter run, you can see Asserter Instancetagged at the top as displayed in the image. You will also see the Test status displayed at the top as well. In order to see the details of the assertion, you can click Oracle Asserter Result from the top right inline menu. This will display the golden input that was stored earlier as part of the recording which was compared against the actual response. If there is a match, the test will pass else the test will fail. Note that this will be a XML comparison and not a string comparison so that the prefix differences are ignored. Using REST API to automate Asserter Asserter supports REST API so that you can automate some of the Asserter operations. The commonly used REST operation will be to playback a Asserter recording. For detailed information, you can check Asserter user guide. Some of the operations supported via REST are  Activate an Integration with recording mode. Enable/Disable Oracle Asserter Recording mode Import (with and without recordings, both integration and package) Export (with and without recordings) Play back a recorded instance Update a recording Delete a recording Get Asserter Recordings Submit recordings

In this blog, I'd like to show you how easy it is to use Oracle Asserter, a new feature added to Oracle Integration Cloud for creating unit tests automatically with a few clicks and run those tests to...

How to enable and use Tracing in less than 5 min

In this short blog, I'd like to show you how easy it is to enable tracing in OIC Integration and start tracing your integration flows. When Tracing is enabled, OIC Integration prints detailed info before and after each action that is executed (optionally the payload if needed). Hence care should be taken to make sure that it is enabled only for debugging purposes and turned off before going production. Global Tracing Let's assume that you have a requirement where you would like to enable or disable tracing for every integration you have created. You can use the global tracing for accomplishing the same. Enable the Global tracing with below steps: Login as an administrator. Click Settings on the left side. Click Trace on the left side. Select Global Tracing On and Click Save on the top right. Optionally you can select Include Payload which will additionally write the payload.   Integration Level Tracing If your requirement is to enable the tracing for one or more integrations and disable tracing for the rest of the integrations, you can use Integration Level tracing. Enable the Integration tracing with below steps: Login as an administrator. Click Settings on the left side. Click Trace on the left side. Select Integration Level and Click Save on the top right. After the above step, tracing can be enabled for a specific integration in two ways When an integration is activated, select Tracing. For an already activated integration, use the inline menu Tracing as shown in image.           Checking the info added by Tracing After the tracing is enabled (Global or Integration), the info logged by the tracing can be used to debug your integration. Check the tracing info using the below steps: Run your integration once. Go to Monitoring -> Tracking and select the instance. Once the instance is opened(showing the run in Green / Red), select the inline menu on the top right and select Activity Stream  

In this short blog, I'd like to show you how easy it is to enable tracing in OIC Integration and start tracing your integration flows. When Tracing is enabled, OIC Integration prints detailed info...

Integration

Oracle Named a Leader in 2018 Gartner Magic Quadrant for Data Integration Tools

Oracle has been named a Leader in Gartner’s 2018 “Magic Quadrant for Data Integration Tools” report based on its ability to execute and completeness of vision. Oracle believes that this recognition is a testament to Oracle’s continued leadership and focus on in its data integration solutions. The Magic Quadrant positions vendors within a particular quadrant based on their ability to execute and completeness of vision. According to Gartner’s research methodologies, “A Magic Quadrant provides a graphical competitive positioning of four types of technology providers, in markets where growth is high and provider differentiation is distinct: Leaders execute well against their current vision and are well positioned for tomorrow. Visionaries understand where the market is going or have a vision for changing market rules, but do not yet execute well. Niche Players focus successfully on a small segment, or are unfocused and do not out-innovate or outperform others. Challengers execute well today or may dominate a large segment, but do not demonstrate an understanding of market direction.” Gartner shares that, “the data integration tools market is composed of tools for rationalizing, reconciling, semantically interpreting and restructuring data between diverse architectural approaches, specifically to support data and analytics leaders in transforming data access and delivery in the enterprise.” The report adds “This integration takes place in the enterprise and beyond the enterprise — across partners and third-party data sources and use cases — to meet the data consumption requirements of all applications and business processes.” Download the full 2018 Gartner “Magic Quadrant for Data Integration Tools” here. Oracle recently announced autonomous capabilities across its entire Oracle Cloud Platform portfolio, including application and data integration. Autonomous capabilities include self-defining integrations that help customers rapidly automate business processes across different SaaS and on-premises applications, as well as self-defining data flows with automated data lake and data prep pipeline creation for ingesting data (streaming and batch). A Few Reasons Why Oracle Data Integration Platform Cloud is Exciting Oracle Data Integration Platform Cloud accelerates business transformation by modernizing technology platforms and helping companies adopt the cloud through a combination of machine learning, an open and unified data platform, prebuilt data and governance solutions and autonomous features. Here are a few key features: Unified data migration, transformation, governance and stream analytics – Oracle Data Integration Platform Cloud merges data replication, data transformation, data governance, and real-time streaming analytics into a single unified integration solution to shrink the time to complete end-to-end business data lifecycles.  Autonomous – Oracle Data Integration Platform Cloud is self-driving, self-securing, and self-repairing, providing recommendations and data insights, removing risks through machine learning assisted data governance, and automatic platform upkeep by predicting and correcting for downtimes and data drift. Hybrid Integration –Oracle Data Integration Platform Cloud enables data access across on-premises, Oracle Cloud and 3rd party cloud solutions for businesses to have ubiquitous and real-time data access. Integrated Data Lake and Data Warehouse Solutions – Oracle Data Integration Platform Cloud has solution based “elevated” tasks that automate data lake and data warehouse creation and population to modernize customer analytics and decision-making platforms. Discover DIPC for yourself by taking advantage of this limited time offer to start for free with Oracle Data Integration Platform Cloud. Check here to learn more about Oracle Data Integration Platform Cloud. Gartner Magic Quadrant for Data Integration Tools, Mark A. Beyer, Eric Thoo, Ehtisham Zaidi, 19 July 2018. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Oracle has been named a Leader in Gartner’s 2018 “Magic Quadrant for Data Integration Tools” report based on its ability to execute and completeness of vision. Oracle believes that this recognition is...

Oracle Integration Day is Coming to a City near You

Are you able to innovate quickly in the new digital world? Are you looking for ways to integrate systems and data faster using a modern cloud integration platform? Is your organization able to achieve differentiation and disruption? Is your Data Integration architecture allowing you to meet your uptime, replication and analytics/reporting needs? Join Oracle product managers and application/data integration experts to hear about best practices for the design and development of application integrations, APIs, and data pipelines with Oracle’s Autonomous Integration Platform. Hear real-world stories about how Oracle customers are able to adopt new digital business models and accelerate innovation through integration of their cloud, SaaS, on-premises applications and databases, and Big Data systems. Learn about Oracle’s support for emerging trends such as Blockchain, Visual Application Development, and Self-Service Integration to deliver competitive advantage. Tampa Integration Day   Oracle VP of Product Marketing, Vikas Anand, presenting his keynote at Integration Day at Oracle HQ in Redwood Shores, CA With interactive sessions, deep-dive demos and hands-on labs, the Oracle Integration Day will help you to: Understand Oracle's industry leading use of Machine Learning/AI in its Autonomous Integration Platform and how it can significantly increase speed and improve delivery of IT projects Quickly create integrations using Oracle’s simple but powerful Integration Platform as a Service (iPaaS) Secure, manage, govern and grow your APIs using Oracle API Platform Cloud Service Understand how to leverage and integrate with Oracle’s new Blockchain Cloud Service for building new value chains and partner networks Understand how Oracle’s Data Integration Platform Cloud (DIPC) can help derive business value from enterprise data; getting data to the right place at the right time reliably and ensuring high availability Integration Day begins on August 8 in Tampa. Register now to reserve your spot! Click the links to learn more about your local Integration Day   September 19, 2018 – New York City September 26, 2018 – Toronto October 3, 2018 – Boston December 5, 2018 – Chicago January 23, 2019 – Atlanta January 30 , 2019 –  Dallas February 6, 2019 – Washington DC February 20, 2019 – Santa Clara  

Are you able to innovate quickly in the new digital world? Are you looking for ways to integrate systems and data faster using a modern cloud integration platform? Is your organization able to achieve...

Integration

New Whitepaper: EU GDPR as a Catalyst for Effective Data Governance and Monetizing Data Assets

The European Union (EU) General Data Protection Regulation (GDPR) was adopted on the 27th of April 2016 and comes into force on the 25th of May 2018. Although many of the principles of GDPR have been present in country-specific legislation for some time, there are a number of new requirements which impact any organization operating within the EU. As organizations implement changes to processes, organization and technology as part of their GDPR compliance, they should consider how a broader Data Governance strategy can leverage their regulatory investment to offer opportunities to drive business value. This paper reviews some of the Data Governance challenges associated with GDPR and considers how investment in GDPR Data Governance can be used for broader business benefit. It also reviews the part that Oracle’s data governance technologies can play in helping organizations address GDPR. The following Oracle products are discussed in this paper: Oracle Enterprise Metadata Manager (OEMM)–metadata harvesting and data lineage Oracle Enterprise Data Quality (EDQ)–for operational data policies and data cleansing Oracle Data Integration Platform Cloud–Governance Edition (DIPC-GE)–for data movement, cloud-based data cleansing and subscription-based data governance Read the full whitepaper here.

The European Union (EU) General Data Protection Regulation (GDPR) was adopted on the 27th of April 2016 and comes into force on the 25th of May 2018. Although many of the principles of GDPR have been...

Integration

Migration from Oracle BPM to Oracle Autonomous Integration Cloud — Streamlining Process Automation in the Cloud

By Andre Boaventura, Senior Manager of Product Management In my last blog post Migrating your Oracle BPM assets into Oracle Process Cloud Service (PCS), I have described and demonstrated how to migrate modeling assets (essentially BPMN models) by leveraging the conversion framework that you can find at my repository at GitHub. As stated on the blog post above, the major use case was to demonstrate how customers using Oracle BPM Composer for modeling purposes *ONLY* could streamline their migration process from Oracle BPM into PCS. Also, as declared earlier, I have seen many customers that are using BPM for documentation purposes only, but at the other end, and as you might be likely asking yourself, there are many others that have already developed many projects and processes on top of the Oracle BPM not only for documentation purposes, but indeed for process automation, and obviously want to move them to the respective cloud version of Oracle BPM (aka PCS), given all the very known benefits of cloud adoption such as lower costs, greater agility, improved responsiveness and better resource utilization among other technical and business drivers. Thus, with Process Automation in mind, asset migration from Oracle BPM to PCS becomes an even more serious matter, but the good news is that this is really possible. As the major goal of my posts is to share experiences that I have seen with customers I have worked in the field, the following technique you will find below obviously could not avoid the rule. This exercise came as a challenge from a specific customer that was running in production all their processes on Oracle BPM for process automation purposes, so this means it also can be applied to many others since there is an increasing demand for this sort of migration given the high number of customers relying on Oracle BPM for process automation, and which at the same time, want to bring their processes to the Cloud as well. Look at the video below for a quick introduction about the Oracle BPM Path to the Cloud. After getting started with this introduction, I guess you might be thinking yourself about the new suite introduced a couple of months ago called Oracle Integration Cloud(aka OIC) that is a combination of both PCS, ICS and other known Oracle cloud services for Integration such as Integration Insight, Visual Builder Cloud Services and others. So the question that comes to mind is: Should this migration framework work for OIC as well? The good news is that the answer is also *YES* for OIC. Although there will be some caveats that I will be explaining along this article, since OIC only supports the new Web Forms technology(not the old one based in Frevvo) that was introduced in PCS, then the current forms technologies available in BPM(ADF or Basic Web Forms) won’t work within OIC, but even so, through my migration scripts you should be able to import other artifacts such as processes, indicators, integrations, etc. Again, throughout this post you will understand all alternatives currently available. However, before digging deeper into the details of this migration technique, let me give you a brief introduction about the new Oracle Cloud Integration Platform called OIC. Oracle Integration Cloud (OIC) brings together all the capabilities of Application Integration, Process Automation, Visual Application Building and Integration Analytics into a single unified cloud service. It now brings real-time and batch based integration, structured and unstructured processes, case management, stream analytics and integration insight allowing customers to service all their end to end integration needs in one cohesive platform so that all users can now build and deliver capabilities needed to realize true Digital Business Transformations. Billing is simplified with a single metric of OCPUs per hour (no more connection or user counting!!). Natively discover and invoke integration flows from processes in OIC and vice versa. Customers can configure and schedule patching according to their own schedule, and they can scale the Database to accommodate their business’s retention policy. The Integration Cloud runtime can be scaled out to meet the most demanding customer volumes. Also, customers now can choose to use the following alternatives for on-premises application connectivity: Connectivity Agent, API Platform, VPN or Oracle OCI FastConnect. OIC Key Features SaaS and On-Premises Integration: Quickly connect to thousands of SaaS or on-premises Applications seamlessly through 50+ native app adapters or technology adapters. Support for Service Orchestration and rich integration patterns for real-time and batch processing.   OIC Integration(formerly ICS) Process Automation: Bring agility to your business with an easy, visual, low-code platform that simplifies day to day tasks by getting employees, customers, and partners the services they need to work anywhere, anytime, and on any device. Support for Dynamic Case Management   OIC Process (formerly PCS) Visual Application Design: Rapidly create and host engaging business applications with a visual development environment right from the comfort of your browser. Integration Insight: The Service gives you the information you need — out of the box, with powerful dashboards that require no coding, configuration, or modification. Get up and running fast with a solution that delivers deep insight into your business.   Integration Insight(Also available in both SOA on-prem as an option & SOA Cloud Service SKU as Integration Analytics) Stream Analytics: Stream processing for anomaly detection, reacting to Fast Data. Super-scalable with Apache Spark and Kafka. Challenges Now that you already know that migration from Oracle BPM to either PCS or OIC is achievable by leveraging the migration framework I will be explaining along this article, and also and also had the opportunity to have a brief introduction about the new integration platform in the Oracle cloud called OIC, now let’s look at some of the challenges that we will need to overcome in order to make this migration happen. Getting back again to my earlier post about BPM Migration, where I have highlighted some of the differences among what is supported in Oracle BPM and PCS regarding the BPMN notation, you will notice that with process automation into the picture, the scope is larger since we should take into account the way BPMN notation behaves while running the process itself. Additionally, we have other extensions such as Human Tasks, Script tasks, Web forms, Data Objects, Business Indicators, etc, so the bottom of the issue is still a long way off, so let’s go for parts for seeing the whole. The Scope of this Migration As mentioned earlier, the goal of this blog is to provide you with experiences and real cases that I have seen in real customer in the field. As such, for obvious reasons, I can’t share customer processes information within this post, but I can use generic examples that will be covering all findings which I have discovered along customer migration process in order to fully showcase how to do a process migration from Oracle BPM to PCS in a smoothly way. For the sake of understanding, let us use the following Oracle BPM project(Oracle Travel Approval) as the sample to demonstrate migration in-action.   Oracle BPM Travel Approval Sample Process Note 1: This blog post is leveraging an Oracle BPM project sample with forms implemented in Web Forms(aka Frevvo forms). If your Oracle BPM projects have forms implemented with Oracle ADF technology, you still should be able to properly import your Oracle BPM projects into Oracle PCS, however at the other end, you will be forced to rewrite all your forms from scratch to get your applications fully migrated, up and running on the Cloud, since ADF is not supported in either PCS or OIC. Note 2: The recommended forms technology for PCS(which now is part of Oracle Integration Cloud) is the new Web forms technology. The previous one, that is a frevvo heritage, called Basic Forms in PCS, is no longer supported in OIC, so please be advised to get your forms migrated to Web forms in order to be able to upgrade to OIC, the latest Oracle Cloud integration platform, as described in the introduction section earlier in this post. In the process sample above, we can find some examples of activities or extensions that must be carefully handled by the conversion framework so that the BPM project can be imported and deployed on PCS in a smoothly way. They are as follow: Human Tasks Script Tasks Project Data Objects Business Indicators Initiator Tasks 1. Human Tasks The following are some of the attributes that are currently available in Oracle BPM(maybe you can find more within your Oracle BPM project), but not supported in Oracle PCS. hideCreator excludeSaturdayAndSunday So, if you don’t remove them from your task definitions files, you will get the following errors while trying to deploy your application project on PCS: … The errors are [2]: Element ‘hideCreator’ not expected. at line -1 at column -1. exception.fix: Make sure that the task definition conforms to the task definition XML schema definition. . — Please contact the administrator for more information… The errors are [2]: Element ‘excludeSaturdayAndSunday’ not expected. at line -1 at column -1. exception.fix: Make sure that the task definition conforms to the task definition XML schema definition. . — Please contact the administrator for more information. 2. Script Tasks Whether if you are creating Oracle BPM applications, it is very likely that you might have already played with script tasks. In general, script tasks are a very powerful component in Oracle BPM to enable users to pass data objects through data associations. Although an user can use script tasks for implementing Groovy scripts, most of implementations out there only use script tasks for data association. Also, as you probably must know, and as indicated in my previous post, Script tasks are not available in PCS. So, how can an Oracle BPM process be imported into PCS once this activity is not available in the component palette, resulting in a compilation error if trying to do so? Don’t worry, I have made the job for you, so you don’t have to get your hands dirty fixing this sort of things. The magic behind is to convert every script task into something supported in PCS so one can migrate process application smoothly. The approach I have figured out to solve this problem was converting all script tasks across all bpmn files into service tasks activity, then make those service tasks to call any service that would allow us to keep script task behaving as it was in Oracle BPM. A simple workaround was to call an internal PCS REST API(in my conversion script I am using getProcessDefinitions but could be any other one) only to be able to run the same data transformation as used in Oracle BPM. This is only valide por PCS as you might have realized. As for OIC, the signature is slightly different(process-definitions). Please check out the REST APIs for Oracle Autonomous Integration Cloud Service here. Also, there is an alternative while running the conversion script which is to disable all service Tasks activities when converting from scriptTask activity by passing the parameter disableScriptTasks to the conversation framework.   As such, after importing application project into OIC/PCS, then all these service Tasks need to be manually setup to point to any valid and existing service URL in order to get the process properly validated and deployed after all. If you want to learn more how it was done under the covers, please check out the following function in my linux bash script: function addPCSRESTConnectionToProjectDefinition() function addPCSRESTConnectionToScriptTask() function disableScriptTasks() Migration of script tasks was only tested against those with no implementations. If you are leveraging script tasks for running Groovy scripts, then these steps would need to be converted in a manual basis, since you can’t run Groovy scripts in OIPCS. 3. Data Objects Unlike Oracle BPM, Oracle OIC/PCS no longer has the concept of Project Data Object vs Process Data object. There is only Data Objects that works on the scope of the processes, and as opposed to Project data objects, they can’t be shared across different processes.   BPM Process and Project Data Objects   OIC/PCS Data Objects With that said, we first need to find out where the project data objects are stored within a BPM project, and then move them to the right place within the process to be properly validated by the Oracle OIC/PCS process validation engine. So, having a closer look at the project files, you will notice that Project Data Objects get stored within projectInfo.xml (under the SOA Folder), that is under the root folder. Particularly in my example, it will be called as BpmProject, since it is always stores under the folder containing the Process project name. See picture below for more details:   This is a sample Oracle BPM Project export(.exp) file structure By drilling down to the projectInfo.xml, you will find the following content:   Project Data Objects section Similarly, if you go the process file (in my case Process.bpmn), you will find the following section:   Process.bpmn file dataObjects section So, in order to make it OIC/PCS compatible, all project data objects must be within a process file (with .bpmn extension). In my case, with a simple process called Process.bpmn, I will add all Project Data objects from projectInfo.xml into this file, so at the end this work, Process.bpmn should look like the following:   Final Process.bpmn file after migration. It includes all Project Data Objects from the original Oracle BPM project, but now as process data objects. Please note that I had to find all references in the projectInfo.xml file for ns3and replace by bpmn before adding to the target process file. Similarly I had to do the same for all ns7 references and replacy by bpmnext. Also, you must figure out in which bpmn files you should add those former project data objects given the way they are referenced across processes. 4. Business Indicators One of the most powerful features available in Oracle BPM and Oracle OIC/PCS is the concept of Process Analytics. Process Analytics enable you to obtain performance and workload metrics of the processes in your project. You can use these metrics to make decisions about your process. Process analysts can monitor standard pre-defined metrics and process specific user-defined metrics. Process developers can define process specific metrics using Business Indicators. In Oracle BPM, business Indicators can be bound to project data objects. Once bound, the BPMN service engine publishes the business indicator values to process analytics stores when it runs the BPMN processes. However, as mentioned above, there is no such concept of Project Data Objects in OIC/PCS. Please see the pictures below to visualize the differences between them:   Oracle OIC/PCS Indicators   Oracle BPM Business Indicators As you may have noticed, there is a small difference between BPM and OIC/PCS regarding business indicators, but as said earlier, my conversion script does the dirty job for you by converting Project Data Objects only supported by Oracle BPM into Process data objects, which are then mapped to their respective process data objects in PCS. 5. Initiator Tasks The initiator task is one among the many human task interactive patterns in Oracle BPM. It’s used to trigger a BPM process flow from the defined human task user interaction interface. When you are using the initiator task to initiate a BPM process, the process always starts with the none start event. The none start event will not trigger the process; however, the human task initiator will initiate the process. It’s the role associated with the swim lane that defines the process participant, and that process participant/assignee is the one the initiator task gets assigned to. If your Oracle BPM processes have initiator tasks, this will be converted to approval tasks in Oracle OIC/PCS, however to keep the original behavior of an initiator task used in Oracle BPM in either PCS or OIC, that is the capability of triggering a BPM process flow from the defined human task user interface(i.e: An initiator task allows creating a BPM process instance through the BPM workspace or any other UI), this should be changed either to a Form Start Activity, with the form assigned to that activity, or using the submit task. I preferred to use the Form start activity since it simplifies the process by removing the human task and assigning the web form straight to the form start activity. Although I haven’t added this as a feature to my migration script yet, you can simply follow the steps I have showed in this quick video below, by performing these steps right after running a BPM migration and importing migrated artifacts from Oracle BPM(.exp files) into Oracle OIC/PCS, similarly to what will be covered in the next section “Getting Started with your Migration to the Cloud”. Getting started with your Migration to the Cloud After all the topics covered so far, that were essential for a deeper understanding of this migration procedure, finally the so awaited time has come, that is, where you can get your hands dirty and finally see migration in-action. Let’s get down to work!! Oracle BPM -> Oracle Autonomous Integration Cloud(OIC) Please watch the video below to find out how to migrate a process from Oracle BPM straight to OIC. TravelApplication is a sample application built on Oracle BPM, leveraging Webforms(frevvo) as the forms technology. As it is not supported in OIC, note that one step done in the video is to remove the formsfolder to allow this application to be properly imported within OIC. Then, as mentioned earlier, these forms will need to be created from scratch, but at least all other artifacts such as processes, business types, integrations, indicatores, etc were successfully imported into OIC. Oracle BPM -> Oracle Process Cloud Service (PCS) This is a similar video as shown above, but this time you can see an end-to-end migration from Oracle BPM to PCS, highlighting the following steps: Export process from Oracle BPM Migration with my framework Import into PCS Application Validation Test & Deploy Running the process from the Process Workspace   Now it's time to see an actual migration in action Below you will find videos describing a real migration done for a customer by using the migration framework to migrate Oracle BPM processes to PCS, ICS and Integration Analytics. Note that even though this process was done for a previous version of OIC(PCS & ICS), the majority of procedures and techniques shared below would be valid for OIC as well. Part 1 BPM to OIC/PCS Migration This is the first step towards generate the first BPM project to be imported into PCS. It walks you through the original Oracle BPM application(BPM Composer and Studio) and shows how to create the first migratable project on Oracle OIC/PCS by leveraging the BPM Cloud Migration framework available at this github repository. Although you can find references to PCS in this video, the same technique applies to OIC as well. Part 2 OIC(formerly ICS) — the integration Layer This step demonstrates how to install and setup OIC/ICS connectivity agent to be used by integrations that require access to customer’s Oracle database tables. Also, it shows how to build an integration from scratch in OIC/ICS to access customer database tables and then expose them as REST services to be consumed by Oracle OIC/PCS. Although you can find references to ICS in this video, the same technique applies to OIC as well. Part 3 — PCS & ICS Integration This video demonstrates how to leverage services created in OIC/ICS to replace those from the original process created with Oracle DB adapter within a SOA composite. Also, it showcases how to link those OIC/ICS services to OIC/PCS service call activities and how to map inbound and outbound data. Also, it shows the first deployable version to be tested and run on PCS. Although you can find references to both ICS & PCS in this video, the same technique applies to OIC as well. Part 4 — Integration Analytics This video guides you on how to create a Business Insight model with milestones, business metrics(measures and dimensions), assign them to their respective milestones and finally expose those milestones APIs to be consumed by Oracle OIC/PCS. Part 5 — OIC/PCS Business Insight Integration This video shows how to enable and link Business Insight within OIC/PCS and also deploy the final version to be used and tested in run-time. Although you can find references to PCS in this video, the same technique applies to OIC as well. PCS/ICS/Integration Analytics(Now OIC) End-to-End demo after a successful migration This video walks you through all products described earlier, but looking from the run-time perspective. It starts showing a process instance kicked-off through a Web forms, then an approval by a Mobile app, integration with Content Experience Cloud. Also, it guides you through all default and custom dashboards created on Integration Analytics as well as how to monitor integrations and track process integration instances in OIC/ICS. This is a comprehensive and seamlessly integrated demo that highlights how this Oracle PaaS service can work to bring more value and benefits for customers that have the same or a similar use case. Although you can find references to both ICS & PCS in this video below, the same technique applies to OIC as well. Stuff to Know Here is a summary of all available migration paths considering the current BPM offerings: Oracle BPM, PCS and OIC(the latest one), with their respective caveats/comments depending on the forms technology being used for you process automation. Oracle BPM->Oracle PCS Web forms: Migration will be smoothly since PCS supports the old web forms technology based on frevvo web forms. ADF: Although you can migrate processes and all other assets, you should rewrite your forms. Recommendation is to do it by leveraging the new Web Forms technology. BPM->OIC Regardless the forms technology it is being currently used in your Oracle BPM project, they will have to be created from scratch if you need a straight migration to OIC. Oracle is working in a forms migration tool. When available, it will allow you to migrate your old web forms based on frevvo to the latest web forms technology. As such, a reasonable approach(only applied if you are using frevvo web forms) would be to migrate from BPM->PCS to keep forms as they are, followed by a migration from PCS->OIC by leveraging the new web forms migration expected to be available this calendar year. PCS->OIC Transparent migration since the forms have been made in the new forms technology also available in PCS, otherwise it will be needed to rewrite the forms or wait until the migration framework, mentioned in the item above, is available to be used, which will allow to migrate the forms of the PCS built with old forms technology(frevvo) to be properly migrated to the OIC. Conclusion Hopefully this article has provided to you with some guidance on how to perform a migration from Oracle BPM to our managed version in the cloud called OIC, and also has indicated some of the options currently available depending on the sort of technology being employed during implementation with Oracle BPM. Below you will find some highlights of the key steps for an end-to-end migration. Disclaimer Although the procedure described in this blog post is a result of multiple success engagements done for some customers while migrating them from Oracle BPM to Oracle OIC/PCS, please be advised that this is not supported by Oracle and must be used at your own discretion. Please like on this article if it has helped you to understand the migration procedures from Oracle BPM to Oracle Autonomous Integration Cloud Service. And please leave a comment and state your feedback. This is very important to help me keep writing new articles about this sort of content.

By Andre Boaventura, Senior Manager of Product Management In my last blog post Migrating your Oracle BPM assets into Oracle Process Cloud Service (PCS), I have described and demonstrated how to...

Integration

Where and How Blockchain can be a better option than the traditional centralized systems? — A straightforward answer for a very common question

By Andre Boaventura, Senior Manager of Product Management After having written this article explaining a variety of consensus mechanisms for Blockchain available today, I have received hundreds of comments and positive feedback from those were struggling to understand what makes cryptocurrencies and blockchain securely working in a decentralized way, after having reading my article and thus have clearly understood the foundation supporting Blockchain implementations (thanks for all my readers by the way). However, there are others that are still struggling to understand the benefits of a blockchain vs a traditional model based system, and where blockchain can be really a “breakthrough” when compared to what can be done and achieved by leveraging a centralized system model. Among dozens of mails I have received regarding the same sort of questions, that is, where Blockchain can be better than those traditional systems that we have been using for decades, I have decided to share one of my answers to one of these colleagues, since I have noticed that this could benefit others that eventually could be questioning themselves about the same topic. The Question So, here is a sample question related to this concern mentioned above that I received from a colleague from the UK. Hi Andre I’ve read your comprehensive article and thanks for it being most useful and informative. I confess I may need to read it a second time to get a proper understanding of the intricate details and the “Proof of . . .” mechanisms and their pros and cons, but I think I understand the principle of Consensus Mechanisms. My fundamental question (and the reason for this reply to you) is “what are the real-world applications of a blockchain, and (playing devil’s advocate, if you’ll forgive me, why is it any more than an interesting computer science project?”.  I can easily see its application to virtual currencies like Bitcoin (whose critics would decry it as no more than a baseless speculation mechanism), but I’m struggling to imagine real-world examples of why I’d need a blockchain, or why it’s better, in some way or other, than what already exists. As an example — suppose I want to send £100 (or 100 Reais) to you. I contact my bank, tell them your banking details, and issue instructions to send the money: it arrives in your account, in due course, via the international banking system. Whilst I can imagine how the equivalent could be achieved via a blockchain in which you and I are both participants, why would doing so be any “better” than the existing mechanism? I speculate that your reply will include that it removes the need for the two banks acting as middlemen, but is that the only advantage? Or have I imagined a poor example? Thanks in advance for your expert reply.   The Answer Here was my answer to him(except the comics that I added to it while writing this article): First of all, thanks for spending the time to have a look at my article about consensus mechanisms found in some of the most popular Blockchain implementations available today. Hopefully it has helped you and provided some insights about key terminologies that are becoming more and more popular while talking about distributed ledger technologies. As I initially mentioned in this article, people generally understand(and also associates) the usage of this technology with cryptocurrencies, which is natural indeed, since it was born due to undeniably ingenious invention, to be used as the foundation of the most popular and valuable cryptocurrency we know today: bitcoin. However, sometimes it is a bit hard(I also agree with you) to get started with Blockchain and imagine something else that is different from the traditional model, based on the very common example related to money exchange between banking accounts. Therefore, let me try to provide you with a few other benefits and examples of usage that Blockchain technology can bring that go beyond the original usage leveraged by Satoshi Nakamoto, the creator of Bitcoin. The first question one should be asking is: Why is Blockchain so important leading to everyone talking about it? Fundamentally, it cryptographically addresses the problem of shared trust. However, how can entities that don’t trust each other transact?  Before answering this question, let us try to answer the following: What are the current approaches used by financial services companies(the same used in your use case above applies here) for solving this problem:   1) Trusted intermediary, e.g.: Visa/MC, SWIFT, DTCC, EuroClear  Issues: cost, latency, single-point-of-failure  Blockchain can remove the need for intermediary and replace it with cryptographically secure protocols 2) Separate records stored by all the different entities  Issues: reconciliation costly and error prone, does not scale, delayed settlement  Blockchain’s distributed ledger is a single source of truth — no reconciliation needed In a nutshell, Blockchain as a decentralized, peer-to-peer network with no central or controlling authority, means eliminating intermediaries that results in reduced transaction costs and near real-time transaction execution, which is a way different from what has been done by financial institutions, specially when transferring money overseas, which is exactly what you described in the use case below used to illustrate your question. Additionally, as a distributed ledger based technology, where all participants maintain a copy of the ledger, it eliminates manual efforts and delays due to reconciliation needs since data consistency is a key attribute of the distributed ledger. Often, data integration between systems of record (SORs) is driven by offline or batch reconciliation processes characterized by delays and manual exception handling. As such, Blockchain can help by using the cryptographically-secured consensus protocols that assure validation and agreement by all relevant parties, as well as real-time replication of data to each participant’s copy of the ledger. So, getting back to the first question, Blockchain is important and different from what already exists because it enables distributed and autonomous marketplaces,reduces friction in business transactions and reconciliations, securely maintainand share decentralized records, which can be used for a variety of use cases, such as the provenance of products, documents, materials, etc, which are not necessarily related only to the financial perspective as can be noticed, but can be applied for a countless number of use cases, which by the way, I have put them together below for you convenience. Potential Use Cases for Blockchain Here is a sample list of potential use cases(0f course not limited only to the ones listed below) to which you are likely to achieve real benefits by leveraging the key blockchain capabilities such as single source of truth, trusted transactions,immutable ledger store and near-real time data sharing, as highlighted above. Supply Chain Genealogy and traceability of parts, components, ingredients Maintenance parts tracking in multi-layered distribution Parts & maintenance tracking for aircraft & other regulated assets Farm-to-table food provenance Country of origin traceability Electronic compliance records Quality control records Tamper-proof IoT sensor data, non-repudiation of monitored activities Public Sector Government records (titles, birth certificates, licenses, etc.) sharing Customs (import/export licensing, excise taxes) Regulatory certifications (food, pharma, etc.) Procurement/Acquisitions Citizen services, e.g., benefits, multi-agency programs Healthcare Electronic Health Record Service provider credential management Clinical Tamper-proof IoT sensor data, non-repudiation of monitored activities, trials Anti-counterfeit track & trace for drugs Cold chain track & trace Integration with IoT devices monitoring health or equipment Telecom Roaming & Interconnect billing 3rd Party Service Providers eSIM   Key Characteristics to Remember After reviewing the answer, we could summarize some of the key characteristics that make Blockchain unique, different, better and more innovative than a traditional centralized model based system. Decentralized and Distributed (Ledger Storage & Integrity) Maintains distributed ledger of facts and history of the updates All participants see consistent data Distributed among participants Updates replicated across participants Authorized participants access data Irreversible and Immutable (Validated/Non-Repudiable Transactions) Each new block contains a hash of the previous block creating the chain All records are encrypted, and only those authorized with corresponding keys can view the data Records cannot be undetectably altered or deleted, only appended Consensus from a subset of nodes on new blocks/transactions Existence and validity of the record can not be denied When consensus is reached under network’s policies, transactions and their results are grouped into blocks, which are appended to the ledger with cryptographically secured hashes for immutability Near Real Time (Transactions verified and settled in minutes versus days) Parties interact directly. No intermediaries Changes to the ledger are made by smart contracts (business logic) when triggered by transactions from external applications Participants execute smart contracts on the validating nodes (peers) and follow consensus protocols to verify results Blockchain is not a solution to all problems! We used to say “there is an app for that” — but nowadays it seems there is a blockchain for everything. So, what is Blockchain good for? Besides the hype around Bitcoin and the people it made rich, what are the real applications?A lot of what we hear about the Blockchain is not what it is doing, but what it can do. In many ways, you can do with centralized systems what Blockchain promises to do — with one core difference; trust. That is the big advantage for Blockchain, however as stated, blockchain is not a silver bullet to solve all use cases, then there are a few useful questions that one should be asking to determine blockchain applicability. So, if your enterprises/customers can answer "YES" to these questions below, then it is likely that Blockchain can be a good fit for them. So, here are the questions to be used as a guideline for Blockchain selection (or not): Is my business process pre-dominantly cross-departmental/cross-organizational? Are there cross-system discrepancies that slow down the business? Is there less than full trust among transacting parties? Does it involve intermediaries, possibly charging expensive fees, adding risk or delay? Does it require periodic offline (batch) reconciliations? Is there a need to improve traceability or audit trail? Do we need real time visibility of the current state of multi-party transaction? Can I improve a multi-party business process by automating certain steps in it? Although Blockchain technology is well-suited to record certain kinds of information, traditional systems such as Database systems are better suited for other kinds of information. It is crucial for every organization to understand what it wants from these different approaches, and gauge this against the strengths and vulnerabilities of each kind of solution before selecting one.   Conclusion Blockchain is speculated, but it has great potential. The tech is getting better and better, and there are solutions evolving for many of the problems I have mentioned above. What is imperative is to see if a proposed Blockchain project is really a solution for the mentioned problems (i.e. it cannot be accomplished without using Blockchain), or is just trying to take advantage of the hype.

By Andre Boaventura, Senior Manager of Product Management After having written this article explaining a variety of consensus mechanisms for Blockchain available today, I have received hundreds of...

Integration

Oracle Named a Leader in 2018 Gartner Magic Quadrant for Enterprise Integration Platform as a Service for the Second Year in a Row

Oracle announced in a press release today that it has been named a Leader in Gartner’s 2018 “Magic Quadrant for Enterprise Integration Platform as a Service” report for the second consecutive year. Oracle believes that the recognition is testament to the continued momentum and growth of Oracle Cloud Platform in the past year.   As explained by Gartner, the Magic Quadrant positions vendors within a particular quadrant based on their ability to execute and completeness of vision separating into the following four categories: Leaders execute well against their current vision and are well positioned for tomorrow. Visionaries understand where the market is going or have a vision for changing market rules, but do not yet execute well. Niche Players focus successfully on a small segment, or are unfocused and do not out-innovate or outperform others. Challengers execute well today or may dominate a large segment, but do not demonstrate an understanding of market direction.   Gartner views integration platform as a service (iPaaS) as having the “capabilities to enable subscribers (aka "tenants") to implement data, application, API and process integration projects involving any combination of cloud-resident and on-premises endpoints.” The report adds, “This is achieved by developing, deploying, executing, managing and monitoring integration processes/flows that connect multiple endpoints so that they can work together.”   “GE leverages Oracle Integration Cloud to streamline commercial, fulfilment, operations and financial processes of our Digital unit across multiple systems and tools, while providing a seamless experience for our employees and customers,” said Kamil Litman, Vice President of Software Engineering, GE Digital. “Our investment with Oracle has enabled us to significantly reduce time to market for new projects, and we look forward to the autonomous capabilities that Oracle plans to soon introduce.”   Download the full 2018 Gartner “Magic Quadrant for Enterprise Integration Platform as a Service” here.   Oracle recently announced autonomous capabilities across its entire Oracle Cloud Platform portfolio, including application and data integration. Autonomous capabilities include self-defining integrations that help customers rapidly automate business processes across different SaaS and on-premises applications, as well as self-defining data flows with automated data lake and data prep pipeline creation for ingesting data (streaming and batch).   Oracle also recently introduced Oracle Self-Service Integration, enabling business users to improve productivity and streamline daily tasks by connecting cloud applications to automate processes. Thousands of customers use Oracle Cloud Platform, including global enterprises, along with SMBs and ISVs to build, test, and deploy modern applications and leverage the latest emerging technologies such as blockchain, artificial intelligence, machine learning and bots, to deliver enhanced experiences.   A Few Reasons Why Oracle Autonomous Integration Cloud is Exciting    Oracle Autonomous Integration Cloud accelerates the path to digital transformation by eliminating barriers between business applications through a combination of machine learning, embedded best-practice guidance, and prebuilt application integration and process automation.  Here are a few key features: Pre-Integrated with Applications – A large library of pre-integration with Oracle and 3rd Party SaaS and on-premises applications through application adapters eliminates the slow and error prone process of configuring and manually updating Web service and other styles of application integration.  Pre-Built Integration Flows – Instead of recreating the most commonly used integration flows, such as between sales applications (CRM) and configure, price, quoting (CPQ) applications, Oracle provides pre-built integration flows between applications spanning CX, ERP, HCM and more to take the guesswork out of integration.  Unified Process, Integration, and Analytics – Oracle Autonomous Integration Cloud merges the solution components of application integration, business process automation, and the associated analytics into a single seamlessly unified business integration solution to shrink the time to complete end-to-end business process lifecycles.   Autonomous – It is self-driving, self-securing, and self-repairing, providing recommendations and best next actions, removing security risks resulting from manual patching, and sensing application integration connectivity issues for corrective action.   Discover OAIC for yourself by taking advantage of this limited time offer to start for free with Oracle Autonomous Integration Cloud.   Check here for Oracle Autonomous Cloud Integration customer stories.   Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.    

Oracle announced in a press release today that it has been named a Leader in Gartner’s 2018 “Magic Quadrant for Enterprise Integration Platform as a Service” report for the second consecutive...

A Practical Path to AI Podcast Series: Podcast #10 – Self-Service Integration Closes the Productivity Gap for AI Adoption

By Kellsey Ruppel, Principal Product Marketing Director   For the tenth podcast in our "Practical Path to AI" podcast series, I had a conversation with Daryl Eicher, Senior Product Marketing Director at Oracle. This was another podcast in our "Practical Path to AI" podcast series where we've been covering how Artificial Intelligence (AI) is reshaping the business landscape and helping you better understand how to get on the path to AI adoption.   IDC is forecasting spending on AI and machine learning will grow from $8B in 2016 to $47B by 2020. Automation, integration, machine learning, and AI technologies are so pervasive that more than 68 percent of us trust and leverage these powerful technologies without knowing it. What began as consumer-centric selling and investing is entering the enterprise, and transformation leaders need to know how to employ these and related disruptive technologies to grow share of wallet and reach new markets in the attention economy. Self-service integration closes the gap between enterprise applications and the tsunami of productivity apps that threatens to overrun enterprise IT if left unmanaged. In this segment of our podcast series, Daryl and I discussed how self-service integration brings the productivity leap of thousands of innovative apps together with the governance that financial services, healthcare, public sector, manufacturing, and retail firms need to build their digital brands.   Wondering what else Daryl had to say? Be sure to catch this podcast “Self-Service Integration Closes the Productivity Gap for AI Adoption” to learn more. You can also listen to the other podcasts in “A Practical Path to AI” podcast series here!   Additionally, we invite you to attend the webcast, Introducing Oracle Self-Service Integration, on April 18th at 10:00am PT. Vikas Anand, Oracle Vice President of Product Management, will discuss: Integration trends such as self-service, blockchain, and artificial intelligence the solutions available in Oracle Self-Service Integration Cloud Service Register today!

By Kellsey Ruppel, Principal Product Marketing Director   For the tenth podcast in our "Practical Path to AI" podcast series, I had a conversation with Daryl Eicher, Senior Product Marketing Director at...

Integration

Demystifying Blockchain and Consensus Mechanisms - Everything You Wanted to Know But Were Never Told

  Article written by Andre Boaventura, Senior Manager of Product Management It is likely that you’ve heard so far, many descriptions of what blockchain is, and that description probably is related somehow with money. Of course, this is not happening by chance, but actually due to many popular technologies such as Bitcoin, Ethereum, Ripple and many others currently available in the cryptocurrency marketplace, which have this solution based on DLT(Distributed Ledger Technology), as their core implementation foundation, which is the basis for trading cryptocurrencies and other assets through public & private markets. However, Blockchain technology goes much further than just cryptocurrencies. Today, blockchain is already adopted as part of many everyday B2B transactions, including those powered by enterprise applications such as ERPs, Supply Chain, Financial Services, Healthcare systems, etc, and the list is much longer than this one. The Blockchain is an undeniably ingenious invention – the brainchild of a person or group of people known by the pseudonym, Satoshi Nakamoto. But since then, it has evolved into something greater, and the main question every single person is asking is: What is Blockchain? By definition, Blockchain is a continuously growing list of records, called blocks, which are linked and secured using cryptography. Each block typically contains a cryptographic hash of the previous block, a timestamp and transaction data. By design, a blockchain is inherently resistant to modification of the data. It is "an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way". For use as a distributed ledger, a blockchain is typically managed by a peer-to-peer network collectively adhering to a protocol for validating new blocks. Once recorded, the data in any given block cannot be altered retroactively without the alteration of all subsequent blocks, which requires collusion of the network majority. Generally speaking, a blockchain network is a system for maintaining distributed ledgers or facts and the history of the ledgers' updates. This approach allows organizations that don't fully trust each other to agree on the updates submitted to a shared ledger by using peer-to-peer protocols rather than a central third party or manual offline reconciliation process. Blockchain enables real-time transactions and securely shares tamper-proof data across a trusted business network.   There are essentially two types of Blockchain considering the access perspective: Permissionless Basically, anyone can read the chain, anyone can make legitimate changes and anyone can write a new block into the chain (as long as they follow the rules). Bitcoin is by far the most popular example of a successful public blockchain network. It is totally decentralized. It is also described as a 'censor-proof' blockchain. Bitcoin and other cryptocurrencies such as Ethereum currently secure their blockchain by requiring new entries to include a proof of work. However, due to the way a public blockchain works, they require computer resource intensive mining process to add blocks cryptographically. Also, Consensus models based on computationally expensive algorithms requiring the processing power of many nodes to ensure security The great advantage to an open, permissionless, or public, blockchain network is that guarding against bad actors is not required and no access control is needed. This means that applications can be added to the network without the approval or trust of others, using the blockchain as a transport layer. For these reasons, it's also known by its widest description, a public blockchain. But, obviously, this is not the only way to build a blockchain. Permissioned Essentially they are a closed ecosystem where members are invited to join and keep a copy of the ledger. e.g.: (Hyperledger, R3 Corda).  Permissioned blockchains use an access control layer to govern who has access to the network. In contrast to public blockchain networks, validators on private blockchain networks are vetted by the network owner. They do not rely on anonymous nodes to validate transactions nor do they benefit from the network effect, but they rely on something called consensus protocol, like bitcoin's proof of work (the one we hear about most often), that does two basic things: it ensures that the next block in a blockchain is the one and only version of the truth, and it keeps powerful adversaries from derailing the system and successfully forking the chain. Consensus protocol comprises of 3 basic steps: Endorsement: determine whether to accept or reject a transaction Ordering: sort all transactions within a time period into a sequence Validation: verify endorsement satisfy policy and Read set is valid Permissioned networks can also go by the name of 'consortium' or 'hybrid' blockchains.       Blockchain Consensus mechanisms A blockchain is a decentralized peer-to-peer system with no central authority figure. While this creates a system that is devoid of corruption from a single source, it still creates a major problem. How are any decisions made? How does anything get done? Think of a normal centralized organization. All the decisions are taken by the leader or a board of decision makers. This isn’t possible in a blockchain because a blockchain has no “leader”. For the blockchain to make decisions, they need to come to a consensus using “consensus mechanisms”. So, how do these consensus mechanisms work and why did we need them? What are some of the consensus mechanisms used in cryptocurrencies and in some Blockchain implementations such as Hyperledger? All these questions will be answered later on, however let's understand how a consensus work prior to talk about some available implementations. In simpler terms, consensus is a dynamic way of reaching agreement in a group. While voting just settles for a majority rule without any thought for the feelings and well-being of the minority, a consensus on the other hand makes sure that an agreement is reached which could benefit the entire group as a whole. A method by which consensus decision-making is achieved is called “consensus mechanism”. So now that we have defined what a consensus is, let’s look at what the objectives of a consensus mechanism are: Agreement Seeking: A consensus mechanism should bring about as much agreement from the group as possible. Collaborative: All the participants should aim to work together to achieve a result that puts the best interest of the group first. Cooperative: All the participants shouldn’t put their own interests first and work as a team more than individuals. Egalitarian: A group trying to achieve consensus should be as egalitarian as possible. What this basically means that each and every vote has equal weight. One person’s vote can’t be more important than another’s. Inclusive: As many people as possible should be involved in the consensus process. It shouldn’t be like normal voting where people don’t really feel like voting because they believe that their vote won’t have any weight in the long run. Participatory: The consensus mechanism should be such that everyone should actively participate in the overall process. Now that we have defined what consensus mechanisms are and what they should aim for, we need to think of the other questions: Which consensus mechanisms should be used for blockchain network to keep their original characteristics such reliability, security and availability? We hear plenty of talk of how public blockchains are going to change the world, but to function on a global scale, a shared public ledger like Bitcoin needs a functional, efficient and secure consensus algorithm. Before Bitcoin, there were loads of iterations of peer-to-peer decentralized currency systems which failed because they were unable to answer the biggest problem when it came to reaching a consensus. This problem is called “Byzantine Generals Problem(BGP)”.   Byzantine Generals Problem(BGP) Imagine that several divisions of the Byzantine army are camped outside an enemy city, each division commanded by its own general. The generals can communicate with one another only by messenger. After observing the enemy, they must decide upon a common plan of action. However, some of the generals may be traitors, trying to prevent the loyal generals from reaching agreement. The generals must decide on when to attack the city, but they need a strong majority of their army to attack at the same time. The generals must have an algorithm to guarantee that: (a) All loyal generals decide upon the same plan of action, and A small number of traitors cannot cause the loyal generals to adopt a bad plan. The loyal generals will all do what the algorithm says they should, but the traitors may do anything they wish. The algorithm must guarantee condition (a) regardless of what the traitors do. The loyal generals should not only reach agreement, but should agree upon a reasonable plan.     Looking at the picture above, you can understand the problem and what is the challenge for Byzantine generals while attacking a city. They are facing two very distinct problems: The generals and their armies are very far apart so centralized authority is impossible, which makes coordinated attack very tough. The city has a huge army and the only way that they can win is if they all attack at once. What these generals need, is a consensus mechanism which can make sure that their army can actually attack as a unit despite all these setbacks. This has clear references to blockchain as well. The chain is a huge network; how can you possibly trust them? If you were sending someone 4 Bitcoin from your wallet, how would you know for sure that someone in the network isn’t going to tamper with it and change 4 to 40 Bitcoins? This is where consensus mechanisms come to the rescue. As such, now we are going to go through a list of consensus mechanisms which can solve the Byzantine Generals problem for some very known Blockchain networks such as Bitcoin, Ethereum, Ripple, Peercoin, Hyperledger and many others. Proof of Work (PoW) Bitcoin uses Proof of Work(PoW) to ensure blockchain security and consensus. “Proof of Work”, as its name implies, requires that the decentralized participants that validate blocks show that they have invested significant computing power in doing so. In bitcoin, validators (known as “miners”) compete to process a block of transactions and add it to the blockchain. In proof of work, miners compete to add the next block (a set of transactions) in the chain by racing to solve a extremely difficult cryptographic puzzle. They do this by churning enough random guesses on their computer to come up with an answer within the parameters established by the bitcoin. This process requires immense amount of energy and computational usage. The puzzles have been designed in a way which makes it hard and taxing on the system. Essentially this puzzle that needs solving is to find a number that, when combined with the data in the block and passed through a hash function, produces a result that is within a certain range. This is much harder than it sounds. The main character in this game is called a “nonce”, which is an abbreviation of “number used once”. In the case of bitcoin, the nonce is an integer between 0 and 4.294.967.296. How do they find this number? By guessing at random. The hash function makes it impossible to predict what the output will be. So, miners guess the mystery number and apply the hash function to the combination of that guessed number and the data in the block. The resulting hash has to start with a pre-established number of zeroes. There's no way of knowing which number will work, because two consecutive integers will give wildly varying results. What's more, there may be several nonces that produce the desired result, or there may be none (in which case the miners keep trying, but with a different block configuration). When a miner solves the puzzle, they present their block to the network for verification. Verifying whether the block belongs to the chain or not is an extremely simple process. The first to solve the puzzle, wins the lottery. As a reward for his or her efforts, the miner receives newly bitcoins - and a small transaction fee. The difficulty of the calculation (the required number of zeroes at the beginning of the hash string) is adjusted frequently, so that it takes on average about 10 minutes to process a block. Why 10 minutes? That is the amount of time that the bitcoin developers think is necessary for a steady and diminishing flow of new coins until the maximum number of 21 million is reached (expected some time in 2140). Yet, although a masterpiece in its own right, bitcoin's proof of work isn't quite perfect. Common criticisms include that it requires enormous amounts of computational energy, that it does not scale well (transaction confirmation takes about 10-60 minutes) and that the majority of mining is centralized in areas of the world where electricity is cheap, leading to an inefficient process because of the sheer amount of power and energy that it eats up. That said, people and organizations that can afford faster and more powerful ASICs(Application-specific integrated circuit chips) usually have better chance of mining than the others. As a result of this, bitcoin isn’t as decentralized as it wants to be. Theoretically speaking, there are big mining pools that could simply team up with each other and launch over than 51% on the bitcoin network. As a result, those who have significant financial resources have come to dominate the bitcoin mining space. Mining today is embodied by the emergence of enterprise-style, datacenter-hosted mining operations. Bitcoin creator Satoshi Nakamoto woke us up to the potential of the blockchain, but that doesn't mean we can't keep searching for faster, less centralized and more energy-efficient consensus algorithms to carry us into the future. Other examples can be find below such as PoS(Proof-of-Stake), Proof-of-Activity and some others available today. Proof-of-Stake (PoS) The most common alternative to proof of work is proof of stake. In this type of consensus algorithm, instead of investing in expensive computer equipment in a race to mine blocks, a 'validator' invests in the coins of the system. Note the term validator. That's because no coin creation (mining) exists in proof of stake. Instead, all the coins exist from day one, and validators (also called stakeholders, because they hold a stake in the system) are paid strictly in transaction fees. The systems that don’t use proof-of-work are also often called virtual mining systems because they don’t have a mining activity. The network selects an individual to approve new messages (that is to say, confirm the validity of new information submitted to the databse) based on their proportional stake in the network. In other words, instead of any individual attempting to calculate a value in order to be chosen to establish a consensus point, the network itself runs a lottery to decide who will announce the results, and system participants are exclusively and automatically entered into that lottery in direct proportion to their total stake in the network. As in the PoW system run by Bitcoin, the PoS system run by organizations such as Peercoin also provides an incentive to participation, which ensures broadest possible network participation and therefore the most robust network security possible. In the Peercoin system, the chosen party is rewarded with a new Peercoin in a process called ‘minting’ (rather than BitCoin’s ‘mining’). As mentioned, proof of stake will make the entire mining process virtual and replace miners with validators. Here is an outline on how the process will work: The validators will have to lock up some of their coins as stake. After that, they will start validating the blocks. Meaning, when they discover a block which they think can be added to the chain, they will validate it by placing a bet on it. If the block gets appended, then the validators will get a reward proportionate to their bets. In proof of stake, your chance of being picked to create the next block depends on the fraction of coins in the system you own (or set aside for staking). A validator with 300 coins will be three times as likely to be chosen as someone with 100 coins. Once a validator creates a block, that block still needs to be committed to the blockchain. Different proof-of-stake systems vary in how they handle this. There are some implementations where  every node in the system has to sign off on a block until a majority vote is reached, while in other systems, a random group of signers is chosen. As you can see, the PoS protocol is a lot more resource-friendly than PoW. In PoW, you NEED to waste a lot of resources to go along with the protocol, it is basically resource wastage for the sake of resource wastage. Although PoS seems to be the most reasonable replacement for PoW, due to not having the issues found in PoW(requires enormous amounts of computational energy, not decentralized as it wants to be since there are just a few large pools that own over than 50% of Bitcoin network together), there is a very common problem that needs to be solved by PoS prior to be largely adopted by a production blockchain implementation. So, reviewing the way PoS works with regards to security, the common questions that could arise would be the following: What is to discourage a validator from creating two blocks and claiming two sets of transaction fees? And what is to discourage a signer from signing both of those blocks? This has been called the 'nothing-at-stake' problem. A participant with nothing to lose has no reason not to behave badly. In the burgeoning field of 'crypto-economics', blockchain engineers are exploring ways to tackle this and other problems. One answer is to require a validator to lock their currency in a type of virtual vault. If the validator tries to double sign or fork the system, those coins are slashed. Additionally, this system, however, by rewarding those who already are most deeply involved in the network inherently creates an increasingly centralized system. This is inimical to a truly robust network. Therefore proponents of PoS systems have put forward a number of various modifications to help ensure the base for their networks remain as broad (and therefore secure) as possible. Peercoin was the first coin to implement proof of stake. Ethereum currently relies on proof of work, but is planning a move to proof of stake in early 2018 by solving the PoS problem called 'nothing-at-stake' by leveraging a new approach to address this PoS issue called Casper protocol. Also, there is a variation of this method called a delegated proof-of-stake (DPoS). This system works along the same lines as the PoS system, except that individuals choose an overarching entity to represent their portion of stake in the system. So imagine, each individual decides if entity 1, 2, or 3 (these could be, for example, computer servers, and are called ‘delegate nodes’ within a DPoS system) will ‘represent’ his or her individual stake in the system. This allows individuals with smaller stakes to team up to magnify their representation, thereby creating a mechanism to help balance out the power of large stake holders. This comes at the cost, however of greater network centralization. Bitshares is one company that employs a DPoS system. Proof-of-Activity(PoA) So, proof of activity was created as an alternative incentive structure for bitcoin. Proof of activity is a hybrid approach that combines both proof of work and proof of stake. In proof of activity, mining kicks off in a traditional proof-of-work fashion, with miners racing to solve a cryptographic puzzle. Depending on the implementation, blocks mined do not contain any transactions (they are more like templates), so the winning block will only contain a header and the miner's reward address. At this point, the system switches to proof of stake. Based on information in the header, a random group of validators is chosen to sign the new block. The more coins in the system a validator owns, the more likely he or she is to be chosen. The template becomes a full-fledged block as soon as all of the validators sign it. If some of the selected validators are not available to complete the block, then the next winning block is selected, a new group of validators is chosen, and so on, until a block receives the correct amount of signatures. Fees are split between the miner and the validators who signed off on the block. Criticisms of proof of activity are the same as for both proof of work (too much energy is required to mine blocks) and proof of stake (there is nothing to deter a validator from double signing). Decred is the only coin right now using a variation of proof of activity. Practical Byzantine Fault Tolerance Algorithm(PBFT) The Practical Byzantine Fault Tolerance Algorithm (PBFT) was designed as a solution to a problem presented in the form of an allegory described earlier in this introduction chapter under the Byzantine Generals Problem(BGP) section. To clarify the allegory for our purposes: the ‘generals’ in the story are the parties participating in the distributed network running the blockchain (database) in question. The messengers they are sending back and forth are the means of communication across the network on which the blockchain is running. The collective goal of the “loyal generals” is to decide whether or not to accept a piece of information submitted to the blockchain (database) as valid or not. A valid piece of information would be, in our allegory, a correct opportunity to decide in favor of attack. Loyal generals, for their part, are faithful blockchain participants, who are interested in ensuring the integrity of the blockchain (database) and therefore ensuring that only correct information is accepted. The trecherous generals, on the other hand, would be any party seeking to falsify information on the blockchain (the database). Their potential motives are myriad — it could be an individual seeking to spend a BitCoin that she does not actually own or another person who wants to get out of contractual obligations as outlined in a smart contract he already signed and submitted. Various computer scientists have outline a number of potential solutions to the Byzantine generals problem from the allegory. The practical byzantine fault tolerance algorithm (PBFT), which is used to establish consensus in blockchain systems, is only one of those potential solutions. Three examples of blockchains that rely on the PBFT for conses are Hyperledger, Stellar, and Ripple. Very roughly and without explaining the whole algorithm (which would take a multiple page research paper), what the PBFT does is as follows: Each ‘general’ maintains an internal state (ongoing specific information or status). When a ‘general’ receives a message, they use the message in conjunction with their internal state to run a computation or operation. This computation in turn tells that individual ‘general’ what to think about the message in question. Then, after reaching his individual decision about the new message, that ‘general’ shares that decision with all the other ‘generals’ in the system. A consensus decision is determined based on the total decisions submitted by all generals. Among other considerations, this method of establishing consensus requires less effort than other previous methods described earlier. Also, PBFT is a system initially devised for low-latency storage systems - something that could be applicable in digital asset-based platforms that don't require a large amount of throughput, but do demand many transactions. Hyperledger's approach for consensus The Hyperledger project allows developers to create their own digital assets with a distributed ledger powered by nodes built on the principle of PBFT. The system could be used to digitally back a real asset (such as a house), create new coins, or form a fault-tolerant system of consensus. The idea for Hyperledger's use of PBFT goes beyond asset-based systems. It takes the idea of an algorithm for consensus and uses it to distribute all sorts of technical solutions - not just the low latency, high-speed file storage solution it was originally built to provide. This might be a good method of testing the power of nodes that do not use incentive to develop their strength. What will happen without such rewards? Systems like Hyperledger aim to find out. If you use Byzantine Fault Tolerance, ideally corruption problems are contained. The other nodes can realize a node is misbehaving, and not respond to its messages. In distributed ledger technology, consensus has recently become synonymous with a specific algorithm, within a single function. However, consensus encompasses more than simply agreeing upon the order of transactions, and this differentiation is highlighted in Hyperledger Fabric through its fundamental role in the entire transaction flow, from proposal and endorsement, to ordering, validation and commitment. In a nutshell, consensus is defined as the full-circle verification of the correctness of a set of transactions comprising a block. As for Hyperledger implementation, consensus is ultimately achieved when the order and results of a block’s transactions have met the explicit policy criteria checks. These checks and balances take place during the lifecycle of a transaction, and include the usage of endorsement policies to dictate which specific members must endorse a certain transaction class, as well as system chaincodes to ensure that these policies are enforced and upheld. Prior to commitment, the peers will employ these system chaincodes to make sure that enough endorsements are present, and that they were derived from the appropriate entities. Moreover, a versioning check will take place during which the current state of the ledger is agreed or consented upon, before any blocks containing transactions are appended to the ledger. This final check provides protection against double spend operations and other threats that might compromise data integrity, and allows for functions to be executed against non-static variables. Also, since Hyperledger Fabric requires all participants to be authenticated, due to its permissioned implementation nature, in addition to take advantage of this characteristic to govern certain levels of access control (e.g. this user can read the ledger, but cannot exchange or transfer assets), it also can benefit on this dependence on identity as a great advantage in that varying consensus algorithms (e.g. byzantine or crash fault tolerant) can be implemented in place of the more compute-intensive Proof-of-Work and Proof-of-Stake varieties, as it was properly described earlier in this section. As a result, permissioned networks tend to provide higher transaction throughput rates and performance. In addition to the multitude of endorsement, validity and versioning checks that take place, there are also ongoing identity verifications happening in all directions of the transaction flow. Access control lists are implemented on hierarchal layers of the network (ordering service down to channels), and payloads are repeatedly signed, verified and authenticated as a transaction proposal passes through the different architectural components. To summarize, consensus is not merely limited to the agreed upon order of a batch of transactions, but rather, it is an overarching characterization that is achieved as a byproduct of the ongoing verifications that take place during a transaction’s journey from proposal to commitment. Conclusion about consensus mechanisms While these systems for establishing consensus are currently the most dominant, the field is still wide open to innovation by creating variations of these implementations as well as new approaches to them. Some other examples are: Proof of burn, Proof of capacity, and Proof of elapsed time. As blockchain systems continue to gain in popularity, they will also continue to grow in scale and complexity. Which of these consensus building systems (if any) is best equipped to handle this ongoing expansion remains to be seen. Currently, companies choose a system for their product that best meets their (or their customer’s) needs for speed, efficiency, and security. It is important to note, these systems differ not only in the details of the formation of their respective consensus-building communities, but importantly they differ in how they would handle potential attacks. This is, in fact, one of the clearest distinguishing features between the consensus-building systems: the potential size of an attack on the system that could be easily managed. If you've made it this far, then congratulations! There is still so much more to explain about the Blockchain and Hyperledger, but at least now you have an idea of the broad outline of the genius of the programming and the concept. For the first time we have a system that allows for convenient digital transfers in a decentralized, trust-free and tamper-proof way. Sky is the limit for Blockchain!!

  Article written by Andre Boaventura, Senior Manager of Product Management It is likely that you’ve heard so far, many descriptions of what blockchain is, and that description probably is related...

Integration

Great APIs need a plan!

We recently released API Platform Cloud service 18.1.5 and with that we are introducing phase 1 of plans!  To be fully transparent, we've had plans built into the service from day 1, but this marks a step in making the feature available. What are plans you may ask? Plans provide the measured access to one or more APIs serving as the foundation for monetization. Plans define limits at the subscriber level that stretches across APIs. To explain this further, let's use the example of a rate-limit.  A rate limit controls the number of calls within a certain time period.  The API Rate Limit protects a system by limiting the number of calls that may be made to a particular API, no matter who is calling the API.  For example, if my back-end system can handle no more than 10000 requests per second, I may set an API Rate Limit of 10000 per second which would apply for all callers.  Another limit is the Application Rate Limit, which we can call the "fair share" limit.  This stipulates that no one application can get more than a limited number of calls within a certain time period.  For example, I may decide that no one application can get more than 1000 calls per second.  If I have 5 applications subscribed then this means that there can be a total of 5000 requests per second.  Plans takes this forward in a much richer way in that I can set limits for the consumer.  With plans, the API consumer now subscribes to the plan and APIs are entitled to that plan.  When a consumer subscribes to the plan, the consumer gets access to all of the APIs entitled in the plan. We can now set a limit at the plan level.  For example, a consumer may be limited to 100 calls per second.  This limit would apply across all of the APIs.  This means that while the API can handle up to 10000 calls/second, and any application can call up to 1000 calls/second, the plan that the subscriber happens to be subscribed to set a limit of 100 calls/second so the limit for that subscriber is the lower of the three limits.  This limit would also stripe across all APIs in the plan meaning that the calls are counted for the subscriber no matter how many APIs happen to be entitled in the plan We can also within the plan, set limits for specific APIs in that plan.  This is not to be confused with the API Rate Limit policy, rather it is a plan limit applied specifically to that API entitled in the plan Plans allow us to create consumer groups where we can control access based on the subscription rather than just the API itself.  This provides the foundation for monetization where you can segment consumers based on the plan they are entitled.  This is just the beginning as we will be bringing more monetization features, but this already provides great value for enterprises that want to define limits across groups.  To learn more about API Plans, visit Managing Plans in our documentation!

We recently released API Platform Cloud service 18.1.5 and with that we are introducing phase 1 of plans!  To be fully transparent, we've had plans built into the service from day 1, but this marks a...

Integration

Introducing Oracle Self-Service Integration Cloud Service

REGISTER HERE One of the most exciting innovations in integration over the last decade is arriving just in time to address the surge of productivity apps that need to be integrated into the Enterprise. On a general scale, there are approximately 2,300 SaaS apps businesses use that need to be integrated. Line of business (LOB) users such as marketing campaign managers and sales managers are looking to perform quick and simple self-service integration of these apps themselves without the need for IT involvement. Oracle Self-Service Integration Cloud Service (SSI) provides the right tools for anyone that wants to connect productivity apps such as Slack or Eventbrite into their business. For example, perhaps you are a Marketing Campaign Manager and want to receive an alert each time a new digital asset is ready for your campaign. Or you are a Customer Support Representative trying to automate the deployment of survey links when an incident is closed. Or you are a Sales Manager who wants to feed your event attendees and survey respondents into your CRM. SSI has the tools to address all these needs and more.     Oracle Self-Service Integration is solving these business challenges by: Connecting productivity with enterprise apps - Addressing the quick growth of social and productivity apps that need to be integrated with enterprise apps. Enabling Self-Service Integration - Provide line of business (LOB) users the ability to self-service connect applications with no coding to automate repetitive tasks. Recipe-based Integration - Making it easier to work faster and smarter with modern cloud apps with an easy to use interface, library of cloud application connectors, and ready to use recipes. SSI increases productivity by bringing together collaborative applications such as Slack with traditional enterprise applications, reduces IT workloads allowing IT to deliver initial set-up and any required advanced integration and then offloading basic integration updates to LOB, and delivers faster integration and integration updates. There is no training required to use SSI. Simply ‘activate’ ready-to-run recipes and customer added events will automatically trigger the flow of integration.  To learn more, attend this webcast on April 18, 2018 at 10am PT/1pm ET to hear from Vikas Anand, Oracle Vice President of Product Management, as he discusses: Integration trends such as self-service, blockchain, and artificial intelligence the solutions available in Oracle Self-Service Integration Cloud Service the journey to a friction-less enterprise  Register here For a comprehensive overview of Oracle Self-Service Integration Cloud Service, take a look at our SSI ebook: Make Your Cloud Work for You.

REGISTER HERE One of the most exciting innovations in integration over the last decade is arriving just in time to address the surge of productivity apps that need to be integrated into the...

Visit COLLABORATE 18 to Hear the Latest on Oracle Cloud

    Post by Wincy Ip, Oracle     As Oracle's Cloud solutionscontinue to expand across SaaS, IaaS, and PaaS, customers are eagerly evaluating how these offerings can help transform how they run their businesses. Whether users are looking to modernize their business and optimize with new cloud investments, integrate and extend an existing hybrid environment with on-premise systems, or build a personalized path to the cloud, COLLABORATEis the annual Oracle user conference where attendees can learn how they can accelerate business innovation and digital transformation with Oracle Cloud.   This year's program at COLLABORATE, nearly 50% of the 1,200+ sessions will focus on cloud, developer, and emerging technologiesto complement Oracle's on-premise solutions. Here's a preview of some of the education available at COLLABORATE.   In the Oracle keynote session on Monday, April 23 at 2:30 p.m., Steve Daheb, Senior Vice President for Oracle Cloud, will illuminate how the Oracle Cloud Platform makes it possible for organizations to develop their own unique path to cloud from wherever they choose—SaaS, PaaS, or IaaS—and share how organizations have designed their unique journeys.   With the introduction of the world's first-ever autonomous database, COLLABORATE attendees will also hear about the exciting developments and get a sneak peek on the Oracle Autonomous Database Cloud, and how Oracle is integrating AI and machine learning to its suite of cloud services to make them fully autonomous and cognitive. These sessions will explore how organizations can benefit from more autonomy in their software, from business users to app developers to DBAs.   Additionally, there are more than 500 sessions available that span across Oracle's SaaS, IaaS, and PaaS solutions where attendees can learn how Oracle's cloud offerings can accelerate business transformation, increase agility, and optimize security with their existing solutions. Some of these sessions include:   Your Journey to Cloud with Choice and Control [Session ID: 109730] Move Your Oracle Workloads to Oracle Cloud: No Pain, Lots of Gain [Session ID: 109430] Oracle Cloud Infrastructure - The Best of On-Premises and Cloud in a Single Infrastructure Solution [Session ID: 112020] Advanced Architectures for Deploying Oracle Applications on Oracle Cloud Infrastructure [Session ID: 107820] Extend and Enhance ERP and Supply Chain with Oracle Cloud Platform [Session ID: 104320] Bitcoin Tech: How Blockchain Helps Extend Boundaries for Enterprise Applications and SaaS [Session ID: 104340] The Next Big Things: AI, Machine Learning, Chatbots, IOT, and Blockchain [Session ID: 110080]   COLLABORATE is the largest annual technology and applications forum for the Oracle user community in North America. Taking place on April 22-26 in Las Vegas, Nevada, and hosted by three Oracle user groups – IOUG, OAUG, and Quest International Users Group – the five-day conference will host more than 5,000 attendees in keynotes, sessions, workshops, networking events, and an exhibitor showcase with 200+ vendors.   See what COLLABORATE 18 has to offer. You can also review the complete agendaand search by keyword, education track, product line, or business goal. Register at attendcollaborate.comby April 18 and save up to 25% from the onsite registration price.

    Post by Wincy Ip, Oracle     As Oracle's Cloud solutionscontinue to expand across SaaS, IaaS, and PaaS, customers are eagerly evaluating how these offerings can help transform how they run their...

All Nippon Airways Ensures Customer Satisfaction Using Oracle API Platform Cloud

All Nippon Airways is a long-time Oracle Middleware customer, and they have seen how Oracle Integration and API Management can serve them well into the future. The Japanese airline operates over 260 aircraft for more than 50 million flyers each year. Delivering top notch customer service is core to the company’s DNA. They pride themselves on the high standards that allow them to achieve their incredibly high safety and 88.88% on-time flight statuses, but the industry accolades that they collect ranks up there, as well. Their commitment to excellence extends not only to their customers, but to their partners. As a member of the Star Alliance, ANA is committed to delivering the same level of Japanese quality to the other 27 members of the Star Alliance.  Providing seamless access to data can present a technical challenge, however. When initially confronted with the task of providing their flight data and other internal operating information to their partners, the company was faced with both technical and time limitations. Partners needed to be able to access highly regulated flight status and booking information, and expected to deliver it in their own form and fashion.  They ultimately realized that information was best exposed through APIs.  Like many companies entering the Digital Economy, All Nippon Airways decided that  investing in an API strategy was the best decision for them because of the innate scalability that they provide, but also because they expect tools to provide: •Growth in top line revenue •Enhanced customer engagement •Omni-channel availability •Continuous innovation & automation •Modernized backend interfaces   Read More: Why do teams need API Management? So they can build great products. Thankfully, in times of high demand and fluctuation, when ANA's customers needed them most, the Oracle API Platform was able to ensure that their system could flex to meet demand. In the most trying circumstances, All Nippon Airways focuses on keeping their passengers - and partners - happy, safe, and up to date. Learn more - watch the video here.

All Nippon Airways is a long-time Oracle Middleware customer, and they have seen how Oracle Integration and API Management can serve them well into the future. The Japanese airline operates over 260...

The Power of Process Automation: The Linchpin to Digitization and AI Adoption

“How we respond to the opportunities and challenges of the outside world now determines how much the outside world values us.” - Seth Godin, Linchpin: Are You Indispensable? The Future of Digital Workforce Automation I think about the future of work in the intelligent enterprise a lot. I have two brilliant kids just entering the workforce and a bright three year old. I'm always amazed by their creativity, resilience, and engagement in the mashed up real and virtual worlds they were born into. Disruptive technologies such as process automation with RPA and conversational AI offer the hope that my kids will never have to cut and paste across devices, apps, and reports. Less mechanics, more design, faster time to market. That's what scales. Orchestrating dynamic interactions among employees, intelligent bots, and robots is the quickest way to build your digital workforce.  In this segment of our podcast series, we discuss how process automation paves the way for AI optimization while providing the governance that financial services, healthcare, public sector, manufacturing, and retail firms need to grow new revenue streams.  Check out the 10 minute podcast here.   AI and the Enterprise IDC is forecasting that spending on AI and machine learning will grow from $8B in 2016 to $47B by 2021. Automation, pervasive integration, machine learning, and AI technologies are so ubiquitous that more than 68 percent  of us trust and leverage these powerful technologies without knowing it. What began as consumer focused selling and investing is entering the enterprise, and transformation leaders need to know how to employ these and related disruptive technologies to grow share of wallet and reach new markets in the attention economy. Process automation and API-first design thinking pave the way for optimizing customer and employee engagement with conversational AI and best next action recommendations. Robotic Process Automation is a hot topic because: RPA fills a long-standing gap that has slowed transformation - the ability to "mimic" human operators for repetitive tasks. This opens new possibilities for immediate productivity gains while contributing to governance. RPA robots are easy to train, don't require changes to underlying legacy systems, and execute flawlessly. Speed to Revenue with Enterprise-Class Governance Process automation and AI offer significant productivity and time to market advantages for companies of all sizes. RPA is a subset of process automation that focuses on repetitive human tasks. It provides "record and playback" to mimic the exact steps employees perform to interact with systems that may not have modern APIs. This gives transformation teams the ability to digitize end to end processes such as hire to fire, inquiry to order, or application to funding for new home loans. RPA robots can be easily trained to log into a system, enter data, copy and paste between SaaS and on-premises applications, and commit transactions exactly as human operators do. This frees employees to engage with customers by taking the friction out of their business conversations. AFG's IT Manager of Home Loan Business Services, Andrew McGee, spoke to me about how process automation has impacted his business recently. He's seen his capacity to innovate triple, time to market reduced by 4X, and cost of ownership drop 45% since beginning his transformation journey with Oracle's SaaS and cloud platform. For AFG, taming legacy complexity is a mantra, and process automation with RPA offers quick wins that don't require changes to the underlying systems of record.   Robotic Process Automation and AI It's important to think of RPA and AI as peanut butter and chocolate - both great and even better together - but they're not the same things. It helps me to think of these not as just more technology but as digital workers. Some of these digital workers are efficent but rigid, while others are more adaptive and need much more context to be effective. The global volatility we're seeing attributed to automated trading shows the limits of RPA. It's not a weakness that RPA robots do exactly what they're trained to do quickly and accurately. Just like it's not a weakness that stock buy and sell triggers work fast and independently to automate trades. That's great for productivity and governance. But RPA robots need to be orchestrated and managed to ensure that intended business outcomes are consistently achieved in day to day operations. That's where API-first design, adaptive case management, and pervasive integration come in.  AIs are digital natives that can be employed to optimize automated processes and make what seems like magic look easy. It may sound like science fiction, but the Japanese government is banking on AI to address their near term home healthcare challenges. The world is watching SpaceX transform the cost of transporting heavy equipment with reusable, AI-controlled, autonomous booster rockets that can safely land themselves. AIs are obsessed with learning and they need real-time data - lots of it - to be helpful and relevant in enterprise applications. Top 2 Ways AI Simplifies Enterprise Apps The two big application patterns AIs contribute to are conversational interactions and best next action recommendations. We experience the power of these patterns working together every time we ask Google for directions on our smartphones. That feat requires streaming data and the digital processes needed to convert that raw data into personalized recommendations in time for your next turn in rush hour traffic. I recently spoke with Bobby Patrick, CMO of UiPath, to get his perspective on how RPA and AI work together for firms of all sizes across Financial Services, Healthcare, and Public Sector industries. Bobby helped me to think of RPA robots and AIs as virtual assistants that augment an enterprise's digital workforce. Simply put, RPA robots are the brawn to get repetitive tasks done efficiently while conversational AIs and best next action recommendations boost productivity for your employees - who are, and remain, the brains of your business operations. Process automation is the lynchpin that helps digital and human workers do better together in the intelligent enterprise. Like landing drone rockets in perfect tandem or recommending a faster route home from work, the way is simple, but getting it to work at scale isn't easy. How does the right partnership and hybrid cloud platform help transform IT? Build a shared library of integrations, quickly automate end to end processes using pre-built connections, and then optimize your apps with conversational AI and best next actions. Simple right? Process Automation and Chatbots for Conversational Governance I asked the CFO and CTO of Rubicon Red, Matt Wright, how he has helped bring conversational AI and process automation together for his retail and home health customers. His eminently practical approach is to pick the right partners, platform, and projects to deliver quick wins and build the foundation for what his customer, Ryan Klose, Executive General Manager of National Pharmacies, calls "conversational governance". National Pharmacies competes with much larger firms for the loyalty and engagement of their 350,000 members. Building shared libraries of standard operating procedures and integrations gives their "business architects" the pre-built components they need to quickly assemble, release, and analyze new innovations. Using Matt's simple approach and Oracle's hybrid cloud platform, Ryan's transformation team is seeing 250% sales increases from projects delivered in a matter of days. Ryan credits API-first design thinking with extending the governance and automation beyond his back office to include every aspect of his customer experience. Breaking Barriers to AI Adoption National Pharmacies is transforming their business by retooling both their IT and business leadership. And it's working in the face of stiff competition including Amazon. I believe if you look at what HBR, MIT, and industry luminaries are saying about skills and data being being the biggest barriers to CEOs adopting AI, you'll see why Ryan and Matt are making magic happen while building the trust they'll need to win in their markets.      Oracle's hybrid cloud platform is helping companies of all sizes to grow their share of wallet and expand into new markets.   For more on Oracle Integration Cloud's process automation with RPA and conversational AI, check out the podcast and visit oracle.com/paas. 

“How we respond to the opportunities and challenges of the outside world now determines how much the outside world values us.” - Seth Godin, Linchpin: Are You Indispensable? The Future of Digital...

Integration

Denver Metro RTD Delivers Predictive Stop Times with Oracle API Platform

For many years, Denver Metro Regional Transportation District has been at the cutting edge of new technology, highly focused on delivering an excellent experience to their riders. Keeping the trains and buses running on time is no small task, and RTD takes that very seriously.  In this video, CTO Rahul Sood sat down to talk about how Denver Metro RTD provides public transportation to eight counties in the metropolitan area of Denver. The public agency serves the transportation needs of over 2.8 million people, with services including bus, rail, shuttles, ADA paratransit services and more. The innovative solution that they developed using the Oracle Platform provides railway location data as well as predictive stop data that the District can publish to consumers, easing their commutes and letting them know of any challenges as soon as possible. This stop prediction data employs predictive algorithms to alert people waiting at a bus stop about arrival times for the next three buses to make planning a little bit easier. Not only that, but by partnering with the robust development community that exists in Denver, RTD has been able to provide this data to its riders through easy to use mobile apps. Denver Metro has invested in API-First Design & Management with the Oracle API Platform to allow developers to easily access & find APIs. Their design-first approach to develop useful APIs results in 2.5 million API calls/month Publication of 1 billion predictions/year to users Half a billion locations published   This gives both developers & riders an easy way to see what is out there, whether that is the right API for the job or the next bus coming down the road. Denver Metro's developers get an easy way to design & test APIs for their developer community to leverage in apps and services, all while assured that their systems and data are protected from unnecessary & unauthorized usage and attacks through security policies like OAuth 2.0.  API Managers are able to monitor & control APIs by determining access roles, all from a central dashboard. To find out more, watch video here.

For many years, Denver Metro Regional Transportation District has been at the cutting edge of new technology, highly focused on delivering an excellent experience to their riders. Keeping the trains...

AFG Drives Frictionless Home Loans with Oracle Cloud

4X faster delivery, 45% lower cost and 3X increased innovation capacity - IT transformed Andrew McGee, IT Manager of Business Services for AFG Home Loans, began his transformation journey with a simple question. How much of his IT spend was going to innovation? As a financial services company facing new competition from specialized fintech vendors, Andrew knew they needed to increase their innovation capacity, and he looked to the Oracle Cloud for answers. I got to speak with Andrew about his amazing journey and how it's transformed the way his team engages with business leaders.  Moving their workloads to the cloud relieved IT from more mundane activities and saved on data center costs. In their quest to take the friction out of the end to end home loan process, Andrew and his team retooled with Oracle's Cloud Platform with Autonomous Services. Now, new hires are able to immediately contribute to innovation efforts, and based on a track record of quick wins that scale, his team spends more time obsessing on simplifying every aspect of approving and funding the 10,000 home loans per month AFG brokers manage.   Andrew's vision of frictionless home loans is working for AFG. In the time he has led his team on this journey, he's seen a 3X increase in innovation capacity. With Oracle Cloud Platform's hybrid integration and automation, Andrew now has a practical path to adopt disruptive technologies such as conversational AIs, Blockchain, and Robotic Process Automation. And like his colleague shares in  3 Ways National Pharmacies is Creating the Future of Work, the result is more engaged and energized IT pros that find new ways to monetize and secure their digital future. .  Taking hassle out and keeping risks down is what elevates AFG's connected business to deliver engaging customer and employee experiences.  Watch the video where  Andrew outlines his approach to accelerating project delivery by 4X while minimizing cost and risk.   To learn more about Oracle Cloud Platform with Autonomous Services, visit oracle.com/paas. 

4X faster delivery, 45% lower cost and 3X increased innovation capacity - IT transformed Andrew McGee, IT Manager of Business Services for AFG Home Loans, began his transformation journey with a simple...

Integration

3 Ways National Pharmacies is Creating the Future of Work

In a digital world, it's all about the conversation... When National Pharmacies was looking for new ideas to tap into disruptive technologies and drive the consumer-grade engagement their 350,000 members have come to expect from much larger competitors, they went back to basics and started their IT transformation with API-first design thinking. Ryan Klose, Executive General Manager, spoke to me about his journey and how his conversations with executives, board members, and employees have evolved along the way.  The attention economy demands engaging experiences across devices and channels, data-driven business agility, and just enough of the right kind of governance to scale enterprise-class digital offerings. At National Pharmacies, that meant delivering a mobile loyalty app, modernizing their core applications, and extending the governance they needed to build trust in their digital future.  For everything he had to work with, a crystal ball wasn't readily available. So rather than try to predict the future of work at National Pharmacies, Ryan and a core team of technical and business architects decided to work with Rubicon Red and Oracle's Cloud Platform with Autonomous Services to make it happen every day.   They knew their new workforce would be a mix of people, virtual assistants, and eventually, robotics. The local, personal touch had served them well and needed to be carried forward into the attention economy. So they evolved what they call "conversational governance", a powerful mix of design thinking and high-productivity delivery based on three big ideas: Build a common library of shared services to power their connected business  Automate and integrate retail and home health best practices using these services Analyze and augment adaptive APIs to drive continuous innovation  Watch the video where Ryan Klose outlines his approach to accelerating project delivery by 5X.  To learn more about Oracle Cloud Platform with Autonomous Services, visit oracle.com/paas. 

In a digital world, it's all about the conversation... When National Pharmacies was looking for new ideas to tap into disruptive technologies and drive the consumer-grade engagement their 350,000...

Integration

3 Reasons Gartner App Summit Should Be On Your 2018 #AI Radar

...so many big ideas, so little time - here are my top 3 for 2018! When your CFO asks you "Should we be able to take Bitcoin payments over a mobile device?" or "Would you trust a conversational AI to place a stock trade for you?", you need a point of view that works for them - right away!  This year's Gartner Application Strategies and Solutions Summit gave thousands of senior IT leaders the chance to gear up for just these kinds of career-shaping interactions. And while there were too many great sessions and expert insights for any mere human to cover, I was amazed by the reaction in the rooms when these three topics came up: Say yes to the business! It's all about speed The practical path to AI From the opening keynote to the closing bell, you couldn't miss the excitement and interest in #AI, #APIs, #iPaaS, #RPA, #Blockchain, #aPaaS, and others, but the questions were less about how these disruptive technologies might work someday than about how they can help transform what IT can do with the talent, assets, and budget they have today. Here's what I learned from the summit and the story of one firm's transformation using Oracle Cloud Platform:  1. Get used to saying "Yes!" to your business. This was the shot that rang around the room in the opening keynote. It challenged the longstanding belief that the business must be constrained by what IT has to deal with in the back office. Transforming IT's role from specialized technical problem solvers to more general, data-driven, problem finders was the theme that unified all of the sessions and inspired so many hallway conversations. For many I spoke with, including Sinclair's Director of Finance Applications, Duane DeBique, this was a strong validation of the role Applications IT has been playing in transformation efforts. For others, this seemed to run counter to everything that has worked for IT until now. That's a big idea whose time has come, and Oracle's Vikas Anand, Vice President of Product Management for iPaaS engaged with Duane, and a packed room of Apps IT leaders on how to make this possible for firms of all sizes!                                                                                                                                                                                                 2. It's all about speed. ​ When Vikas and Duane shared Sinclair's vision of a "frictionless enterprise", they focused on how regional marketing consultants could better engage with direct advertisers and agencies using a unified quote to cash experience across devices. When Duane demoed his tablet-based quote to order digital app featuring Oracle Integration Cloud's Adaptive Case Management (ACM) with Robotic Process Automation (RPA), the power of his vision became clear. When he interacted with his very own conversational AI to take a payment on the spot from his iPhone, the room lit up with questions. But it was when he thanked Vikas and the Oracle team for helping Sinclair take their vision to live demos running across six SaaS and on-premises applications - in just under two weeks - that the real story emerged. It was the speed that these innovations went from whiteboard to boardroom that Duane highlighted with his business and IT transformation team to help chart the next stage of their journey with Oracle.                                                                                                                                                                                                                                          3. The practical path to AI.  Whether in packed presentations or at the Oracle demo booth, conversations centered around how disruptive technologies including #Blockchain, #IoT, and #AI can open doors to new revenue streams without disrupting the systems of record business operations teams depend on for day to day execution. Vikas Anand laid out the design principles that make this possible and several sessions supported the idea that the practical path to AI starts with integration. Why? Because best next action recommendations and conversational AIs depend on content and context to be helpful and relevant. Blockchain and IoT, like AI technology, are only practical for businesses who have a modern, hybrid integration strategy to transform real-time data and events into intelligent micro-decisions that can lead to big new revenue opportunities for innovators like Sinclair Broadcasting. Digital process automation, using API-first integration emerged as the next stage because manual steps and legacy complexity fragment end to end operations such as quote to cash. Optimizing customer-facing activities with conversational AI and best next action recommendations depends on simplifying integration and accelerating process automation.  Getting Disruptive Technology to Work For Your Transformation The transformations Apps IT leaders such as Sinclair Broadcasting's Duane DeBique came to the Gartner Applications Strategies and Solutions Summit to enable all depend on the agility to say to "yes" when their CFOs ask for simple answers to complex technology disruptions. Amit Zavery, Oracle's SVP, Product Development, Oracle Cloud Platform and Middleware, joined Duane and Vikas onstage to support the kind of rapid integration and automation needed to make disruptive technologies such as #Blockchain, #IoT, and #AI practical elements of your application strategy in 2018! With so many senior IT executives asking for more details on this simple yet powerful path, Vikas encouraged attendees to join him and Bobby Patrick, UiPath's Chief Marketing Officer, for a live webcast on Robotic Process Automation and Artificial Intelligence to see how Oracle Integration Cloud's process automation with RPA helps simply legacy complexity. As Vikas summarized in his closing remarks, he is no longer just delivering industry-leading enterprise software and platforms, he's now part of an energized executive team that believes making Apps IT leaders like Duane dangerous is their business. When Apps IT innovators bring data-driven insights into how to optimize core processes and drive new revenues to their business executives, they impress. When they bring functional digital apps that executives can try on their phones, they become the thought leaders their business turns to in the face of technical and market disruption. When those same apps work across both API-enabled SaaS applications and custom legacy systems - without the need for retraining - disruptive technologies such as conversational AI and best next action recommendations take the friction out of business operations and put your digital transformation on the fast track to success!  To learn more about how your team can transform faster with Oracle Cloud Platform, follow us @OracleIntegrate and join the conversation at https://blogs.oracle.com/integration.    

...so many big ideas, so little time - here are my top 3 for 2018! When your CFO asks you "Should we be able to take Bitcoin payments over a mobile device?" or "Would you trust a conversational AI to...

Oracle at the 2017 Gartner Application Strategies & Solutions Summit

The premier conference on app strategy, innovation and customer experience Oracle is a proud to be a Platinum sponsor of the Gartner Application Strategies & Solutions Summit December 4-6, 2017 in Las Vegas, NV.  The ultimate digital experience is where mobile apps, business applications, customer data and real-time analytics come together. At Gartner Application Strategies & Solutions Summit 2017, you’ll explore the intersection of topics such as agile, CRM, event-driven architecture, CX and UX strategies. Attend the Oracle Featured Solution Provider Session: Date:  Monday, December 4, 2017 | 12:15 p.m. Location:  Octavius 10 You won’t want to miss this session as our customer speaker and industry thought leader Duane DeBique, Financial Applications IT, Sinclair Broadcasting, will be talking about RPA, Chatbots and AI and will demo his fresh new Conversational AI to showcase his vision of a unified experience for his marketing consultants and their customers who are advertisers and agencies. Additionally, Vikas Anand, Vice President, Oracle Integration, Process and API Product Management, will showcase some great demos around Integration, Adaptive Case Management, and Robotic Process Automation. Session title: Digital Transformation with Bots, RPA, API-First Design, and Integration Session abstract: Differentiation and business agility are built, not bought. Digital transformers across industries and in firms of all sizes are delivering enterprise-grade innovations faster with Oracle’s platform for low- and no-code automation and API-first development. This session reveals proven ways to get quick wins while reducing risk and training costs. Learn how APIs and microservices enable you to assemble pre-built components, analyze outcomes in real-time, and monetize technical disruptions such as Blockchain and conversational commerce. Customer Speaker: Duane DeBique, Financial Applications IT, Sinclair Broadcasting Oracle Speaker: Vikas Anand, Vice President, Oracle Integration, Process and API Product Management In addition to our session, attendees will have the opportunity to meet with Oracle experts in a variety of other ways, including demonstrations during the showcase receptions and an attendee lunch. Stop by the Oracle booth #201 and chat with product experts to learn how to: ·       Accelerate integration for SaaS and on-premises application integration ·       Manage, secure and monetize your APIs with zero-code ·       Simplify the development and deployment on cloud-native applications ·       Drive cognitive insights, social, and conversational interactions Booth Hours ·       Monday, December 4 – 12:00 p.m. to 3:00 p.m. ·       Monday, December 4 – 5:30 p.m. to 7:30 p.m. ·       Tuesday, December 5 – 12:00 p.m. to 3:00 p.m. Attendee Lunch Wednesday, December 6 Have lunch with Daryl Eicher, Product Marketing Director – Process Cloud Service, where he will moderate an informal discussion on Bots, RPA, API-First Design, and Integration with attendees over lunch. All participating exhibitors will be assigned a table which will be noted on a sign at the entrance to the attendee lunch on Wednesday. We hope to see you in Vegas!  

The premier conference on app strategy, innovation and customer experience Oracle is a proud to be a Platinum sponsor of the Gartner Application Strategies & Solutions Summit December 4-6, 2017 in Las...

Oracle API & Integration Days: Learn More to Move Faster Than Ever

Companies of all sizes need to move quickly - we all know that. To innovate in a digital world that is moving faster than ever and benefit from API and Integration tools, hands-on is best. We will be hosting events around North America to give you insight into how Oracle can help ease your integration pains. Join Oracle to hear from development and integration experts on best practices for the design and development of APIs and the management of integrations. Hear directly from Oracle customers about how they were able to adopt new digital business models and accelerate innovation through the integration of their SaaS and on-premises applications. With interactive sessions and hands-on labs, the Oracle API & Integration Days will help you to:  Quickly and easily create integrations using Oracle’s simple, but powerful Integration Platform as a Service (iPaaS) Simplify connection to cloud, on-premise and hybrid systems with the constantly growing list of adapters Secure, manage, govern and grow your APIs using Oracle API Platform Cloud Service Workshop schedule & registration: El Segundo, CA – December 5th Seattle, WA – December  7th Redwood Shores, CA -  December 12th Austin, TX – January 18th Houston, TX – January 23rd Spaces are filling up quickly, so sign up today!

Companies of all sizes need to move quickly - we all know that. To innovate in a digital world that is moving faster than ever and benefit from API and Integration tools, hands-on is best. We will be...

Innovation Disruptors Come and Go - Only Oracle Integration & API Cloud Lead the Way!

Author: Vikas Anand, Vice President, Oracle   With Oracle OpenWorld 2017 in the rearview mirror, I find myself just now catching my breath. This year was FANTASTIC! We had a chance to showcase our newest, coolest capabilities across Oracle Integration Cloud, Robotic Process Automation and API Design and Management, but it was our customers and partners that made integration come to life through their digital transformation stories. They shared how they are leading the way positioning integration, process automation, and API-first strategy at the heart of their tech roadmap. It has been an incredible year for Oracle iPaaS, as we were recognized by industry experts in the Gartner Enterprise iPaaS Magic Quadrant and Forrester Wave reports, with the acquisition of Apiary and the launch of the API Platform Cloud Service. At the same time we focused on continuous improvement of  our portfolio i.e adding 100+ adapters and Visual Builder Cloud to our portfolio.   We talk a lot about innovation around here, and with good reason. We want to keep pushing forward not just for ourselves, but for the benefit of our customers and partners. We have all heard that “competition breeds innovation” but I believe “innovation breeds innovation.” I have seen this happen not only with my team but as I meet with customers, and in turn, they interact with each other and our partners. If there is anything that I have come away with from the past few months, it is that integration is more important than it has ever been. As organizations embrace new Chatbots, Blockchain, AI and ML capabilities, one thing remains constant: the best way to move to the cloud is to leverage best-in-class SaaS, but what makes all of those initiatives work flawlessly is killer integration, process automation and API-first lifecycle management. I heard story after story of how customers like Subaru of America, Corelogic, and The Factory, are using Oracle Integration Cloud to connect their SaaS and on-premises applications. They are orchestrating end-to-end process automation, leveraging 100+ out-of-the-box cloud adapters, and using tooling for visual application development. We had customers like All Nippon Airways, Rabobank, and Denver Metro Regional Transportation District present how Oracle’s complete API management cloud solution  is allowing them to design, govern, manage, analyze, monetize, and secure APIs in a true hybrid deployment. Developers and attendees at our SOLD OUT Oracle Code event were especially excited to see Jakub Nesetril, founder of Apiary, discuss how we have pulled Apiary into our API Platform Cloud offering. Oracle’s API Platform Cloud now includes Apiary, allowing developers the ability to prototype, test, document and manage their APIs with industry standards such as Open API and API Blueprint. At OOW, we were excited to be able to acknowledge companies that were executing this innovation perfectly at our annual Innovation Awards ceremony. We had so many entrants, this year was especially difficult to select the winners but here are three of our customers that edged out all others: Innovation Award Customer Winners for Integration From a wide field of nominees, the Integration winners embodied excellence in their Cloud journey.   Corelogic took the first step in venturing from on-premises operations into the Cloud, by integrating SaaS with existing on-premises systems. Motivated by an operational requirement for real-time integration between Oracle E-Business Suite and CPQ apps, as well as by a need for a single dashboard able to convey all system statuses, they sought Oracle Integration Cloud Service for their One Corelogic Experience. ICS not only increased developer productivity by reducing custom code and increasing ease of provisioning, but data synchronization was reduced from 4 days to a few seconds. Aided by Oracle Consulting, Corelogic aims to increase the number of integration flows within Oracle Integration across their cloud & on-premise applications. Subaru is well on their way into their digital transformation journey. With a goal of delivering the Subaru Telematics Platform in their 2nd generation connected car, Subaru is delivering vehicle health reports, emergency assistance, as well as automatic collision notifications. This robust solution has been powered by Oracle SOA Cloud Service and Managed File Transfer to ensure dealers are up to date on customers’ needs. Their hybrid integration solution has allowed their back-end to gracefully shift from on-premises to a hybrid solution, and ultimately into a pure Oracle Cloud environment. Their integration successes have allowed IT to develop applications and services that directly increase customer satisfaction and engagement. Not only that, but aided by partner Centroid, Subaru was able to achieve a savings of over $1 million per year! National Pharmacies continues to push the envelope leveraging the latest technologies to engage their customers and employees with Mobile and Chatbots while integrating and automating their manual processes to tie their frontend and backend systems. Taking the human slack out of workflows using Oracle Process Cloud & Visual Builder Cloud allowed them to streamline the multiple systems that touched their Purchase Orders. Their solution was able to be built, tested, and deployed to stores in just 10 weeks, resulting in a 30% reduction in the cost of stock management, as well as the improvement of inventory management.   IKEA’s global retail brand experienced a strong need to become a digital organization, yet IKEA’s information was locked in hundreds of old legacy systems distributed in 48 countries and 500 data centers. They needed a solution that could innovate until modernization could happen.  They wanted a solution powered by Oracle Apiary for the design governance, Oracle API Platform Cloud Service for the runtime aspects of the platform, a rapid deployment tempo of weeks rather than months, and the capability of scaling to up to 1,000 gateways. These requirements were central to enabling IKEA's digital strategy which is expected to help double their current revenue to 50 billion EUR by 2020. With their implementation partner Capgemini, the result has been a fulfillment of all their requirements, plus a reduction in the overall IT budget by at least 70%! Those of you that know me already know that my team and I work hard, but we also like to have FUN, too! So of course, with the help of our fantastic partners, we toasted all of our Integration Cloud, Process Automation, and API Platform Cloud customers in style! They are all WINNERS in our book! This year’s killer event was held at San Francisco’s Gotham Club inside AT&T Park – complete with a night full of batting cage practice fun, behind-the scenes peeks at the park, and delicious food and beverages for all. Sponsored by our very generous partners eProseed, Capgemini, AVIO Consulting, Rubicon Red, Flexagon, and Cognizant, we had a great time, took off our jackets, and enjoyed the evening, sharing a uniquely SF experience. I even had a chance to show off my skills with a 1.000 batting average! So by now you must be wondering when’s the next event? – Well keep your eyes peeled; we will be having a lot more of these exciting events coming to a location near you…. Don’t forget that OOW18 is  October 28 to November 1, 2018!    

Author: Vikas Anand, Vice President, Oracle   With Oracle OpenWorld 2017 in the rearview mirror, I find myself just now catching my breath. This year was FANTASTIC! We had a chance to showcase our...

Use a Layered Approach with APIs to Protect your Data

We often talk about wonderful opportunities like monetization, re-use, etc of data through APIs, but a critical concern is security. To meet security requirements many vendors will provide products like API gateways. Many API gateways provide a wide-range of functionality but should all security checks happen in one place? There is an allure of having a "one size fits all" approach mainly with what may seem to be simpler configuration. Learning only one tool and using that tool for complete end-to-end, API to Service Implementation may seem attractive. When subjecting our APIs to web-scale workloads however, we may find that this approach will not scale. I talked about this in another post that focused on the API Gateway and Integration Layers, but now, let's take a step back to include more of the invocation flow. Our discussion in this post will include load balancers and content delivery networks (CDNs). We will also talk about some of the principles of enterprise deployments and why layers protect our most precious commodity, data. Speaking of data, let's say we have some sort of data store, such as a database. We would not likely just open up the database listener directly up to the Internet. At the very minimum, we would have a firewall in between and more likely we have have multiple tiers in between. Data is the most valuable resource, deserving the most protection. The diagram above shows multiple layers before anything from the public Internet can reach the most secure data. Clients cannot go directly to data, they have to pass through intermediaries and of course can only perform certain prescribed functions, such as invoking an API. One may ask if we can simplify this process and collapse multiple functions into one tier. Let's look at each tier and its value Content Delivery Network/Public Facing Firewall and Load Balancer This is the outermost public facing end-point. There are some validations that can take place here before allowing a request to proceed. This level can check the request for all sorts of exploits including but not limited to SQL Injection, Cross Site Scripting, GeoIP blocking, HTTP Protocol Violations, etc. Essentially this level is validating the HTTP request at the level of resource, headers, and attributes like client IP. Protect against Denial of Service Attacks (DOS), Distributed Denial of Service Attacks (DDOS) This can also be used for image caching, HTTP compression, SSL Termination, etc May enforce SSO and establish a session/transaction token May also hide/modify errors, etc. For example, not show internal server implementation details in error messages returned to the client, but log for the administrators to be able to diagnose and resolve Logging and Analytics Demilitarized Zone (DMZ): API Gateways, B2B Gateways, FTP Proxies, etc Requests that enter the DMZ, have the following features Cleared of the highest-level exploits: Somewhat, but not fully trusted. Require further processing: The request was not for cached content so the CDN tier forwarded it to the DMZ. The gateways in the DMZ may proceed to validate and handle the request, performing functions such as Key validation: Application association Throttling/Rate limiting: Protect back-end systems, or enforce contracts AuthN/AuthZ: Methods like OAuth2 to validate if the client is authenticated and authorized to call a particular resource Transformation: Lightweight message modification Redaction: Protecting data Routing: Identifying back-end services to receive the requests Caching: Some non-sensitive/redacted data Logging and Analytics Green-zone/Application Tier Requests that have made it here have been validated and have been cleared of exploits. They can now be received by applications, service implementations or the Integration Platform which may perform the following: Connect to legacy applications: Not all applications or technologies are service enabled. Sometimes an adapter is required Fine-grained authorization: Deeper level entitlements with access manager systems or the security layers of the applications themselves. Heavy-weight transformations: Performing complex mappings of a SOAP/XML message and turning it into a concise REST/JSON message for example Orchestrations: Connecting to multiple back-end systems to provide a single service Caching: Maintaining shaped results for continued calls under the principles of eventual consistency Logging and Analytics Data Tier Applications, service implementations, and the integration platform make connections to data stores. Only authorized servers either through IP filtering, router segmentation, or some sort of shared key security along with credentials Data auditing Role-based access control Logging and Analytics The Layered Approach to Security Returning to the question regarding the possibility of collapsing some of the layers and use one platform to complete all, or most of the functions detailed above. Again the allure is to try to have a simpler configuration. Technically, the answer is "yes" this is possible. There are a few reasons why we want to maintain a fully layered approach. First, if we look at any security methodology in the physical world, we will find multiple layers. Office buildings, airports, military installations for example employ layers both for security and also for efficiency. To use an airport as an example, if I am taking a flight, I have a boarding pass. Let's say I have some sort of priority boarding. When entering the priority security line, an agent may want to see my pass just to check to see if I have the appropriate mark indicating that I have priority boarding. That agent is not validating that I can get on the plane, rather preforming a quick, cursory check to redirect me immediately if I got in the wrong line. This is an example of fail fast and also reduces the load on the downstream by rejecting my request based on the simplest of parameters. Of course, we might ask if we can use the same product across multiple tiers and again the answer would be "yes" here as well. A word of caution though, the more functionality that is packed into any one component expands the attack surface as well as opens up to common performance issues. The more functions a particular technology performs the heavier it tends to be. There is also often a number of trade-offs as features begin to conflict. To revisit the airport analogy, the person who first looks at my boarding pass does not clear me to board my specific flight. Imagine having one person handling the security checks and boarding for all gates and all airlines? We could have multiple people assigned, but their task would quickly become more complex to handle gate changes, boarding groups etc. By design, I will have multiple interactions with staff specifically trained for their function as I make my way from the departures zone to the aircraft. Scalability In the airport analogy, we were talking about layered security and a certain level of QOS by rejecting requests (entering a security line) earlier rather than later. In our layered approach from a technology perspective, we can not only reject invalid requests, but we can also return the requested results in some cases, without having to go all the way back to the data tier. At each step of the way, we can employ caching and also horizontally scale platforms to be able to handle requests beyond the capacity of the back-end systems. Furthermore, we can speed up calculated results. Let's say we have an orchestration that calls multiple back-end services to provide a result. We can cache that result and offload subsequent requests to the integration tier for example. A caveat about caching: Cached data is data at rest. We need to be vigilant about what we cache where. Non sensitive content such as images of products in a catalog for example, could be cached at the outermost layer. Business sensitive data on the other-hand should not be cached in the DMZ so the integration platform can manage this level within the green-zone. Opportunities While capable of extreme performance serving the most critical business requirements, enterprise deployments can be complex. Having multiple disparate services requires more planning, configuration and management on the part of administrators. Furthermore, having different tiers can result in silos of information, making it more difficult for administrators to monitor and manage the infrastructure. This all can lead to a greater risk of overlooked vulnerabilities. This goes against the rational for having a proper enterprise deployment. Fortunately, there has been a move to decouple the user-experience from the processing engines and to lift the burdens of management from users. This is moving in the right direction and there are some opportunities going forward. Artificial Intelligence: Analytics needs to go beyond just showing logs, charts and alerts. Applying AI, patterns can be identified to detect threat vectors not yet known. Solution based design and implementation: By providing common solution-based canvases to designers and implementers, the underlying complexity can be abstracted. Users should not have to think of the underlying technology as much. A common question by users is "for use-case 1, should I use product X or product Y". The user should just be thinking of how to solve the use-case, and the vendor should handle the selection of the underlying product(s). Single-pane of glass: Provide users visibility across the multiple technologies. This increases understanding and reduces complexity. Go Autonomous: Oracle announced the first Autonomous Database which is a huge leap forward in reducing costs and complexity. The more we make platforms autonomous, the better and safer they will be. Conclusion Using a layered topology is critical to ensuring security and scalability. While these topologies can be more complex, hybrid deployment offerings like Oracle Integration Cloud Service and Oracle API Platform Cloud Service reduce the complexity of Integration and API Management. With all of the new technologies ranging from microservices, to containers to serverless computing, the hybrid deployment approach will become more critical than ever. I believe this is just the start of great things to come. Disclaimer The content above does not necessarily represent Oracle Corporation. All statements are my own.

We often talk about wonderful opportunities like monetization, re-use, etc of data through APIs, but a critical concern is security. To meet security requirements many vendors will provide products...

Promote Standards to Create Great APIs

  When I moved from Development to Product Management, one of my first assignments was to create tools that helped developers build SOA services following prescribed patterns. Simply put, we had a "best practice" and we wanted to help developers leverage it easily and repeatedly. My main motivation was to lift the burden of compliance through automation. I still believe that automation promotes compliance. Fast-forward to the present with microservices, and APIs, we see the need for a design-first approach with standards is even more critical than ever. This was one of the many reasons I was excited when Oracle acquired Apiary because I knew that Apiary enables teams to create great APIs. A great example of this is presented in Creating the New: Adidas APIs from the Nordic APIs 2017 platform summit that took place in Stockholm, Sweden on October 11th of this year. Oldrich Novak from Adidas, and Zdenek "Z" Nemec from GoodAPI gave a great presentation about implementing a well defined approach to APIs. I'd like to highlight some of the points from their session, but I really do suggest you find the time to watch it for yourself to learn even more. Adidas suffered a lack of visibility and governance resulting in a lack of consistency which caused them to have APIs that were hard to control and re-use. Adidas had to approach the problem from three key areas, People, Process and Platform. A problem like this required more than just a tool, or even some good advice, it was the combination of Apiary, GoodAPI and Adidas' commitment to organize that resulted in their success. Zdenek was instrumental in guiding Adidas to maturity with APIs. Adidas uses Oracle Apiary to design, document, guide and ensure compliance. Oracle Apiary not only provides governance but the visibility supports re-use Oracle Apiary is open to any API Platform to promote the creation of great APIs even on other platforms. Design-first: The API Description, not the implementation, is the source of truth. Using the automated tools of Apiary such as Dredd, with their CI/CD, they are able to maintain compliance with their API Descriptions Ideally APIs should achieve level 3 the Richardson Maturity Model to maintain robustness and true loose coupling of your clients There is so much more from this presentation that again I recommend you take some time to review it. I think one of the best quotes (a bit paraphrased) is from "Z" stating "We don't really want to force but we want to promote good standards". As a former developer, I think this mindset will achieve more successful adoption of standards. Leveraging Apiary, can help any organization establish great practices that are easy for developers to adopt. Great products are made by great people, armed with a winning strategy and design first approach. If you need an expert guide, I suggest you check out GoodAPI and talk with "Z". Of course anything stated above are my opinions alone and does not necessarily represent Oracle Corporation, Adidas, or GoodAPI.

  When I moved from Development to Product Management, one of my first assignments was to create tools that helped developers build SOA services following prescribed patterns. Simply put, we had...

#OOW17 was the #bestoneyet!

Author: Rimi S. Bewtra, Sr. Director, Oracle Cloud Business Group … here are just a few of my highlights! WOW … I am not even sure where to begin … this Oracle Open World was AWESOME!!!!! And if by chance you missed it then catch one or more of the replays: (Larry Ellison, Mark Hurd, Thomas Kurian, Dave Donatelli) This year was all about Innovation – across areas like AI, Chatbots, App and Data Integration, APIs, Robotic Process Automation, IoT and Content and Experience Management … and few other things like Blockchain, Autonomous Database and Security and Management in the Cloud also got some attention J but I will keep my focus and highlights on the areas I manage and am most familiar with and let my colleagues share their own highlights. So much going on this year, here are just a few of my highlights! No matter where you went, you probablhy heard the buzz and hopefully caught one or more of our sessions on:  #AI, #APIs, #BigData #Chatbots, Content & #ExperienceMgmt Integration (App and Data), #IoT, and #RoboticProcessAutomation Leaping off of our global Chatbots launch, our Intelligent Bots were everywhere! Across all major keynotes the attention was on AI powered Chatbots as the new engagement channel to enhance customer and employee experiences. There were press releases on Oracle’s AI –powered Intelligent Bots and strategic partnerships with both Chatbox and Slack.    Oracle Introduces AI-Powered Intelligent Bots to Help Enterprises Engage Customers and Employees Oracle and Chatbox Collaborate to Bring Instant Apps to AI-powered Oracle Intelligent Chatbots Reuters: Slack locks down Oracle partnership targeting enterprises CNBC: Slack is partnering with Oracle to offer new in-app business bots ComputerWorld: Slack and Oracle move to collaborate on business app chatbots  You are going to continue to hear a lot more so if you have taken 30 minutes to watch our Launch Webcast and visit www.oracle.com/bots - it is well worth the time. More than anything #OOW17 was about innovation, sharing our customer successes and Oracle’s strategy and vision! I was inspired by the kids from Design Tech High School – these students are teaching Silicon Valley a few things. Clearly there is a lot to look forward to and this next generation of entrepreneurs have already started to lead the way.  The pace of innovation is not slowing down. But one thing is certain, with Oracle Cloud, we are committed to supporting our customers throughout their Cloud journey, no matter where they are. Our Cloud Platform innovations:   Oracle Expands Open and Integrated Cloud Platform with Innovative Technologies Oracle Cloud Platform Innovates to Power Big Data at Scale More than anything across all areas it was our customers and partners that showcased the power of the Oracle Cloud Platform. Organizations, our customers, big and small and across industries, like Australia eBay, Exelon, Financial Group, CoreLogic, National Pharmacies, Orb Financial, Paysafe, Sinclair, Subaru of America, The Factory, Trek Bicycles, Trunk Club, Turning Point and so many others, shared their stories about how they have embraced their Oracle Cloud to lead their industry, and embark on their own digital transformation journeys. Customers shared how they are leveraging Oracle Integration Cloud to connect their SaaS and on-premises applications and orchestrate end to end process automation and leveraging 100+ out-of-the-box cloud adapters, and tooling for visual application development. With Oracle Data Integration Platform Cloud organizations are able to easily integrate new sources of data in any format to help eliminate costly downtime, accelerate data integration for analytics or big data, and improve data governance. And whether you were at one of our largest Developer focused Oracle Code events which drew over 1500+ attendees or one of our standing room only general and conference sessions, you probably got a glimpse of our most complete API management cloud solution to design, govern, manage, analyze, monetize, and secure APIs in a true hybrid deployment. Oracle’s API Platform Cloud now includes Apiary and provides developers the ability to prototype, test, document and manage their APIs with industry standards such as OpenAPI and API Blueprint. Now I could go on and on, but let’s save something for future blogs -- before I end I must call out our partners. They drove the customer successes by helping our customers embark on their journeys – here are just a few of the ones that I had the pleasure of working with -- Auraplayer, Avio,  Cognizant, Fishbowl, Rubicon Red, SunetraTech, Sofbang, TetraTech along with of course Deloitte, Intel, PWC, Tata Consultancy Services, Accenture, Fujitsu, Infosys, and Wipro. Hopefully all of you had a chance to experience #OOW17 the #bestoneyet! Check out the Keynote replays (Larry Ellison, Mark Hurd, Thomas Kurian, Dave Donatelli) and get ready to hear much more from Oracle – this year’s #OOW17 #notwahatyouexpected from Oracle. With the help from our customers, partners we are able to showcase how technology in transforming companies and industries – we were able to   Explore Tomorrow, Today! P.S. Mark your calendars, #OOW18 promises to be #evenbetter!

Author: Rimi S. Bewtra, Sr. Director, Oracle Cloud Business Group … here are just a few of my highlights! WOW … I am not even sure where to begin … this Oracle Open World was AWESOME!!!!! And if by...

Integration

Exciting Integration Analytics Session at OOW17

Customers are quickly beginning to see the value of a comprehensive cloud-based solution to their integration needs, and Oracle's Integration Platform as a Service (iPaaS) is gaining the attention of both customers and experts alike.  One key differentiator for Oracle is the bundled ability to track business-level analytics on integrations created in Oracle's Cloud platform.  At session CON7015 - How Integration Analytics Enables Insight On-Premises and in the Cloud, on Monday at 12.15 pm, a team of product managers, partners, and customers will give a preview of Oracle's new Integration Analytics features and discuss how they can be applied to give you detailed insight into how your business processes are performing. Oracle Integration Analytics allows customers to track key business metrics using a simple workflow. Define a business analytics model.  Using simple browser-based tooling, analysts identify and describe business milestones and metrics that are important to the business.  Map the model to integration artifacts.  Architects can map business milestones and metrics from the analytivcs model to iPaaS artifacts such as orchestration flows in Oracle SOA Cloud Service (SOACS) or Oracle Integration Cloud Service (ICS).  Mapping is done easily using browser based tooling, requiring no code changes or redeployments! Create Dashboards to Track Key Business Metrics.  Once the analytics model has been defined, mapped to implementation, and activated, business stakeholders can track business progress using pre-configured dashboards.  It's also easy to define custom dashboards using the metrics from the analytics model.  Highlighted in the session will be new features in Integration Analytics, including close alignment with the Integration Cloud Service.  See how drag and drop simplicity makes it a snap for ICS developers to map orchestration actions to milestones in the analytics model.  There will also be a real-life customer use case that outlines benefits seen by an early adopter.  We hope to see you there!    

Customers are quickly beginning to see the value of a comprehensive cloud-based solution to their integration needs, and Oracle's Integration Platform as a Service (iPaaS) is gaining the attention of...

#OOW17 Bring it All Together

Integrate and Automate …. More Important than ever before! Author: Rimi Bewtra, Sr. Director, Oracle Cloud Business Group Let’s take a moment and think about what you did this morning. Did you check email, log into your employee portal, review a document, Powerpoint, or video? Chances are you were reviewing data that was sitting within an application. Much of that data was part of one or more back-end systems, social apps, enterprise apps in cloud or on premises, or a 3rd party app.    Today’s economy is based on information. Data is the single biggest asset for companies own, share, enrich, manage and govern. Yet this data is also very hard to deal with. Consider this: 72% of big data projects have issues with data integration reliability and typical data outages last 86 minutes, totaling an average of $690,200 of costs. No wonder, companies spend $400B/year just connecting systems. So whether you are talking about #AI, #Chatbots, #BlockChain or #Digital Economy, what makes all of this come together is your ability to integrate and automate quickly and efficiently: integrate and automate your data, integrate and automate your apps, and integrate and automate your devices. At #OOW17, here are 5 sessions that will help you SEE, LEARN and EXPERIENCE how Oracle Cloud Platform for Integration is bringing it all together:   Oracle Integration, API, and Process Strategy Monday, Oct 02, 11:00 a.m. - 11:45 a.m. | Moscone West - Room 3005   Oracle Data Integration Platform Cloud Strategy and Roadmap Monday, Oct 02, 12:15 p.m. - 1:00 p.m. | Moscone West - Room 3024   Oracle API Platform Cloud Service: Roadmap, Vision, and Demo Tuesday, Oct 03, 11:30 a.m. - 12:15 p.m. | Moscone West - Room 3005   Oracle GoldenGate Product Update and Strategy Tuesday, Oct 03, 5:45 p.m. - 6:30 p.m. | Moscone West - Room 3003   Differentiate with SaaS Applications Using Rapid Process Automation Wednesday, Oct 04, 2:00 p.m. - 2:45 p.m. | Moscone West - Room 3005   Hope I see you at #OOW17. This year is going to better than the years past ... and I am not just saying that because I happen to head up product marketing for some of Oracle's coolest innovations but seriously we have a lot of smart developers and engineers and they have been busy and we are ready to show off! See you soon.  

Integrate and Automate …. More Important than ever before! Author: Rimi Bewtra, Sr. Director, Oracle Cloud Business Group Let’s take a moment and think about what you did this morning. Did you check...

Exciting #OOW17iPaaS Sessions to Make You a Process Automation Whiz

With Oracle OpenWorld 2017 just over a week away, if you are trying to figure out how to . If you’re curious about Oracle Process Cloud Service, we have some great sessions for you to check out. Take a look at our Focus On: Process Automation guide to make the most of your time at #OOW17. This handy reference can help you now or as you drink coffee and plan your day from the plaza on Howard Street! Sunday Oct. 1st, 2017 No Code: How to Extend Oracle SaaS with Oracle Application Builder Cloud Service [SUN3124] Time:  11:45 a.m. - 12:30 p.m. Location: Marriott Marquis (Yerba Buena Level) - Nob Hill A/B Speakers: Luc Bors, Technical Director, eProseed Europe s.a. But why should I attend?? You will get a chance to see Oracle Application Builder Cloud Service up close, super approachable and easy to use app dev product built for the nonprofessional developer. Oracle ABCS enables citizen developers to quickly build and publish applications that can address immediate business needs. Luc will show how easy it is to use Oracle Application Builder Cloud Service to create an extension to Oracle SaaS and how seamlessly the extension fits to the SaaS look and feel and integrates with PaaS like Oracle Process Cloud Service. Monday, Oct. 2nd, 2017 Oracle Integration, API, and Process Strategy [CON7118] Time:  11:00 a.m. - 11:45 a.m. Location: Moscone West - Room 3005 Speakers: Vikas Anand But why should I attend?? In this headliner session, you’ll learn how to best leverage Oracle’s recent innovations within Oracle Integration Cloud Service, Oracle Process Cloud Service, and Oracle API Platform Cloud Service. VP of Oracle Product Management, Vikas Anand, is a compelling speaker who always engages the audience and makes you want to rush home to get started with all the great new products & features that Oracle Integration has to offer! Wednesday, Oct 4th, 2017 Building Business Agility with Rapid Process Automation: Customer Stories [CON7032] Time: 1:00 p.m. - 1:45 p.m. Location: Moscone West - Room 3003 Speakers: Daryl Eicher, Oracle Brent Seaman, EVP, Mythics, Inc Matthew Wright, CTO, Rubicon Red Pty Ltd Ryan Klose, General Manager, National Pharmacies Group Andrew Mcgee, IT Manager Shared Services, Australian Finance Group But why should I attend?? This great customer panel moderated by Daryl Eicher will showcase Oracle Process Cloud Service customers where they will talk about how they have been able to achieve efficiencies of scale and innovation within their operating models by digitizing their many manual paper- and email-intensive business processes. The future of business is in putting automation in the hands of stakeholders to transform app delivery and operations, and you will hear about the many business benefits achieved and key lessons learned. Differentiate with SaaS Applications Using Rapid Process Automation [CON7031] Time: 2:00 p.m. - 2:45 p.m. Location: Moscone West - Room 3005 Speakers: Nathan Angstadt, Oracle Ravi Gade, Director, Calix, Inc John Deeb, Director, Rubicon Red Pty Ltd But why should I attend?? Everyone wants to maximize their ROI, right? Oracle Process Cloud Service PM Nathan Angstadt will lay down the foundation for how process automation will do just that: max out your return on cloud and on-premises IT assets using the Oracle Cloud Platform for innovation. Low-code automation and API-first development are key steps in democratizing the innovation throughout your company, and through them you will see the reach of your digital business and omni-channel experiences of your SaaS apps skyrocket! Not only will this session will be packed with real-world examples of getting more from your CRM, finance & supply-chain ERP systems, but Ravi Gade & John Deeb will share their experiences in choosing and implementing the solution. There will be plenty of time for questions & answers, so come ready to learn! Build Connectivity to Cloud Apps: Oracle Self-Service Integration Cloud Service [CON7093] Time: 5:30 p.m. - 6:15 p.m. Location: Moscone West - Room 3005 Speakers: Tuck Chang, Oracle Udom Dwivedi, Oracle But why should I attend?? Oracle Self-Service Integration Cloud Service is a brand new part of the Oracle Integration Cloud provides an easy-to-use application interface designed for the business user to connect the cloud applications they use. In this session, you will be able to hear directly from the Oracle PMs & builders of the product to come away with actionable steps to implement Oracle Self-Service Integration to quickly build connections to a custom cloud application, empower employees (with no coding experience!) to integrate and automate tasks with recipe-style self-service integration using either Oracle or non-Oracle cloud applications, and make it available to your employees right when you get back from OOW. In addition to the awesome sessions above, we will also have Hands-On-Labs, Demos, and Theater Sessions. And don’t forget to stop by The Innovation Studio at Oracle OpenWorld, located at the top of the escalators in Moscone North where you'll be able to learn about all the ways that Oracle Cloud Platform brings together technology. Find out more here. Interested in Oracle Integration Cloud? Take a look at these Five #OOW17iPaaS Integration Cloud Sessions

With Oracle OpenWorld 2017 just over a week away, if you are trying to figure out how to . If you’re curious about Oracle Process Cloud Service, we have some great sessions for you to check out. Take a...

Integration

Five #OOW17iPaaS Integration Cloud Sessions You Won't Want to Miss

With Oracle OpenWorld 2017 just two weeks away, you are probably busy building your schedule and looking forward to packed days in San Francisco full of fun & learning. If you’re curious about Oracle Integration Cloud, we have some great sessions for you to check out. Make sure to take a look at the Focus On: Integration Cloud Service guide to make the most of your time at #OOW17. Coming up this week on the blog, we will continue to highlight #OOW17iPaaS sessions that you'll definitely want to make the time for.  From Roadmap to How-To, you will find just what you’re looking for in iPaaS knowledge. Monday, Oct. 2nd, 2017 Oracle Integration, API, and Process Strategy [CON7118] Time:  11:00 a.m. - 11:45 a.m. Location: Moscone West - Room 3005 Speakers: Vikas Anand But why should I attend?? In this headliner session, you’ll learn how to best leverage Oracle’s recent innovations within Oracle Integration Cloud Service, Oracle Process Cloud Service, and Oracle API Platform Cloud Service. VP of Oracle Product Management, Vikas Anand, is a compelling speaker who always engages the audience and makes you want to rush home to get started with all the great new products & features that Oracle Integration has to offer! Oracle Integration Cloud Best Practices: Customer Panel [CON7014] Time: 12:15 p.m. - 1:00 p.m Location: Moscone West - Room 3005 Speakers: Bruce Tierney, Oracle Pankaj Varshney, IT Manager,Corelogic Naveen Vallamkondu, Applications Architect, Peregrine Semiconductor But why should I attend?? Oracle Integration Cloud has an eager hear directly from customers that have used Oracle Integration Cloud to connect SaaS with on-premises applications. Get the background information you need to jumpstart your next CX, HCM, ERP, or other integration project with modern best practices. Join us and get the answers to all your questions. Tuesday, Oct.3rd, 2017 Integrate Salesforce.com and Workday with Oracle Integration Cloud Service [CON7227] Time: 12:45 p.m. - 1:30 p.m. Location: Moscone West - Room 3005 Speakers: Mani Shankar Choudhary, Technical Manager, DAZ Systems, Inc. Ram Anantharaman, Technical Support Manager, Stitch Fix Shalindra Singh, Enterprise Architect, Bristlecone Inc But why should I attend?? If you are looking to integrate a variety of applications—Oracle or non-Oracle—with an enterprise-ready, market-leading iPaaS, then this is the session for you. You will hear straight from customers about how they have used Oracle Integration Cloud's Salesforce and Workday integration and been able to leverage Oracle ICS as their single strategic platform for integrating both Oracle and non-Oracle applications. Wednesday, Oct 4th, 2017 ERP/HCM/HRMS Integration Made Simple with Oracle Integration [CON7020] Time: Wednesday, Oct 04, 12:00 p.m. - 12:45 p.m. Location: Moscone West - Room 3005 Speakers: Ramkumar Menon, Oracle Ravindran Sankaran, Oracle Yogesh Sontakke, Oracle But why should I attend?? You will get a chance to discover how Oracle Integration Cloud can simplify the task of connecting your ERP or HCM/HR systems, straight from the PMs who built the product. This will be a use case–based session and you will leave with a solution that can be used for pure SaaS-to-SaaS connections, as well as for connecting SaaS and on-premises systems. Build Connectivity to Cloud Apps: Oracle Self-Service Integration Cloud Service [CON7093] Time: 5:30 p.m. - 6:15 p.m. Location: Moscone West - Room 3005 Speakers: Jeff Hoffman, Oracle Tuck Chang, Oracle Udom Dwivedi, Oracle But why should I attend?? We are all familiar with the demand for connectivity among cloud applications. The brand new Oracle Self-Service Integration Cloud Service easily supports connectivity to cloud applications. Oracle SSI also provides an easy-to-use application interface designed for users with very little tech experience to connect the common cloud apps they lean on most. In addition to the awesome sessions above, we will also have Hands-On-Labs, Demos, and Theater Sessions. Between sessions, don’t forget to stop by The Innovation Studio at Oracle OpenWorld, located at the top of the escalators in Moscone North where you'll be able to learn about all the ways that Oracle Cloud Platform brings together technology.  To find out about our entire #OOW17iPaaS session offerings, check out the App & Data Integration Focus On Document here.

With Oracle OpenWorld 2017 just two weeks away, you are probably busy building your schedule and looking forward to packed days in San Francisco full of fun & learning. If you’re curious about Oracle In...

Wondering how to attack PSD2? APIs are the answer.

In January 2018, the European Union’s Revised Payment Service Directive will go into effect. The impact of this directive should be completely transparent for users of banks and other financial services, but in the background, businesses are busy at work making sure they are compliant. So what does this mean? PSD2 includes provisions to make it easier and safer to use internet payment services better protect consumers against fraud, abuse, and payment problems promote innovative mobile and internet payment services strengthen consumer rights strengthen the role of the European Banking Authority (EBA) to both coordinate supervisory authorities and draft technical standards In short, PSD2 allows a payment service user to have an overview of their financial situation on demand, have aggregated online information on payment accounts – like transaction data - held with other payment service providers, all within a secure environment and with strong customer authentication. The simple answer to the how of PSD2 compliance is with APIs. APIs make it simple to include robust security and user authentication, to quickly design & deploy APIs that suit your business, and to manage the APIs your team has created. To find out more about how APIs can help, download our White Paper here. 

In January 2018, the European Union’s Revised Payment Service Directive will go into effect. The impact of this directive should be completely transparent for users of banks and other...

Integration

Oracle Named a Leader in the 2017 Analyst Evaluation for Digital Process Automation Software

The Forrester Wave™: Digital Process Automation Software, Q3 2017 report has been released, and Oracle has been named a leader among vendors selected for demonstrating proven customer adoption across geographies and verticals, strong go-to-market strategy and thought leadership, and breadth of support for DPA requirements! Oracle was also named a leader in Gartner's 2017 Magic Quadrant for Enterprise Integration Platform-as-a-Service report which included process automation as a key consideration. According to the Oracle press release published this morning, "In Forrester's 30-criteria evaluation of DPA vendors, they evaluated 12 significant software providers. Oracle was cited as a leader with the highest possible scores in the low-code/no-code, smart forms and user experience, process flow and design, mobile engagement, API support, data virtualization, deployment options, and ease of implementation criteria." In the press release, Vikas Anand, Vice President, Oracle Cloud Platform noted that: "By delivering comprehensive process automation capabilities such as no- and low-code process design, case management and simplified connections to SaaS, Social, Cloud and on-premises systems, Oracle provides customers with a powerful option to continuously deliver engaging customer, employee, and partner experiences at every stage in their business transformation journey." Download the full report here! Follow us @OracleIntegrate and don't forget to join the transformation conversation at https://blogs.oracle.com/integration.

The Forrester Wave™: Digital Process Automation Software, Q3 2017 report has been released, and Oracle has been named a leader among vendors selected for demonstrating proven customer adoption across...

Integration

Trunk Club Develops APIs in Style

Since 2009, Trunk Club has brought high quality fashion to customers. For professionals that either can’t or won’t take time to shop in person, the solution was straightforward: help folks discover great clothes that are perfect for them without ever having to step into a store. The brilliant combination of top brands, expert service, and unparalleled convenience allows Trunk Club to deliver a highly personalized experience that helps customers look their best. ICYMI Hear what thought leaders said about the evolution of API Design & Management The same high end experience that Nordstrom’s Trunk Club was known for bringing to its customers was missing from the technology team’s work processes. True to their start-up roots, they iterated to find the right solution, starting with bootstrapping a homegrown product to exploring other options to support their microservices architecture. Their discovery of Oracle Apiary as an API documentation solution solved their first stumbling block, but really gave them so much more. As a result of implementing Apiary, the team was able to improve the way they came together to understand the problem and were able to deliver higher quality APIs faster. And they are just getting started! Read the Forbes Article to hear more about Trunk Club’s tech evolution.

Since 2009, Trunk Club has brought high quality fashion to customers. For professionals that either can’t or won’t take time to shop in person, the solution was straightforward: help folks discover...

Integration

The Power of API Design & Management

Today’s economy is based on information – obtaining data, transporting data, exchanging data. We do talk about this a lot – the digital economy – but how does it affect your business? When a business goes digital, they are not only able to leverage the information stored away in their legacy systems, but they can also make that information available outside their firewall to customers, devices, and partners. Gartner says that digital platforms are the focal point for integrating ecosystems of people, business, government and things (Harnessing APIs and Open Data, May 2017). These digital ecosystems form around the exchange of data. APIs are the way to exchange data in a robust marketplace. Since more organizations are investing in digital transformation than ever before, the number of APIs is increasing quickly. Many businesses want to reinvent themselves, and APIs as digital doors to their intellectual property enable rapid assembly of new services and applications. In the face of an increasingly complex inventory of digital assets, standardization through a style guide prevents ad hoc chaos, and interactive documentation relieves the pains of API and microservice development. Being deliberate about the API Design process ensures that all stakeholders understand the API from the beginning, preventing costly rework and getting better projects to deadline faster.  Businesses can also achieve a complete understanding of the business through analytics as data and services are being accessed and monetized.   Secure digital assets with policies tied to your APIs and eliminate external threats to your data.  Until now, API Management has been reactionary, trying to corral what has already been created. Typically, this has included the operational aspects of the lifecycle – security, deployment, consumption, and monitoring, but Oracle API Platform Cloud Service with Oracle Apiary offers another aspect to a full lifecycle API Management tool: Design. After the Evolution of API Design & Management, what does modern API Management look like? Read about the conversation we had with thought leaders in our Twitter Chat here. The Oracle API Platform Cloud Service comprises the full API Lifecycle and can be deployed on-premises, in the cloud, or anywhere in between. Encompassing the complete API continuum from API Design & Standardization via Oracle Apiary’s trusted API Design & Documentation Platform through to API Security, Discovery & Consumption, Monetization, and Analysis, the API Platform Cloud Service provides a completely new, simplified API management user experience on top of a proven API gateway. Learn more about Oracle’s full-lifecycle API management solution. Join us on June 15th at 10 am. Click here to register!

Today’s economy is based on information – obtaining data, transporting data, exchanging data. We do talk about this a lot – the digital economy – but how does it affect your business? When a business...

Integration

Twitter Chat Reveals – The Evolution of API Design & Management

Yesterday, on June 7, @OracleIntegrate hosted a live Twitter Chat at #APIDandM. The topic – “The Evolution of API Design & Management”. APIs clearly define exactly how a program will interact with the rest of the software world and APIs are at the heart of effective mobile, cloud and web development. With the rise of the digital economy, more organizations are investing in digital transformation, causing an explosion in APIs. It’s now more important than ever for effective and efficient API Design and Management. With that in mind, the live Twitter discussion focused on the topic of “The Evolution of API Design & Management”. The Twitter Chat explored the difference between legacy and modern API Management, as well as what’s new in modern API management. Industry thought leaders, numerous partners and more participated in this very engaging discussion. The interaction ranged from how API Design fills in the gaps in API Management to whether API Management is required before you enter the Digital Economy. From serious musings to light hearted commentary the Twitter Chat proved to be a great meeting of minds. One of my personal favorites was a tweet from David J. Biesack (@davidbiesack) that said "APIs without API management is API chaos. We’ve all #BTDT #APIDandM". Even if you participated, you may have missed portions of the live discussion so we have curated the chat here; it might be worth going back and following the discussion. Catch the recap of the Twitter Chat and while you still can, feel free to search for the complete thread by searching on “#APIDandM” on Twitter. Archive: The Evolution of API Design & Management

Yesterday, on June 7, @OracleIntegrate hosted a live Twitter Chat at #APIDandM. The topic – “The Evolution of API Design & Management”. APIs clearly define exactly how a program will interact with...

Integration

The Evolution of API Design & Management

The digital economy has been talked about a lot, but what does it mean for your business? Digital transformation is a fundamental shift in business strategy. Not only does it involve taking on modernization projects & teams, but it really requires changing the way you think about your company’s strategic initiatives. For many, this has been on the horizon and they have been able to take it slow, allowing the market to vet the technology. As competition becomes ever more fierce and innovation happens faster and faster, now is the time to figure out what digital transformation means for you. When a business goes digital, they are not only able to leverage the information stored away in their legacy systems, but they can also make that information available outside their firewall to customers, devices, and partners. Gartner says that digital platforms are the focal point for integrating ecosystems of people, business, government and things (Harnessing APIs and Open Data, May 2017). Thriving digital ecosystems are shaped around the exchange of data. APIs are the way to exchange data in a robust marketplace. Since more organizations are investing in digital transformation than ever before, APIs are proliferating. The reason for investment in APIs are many, and there is no right answer for every business. For many, the ability to apply industry-standard security is of the utmost concern; others find that easy to implement policies like throttling, rate limiting, and version control are critical to developing quickly and making sure that the app developers that depend on well-developed APIs get the tools that they need to support agile methodologies. Still other organizations need visibility into how their data is moving, how APIs are working, and who is accessing them. Whatever the reason, API proliferation has resulted in a vast number of APIs and a need for API Management. In the most simplistic of terms, APIs are products that are created. Products must be managed to serve their customers their best user experience. Thus, their product managers need to be attuned to every aspect of the API Lifecycle. API Management has historically been reactionary, trying to corral what has already been created. Typically, this has included the operational aspects of the lifecycle – security, deployment, consumption, and monitoring, but Oracle API Platform Cloud Service with Oracle Apiary offers another aspect to a full lifecycle API Management tool: Design. When you’re driving a car, you want to see where you’re going. Looking through the windshield is important to make sure that you strategize around the car slamming on its brakes in front of you, maneuvering around a turn, or the pothole up ahead. It is incredibly tough to steer a car if you’re only able to see through the rear view mirror. Similarly, including design and standardization as critical elements of the API Lifecycle help to make sure that all stakeholders – business units, both API & app developers, executive sponsors – know the plan and are able to help course-correct if they see a problem coming right at you. Not only does API Design saves time upfront, but considering design and catching problems early saves money. Imagine allowing a buggy product going into deployment. The maintenance and support of that product costs 100x more than catching it while in the design stage. Design is a critical piece of the API Management puzzle. To find out more, be a part of the conversation around API lifecycle management. Join us on June 7th at 10 am PDT (1 pm EDT) when @OracleIntegrate hosts a Twitter Chat to talk about the evolution of API Design & Management. For even more information, check out the free Oracle Apiary white paper: How to Build an API  

The digital economy has been talked about a lot, but what does it mean for your business? Digital transformation is a fundamental shift in business strategy. Not only does it involve taking on...

Why Invest in API Management?

In this post I will highlight why APIs and API Management are essential to succeed in today’s business environment and why it is inevitable that businesses invest in a good API Management solution to accelerate innovation in today’s rapidly changing marketplace. Digital Economy We are living in a world of digital disruption. We have all used products and services that have been replaced by newer, innovative products and services. For example, Yelp has replaced old Yellow pages, smart phones have replaced cameras, Google Maps has replaced old paper maps, and streaming services such as Netflix are replacing cable television. And we all know about AirBnB and Uber which have disrupted entire industries such as hotel and transportation. Why are these disruptions taking place? And why is it so easy today for unexpected vendors to come up with services that are innovative and disruptive? It is because the barriers to entry are low. Everyone has access to the new “Digital Building Blocks” such as mobile, IOT, social technologies, cloud computing which allow for rapid innovation. Also, the new Microservices driven development approach is allowing for faster development, higher agility and faster time to market. Developers building these new, innovative services have a variety of modern application development patterns to choose from. They are building light weight, polyglot apps based on Microservices, or Serverless architectures. They are building web apps, mobile apps and bots using newer visual, low code environments. They can choose from a variety of cloud infrastructures and data stores. They can choose from a number of app development infrastructures, DevOps tools and deployment options such as VMs or Docker containers. Legacy architectures, large monolithic applications are now transitioning into Microservices and distributed computing architectures. While developers have the liberty to choose from all of the development, infrastructure, tooling options, we need to ensure that all these apps and Microservices are able to work together. API-led integration is the glue to hold these building blocks together and allows them to communicate smoothly. APIs are at the heart of a modern IT architecture that allows businesses to survive and thrive in this new digital economy by accelerating their innovation and meeting customer demands faster. APIs – the Doors to Digital Transformation Now that we have an understanding about the type of environment we are living in, let us take a look at APIs and API-led modern architecture. API stands for Application Programming Interface and is one of the most common ways of communicating digital information today. Most apps today use APIs in the backend even though it may be invisible to the end user. Whether it is commoditized apps such as Facebook or Twitter, or other apps provided by your banking or health institutions, they are all exchanging information with another using APIs. APIs help businesses to address two main use cases. The first is where an enterprise wants to expose a set of services or data internally for all its employees to consume. They can encapsulate commonly used services as APIs, or expose data available in back end systems to promote re-use and governance within the enterprise. The second use case is where the enterprise would like to expose these APIs to the external world for branding, building developer mind share, marketing, customer engagement, or monetization. Some examples of APIs exposed to the external world include the Google Maps API, the Uber API, the Spotify API, or the Facebook or Twitter Login APIs. There are hundreds of other examples where developers are consuming publicly available APIs (some paid and some unpaid) to build new products and services. Is it any wonder then that APIs are called the Doors to Digital Transformation? Broadly speaking, APIs provide four main benefits to businesses: Customer Satisfaction & Engagement: businesses want to improve the service level interface with the customer as part of their key business goals Revenue Growth: APIs make it possible to monetize existing data and service assets and create additional channels of revenue Operational Efficiency: efficiency resulting from cost savings and overall process improvements due to enhanced customer experience Partner Contribution & Ecosystem: distribution of content & assets through new channels and partners.  For example, Facebook messenger APIs allows hundreds of companies, partners and developers to build chatbots for their business needs. The need for API Management Having understood why APIs are important we must also try to understand why API Management is critical. API Management is the layer between the APIs and the end consumers (internal or external consumers). Having the APIs is not sufficient, organizations must be able to maximize the value out of the APIs. Businesses must make the APIs well documented and discoverable by consumers so they can maximize adoption. Organizations must also make sure to secure their digital assets with API-specific policies since exposing APIs brings significant risks to the IT infrastructure and this risk must be managed. They must also define service level agreements (SLAs) to specify constraints such as application rate limits, quotas and plans. Finally, organizations must also have visibility and metrics which provide information on who is using the APIs, successes and failures, request and response times, etc.  This insight helps them to understand how to use existing digital assets to meet their business goals. In short, companies must manage their APIs and an API Management solution allows them to do that and keep pace in digital economy. The API Lifecycle The API Lifecycle consists of different stages as shown below: Design – this stage is about designing the API. The design stage should promote API-First development which means that APIs should be designed before they are implemented. Often times we have already implemented back end services which we expose as APIs. Although these may be fine for some cases, an API-First approach ensures that the consumers of the APIs participate during the design of the APIs before they are implemented. Consumers can collaborate with API Designers to ensure that APIs are being designed the way they want it. This will maximize the API adoption. Personas in this stage:  API Designer / API Product Manager Gathers the requirements of the consumer Applies best-practices for API design Documents the API to make it easy to learn how to use Gains agreement with the consumer on the design of the API Implement – this is the stage where the API is implemented by the API service developer. This stage also includes implementing the API in the API Management layer. Creating the API backend implementation is different from creating the API in the management layer. The latter creates the API, connects it to the backend API implementation and allows API Managers to add security policies, documentation and publish it in the consumer portal where consumers can consume it. Personas in this stage:  API Manager/Implementer Creates the API, applying policies to support the design and to ensure security Tests the API Deploys the API Monitors and manages the API Deploy – this is the stage where the API is deployed to a “gateway” where it is ready to service runtime requests. The gateway manager will deploy the APIs which will be approved by Gateway managers. In some organizations the same person will play the role of both API manager and Gateway manager. In other organizations, the Gateway manager will be an operations role. Personas in this stage:  API Manager/Gateway Manager Deploys and configures gateway nodes Reviews and approves API deployment requests where necessary Monitors and manages the gateway(s) Manage – this is the stage where you may need to manage the API. Perhaps you want to add more security policies, or add rate limiting SLAs, or undeploy or redeploy an API to a different gateway. Personas in this stage:  API Manager/Gateway Manager Monitors and manages the API Adds or removes security policies, SLAs, etc. Deploys / Undeploys / Redeploys API Discover – in this stage the consumers discover APIs that are available for consumption. Typically the API manager will publish APIs that are ready for consumption into a consumer portal. The consumers (typically developers) will come to a Developer Portal and browse APIs available. They will read the documentation and test the API to ensure that it meets their needs and register to use these APIs. Personas in this stage:  API Consumer Searches the API catalog to identify existing APIs Registers desired APIs with application(s) May collaborate with an API Product Manager/API Manager to design the API and capture the requirements Monitor – this stage is where the API is monitored by the API Developer, API Manager and the API Consumer to gain visibility and metrics. The API Developer and Manager will be monitoring to see who is invoking the APIs, which APIs are most popular, what are the common security violations, what are the average request and response times, etc. The API Consumer will also monitor the API to monitor how many requests have been successful, how many invocations have been made, average request and response times, etc. Metrics are particularly important for business to make sure that they are using their digital assets in the most efficient way possible. Accurate analytics is also important to measure metrics related to performance, SLAs, monetization, rate plans, quotas, etc. Monitoring an API may reveal information that may require APIs to be re-designed, or new APIs to be designed and deployed and thus the API lifecycle continues. Personas in this stage:  API Consumer / API Manager / API Developer Monitors the API Makes sure that performance, SLAs, security policies are met as needed Oracle API Platform Cloud Service Now that we understand the need for an API management solution, we will talk about the brand new API Management offering from Oracle, the Oracle API Platform Cloud Service. Oracle API Platform Cloud Service provides a foundation for Digital Transformation through the first API Management offering that comprises the full API Lifecycle. Encompassing the complete API continuum from API Design & Standardization via Apiary’s trusted API Design & Documentation Platform through to API Security, Discovery & Consumption, Monetization, and Analysis, the API Platform Cloud Service provides a completely new, simplified API management user experience on top of a proven API gateway.  Key Features: Some of the key features of the Oracle API Platform Cloud Service include: Apiary Mock Server: Allows users to quickly model an API on Apiary’s hosted server to make sure that everyone (API Developers / API Consumers) is on-board  Apiary Interactive Documentation: Gives API consumers the information they need to succeed. The Apiary Documentation Console takes the API Description and allows you to not only read and write, but also to interact with your API—even before you’ve built it  API Implementation: Allows API Designers and API Consumers to build new APIs using Apiary UI  API Deployment: Enables Gateway Managers to Deploy, Activate, Deprecate, and Remove APIs  API Inventory & Catalog: Allows API Consumers to discover which APIs are available for use  Application Registration & Management: Allows API Consumers to manage the applications using your APIs to ensure proper usage  Operational Analytics: Allows API Consumers and API Managers to gain visibility using more than 10 pre-built charts to see, for example, who is using the APIs, how, and if they are encountering issues  Policies: Top security, quality of service, and routing policies  User Roles & Grants: Comprehensive grants model to control access to your APIs with API-level entitlements  Architecture Oracle API Platform is built using a modern Microservices driven architecture. Each component is built as a small micro service accessible through a REST API. For example, creating metadata, or deploying an API, or collecting analytics can all be done using REST APIs. The following diagram shows the product architecture at a high level. Management Portal: The management portal runs in the Oracle Public Cloud and is the heart of the Oracle API Platform Cloud Service. The management portal, along with Apiary (also running in the cloud) enables API First design and development of APIs. The gateway managers can create, manage, secure and publish APIs using the management portal. The APIs are deployed on to runtime gateways which enforce policies at runtime. Gateways: API Gateways are the runtime components that enforce all policies specified through the management portal. Gateways also help in collecting data for analytics. The gateways can be deployed anywhere – on premise, on Oracle Cloud or to any third party cloud providers. This allows the gateways to be closest to your backend services. Some organizations may not want to expose their data at runtime through the cloud, so it is possible for them to deploy the gateway on premise. Their data is never published back to the cloud. For analytics also, only the aggregated information is passed back periodically to the management service running on the cloud. Developer Portal: After an API is published, API Consumers (application developers typically) use the Developer Portal to discover, register, and consume APIs. The Developer Portal can be completely customized to an enterprise’ needs so that they can publish it as their own portal to their customers. Developer Portal can also run either on the Oracle Cloud or directly in the customer environment on premises. Why Oracle API Platform Cloud Service? The API Platform Cloud Service provides complete support for API Lifecycle from API Design & Standardization to API  Security,  Discovery  &  Consumption,  Monetization,  and  Analysis  on  top  of  a  proven API gateway.  Some of the key differentiators include: Deployment Flexibility - The API Platform Cloud Service is a hybrid solution where management is in the Cloud but gateways can be deployed anywhere - on premises, Oracle Cloud or third party cloud providers. This allows organizations to deploy the gateways closest to their backend services with the assurance that their data is never going to the cloud Low Operational Cost – The management portal of the API Platform cloud service is run in the Oracle Public Cloud and is completely managed by Oracle. As a customer you need to only run the gateway, the operational cost of which is very low Comprehensive Functional and Operational Analytics - The platform provides robust fine-grained analytics and out-of-the-box support for custom security policies. It also provides a complete audit history of all APIs Customizable Developer Portal - The Developer Portal allows complete customization of the CSS and the addition of languages through a REST API.  The portal can also be hosted by the customer through provided JQuery source code. This allows customers to host the Developer Portal as their own portal to their customers Proven Technology - The platform’s gateway is based on proven technology that has been tested in rigorous production environments at telecommunications companies around the world for several years Summary Today’s fast changing business environment makes it mandatory for companies to invest in a comprehensive API Management solution to keep pace with rapid innovation in our digital economy. Companies that do not do this will stumble and fall behind.  Oracle API Platform Cloud Service provides complete support for API Lifecycle from API Design to Discovery & Consumption, Monetization, and Analysis. Using Oracle API Platform Cloud Service, companies can take advantage of their existing digital assets, be agile and accelerate innovation, consequently increasing customer satisfaction and engagement, lowering cost and increasing revenue.

In this post I will highlight why APIs and API Management are essential to succeed in today’s business environment and why it is inevitable that businesses invest in a good API Management solution to...

Integration

Learn How to Integrate Systems 6 Times Faster

How much time does your IT team spend on integration work? If you’re like most SMBs, it’s probably a lot. And yet, somehow, it’s never enough. Small to medium businesses tend to have small IT teams—usually a handful of people—and they’re expected to do everything from managing software licenses to fixing the CEO’s computer when it crashes. There’s not a lot of time to spend on strategic projects. The cloud has lifted some of this burden off the backs of overworked IT teams—but if your SMB has been around for a few years, you probably have some on-premises systems. And when you decide to invest in a new cloud solution, you need to integrate it with your on-premises applications. When this happens, how do you manage it? Do you hire more staff? Pay a partner to do it for you? How much does that cost? And what happens when the partner leaves? Integration Done 6 Times Faster Integration has traditionally been a barrier to delivering new applications and services, and to innovation in general. Speed matters—not only so that your team can support strategic projects, but so that the business can get to market more quickly with new services and digital revenue streams. There’s good news out there: a complete platform that makes integration 6 times faster and far less risky—at a price that even SMBs can afford. Join this webcast to learn how easy it is to: Create and automate new business processes across cloud and on-premises systems Rapidly build new applications, mash-ups, and mobile apps to engage customers and partners    Leverage the latest technology such as mobile, chatbots, and the Internet of Things to offer exciting new products and services Learn how simple it can be to connect everything for continuous innovation. Register now to attend the webcast.

How much time does your IT team spend on integration work? If you’re like most SMBs, it’s probably a lot. And yet, somehow, it’s never enough. Small to medium businesses tend to have small IT...

API Platform for teams who want to focus on building great products

Oracle announced the availability of its new API Platform Cloud Service.  This offering is a truly hybrid solution to delivering APIs, but goes well beyond just simple API Management.   API Platform Cloud Service is designed for teams who build APIs and who want to focus on building great products.  Up until now, API teams had to stitch together multiple tools, which caused them to lose focus.   API Platform Cloud Service is the only solution that supports the complete API Lifecycle and it allows developers, architects and the business leaders to work together to build great products! To understand this further, let's take a look at the lifecycle of an API   The term "API" is loosely thrown around and often incorrectly applied to a hastily implemented REST service.  I've heard of people pointing to a REST service and calling it an API.  That service lacks any sort of documentation (first step of API in programming 101) and is often just an exposing of some underlying data-object in REST/JSON form. To create effective APIs actually involves multiple people!  First, well developed API begins with the end in mind.  Meaning, rather than just exposing some sort of underlying implementation, we actually consider what is the desired outcome?  Are we creating the next mobile app, or the next chatbot, or the next technology we have not even considered?  How does that client need to interact with the system?  Great APIs begin with an idea and a design around that idea and can come from multiple participants.  To foster this communication and collaboration, we need a solid "API aware" whiteboard. Oracle Apiary provides a robust API design, collaboration, documentation and testing platform that facilities the design discussions of an API.  Using Oracle Apiary, multiple participants from different parts of the enterprise and even outside of the enterprise can collaborate, design and test their API ideas.  A key benefit to Oracle Apiary is that just by working together and defining what the API should do, a mock service and documentation is automatically created! A mock service is a sample service based on the design and allows application developers to begin creating their applications, chatbots, etc based on the design while service developers and API Managers work on the implementation in parallel.  Beyond just a mock service, using Oracle Apiary, interface testing can be included to regularly test the implementation of the API and to alert the team if the implementation and the design ever diverge.  This rapidly speeds delivery while reducing project risk. Once the API is implemented, there is a cycle of granting the appropriate consumers to be able to use the API, publishing the API to make it discoverable and to allow consumers to find it and use it.  There also needs to be the ability to analyze the performance of the API and to manage it in its lifecycle. Functional testing is very important for a successful API strategy.  API Fortress expands testing from interface testing to deep-level testing of the APIs.  Using API Fortress, you can validate not only the interfaces of the APIs but also the message structures.  This further reduces risk as changes to the service implementation that can break your consumers are caught with API Fortress and you are alerted.  Using Oracle Apiary to begin with a design-first approach to your APIs further helps you in the lifecycle because API Fortress connects to Apiary and can build out your tests right from your design in Apiary! Using Oracle API Platform and Oracle Apiary with API Fortress provides you a complete API Lifecycle that allows your team to focus on the most important thing, delivering great product!  

Oracle announced the availability of its new API Platform Cloud Service.  This offering is a truly hybrid solution to delivering APIs, but goes well beyond just simple API Management.   API Platform...

Integration at the Speed of Business

Being recognized by Gartner as a Leader in Enterprise iPaaS in just two years of inception has been the most welcome news yet for our team in 2017.  With the explosive growth in the number of SaaS and PaaS solutions in the past few years, Cloud has transformed into a “best-of-breed playground” where organizations mix and match “specialized” applications that have a specific business function/pillar (for e.g. Sales, Service, Marketing, Talent Management, Payroll etc) and then integrate them to build end to end business solutions.  Enterprise iPaaS platforms have been the new favorite with the Cloud-boomers, with their modern user experience and ease of use, helping bring down costs, accelerate time to market. They also help enterprises find budget and resources for growth-focused innovative projects that can help them differentiate against their competition. In fact, according to Gartner, this fervent demand for and growth of iPaaS is reflected in the growing market segment estimated at around $11 billion today with over 120+ iPaaS vendors in the market today, and projected to hit nearly $15 billion by 2020. However, there are dangers lurking for organizations who perform the move without having a well-thought-out organization-wide cloud strategy.  The truth is that most organizations move to the cloud “incrementally” from within various business units where each team drives their own functional requirements, as well as their own vision of cost savings, simplicity and efficiency. Without an organization-wide cloud strategy, each team drives their own choice of cloud solutions without understanding the organization-wide impact. In the absence of cloud strategy, enterprises continue to spend more time integrating and managing a “Frankenstein" solution rather than spending time in revenue-generating, innovation-focused projects that can help drive growth.   “Cloud helps you to simplify, but it does not change the laws of physics. The more unrelated elements you have in a system, the more issues and problems you’ll have in the implementation and operation—and the more headaches you’ll face every time you do an update.” - Jon Chorley, GVP, Oracle SCM Cloud Within organizations who have been successful cloud adopters, Business, IT and LOB/Apps IT Business Units work closely with each other, yet autonomously, knowing very well that modern iPaaS platforms are no magic pill to avoid spaghetti architectural nightmares in the cloud. It takes "Practical Governance,” strategy and processes in the organization to effectively embrace the cloud. Integration is far less a “technical” exercise and far more a “strategy exercise” when it comes to measuring its long term effectiveness within any enterprise. When Oracle started its journey of Integration in the Cloud a few years ago, we had three key objectives in mind. 1. Simplify SaaS integration as much as possible. This made a lot of sense for us to focus on to start with. Oracle itself has a large SaaS footprint, so this was critical for us as well. SaaS applications from Oracle such as Sales Cloud, ERP and CPQ Cloud, as well as those from third parties such as Salesforce, ServiceNow or SuccessFactors, are diverse in form-factors - underlying data models, security policies, APIs, integration patterns (real-time, bulk/batch, file-based), etc.  The magnitude of the challenge multiplies with the number of apps being on-boarded within organizations. This rate of adoption and uptake has been on a phenomenal rise as well. Mid-to-large organizations own, on an average, more than 23 SaaS Applications across their enterprise. Plus, you are talking about more than 2300+ SaaS applications in the marketplace today. Talk about scale! 2. Enable ad-hoc and citizen integrators to be able to build and deliver integrations quickly on the platform. Unlike years past when IT had massive budgets, several of the new integration projects were now being driven by Lines of Business. As an example, a VP of Sales had annual budgets to modernize Sales Automation. Their team of Apps IT designers were tasked to deliver the solution that involved integrating data and processes amongst various applications. However several of these designers had little to do with traditional integration and middleware technologies in their career.  These users were predominantly functional experts with some range of technical skills. They usually look for a modern simple application-centric approach to integration where the focus was more on the functional end to end experience. These teams cared less about the fine-grained control and configuration that traditional SOA offered, and more on ease of use. 3. Enable customers to harness the data and value of their existing on-premises application assets and middleware platforms  Oracle’s large footprint of 7000+ on-premises middleware customers and over 100,000+ on-premises Database and Enterprise application customers (such as EBS, JDE, Siebel, etc) reflect the scale of the market-segment of enterprises that continue to have application and middleware footprint in their data centers. A large majority of these enterprises already have a cloud strategy in planning and/or execution phase, but they aren't eliminating their on-premises footprint any time soon.  Several of these customers want to use the cloud for their digital transformation projects, however the data ownership or residency could continue to be within the on-premises applications.  There could also be several business processes running on-premises integrating various resident home-grown systems, enterprise applications and/or trading partners that the customer wishes to continue leveraging.  So enabling a hybrid platform for integration was extremely important for ensuring organizations can progressively embrace the cloud - meaningfully.   The past two years have seen us come a very long way, not only in our customer-base (We have grown our iPaaS customer-base five-fold increase in the last two years), but also with respect to significant maturity and completeness of our vision. This is reflected in how our customers and analysts are perceiving us today.  Today, Oracle’s iPaaS portfolio is an extremely rich set of capabilities over various offerings Integration Cloud Service (elevated zero-code integration for ad-hoc integrators - optimized for SaaS Integration) SOA Cloud Service (Powerful SOA for Integration specialists), Process Cloud Service (For Process Automation - Zero code) Managed File Transfer Cloud Service (Secure file exchange - zero code) Integration Insight Cloud Service (Analytics for ad-hoc Integrators) API Platform Cloud Service (For API Design, Management, Discovery and Consumption) Self Service Integration Cloud Service (iSaaS for citizen integrators) - Coming soon! From just over 5 native connectors to on-premises enterprise applications and just one SaaS Connector back in 2015, we have “hyperlooped” to more than 50 feature-rich connectors to various SaaS applications from Oracle and key third party SaaS apps. These connectors/adapters are directly sold and supported by Oracle - and several of these adapters are built and delivered by the SaaS vendors themselves. Being a part of the organization that is also the largest SaaS vendor in the world has also helped our teams understand and work together with various SaaS strategy teams to understand the nitty gritties of the SaaS integration pain-points, key use-cases, and thus ensure that the APIs, integration patterns, practices etc are best in class and supported natively within the platform to simplify SaaS Integration.    Figure 1: The Adapter Library within Oracle Integration Cloud Service With connectivity challenges, also come security challenges. With the Integration platform running in the Cloud, outside of the enterprise data center, Security is a critical focus to all enterprises. Oracle’s vision is to create the most secure and trusted public cloud infrastructure and platform services for enterprises and government organizations. It’s mission is to build secure public cloud infrastructure and platform services where there is greater trust - where Oracle customers have effective and manageable security to run their workloads with more confidence, and build scalable and trusted secure cloud solutions. In fact, Oracle has published an Oracle Cloud IaaS and PaaS Security whitepaper that outlines the security capabilities of Oracle Cloud. In addition to this, Oracle iPaaS provides powerful and flexible security policies at all layers of communication- Transport level, Message level, and even custom policies, depending on the application. All key security policies such as SSL/TLS, WS-Security, HTTP Basic Auth, SAML, OAuth etc and so on are supported within the iPaaS platform.  One of the critical security facets of having an integration platform in the cloud is the requirement to securely integrate with applications and endpoints within the customer’s enterprise (e.g. an EBS PL/SQL API or an SAP BAPI). ICS comes with a secure Connectivity Agent that enables secure connectivity to any on-premises application without needing to open up a pinhole in the Customer’s firewall. This helps enterprises have shorter and more effective conversations with their security teams while on-boarding their iPaaS platform in the Cloud. The increased number of traditional “SOA” middleware customers embracing the cloud also brought forward major Enterprise-Readiness requirements to us - including more advanced integration constructs, enterprise-grade orchestration, monitoring and error handling capabilities etc and so on.  Several of these customers expect the same elevated “zero-code” user experience of ICS while modeling more advanced multi-step orchestrations. While we were designing and delivering these next-generation capabilities, we did UX and functional design-partnership with “Integration Specialists” from several of these early-adopter customers from the “SOA” side to ensure that the new rich user-experience on the Web is more elevated, require users to understand lesser number of concepts and underlying technical details while delivering the same power-packed functionality.  Today, ICS supports a modern multi-step orchestration designer and runtime with rich logical constructs for conditional routing, iterations, scheduling, fault handling and so on, with advanced resiliency features, all with no-code.    Figure 2: The zero-code Orchestration designer with “No source view” These elevated experiences, in no way, makes SOA less significant for enterprises. Requirements for advanced SOA will continue to exist for several enterprises - SOA helps organizations implement medium to very complex use-cases, and provide the integration specialists the much required control over the ecosystem through its powerful toolsets, scripts and management consoles.  Being able to run SOA in the Cloud using Oracle SOA Cloud Service, an integral part of Oracle’s iPaaS portfolio, provides an easy way for customers to quickly provision and run SOA projects in the cloud, including dev/test (and thus avoid major operational expenses and project delays), and/or lift-shift existing SOA projects from on-premises to the Cloud to accommodate a shift in the Application center of gravity. Together with ICS and SOACS, customers can run a true Bimodal practice in their organization and help overcome two critical competing priorities - the priority to provide advanced, stable, secure, high performance services and that to deliver, innovative, technology-intensive services quickly. I would also like to add another popular and critical aspect of our ecosystem - a large partner community - delivering connectivity and pre-built solutions for our platform. These connectors and packaged integrations will be available on our Cloud Marketplace for discovery and use within your integrations. The Oracle Integration and API Partner summits are now running across various cities in North America and across the globe - all of them are currently overbooked. In fact, we just wrapped up an exciting, well attended summit at Reston, VA on Integration and APIs.  Partners have been working with us on various fronts. Several of them work with us on native connectivity/integration with several applications such as Adobe eSign, MS Dynamics CRM and several others.  Some of the partners have delivered packaged integrations between various apps such as Salesforce.com, SAP, Ariba, Successfactors and so on, whereas others have built dedicated Center of Excellence and Practice for delivering Integration projects on Oracle’s Integration Cloud Platform.  With the breadth and skillset available on the product and platform, customers won’t have shortage of skill sets when it comes to engaging a skilled SI to deliver an integration project on Oracle iPaaS. The capabilities do not stop here. There is the rich process designer for the human element in business processes, the very critical API-driven experience with Oracle’s API Platform which allows organizations to  deliver and consume capabilities incrementally and scalably within and across enterprises, the Integration Analytics that is so easy to use for critical Business Insight into important transactions, the Self Service Integration Cloud that Business Users can use to automate their critical day to day business activities with absolutely no knowledge of Integration technologies and so much more! The difference between “Vision with the Cloud" and a "Clouded Vision" is all about your understanding of the business challenges involved in your move/execution in the cloud, choice of products and platform your organization has chosen for building a solution.  It all all boils down to your organizations’ overall cloud strategy - getting that right, and across all your teams is most important.  With Oracle’s iPaaS platform, we try to simplify a critical part of that problem - the pervasive integration challenge - and I believe we’ve got good at it! Join us at Gartner AADI in London to hear more about Oracle Integration Cloud Service and the rest of the Cloud Platform for Integration.

Being recognized by Gartner as a Leader in Enterprise iPaaS in just two years of inception has been the most welcome news yet for our team in 2017.  With the explosive growth in the number of SaaS and...

When Creating APIs, Focus your API Gateway on What it Does Best

When talking with customers and prospects, I often hear about two requirements; converting REST to SOAP or SOAP to REST and Caching of data for performance. These are great opportunities for an API Gateway, but should be used with extreme caution. Misuse of either of these features can cause performance problems in your gateway or even worse, putting your sensitive data at risk! Using an Integration Platform with an API Platform provides the opportunity to follow some best-practice approaches to data isolation as well as scaling of certain heavy-weight operations. Furthermore, by separating these concerns, sensitive data is better protected. First, let's look at the key purposes of an API Platform and an Integration Platform. The API Platform is responsible for the following: Protecting end-points: Only allowing authorized clients to call services. An authorized client is one who is properly authenticated (AuthN), has the appropriate rights (AuthZ) and has not crossed usage thresholds (rate-limiting), etc. If a client is not properly authenticated or authorized, they should be stopped at the "front-door" or DMZ Discoverability: We want to promote usage of APIs. We need our developer base to be able to find, learn about and try our APIs. Our developers need to have a way to onboard to use the APIs Manageability: We need to know how our APIs are performing. Identify any potential problems and be able to evolve our APIs over time. We need to know who is using our APIs and ensure a proper quality of service. Design-first: A complete API Platform won't just manage the implementation but also help with the concept and design of an API. Stakeholders should be able to collaborate on an API aside from any back-end implementation. Monetization: We may have the opportunity to derive real revenues from our APIs. Even an API is not a great candidate to generate revenues, having control over how clients can use it is powerful. The Integration Platform is responsible for the following: Connecting: In a perfect world we would only have the best designed Microservices that speak REST/JSON. In the real world, we have a bunch of legacy back-end systems. We need a robust adapter portfolio to connect to various back-ends from databases, to files, to queues, to complex SOAP interfaces. Transforming: Remember that perfect world of Microservices? In the real world, the back-end implementation rarely matches the requirements of the front-end interface. Sometimes, an API requires multiple back-end services. Most often, an API simply requires a different format such as JSON, whereas the back-end might be SOAP Caching and Scaling: Many back-end systems were never designed to handle web-scale applications. To fire the same query at multiple back-end systems repeatedly for all of our devices on the Internet won't scale. We need to cache our results where possible and be able to scale out our middle-tier to protect our back-end systems. In looking at the diagram, the question might be asked "Why not have the API Platform (or Integration Platform) do both so there is only one required?" Many platforms on the market today have a lot of overlap and can perform similar functions across domains. Here are a couple of reasons to leverage both. The API Gateway should be as fast and light as possible. It should only be addressing the question "Is this client allowed to make the call?" The gateway should be fending off bad requests and doing so starting with the most basic questions (AuthN, AuthZ) then validating the request (proper form, checking for exploits, etc). The API Gateway should cache only non-sensitive data. There is a danger of caching data in the gateway as sensitive data might accidentally be cached. Sensitive data should not be held on disk, or memory in the DMZ. This is resolved by using the Integration Platform. The Integration Platform should focus on transforming complex payloads. The Integration Platform may have to perform some complex, multi-service calls. By leveraging a robust caching system, scalability and high performance can be achieved while maintaining data at rest in a properly secured App Zone. The Integration Platform should provide connectivity to incompatible back-end systems. Having adapters in an API Gateway introduces more complexity and more opportunity for exploits. Does this mean that an API Gateway should never offer these features? Not necessarily. There are cases where it makes sense to cache data in an API Gateway, or to even perform some light-weight transformations. We just need to recognize that there is no "one size fits all" approach and to understand the impacts and risks on a case by case basis. Ultimately, we need to focus the API gateway to do what it does best, and using an Integration Platform may help in this regard. We can bend the rules a bit, but must be careful to protect sensitive information and deliver secure and scalable APIs As a Product Manager for Oracle API Platform Cloud Service, my focus is API Management and Integration, but of course my words are my own and do not represent any statement or commitment by Oracle.    

When talking with customers and prospects, I often hear about two requirements; converting REST to SOAP or SOAP to REST and Caching of data for performance. These are great opportunities for an API...

Oracle Recognized as a Leader in Gartner “Magic Quadrant for Enterprise Integration Platform as a Service”, 2017

Oracle announced in a press release today that it has been named a Leader in Gartner's 2017 "Magic Quadrant for Enterprise Integration Platform as a Service" report. In only the second year of being included, Gartner has recognized Oracle as a Leader and of the 20 vendors included, Oracle achieves the second furthest overall position for "Completeness of Vision". As explained by Gartner, a Magic Quadrant provides a graphical competitivepositioning of four types of technology providers, in markets where growth ishigh and provider differentiation is distinct: Leaders execute well against their currentvision and are well positioned for tomorrow. Visionaries understand where the market is going or have a visionfor changing market rules, but do not yet execute well. Niche Players focus successfully on a small segment, or areunfocused and do not out-innovate or outperform others. Challengers execute well today or may dominate a large segment, butdo not demonstrate an understanding of market direction. Gartner views integration platform as a service (iPaaS) as “providing capabilities to enable subscribers (aka “tenants”) to implement data, application, API and process integration projects spanning cloud-resident and on-premises endpoints.” The report adds, “This is achieved by developing, deploying, executing, managing and monitoring “integration flows” (aka “integration interfaces”) — that is, integration applications bridging between multiple endpoints so that they can work together.”   As described by Oracle, Oracle Cloud Platform, which includes iPaaSofferings, has experienced explosive growth, adding thousands of customers infiscal year 2017. Global enterprises, SMBs, and ISVs are turning to OracleCloud Platform to build and run modern Web, mobile, and cloud-nativeapplications. Continuing its commitment to its customers, Oracle has deliveredmore than 50 cloud services in the last two years.  Download the full Gartner “Magic Quadrant for Enterprise Integration Platform as a Service”, 2017  My Personal Favorites on Why Oracle Integration Cloud isExciting  Everyone has their favorite features of Oracle IntegrationCloud Platform. Ease-of-use based on the modern and intuitive Oracle Cloud userexperience, especially compared to legacy style integration, is frequently atthe top of most people’s list.  Here are some of my favoritefeatures:  Pre-Built Integration Flows –Instead ofrecreating the most commonly used integration flows, such as between salesapplications (CRM) and configure, price, quoting (CPQ) applications, Oracleprovides pre-built integration flows between applications spanning CX, ERP, HCMand more to take the guesswork out of integration.  You select the flowand run it as-is or tweak it to suit your needs. Pre-Integrated with Applications – A large library ofpre-integration with Oracle and 3rd Party SaaS and on-premises applicationsthrough application adapters eliminates the slow and error prone process ofconfiguring and manually updating Web service and other styles of applicationintegration. So as your future requires you to integrate more applications,each with different integration requirements, the experience for you is mostlythe same. The adapters take care of the vendor-specific application integrationrequirements for you.  Recommendations – Take the guesswork out of integration mapping byleverage crowdsourcing of everyone who has done the same integration your are about to create.  You get the input of proven integrations. Orchestrations -  All the power of synchronous, asynchronous, and fire-and-forget process orchestration without the complexity.  Lets you switch activities to create multiple routing options, ad-hoc mappings, schedule-based integration and much more. Dynamic Task Assignments - For approvals and exception management in process automation providing same day change and hassle-free upgrades to systems of record API Management - Bringing together the two worlds of API Design and API Management as a result of Oracle acquiring Apiary Integration Analytics -  Visualize your business KPIs with no coding, configuration, or modification. Choice – If you still want to keep some of your integrationon-premises and some in the cloud, Oracle gives you the hybrid integrationoption by having the same tools available in the cloud and/or on-premises…Thechoice is yours  Discover your own favorite features by taking advantage of this limited time offer to start for free with Oracle Integration Cloud Check here for Oracle Cloud Integration customer stories Customers located in or near the Eastern US can also register for Oracle'sAPI and Integration Summit on April 19th in Reston, VA to learn about best practices on the design and development of APIs and Integration.  Customers can register here for the API and Integration Summit and here for the Senior IT Roundtable. Gartner does not endorse any vendor, product or service depicted in itsresearch publications, and does not advise technology users to select onlythose vendors with the highest ratings or other designation. Gartner researchpublications consist of the opinions of Gartner's research organization andshould not be construed as statements of fact. Gartner disclaims allwarranties, expressed or implied, with respect to this research, including anywarranties of merchantability or fitness for a particular purpose.

Oracle announced in a press release today that it has been named a Leader in Gartner's 2017 "Magic Quadrant for Enterprise Integration Platform as a Service" report. In only the second year of being...

If Customers Mind, It Matters: Digital Design Thinking for the Experience Economy

Written by:  Geoffrey Bock, Principal, Bock & Company and Daryl Eicher, Marketing Director, Oracle  Driving growth in the experience economy means putting customers first. Yet memorable service moments depend on operational excellence. Compelling digital interactions with customers depend on connected employees to make the kinds of memories that build brand loyalty.  This obsession with simple customer and employee experiences is anything but easy to deliver. It demands digital design thinking. This is a new and in many cases challenging approach to design that effectively balances ever-increasing business expectations with the scarcity of highly skilled developer resources.  Digital design thinking is far more agile and incremental than traditional application requirements analysis. Business operations teams expect high bandwidth conversations and nearly immediate release of new capabilities. They don’t have time for conventional requirements documentation and approval. But they know a good design when they see it. Digital design thinking starts with mocking up what business operations will see on their phones first and deferring technical details until after the business workflow is flawless.  Today’s killer apps deliver seamless, multi-channel experiences that assistant and predict, rather than just inform and respond. These seemingly simple apps engage and delight customers, employees, and partners, by contextualizing content to speed decisions, and by anticipating the best next steps to streamline day to day activities.   Digital design thinking isn’t just about building yet another “app for that.” It’s about making the apps you have easier to utilize in creative ways. So, in addition to focusing on the user experience first, digital design thinking must leave plenty of room for data-driven improvement over time. This “design for change” imperative is antithetical to traditional developer-led automation projects where application enhancements can take months or even years to go live.   Consider the many decisions and coordinated actions a government agency must take to manage its contractor workforce. Getting a new contractor’s first day on the job to be productive typically requires approved actions spanning multiple departments and existing applications such as payroll, benefits, an enterprise directory, and a skills registry. Retooling every application involved isn’t an option for the department heads involved, so approvals, supporting documentation, and exception handling are typically handled via email and sneaker net.  Traditional, developer-led automation projects typically put one-off system integration efforts on the critical path to delivering the full end-to-end solution. By comparison, digital design thinking starts with first principles -- delivering a simple mobile app that ensures all needed approvals are in place and easily discoverable for compliance purposes. After this approval workflow is agreed to across department heads, the next consideration is what content is needed to speed time to decision. Subsequent releases of the mobile app may involve supporting documents or forms.  Only after the workflow is operating smoothly, and any existing email attachments are easily accessible to department heads from within the new app, are the difficult issues of integrating with existing systems of record considered. This pragmatic, human-centric, business-led approach depends on low code app development where visual models replace coding and automation is in the hands of subject matter experts.  The final stage in digital design thinking is to prioritize connections with selected systems of record based on their impact on operational decision-making. Modern low code app dev relies on the ability to abstract the technical details of integration so that business analysts can focus on the decisions at hand without distractions and delays. Oracle Cloud Platform for digital business enables business agility and compliance with a shared service catalogue for pre-built connections. Third party and in-house developers publish additional services to the catalogue for business analysts to use to refine their simple mobile apps over time.  How should you get started?  Be sure to design for simplicity on the front end. Expect to add innovative capabilities, including intelligent services running within cloud environments, to the backend. Ensure that every decision point and every human interaction is informed by contextual content. And finally, anticipate the best next steps to simplify the tasks that end users need to perform to get things done.  For more on how digital design thinking powers business agility, check out this on-demand webinar and supporting solution brief.  

Written by:  Geoffrey Bock, Principal, Bock & Company and Daryl Eicher, Marketing Director, Oracle  Driving growth in the experience economy means puttingcustomers first. Yet memorable service moments...

Whats new in Oracle Integration Cloud Service 17.1.3

Oracle Integration Cloud 17.1.3 release is brimming with several new features. Explore some of the key ones below, or click on the “What’s New ” for detailed information. ICS moves to Staggered Release Schedule! Organizations wanting a greater control over the upgrade schedule for their development, test or production environments have some exciting news! Starting 17.1.3, ICS is moving to a more flexible “Staggered Release Schedule”. click here for more details. What does this mean for you? For 17.1.3, Oracle Integration Cloud Service instances created on or after the release date are provisioned with the new 17.1.3 release. All previously provisioned instances remain on release 16.4.5 for the next 30 calendar days, after which individual instances can be selectively upgraded to 17.1.3 (contact your Oracle Account Manager or Customer Success Manager to nominate Oracle Integration Cloud Service instances for this time window). A further 30 calendar days after this, all remaining instances are upgraded to 17.1.3. This staggered upgrade approach is intended to provide you with enough time to test your Oracle Integration Cloud Service integrations on the new release before upgrading. On-Premises Connectivity Agent Enhancements In 17.1.3, we have modernized the Agent’s overall Architecture for enhanced user experience. With this release, the Agent also comes certified with SUSE Linux and RHEL platforms. (SUSE Linux Enterprise Server 12 SP1 and Red Hat Enterprise Linux Server release 6.6 (Santiago) ) Global Fault Handler in Orchestrations 17.1.3 allows you to model fault handlers that can catch and process errors in your orchestrations. This Global Fault Handler can be accessed from the Orchestration Designer and provides you a full canvas design for modeling your fault flow orchestration. You could send the failed messages to error hospital for manual review and recovery. More Orchestration Actions Looping with the “While” Activity Similar to the “For-Each” activity, you can now perform looped execution in orchestrations as long as the expression within the “While” is true. Delayed Execution with the “Wait” Activity Your orchestrated flow might reach a point where it is necessary to pause before continuing. An example would be a flow which needs a product part to be delivered by a third-party before any further action can be taken. Sending Email Notifications from Orchestrations with the “Notification” Activity Users can send email notifications from orchestrations through the new notification activity. Users can parameterize and dynamically specify recipient and mail subject/body with contextual information from the integration flow. Security Features Support for Mutual Authentication (Two-Way SSL) in the SOAP Adapter SSL is a transport level cryptographic security protocol that enables secure communication between a client and a server over a network. The goal is to provide privacy and integrity of the data being exchanged between these systems. In One-Way SSL (which we usually call SSL), the invoked service authenticates itself with a certificate that helps client applications identify the service’s authenticity before establishing a communication channel. Two-Way SSL adds another level of security wherein the calling client also needs to present its identity (via a certificate) to the called service so that the service can ensure that it’s being called by a trusted party. (See (2) and (4) below). ICS now supports Two-Way SSL.  Users can upload their identity certificate through the ICS console. This certificate would be used for two-way SSL Communication with an external SOAP Service. Mapper and Integration Designer Enhancements 1. Mapping non-leaf nodes in the Mapper for Deep-Copy User can drag and drop between non-leaf nodes of the exact same datatype in the source and target to create the mapping for all children and descendant nodes in one move.  (e.g Address in the Source to Address in the target).  Other Features • Design-time o Base64 Encode/Decode functions in the Mapper o Nested Switch in Orchestrations o Ability to update Triggers in Orchestration (e.g. Change in SOAP Endpoint etc o Ability to clone/export locked Integrations • Runtime/Monitoring o Specify Retention Period for Tracking Data via API (< 3 days) o Purge instances based on configured retention period • Connectivity o Support for invoking EBS 12.2 Concurrent Programs via the E-Business Suite Adapter (EBS12.1.3 Concurrent Programs were already supported since 16.3.3) What’s New in 17.1.3 Documentation Please click here to access the What’s New in 17.1.3 online documentation. p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Cambria}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Cambria; min-height: 14.0px}p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px 'Times New Roman'; min-height: 15.0px}p.p4 {margin: 5.0px 0.0px 5.0px 0.0px; font: 12.0px Cambria}li.li1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Cambria}span.s1 {font: 12.0px 'Times New Roman'}span.s2 {text-decoration: underline}span.s3 {font: 12.0px Symbol}span.s4 {font: 12.0px 'Courier New'}span.s5 {font: 10.0px Cambria}span.s6 {text-decoration: underline ; color: #0433ff}span.Apple-tab-span {white-space:pre}ul.ul1 {list-style-type: disc} p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Cambria; min-height: 14.0px}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Cambria}p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px 'Times New Roman'; min-height: 15.0px}p.p4 {margin: 5.0px 0.0px 5.0px 0.0px; font: 12.0px Cambria}span.s1 {text-decoration: underline ; color: #0433ff}

Oracle Integration Cloud 17.1.3 release is brimming with several new features. Explore some of the key ones below, or click on the “What’s New ” for detailed information. ICS moves to Staggered Release...

Building Modern Applications Using APIs, Microservices and Chatbots | DeveloperWeek 2017 Keynote

Applications and architectures are evolving at a rapid rate,pushing developers to keep up. No company is immune to this pressure, and it requires the right tools in your toolbox to keep up to speed with the industry changes sothat work gets done at the pace customers demand. ICYMI Check out the slides from Oracle's DeveloperWeek 2017 keynote here  The new API Platform Cloud Service is a flexible,lightweight platform that allows true hybrid, full lifecycle API Management.Developed in partnership with market leader Apiary,now a part of the Oracle family, the API Platform Cloud Service allows teams tomove quickly, designing thoughtfully before deploying gateways either on premises or in the cloud in the mostconvenient way possible. Apiary has long been a player in the API market with over 21,000 customers having used them to design, use, and implement their APIs.Apiary’s API Blueprint is a powerful,open-sourced and developer-friendly way to quickly develop and test constraintsusing automated mock servers, tests, validations, proxies, and code samples. Apiaryalso makes sharing simple and streamlined, coordinating seamlessly withrepositories like GitHub and building interactive, rich documentation to reducethe burden on the development team, allowing them to focus on the issue athand.  In addition to the focus on APIs, the shift away fromtraditional monolithic architecture to polyglot microservices has been anexciting approach to development, although one that takes a lot of thought to implement. Monolithic applications cause complex,burdensome conflicts that require great effort to coordinate, build, and test. Notonly has the movement toward microservices allowed teams to be distributed in theirlocation, but each team is able to select technology that works best for theirscenario. This approach ensures fault isolation and resiliency, as well as animproved ability to scale and utilize resources, but it requires a set of modern development tools to buildthe service implementations and microservices in a team's language of choice. If the entire DevOps cycle is to be automated, then this also requires a highly available, secure,modern container platform. This is a lot to both demand and source from a grabbag of providers, but the Oracle Cloud Platform makesit simple. Not only is it one-stop shopping for developer tools that help your team build andmanage code, an AppDev Platform that supports microservices and serverlessfunctions, AND a DevOps pipeline that integrates backend services andmonitoring tools, but it works with open source frameworks like Maven, Gradle, Git, and Jenkins that make projects so much easier today. And with the surge in microservices necessitating aseamless way to integrate them, APIs and the API Platform Cloud Service provide just that. Mobile apps and mobile development have also matured leapsand bounds in the past couple years. While mobile apps were seen quite recently as the holy grailof customer retention and insight, monetization, and loyalty, it’s actually provingexhausting for the consumer to toggle amongst a library of apps. Appfatigue is very real, with users typically devoting 80% of their mobiletime to only 3 apps, engaging just 25 apps on a monthly basis. Although chatbotsand AI have become more common-place, actually useful bots are far less common. Oracle Intelligent Chatbot Service has introduced a channel-agnostic, secure way for users to apply a unified natural language interface to simplify both back-end support and the end user's experience.  The Oracle Cloud Platform is an open, modern, and easy way for organizations to implement  proven ways to build modern applications. The platform provides a comprehensive AppDev suite, as well as an automated DevOps platform that supports CICD, mobile and API-first delivery, and robust monitoring in a simple dashboard. Sign up for an Oracle Cloud Trial today!

Applications and architectures are evolving at a rapid rate,pushing developers to keep up. No company is immune to this pressure, and it requires the right tools in your toolbox to keep up to speed...

Oracle Integration Cloud Service 16.4.5 - What's New!

It has been a busy but exciting holiday season for all of us in the Oracle Integration team - we released Oracle Integration Cloud Service version 16.4.5 (Winter Release), in December. Here's a quick look at some of the highlights of this exciting release: Richer Connectivity Bring Your Own Adapter interface on ICS Console Swagger Generation – for inbound REST integration Oracle Logistics Connectivity - New Adapter IBM DB/2 Connectivity - New Adapter REST and SOAP Connectivity Enhancements Enhanced Orchestrations Schedule Your ICS Orchestration Flow Drag-and-Drop Iteration/Loop Activities in ICS Designer Import Maps in your ICS Orchestration Create a new a new version of your ICS flow in a click or two ICS REST APIs now v2, with support for REST 1.2 … and several other interesting enhancements! Bring Your Own Adapter - Register and Use Your Custom Adapter in ICS Have you built a custom ICS adapter using our Cloud Adapter SDK? Or maybe your implementation partner has built one? Register your adapter to your ICS instance with an easy to use interface and start using it right away. No more waiting or going through manual steps to start testing or using your own adapters. Learn more in the docs here – Registering Custom Adapters Swagger Generation for Inbound REST Integrations ICS flows can now be exposed with Swagger for inbound REST-based integrations dynamically. Simply append “/swagger” to the URL that you normally use to view metadata for your ICS flow, and the Swagger description will be dynamically generated for you instantly. Oracle Logistics – New Adapter The all new adapter for Oracle Logistics (Oracle Transportation Management / Oracle Global Trade Management - part of Oracle Supply Chain Management Applications family) takes connectivity to and from Oracle Logistics applications to a new level, with the ICS Cloud adapter interface that our customers have come to know and love. An example use case for this popular application is sending a Fulfillment Line from Order Management Cloud into Logistics Cloud as an Order Release, and sending a response back to Order Management Cloud. In addition to the adapter, our Oracle Logistics Engineering and Product Management team have also provided 11 sample pre-built ICS integrations that illustrate how Oracle Transportation and Global Trade Management might be integrated with Product Hub Cloud, Order Management Cloud, and Inventory Management Cloud through ICS. Note, this adapter is supported only with OTM/OGTM v. 6.4.2 (On-premises) for this release. Learn more in the docs here – Supply Chain Management Adapters. IBM DB/2 Connectivity - New Adapter IBM DB2 Adapter is now available, certified for DB2 Advanced Enterprise Edition 10.5 FP 7. This new adapter supports basic functions such as SQL queries and Stored Procedures and advanced ones such as distributed polling and multithreading support. Learn more about this adapter in the docs here – Using the DB2 Adapter. REST and SOAP Adapter Enhancements Our REST and SOAP adapters have had several enhancements in this release such as support for partial updates for RESTful resources, ability to send multi-part responses (e.g. an ICS flow to request for an Item detail may respond with a PNG Image of the Item and a JSON document for the Item detail as a multi-part message), support for the target application’s TLS version, support for custom inbound/outbound SOAP HTTP headers and more. Learn all about these in the docs here – SOAP adapter and REST adapter. Schedule Your Orchestrations You can now schedule to run your ICS orchestration flows at a time you prefer e.g. a retailer scheduling an orchestration to pick up files at midnight every night for all points-of-sale via secure FTP and uploading to a target SaaS system. Learn more in the docs here - Scheduling Integration Runs. Drag-and-Drop Iteration/For-Each Loops in Your ICS Orchestrations 16.4.5 continues to enhance ICS’s orchestration capabilities to be even more powerful and sophisticated. Iterate through records in your message or file with an easy to use looping construct – For Each. You can simply drag-and-drop definition of the looping element and cursor. It also provides visual indication of looping scope in the ICS Designer. You could now easily loop through, lets say, all Contacts in an incoming update from CRM and call create/update Contacts in one or more target applications. Try it out now, learn more in the docs here - Looping over Repeating Elements with a For-Each Action Import / Export Maps in ICS Orchestrations Not restricted to Map My Data pattern anymore, this popular feature can be now  leveraged in orchestrations too. Learn more in the docs here – Importing a Map file into an Orchestrated Integration Create a New Version of an ICS Flow Need to create a new version of your ICS flow without touching a live flow? Simply select Create Draft from the menu and you have a new draft version you can continue working on. Learn more in the docs here – Creating a Draft of an Integration. ICS REST APIs, now v2, with REST 1.2 Support ICS REST APIs to manage and monitor integrations, adapters, connections, lookups, and packages are now available with REST 1.2 support. Learn more in the docs here – ICS REST APIs. Learn More in our Documentation Get up to speed with all that’s new in ICS 16.4.5 in the documentation here – What’s New

It has been a busy but exciting holiday season for all of us in the Oracle Integration team - we released Oracle Integration Cloud Service version 16.4.5 (Winter Release), in December. Here's...

Oracle Service Bus Internals: Delivered by A-team from SOA Blackbelt Training

Oracle Service Bus Internals:Delivered by A-team from SOA Blackbelt Training Presented by Oracle A-team, Integration Cloud team andMiddleworksThurs, December 15, 20169:00 am  |  Pacific Standard Time (San Francisco, GMT-08:00) |  1 hr REGISTERNOW Thissession will be delivered by Mike Muller of the Oracle A-team, with many yearsof deep expertise in the Oracle Service Bus. The content comes from the “SOABlackbelt Training” which was previously used within Oracle to provide deepinternal and architectural understanding of components within the SOA Suite,and applies to OSB versions 11g and 12c as well as both cloud and on-premiseinstallations. Attendees will receive a highly advanced and deeply technical presentation onsome of the nitty-gritty internal details of the Oracle Service Bus and isintended for developers and architects who already have a good understanding ofOSB. Trust us, if you are looking for an intro or overview of OSB, this sessionwill not be a good use of your time. But if you want to go from being anexperienced OSB developer to the next level, we think this content will beperfect and is only available here. The session is presented by Mike Mullerfrom the  Oracle A-team, with many years of deep expertise with OSB,troubleshooting, putting out customer’s fires and answering technical andarchitectural questions. Key topics on the agenda include: the service bus threading model weblogic thread management work managers throttling transactions Plannedparticipants in this session include: Mike Muller from the Oracle A-team, with some of the deepest working knowledge of Oracle Service Bus in the world, delivering OSB internals information from the Oracle internal "SOA Blackbelt Training" David Shaffer of Middleworks, moderating and providing additional resources Kathryn Lustenberger of the Oracle Cloud Integration prod mgmt team We hope to also have OSB engineering representation on the line to help with Q&A Thissession will be especially fast, focused and highly technical,when compared against others in this series. All who register will receiveinvitations to future related events and the ability to access a recording ofthe webinar and the slides. To access this information for previous webinars,and see the schedule for future webinars, go to http://www.middleworks.com/soa-expert/ REGISTERNOW As soon as yourregistration is submitted, you'll receive instructions for joining the meeting.Can't register? Have requests for future webinar topics or want to offer to bea presenter? Email dave@middleworks.com. And, please feel free to forward this invitation to other interestedcolleagues.

Oracle Service Bus Internals: Delivered by A-team from SOA Blackbelt Training Presented by Oracle A-team, Integration Cloud team and Middleworks Thurs, December 15, 20169:00 am  |  Pacific Standard...

Oracle Managed File Transfer for SOA Customers: Overview, Demo, Q&A with Product Mgmt, Engineering and Partners

Oracle Managed File Transfer for SOA Customers:Overview, Demo, Q&A with Product Mgmt, Engineering and Partners Presented by Oracle Integration Cloud team and Middleworks Tues, November 15, 2016 9:00 am | Pacific Standard Time (San Francisco, GMT-08:00) | 1 hr REGISTER NOW This session, the third webinar in the popular SOA Expert Series, will provide a basic product introduction to Oracle MFT from the product management team, along with real world implementation experience and advice from an experienced SOA partner doing a cloud MFT implementation. As most SOA Suite customers have needs for moving files around using managed file transfer approaches and now that Oracle has a SOA Suite component offering this functionality, we want to answer the typical questions SOA customers have around MFT, such as: what does it do and how does it work? When should I use MFT vs the SOA Suite file capabilities? What are other SOA customers doing with Oracle MFT today? Etc. Planned participants in this session include: •Dave Berry from the Oracle Service and Cloud Integration prod mgmt team, responsible for the MFT product, providing a product overview, release timeline and demo •Ben Kothari of Ampliflex, talking about lessons learned and best practices from implementing MFT in the cloud for a SOA Suite 12c transportation services customer, including integrating with MFT to HR systems (e.g. Fusion HCM, Taleo, payroll, benefit providers) •David Shaffer of Middleworks, moderating and providing additional resources As always, the content for this session will be fast, focused and highly technical, with extra materials and links to documentation provided for attendees' future reference. All who register will receive invitations to future related events and the ability to access a recording of the webinar and the slides. To access this information for previous webinars, and see the schedule for future webinars, go to http://www.middleworks.com/soa-expert/ REGISTER NOW As soon as your registration is submitted, you'll receive instructions for joining the meeting. Can't register? Have requests for future webinar topics or want to offer to be a presenter? Email dave@middleworks.com. And, please feel free to forward this invitation to other interested colleagues.

Oracle Managed File Transfer for SOA Customers: Overview, Demo, Q&A with Product Mgmt, Engineering and Partners Presented by Oracle Integration Cloud team and Middleworks Tues, November 15, 2016 9:00 am...

Migrating your Oracle BPM assets into Oracle Process Cloud Service (PCS)

If you are already an Oracle BPM user, it is likely that you might have heard or even ventured in its respective cloud version called Oracle Process Cloud Service (aka PCS). Essentially, Oracle PCS is a solution that enables you to rapidly design, automate, and manage business processes, as well as it is done with Oracle BPM, however the major advantage is that you can do everything in the cloud, without any concerns with infrastructure installation, setup and provisioning while keeping IT teams focused on high-value projects rather than endless tuning, monitoring, troubleshooting and workarounds, as regularly it is required to be done for on-premise projects, which in turn, allows you to focus on the business value of your solution, that is what really matters whenever we  talk about Business Process Management. Oracle PCS has two environments: Composer: for developing, testing, and deploying process applications Figure 1 – PCS Composer Home Page Figure 2 – PCS Composer Deployment Page Figure 3 – PCS Composer Modeling Page Workspace: for running deployed applications, monitoring and managing them Figure 4 – PCS Workspace tasks view Figure 5 – PCS Workspace tracking view Figure 6 – PCS Workspace dashboards view I have been working with several Oracle BPM customers in the last few years in my current role as Senior Principal Product Manager, and I have seen along these years, an increasing demand of Oracle customers wishing to move to cloud as soon as possible, given all the very known benefits that are spurring adoption such as lower costs, greater agility, improved responsiveness and better resource utilization among other technical and business drivers. Also, there are lots of new customers that want to get started with a streamlined solution to model, design, implement, run and monitor their processes. Thus, as you can notice, Oracle PCS is the perfect match to address all of these customer requirements mentioned above. However, as you might be thinking, there are many customers that have already developed hundreds or thousands of projects and processes on top of the on-premise version, and obviously want to take them to the cloud without any loss. Having said that, the goal of this post is NOT to cover the features of PCS itself, let alone a walk through in the solution, since you can easily find many of them available over the internet, but show how existing BPM customers can take all their assets into PCS in a smooth and easy way. As one can say, there is already an available option within PCS to allow users to export their assets to run on BPM as shown in the picture below: Figure 7 – Downloading a project from PCS which is compatible with Oracle BPM and also documented in the following link (Exporting an Application to Oracle BPM), which is really useful for customer that are using PCS for dev/test and still want to run their processes on-premise. However, how about the other way around? If you want to find out how this can be accomplished, please checkout my blog post at the following link https://andreboaventurablog.wordpress.com/2016/10/10/migrating-your-bpm-assets-into-pcs/ See you in my next post!! ;-)   

If you are already an Oracle BPM user, it is likely that you might have heard or even ventured in its respective cloud version called Oracle Process Cloud Service (aka PCS). Essentially, Oracle PCS is...

Processing high volumes of updates to a single row using OSA

In today's world, data is being produced at an exponential rate. While new big data technologies are available to capture all of this information and process it to discover interesting trends, sometimes we need to take immediate action or would like the current status of a specific item. To accomplish this, you could store all of the incoming data, but the trade-off could be potentially maintaining a lot of data that you don't need and requiring users to write queries that first determine the most current row. Another technique could be to perform database updates, but this could lead to database contention in scenarios where a single row is updated very frequently, like in the domain of IoT where devices can sending out frequent status messages. One way to solve this problem that may not be familiar with is through the use of Oracle Stream Analytics (OSA). OSA has an extremely sophisticate language called "Continuous Query Language" or CQL. It is similar to SQL and is SQL-99 compliant, but it has additional syntax because it operates in-memory and solves problem very efficiently.  The problem of eliminating unnecessary updates on the database is one that OSA solves very well, very easily in a single query. First, we'll explain some basic details about the OSA run-time. While there is a very friendly user-interface for OSA which will allow you to develop an entire application in minutes, you also have the ability to develop an application in a more traditional way using JDeveloper.  When developing with JDeveloper, you will need to understand that the application model is called an "Event Processing Network" or EPN.    The EPN handles the flow of data through your application and contains CQL processor nodes to hold the CQL that you wish to be performed. In our case, the CQL to perform this seemly complex task which may have taken dozens of lines of Java code to write is done very easily and efficiently in-memory using a very simple statement as shown in the picture below. In this query, we are selecting only the attributes that we need (or we could use SELECT * FROM InputChannel ) and using CQL syntax to partition the stream by device id with "ROWS 1" meaning that only 1 of each row per device id will be kept for the time period of 2 seconds "RANGE 2 SECONDS" and the results will be output every 2 seconds "SLIDE 2 SECONDS". In this case, where the "RANGE" value and the "SLIDE" value have the same time frame, we essentially start over from the beginning (with no state in the query) every two seconds.   Note: if 2 seconds is too long for you, you can specify 1 second or even a value specified in milliseconds.  In our test, the data will be generated at random, but we will look at the data and the results to verify if they are accurate. In this case, we are sending the data to Oracle BAM 12c. We will output the data from our OSA application using the JMS adapter which will automatically write out the data as a JMS map message (we simply supply the event type name to the OSA JMS configuration and the out-of-the-box JMS adapter will create the JMS messages without the need for a converter class). This is convenient for our purposes since a BAM "Enterprise Message Source" or EMS can be easily configured in the BAM 12c administration user interface and the map message attributes can be assigned to the data object and everything is done in both OSA and BAM without writing any Java code. We will make sure that we start with an empty data object for this test and reset the EMS metrics. Our BAM EMS is configured for "Upsert", this means that it will insert the record if it does not exist (based on the key which is defined as the "deviceID") or it will update it if a row with that device ID exists already in the data object (i.e. database table). Let's first send a small amount of a controlled set of data to see that it works as expected.  We will send 30 events for only 10 unique device IDs and we will intentionally introduce a small delay between each event.  We will see from the BAM EMS metrics that only 10 messages were sent which is correct because there were only 10 different device ids in the sample and 10 messages were persisted. If we examine the deviceCode and the codeValue for each device ID, we see that these are the most current 10 messages which is what we want the user to see on the BAM dashboard (the latest message from each device). If you have two keys fields for your "Upsert" operation, you can list them both in the partition by clause separated by a comma as follows: There will be a larger number of combinations of rows in the data object. But only 1 entry for each Device ID, Device Code combination. Try it for yourself and you will see that we have achieved our goal of providing the BAM user with the most up-to-date device status extremely quickly and easily without creating unnecessary JMS messages that would have needed to be created, sent over the network to be put on the queue, taken off the queue by the BAM EMS and processed and potentially meet database contention due to so many updates of the same row. We have so easily dramatically increased the efficiency of our solution to meet the goal of delivering to the user the latest device status messages and as an added bonus we've increased operational efficiently by not generating messages that will quickly be overwritten thereby saving CPU usage for other purposes.  

In today's world, data is being produced at an exponential rate. While new big data technologies are available to capture all of this information and process it to discover interesting...

Oracle Cloud Day - Jumpstart Your Cloud Integration and API Management

Interested in connecting your business for faster time to market?  Managing your APIs as new business opportunities?  Regaining visibility and security of your APIs and more?  If so, come join us as Oracle experts explain solutions to the most common integration challenges business face today.  The image above highlights two integration sessions: Cloud and On-premises Integration (Modernize Your Cloud to On-premises Integration) Abstract: The explosive growth of SaaS applications combined with existing on-premises integration challenges are likely to slow the pace of innovation and time-to-market for businesses that attempt to integrate the old way.  Attend this session to learn how to seamlessly modernize the integration of your applications. Hear how Oracle customers are deploying new Oracle Cloud-based integrations to securely connect cloud applications to on-premises systems, manage APIs, and connect IoT devices. Enabling Digital Transformation Abstract: Digital transformation requires a robust yet flexible API management platform to design, create, publish, manage, and secure APIs. Oracle API Platform Cloud Service (coming soon) supports developers with speed and agility, while enabling secure ways to bring in new revenue streams for the business. Sessions vary between events so check the registration link below to see sessions covered at a city near you:  Register Here and learn how to jumpstart your integration and API Management.  I hope to see you there!     Bruce

Interested in connecting your business for faster time to market?  Managing your APIs as new business opportunities?  Regaining visibility and security of your APIs and more?  If so, come join us as...

Oracle Integration Cloud Service 16.4.1 - What's New!

We are pleased to announce the availability of 16.4.1 fall release of Integration Cloud Service (ICS). This October release continues to broaden it's connectivity portfolio in various segments. These include: Oracle Utilities - New Adapter Oracle Eloqua - Inbound (Trigger) support New Oracle Utilities Adapter A recent survey of 100 North American electric, water and gas utility industry executives completed by Zpryme and Oracle Utilities GBU shows that Utility Companies are embracing the Cloud. With 45% of organizations using the Cloud in some form today and another 52% planning to move to the Cloud. Nearly 97% of interviewees told us they have become involved with Cloud technologies or applications and computing resources delivered as services over a network connection instead of through in-house resources at a utility. This means integration is imperative to all Utility customers without doubt! This release introduces the new Oracle Utilities adapter making the list of ICS connectivity adapters for Oracle and non-Oracle SaaS and On-premise applications even longer! This adapter enables you to easily integrate with Oracle Utilities applications that use Oracle Utilities Application Framework v4.3.0.0 or later and supporting Web Services. For those Utilities Applications using JMS or DB integration services, you can continue to leverage the generic ICS adapters instead. New Status and Usage APIs The brand new /status API is now available that can be queried for system health status (runtime, storage, messaging, and security services). Similarly, the new /usage API returns metrics for system design-time (adapters, agent, application instances, lookups, integrations, packages, and runtime - messages and messaging system). Refer to the REST API documentation for more details. Support for Service Callbacks Starting this release, you can configure outbound SOAP invocations to specify callback ICS flows for asynchronous conversations with external systems/web services. So if you are invoking a credit rating service that responds to you asynchronously (One-Way with Async Call back), you can receive that callback as a delayed response. Remember that the callback flow will be specified within the SOAP adapter design-time as a separate ICS flow. Rich Connectivity with Cloud and On-Premises with Oracle ICS And finally, here's a quick look at some of our key Cloud and On-Premises adapters available today with Oracle Integration Cloud Service: Learn MoreMore details on what’s new in ICS 16.4.1 is available here. Learn more about Oracle Integration Cloud Service at http://cloud.oracle.com/integration.

We are pleased to announce the availability of 16.4.1 fall release of Integration Cloud Service (ICS). This October release continues to broaden it's connectivity portfolio in various segments. These...

REST API Now Available for Oracle Real-Time Integration Business Insight

Earlier in 2016, the Integration Analytics service type was released as part of the Oracle SOA Cloud Service.  The Integration Analytics service type includes Oracle Real-Time Integration Business Insight, which allows customers to track business milestones and gather real-time metrics without coding changes to applications.  You can find out more about Insight in the Oracle Cloud here, and on-premises here. In the latest release of SOACS, a new REST API is being announced that allows publishing of business events to Insight.  This new feature makes it possible to include heterogenous and custom application integrations as part of an Insight model, and dramatically expands the power of the platform.  Any integration component, from custom node.js services, to EJBs, to on-premises legacy services can deliver business events to Insight.  All that's required is the ability to send a simple REST API invocation. A key part of the magic of Real-Time Integration Business Insight is the ability to extract metrics from deployed SOA and Service Bus applications without making code changes.  By simply defining milestones and metrics, and then mapping those model constructs to application components, Insight users can track business performance in real time.  In many cases, however, in an effort to achieve a truly end-to-end set of business milestones, it is necessary to extend Insight models to include proprietary or non-Oracle components. The REST Event API gives customers the flexibility that they've been looking for with Insight.  The API allows for sending single events to Insight targeting a single milestone, in addition to delivering metrics (indicators).  When an event is received by Insight, it is correlated with business events received from other applications, and processed asynchronously.  To make things even easier, Insight provides a manifest for application developers to show exactly what JSON payload needs to be delivered so that a particular milestone can be marked as passed. The Insight API will be expanding in future releases to include access to additional resources such as models, milestones, indicators, and dashboards, making the integration possibilities even more exciting!  You can review documentation for the API in the current release here. For more information on Oracle Real-Time Integration Business Insight, including overview videos, tutorials, downloads, and more, visit the product page. &amp;amp;amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;gt;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;amp;amp;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;

Earlier in 2016, the Integration Analytics service type was released as part of the Oracle SOA Cloud Service.  The Integration Analytics service type includes Oracle Real-Time Integration Business...

Oracle OpenWorld 2016 Integration Innovations

If you are coming to Oracle OpenWorld this year as it takes over San Francisco September 18-22, don’t miss the integration sessions where you will learn all about the latest cool integration innovations. API Management, rapid SaaS integration, IoT integration, streaming analytics and hot tips from expert integration architects.With all of these integration sessions, where do you start? If you’re beginning a project to integrate your Software as a Service (SaaS) Oracle Sales Cloud, Salesforce.com, Oracle Service Cloud, Oracle E-Business Suite, Siebel or any other application you can think of, here are some ideas to get you the insights you need: Session: Oracle Integration Strategy: Cloud, API Analytics, Integration, and Process Speaker: Vikas Anand, Senior Director, Product Management, Oracle Speaker: Amit Zavery, SVP, Integration Products, Oracle Date: Monday, Sep 19, 11:00 a.m. | Moscone West – 2007 Next, deep dive into the integration sessions to hear directly from customers on how they implemented Oracle integration Platform as a Service (iPaaS). Here is an example of what customers share and elaborate on during their sessions where they describe their use cases for iPaaS that might be similar to what you plan to do next: Checkout the 45 “Focus On Integration Platform as a Service (iPaaS)” sessions link to see all of the integrationsessions.  Here are a few from thatdocument: Conference Sessions (selectedfew) Connecting Your Microservicesand Cloud Services with Oracle Integration What's New in Oracle IntegrationCloud Service Oracle E-Business SuiteIntegration Best Practices Integrating Your Oracle HCM,ERP, and CX Clouds with Your On-Premises Applications Building Solutions Using IoT,Mobile, Integration, Big Data, and Oracle Service Cloud Panel Sessions (selected few) Transition to Hybrid CloudIntegration: Oracle SOA Cloud Service Customer Panel Cloud Integration BestPractices: Customer Panel Meet the Experts Session(highlighted session) Meet the Experts: Oracle IntegrationCloud Service (iPaaS) HOL (Hands-on Lab) Sessions(selected few) Oracle Process Cloud Service:Digital Process Apps Made Simple Oracle Integration CloudService: Simplifying Connections Between Applications User Group Forum Session(highlighted session)            TopTips for Mastering Oracle SOA Cloud Service  So check out the full listing of all integration sessions on the: Focus On Integration Platform as a Service (iPaaS) Session Listings Focus On API Management Session Listings

If you are coming to Oracle OpenWorld this year as it takes over San Francisco September 18-22, don’t miss the integration sessions where you will learn all about the latest cool...

Oracle Real-Time Integration Business Insight Available in SOA Cloud Service

SAN FRANCISCO, CA – August 9, 2016 – Oracle is announcing the immediate availability of Oracle Real-Time Integration Business Insight as a feature of the Oracle SOA Cloud Service.  This new "Integration Analytics" service type in SOA Cloud Service allows customers to create Insight models and begin extracting business metrics from their cloud-hosted integrations based on SOA and Service Bus technology, without modification to their existing implementations.  “We’re very pleased to be releasing Insight as an integral part of the Oracle Cloud Integration Platform,” says Amit Zavery, Senior Vice President of Integration Products.  “Being able to extract business metrics in real-time from complex applications without modification to code is a differentiating feature that will serve our customers well.  Add to that a powerful set of pre-configured and custom dashboards and you have a very valuable analytics tool requiring almost no startup overhead." Oracle Real-Time Integration Business Insight makes it simple to identify and define milestones and metrics that are important to the business.  Using only web-based tooling, business users and architects map milestones and metrics to existing applications, and begin gathering meaningful business metrics with the click of a button.  There is no IT, engineering, or redeployments required. Now, as part of the Oracle SOA Cloud Service, things are even easier.  Customers can provision a new instance of Insight using a simple wizard that guides cloud administrators through all of the steps required to get up and running fast.   For more information on Oracle Real-Time Integration Business Insight, including overview videos, tutorials, downloads, and more, visit the product page. &amp;amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;gt;amp;amp;&amp;amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;gt;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;

SAN FRANCISCO, CA – August 9, 2016 – Oracle is announcing the immediate availability of Oracle Real-Time Integration Business Insight as a feature of the Oracle SOA Cloud Service.  This new...

DEADLINE EXTENDED!!! Oracle Excellence Awards: Oracle Cloud Platform Innovation

Calling all Oracle Cloud Platform Innovators click here, to submit your nomination todayCall for Nominations: Oracle Cloud Platform Innovation 2016 Customers - Are you using Oracle Cloud Platform to deliver unique business value? If so, submit a nomination today for the 2016 Oracle Excellence Awards for Oracle Cloud Platform Innovation! These highly coveted awards honor you, our customers and your partners for your cutting-edge solutions using Oracle Cloud Platform. Winners are selected based on the uniqueness of their business case, business benefits, level of impact relative to the size of the organization, complexity and magnitude of implementation, and the originality of architecture. Customer Winners receive a free pass to Oracle OpenWorld 2016 in San Francisco (September 18-September 22) and will be honored during a special event at Open World.  Our 2016 Award Categories are: Integration Process Benefits: Customers and partners receive awards in an Oscar style ceremony held at Yerba Buena Center of Arts during OOW. Customer gets a FREE OOW Pass NOTE: The deadline to submit all nominations is 5pm Pacific on June 30th, 2016. Customers don’t have to be in production to submit a nomination and nominations are for both Cloud and on-premise solutions. Click here, to submit your nomination today

Calling all Oracle Cloud Platform Innovators click here, to submit your nomination today Call for Nominations: Oracle Cloud Platform Innovation 2016 Customers - Are you using Oracle Cloud Platform to...

Integrating Salesforce.com Webcast - Demonstration and Overview

Registration is filling up fast!  Interested in connecting Salesforce.com to the rest of your business and taking advantage of the latest in integration simplification?  Register for this webcast to see a demonstration of how much easier integration has become when using Oracle Integration Cloud Service.   Join Oracle VP of Integration Product Management and me as we show you how to connect your business so your field representatives can get 360 degree visibility into everything thats going on in their accounts from Salesforce.com to service applications such as ServiceNow or Oracle Service Cloud all the way back to your Enterprise Resource Planning (ERP) software such as E-Business Suite.   Integrating Salesforce.com into Your Business with Oracle Integration Cloud Service Date: Wednesday, April 20, 2016 Time: 10:00 AM PDT / 1:00 PM EDT Say goodbye to integration complexities and hello to integration simplicity by using Oracle Integration Cloud Service. Attend this webcast and live Q&A to learn how to: Help your field representatives get immediate and complete access to customer data Automate opportunity-to-order process for faster and easier sell Make your application infrastructure future proof, more ready for IoT, and whatever comes next Register here (link) to attend the webcast. It's a webcast so there is no limit to registration :)  Join the hundreds of others that have registered as of the writing of this blog.

Registration is filling up fast!  Interested in connecting Salesforce.com to the rest of your business and taking advantage of the latest in integration simplification?  Register for this webcast to...

Only a few days away.... Oracle Process and Integration Cloud sessions at Collaborate 2016

Oracle Process and Integration Cloud sessions at Collaborate 2016 Monday, April 11th – We are Citizen Integrators! Oracle’s New Integration Cloud Service Antony Reynolds, Product Strategy Director 3:15 PM–4:15 PM | Reef F  _______________________________________________________________________________ Tuesday, April 12th – Choosing the Right Integration Tool for You Antony Reynolds, Product Strategy Director 9:15 AM–10:15 AM | Breakers E _______________________________________________________________________________  Boost SaaS and On-Premises Applications Connectivity: Leverage Oracle Cloud Adapters – Rajesh Kalra, Senior Principal Product Manager 9:15 AM–10:15 AM | South Pacific 1 _______________________________________________________________________________  Simplify and Accelerate Integration of Your On-Premises Applications with the Cloud Using Integration Cloud Service Ram Menon, Principal Product Manager 4:45 PM–5:45 PM | Reef E _______________________________________________________________________________  Wednesday, April 13th - XML Masterclass Antony Reynolds, Product Strategy Director 9:15 AM–10:15 AM | Breakers C _______________________________________________________________________________  For an entire listing of sessions click here. DEMOgrouds:Oracle Booth #1053Exhibit Hall - Bayside C/D, Level 1 – Mandalay Bay South Convention Cente · Oracle Integration Cloud Service, Oracle SOA Cloud Service: Simplify and Accelerate Integration· Oracle Integration Adapters: Rich and Comprehensive Connectivity to SaaS, On-Premises, and More Monday, April 11thWelcome Reception in the Exhibitor Showcase - 5:30pm – 8:00pmTuesday, April 12thGeneral Show Floor Hours -9:00am – 4:00pmHappy Hour in the Exhibitor Showcase – 6:00pm – 7:30pmWednesday, April 13thGeneral Show Floor Hours – 10:15am – 3:00pm

Oracle Process and Integration Cloud sessions at Collaborate 2016 Monday, April 11th – We are Citizen Integrators! Oracle’s New Integration Cloud Service Antony Reynolds, Product Strategy Director 3:15PM–4...

Oracle GoldenGate Cloud Service Executive Webcast

We are proud to announce the availability of the much awaited Oracle GoldenGate Cloud Service. Oracle GoldenGate Cloud Service builds on top of Oracle’s on premises GoldenGate solution, the industry leading data replication and real time streaming engine. Oracle GoldenGate Cloud Service combines the flexibility of a scalable cloud service that can be consumed on a subscription basis along with the powerful data streaming capability of an enterprise solution. fig 1:Oracle GoldenGate Cloud Service Overview Oracle GoldenGate Cloud Service  Oracle GoldenGate Cloud Service Enables migration to the cloud from heterogeneous databases to the Oracle Cloud platform. This includes moving data from Amazon RDS for Oracle among other data sources. Oracle GoldenGate Cloud Services ensures security and reduces risks to the data streaming process through its high-throughput encryption based data delivery. Oracle delivers all this with minimal negative impact on the critical source and business application systems. Is critical for well performing global real time reporting and analytics. Oracle GoldenGate Cloud Service delivers real time data into the cloud allowing the population and maintenance of cloud based data centers and data warehouses. A cloud based global data reserve allows intensive business analytics queries to be offloaded to avoid impacting source data systems. Globally located cloud data centers allow these queries to be offloaded to data the geographically closest location to the user. Helps easy dev/test in the cloud with live production data. Oracle GoldenGate Cloud service enables fast, elastic and web driven provisioning to create sandboxes in the cloud complete with real time data to help replicate production like environments for development and testing. Among other advantages, this allows customers to test and validate time sensitive business decisions with production data. Fig 2: Take advantage of existing skills and architecture to extent data centers into the cloud.  Learn More  Here is a quick product overview video about Oracle GoldenGate Cloud Service. For more information around Oracle GoldenGate Cloud Service visit our home page here. GoldenGate Cloud Service Executive Webcast Join us for our executive webcast featuring Amit Zavery, Senior Vice President, Oracle Cloud Platform and Integration Products and Jeff Pollock, Vice President of Product Management at Oracle discuss Oracle's latest cloud offering. Joining them will be Paul Stracke from Paychex to provide more customer perspective. Mark your calendars today and join us on 10 AM PST on the 7th of April. Register here.

We are proud to announce the availability of the much awaited Oracle GoldenGate Cloud Service. Oracle GoldenGate Cloud Service builds on top of Oracle’s on premises GoldenGate solution, the industry...

Oracle

Integrated Cloud Applications & Platform Services