X

All Things Database: Education, Best Practices,
Use Cases & More

Recent Posts

Autonomous Database

Enabling the Autonomous Enterprise

This post was contributed by Senior Principal Product Marketing Director, Ron Craig. Background – data overflow The ability of enterprises to generate data is increasingly outpacing their ability to realize real value from that data. As a result, opportunities for innovation, driven by customer, market and competitive intelligence are being left on the table. And given that only a subset of the avalanche of data is being put to good use, it’s entirely possible that the use of inadequate data is leading to bad decisions. A key source of this problem is that the productivity of human beings simply hasn’t kept pace with the technologies we have developed to help improve our business processes. IDC has predicted that by 2025, 6 billion consumers will have one data interaction every 18 seconds. At that point, the volume of global data will be 175ZB (175,000,000,000,000,000,000,000 bytes), and ~30% of that will be real time data – a 6X increase vs. 2018. The exponential increase in effort required to clean, arrange, secure, maintain and process the data from all those customers means less effort can be dedicated to insights. As a consequence, enterprises are not truly seeing the benefits from their own success in becoming data companies. Abstraction as a game-changer So what’s needed in response? Sometimes it’s good to look to other areas for inspiration, and innovation in the semiconductor industry provides some useful insights. That industry, since its early days, has had to deal with the fact that user productivity has struggled to keep pace with advances in technology, and has surmounted those issues with innovations that address those productivity limitations head on. Digital designs - the creations that comprise everything from the silicon chips to operate a timer on a microwave oven all the way up to the ability to forecast the weather with a supercomputer - are at their essence created from logical components, known as gates. These logic gates perform pretty routine Boolean operations, and effectively allow decisions or calculations to be made based on sets of inputs, and propagate those calculations in real time. Chip designers working at the gate level could be expected to produce verified designs (effectively combinations of connected gates) at a rate of ~50 gates per day - a productivity level that’s remained pretty constant over time. The processors in today’s high end cellphones may contain around 100 million gates, so a team of 100 chip designers working at the gate level would take 80 years to put such a chip together. In reality though, today such chips are often developed in two years or less, as a result of innovations introduced in the chip design flow over the last twenty years. For the purposes of this blog, and since it provides a useful analogy, the innovation we’ll focus on is the introduction of the hardware definition language (HDL). An HDL effectively works like software, allowing the chip designer to describe logic in a way that resembles what it does, as opposed to how it’s built, hence freeing the designer from the details of how that logic operation is implemented in hardware. HDL-based design goes hand in hand with automated synthesis algorithms, which translate those higher level descriptions into the equivalent gates that perform the same function, and which ultimately can be realized in silicon. As a result of these innovations, the semiconductor industry has enabled designers to keep up with the capacity of silicon chips by allowing them to be less and less concerned about the lower level implementation details of the chips they are designing, and put their focus on what those chips actually do. Chip designers take care of the ‘what’, where they can bring their creativity and experience to bear, and automation takes care of the ‘how’ in a reliable and repeatable fashion. Oracle Autonomous Database - Automating a path to innovation The semiconductor industry experience provides a useful blueprint for a path that the data industry must also take, demonstrating why automation is the key to unlocking the potential of today’s data, in the same way that innovation in the semiconductor industry has allowed designers to fully exploit the capacity of silicon chips. Across a range of industries, corporations differentiate themselves by what they do with the data they generate and collect, not in the effort they expend to manage and secure that data. To have maximum impact, database experts need to be able to maximize their focus on what their data is telling them (the ‘what’), and rely on automation to keep that data available and secure (the ‘how’). 95% of the respondents to a recent Oracle user survey noted that they are having difficulty keeping up with the growth in their data, and the majority of data managers are performing multiple major database updates per year. In addition to simply keeping the database up and running, the survey noted that significant manual effort continues to be dedicated to general troubleshooting and tuning, backup/recovery tasks, and provisioning to handle usage peaks and troughs. Data security also stands out as an area that can benefit significantly from automation, not only because automation can reduce manual effort, but because it can reduce risk. In an age where managers of on premises database setups much continuously juggle the urgency of patch installation with the cost of the downtime needed to install those patches, it comes as no surprise that a recent Verizon survey noted that 85% of successful data breaches exploited vulnerabilities for which patches were available for up to a year before the attack occurred. It makes perfect sense to instead make use of Oracle Autonomous Database to automatically apply security patches with no downtime. In total, these automated capabilities reduce administrative costs by 80%, meaning that the Autonomous Enterprise taking advantage of these advances can dedicate significantly more effort to innovation. Coming back to our semiconductor analogy, innovations in how design is done didn’t make chip designers less necessary, rather it made them significantly more productive and enabled them to make more innovative use of advances in technology. We expect the Oracle Autonomous Database to have the same impact for DBAs and data managers in the Autonomous Enterprise. Learn more at Oracle Open World 2019 To learn more about how enterprises who have already become autonomous, visit the sessions below at the 2019 Oracle Open World event – Drop Tank: A Cloud Journey Case Study, Tuesday September 17, 11:15AM – 12:00PM Oracle Autonomous Data Warehouse: Customer Panel, Tuesday September 17, 1:45PM – 2:30PM Oracle Autonomous Transaction Processing Dedicated Deployment: The End User’s Experience, Tuesday September 17, 5:15PM – 6:00PM Managing One Of the Largest IoT Systems in the World With Autonomous Technologies, Wednesday September 18, 9:00AM – 9:45AM

This post was contributed by Senior Principal Product Marketing Director, Ron Craig. Background – data overflow The ability of enterprises to generate data is increasingly outpacing their ability to...

Autonomous Database

Making Compromise Obsolete with Oracle Gen 2 Cloud

This post was contributed by Senior Principal Product Marketing Director, Ron Craig. There’s an old saying in the automotive world – “Cheap, fast, reliable – pick any two”. What’s interesting is as self-driving and automation in general progresses, the introduction of enabling technologies means those old trade-offs may no longer apply. Out of all of those advances, one that I find particularly intriguing (for as long as the concept of a multi-speed transmission remains relevant!) is the idea of the predictive transmission. A predictive transmission addresses the problem that a regular automatic transmission will typically only react to a steep hill or sharp bend by downshifting after encountering that change in road topology to match your slower engine speed, meaning there’s a period of time during which the car isn’t performing optimally. With a predictive transmission, now available in many production cars, the transmission is linked to the car’s navigation system, giving it the intelligence to get into the appropriate gear in advance. Having knowledge of what is ahead allows the transmission to shift appropriately before a change in road conditions, avoiding that period of sluggishness we’ve all encountered where most automatic transmissions are still figuring out how to react. The end result is the best of both worlds – better fuel economy and better performance – two things that you typically would have had to choose between in the past. In short, compromise becomes obsolete. In the same way, by being proactive rather than reactive, Oracle Generation 2 Cloud is anticipating and acting on your needs rather than always waiting for your guidance, meaning you don’t necessarily need to choose between cost, speed and reliability. Oracle Generation 2 Cloud goes beyond first generation cloud technology to offer organizations advanced capabilities that allow them to run their existing and future workloads better and faster. The Oracle Generation 2 Cloud also allows you to get the most out of your data, while also ensuring it’s available and secured. On the database front, integrated machine learning (ML) capabilities in Oracle Autonomous Database allow you to extract more value from your data – moving algorithms to the data rather than the other way around. Oracle Autonomous Database provides options to extract value from the database using SQL analytics, or gain insights and make predictions via native ML. ML algorithms are available as native SQL functions – in effect we’ve taught the database how to do higher level math. Models that used to take weeks to build can now be built in minutes, and multiple data scientist user roles are supported (DBAs, app developers, data scientists etc.).   Oracle Machine Learning Notebook Moving the analytics to the database opens up a new world of possibilities. For example, you can now dedicate more time to things like identifying which machines in your factory will soon require maintenance and only shut down what really needs to be shut down, rather than pausing production on all equipment for routine, perhaps unnecessary, maintenance. Oracle Cloud technology is helping innovators disrupt a broad range of industries. We’re seeing financial services use AI for automatic forecasting without human intervention, to smart manufacturing utilizing real-time IoT data for equipment optimizations. In a particularly compelling example, a medical lab testing company is using Autonomous technologies to shorten results from weeks to minutes. This allows for faster time to diagnosis, which means fast time to treatment, which ultimately means the shortest time to recovery. Cyber Security company FireEye expects to not only better protect their customers through the ability to analyze tens of millions of emails daily, but also reduce disaster recovery costs by 40-50% by moving to Oracle Cloud. Oracle Cloud Blockchain Technology is helping logistics and shipping management company CargoSmart simplify the otherwise complex documentation process and deliver a single source of truth for trusted, real-time sharing of information, increasing trust and boosting the efficiency of ocean freight shipping. OUTFRONT Media is using Oracle Analytics Cloud and Oracle Autonomous Data Warehouse to gain the insight that help their clients make the most effective use of their advertising budgets. What do all of these companies have in common? That’s simple - none of them is a database or compute infrastructure management company. They leave that to Oracle and the Oracle Generation 2 Cloud, while they focus their efforts on growing their businesses by providing differentiated value to their customers. Learn more at Oracle Open World 2019 To learn more about Oracle’s Generation 2 Cloud or to hear from our customers themselves, visit the sessions below at the 2019 Oracle OpenWorld event - The Power of Insights: Better Business Decisions with Oracle Autonomous Data Warehouse, Monday September 16, 1:45 PM – 2:30 PM | Moscone West – Room 3007B Oracle Cloud: A Path and Platform, Tuesday September 17, 11:15 AM – 12:00 PM | YBCA Theater Solution Keynote: Oracle Cloud Infrastructure Strategy and Roadmap, Tuesday, September 17, 03:15 PM – 05:00 PM | Moscone South – Room 207/208 Customer Case Study Session: Oracle Cloud Infrastructure Enables Threat Detonation in World-Leading Security Products, Monday, September 16, 01:45 PM – 02:30 PM | Moscone South – Room 152A Oracle Autonomous Data Warehouse: Customer Panel, Tuesday, September 17, 01:45 PM – 02:30 PM | Moscone South – Room 211

This post was contributed by Senior Principal Product Marketing Director, Ron Craig. There’s an old saying in the automotive world – “Cheap, fast, reliable – pick any two”. What’s interesting is as...

Autonomous Database

The Oracle Autonomous Data Warehouse: Architecting Advanced Data Platforms to Support Data Management

Part 2 of 3-Post Series: Architecting Advanced Data Platforms to Support Data Management ‘Smart’ or cognitive solutions are all the rage in today’s data-driven world. Specifically, solutions like AI, machine learning and data analytics that help your business ‘think’ and leverage data more effectively. The future of these cognitive solutions is expected to flourish in the coming years due to billions of investments pouring into the cognitive solutions vertical. But what’s driving this market for cognitive data solutions? First, data is really, really important and valuable. Secondly, there is a general desire to make business processes more intuitively intelligent to drive competitive advantage in the modern business environment. This smartening, if you will, of the business is absolutely necessary to allow leaders to respond to market trends, positively impact customer experience, and to adapt to a quickly evolving digital economy. Without these smarter solutions, businesses are anchored and are left responding reactively to market dynamics.  The bottom line: This increase in demand for data-driven solutions is a direct outcome of the digitization and generation of a massive amount of data across industries. And that data footprint will only continue to grow. In the first post of this three-part article series, I noted that the amount of data generated in the world will grow from 33 Zettabytes (ZB) in 2018 to 175 ZB by 2025. To put that into perspective, 90 percent of the world’s data was generated in the past two years. And when it comes to managing all of this data, the market is responding: Analysts forecast that the global cognitive solution market will grow at a CAGR of almost 50% during the period 2018-2022. Compelling stats but you’re probably asking why should I care? Fair and here’s why you should: It’s these cognitive solutions that draw insights from business process data and make data-based predictions to augment human decision-making capabilities. That alone is a pretty big deal, but imagine what you can do if you had to do nothing – ie, if you could automate and data ingestion and processing? If you ask me, that’s when the exciting part (the piece that gets the executive office engaged) takes place. It’s exciting because the business can leverage smarter solutions to make better decisions with a level of visibility at the forefront of digital transformation, allowing organizations to truly capture the momentum of the market. Imagine being able to tell which products are doing better in specific markets based on patterns that your data warehouse helps you find proactively. Or, being able to dynamically customize retail and shopping experiences based on data and previous client interactions. From a competitive perspective, solutions like sentiment analysis allow you to leverage machine learning to better understand market sentiment around your services, as well as those of competitors. There are security examples when it comes to cognitive solutions as well. AI-driven security features can detect anomalous behavior and take action proactively. Similarly, cognitive systems can also help detect and stop fraud based on data metrics. When it comes to cognitive engines, smart quickly becomes the new normal. Oracle Autonomous Data Warehouse does exactly this. It uses applied machine learning to self-tune and automatically optimize performance while the database is running. You’re literally working with a solution that gives you foresight into the market to help disrupt and transform the kinds of service and products you deliver. Built on next-generation autonomous technology, Oracle Autonomous Data Warehouse uses artificial intelligence to deliver unprecedented reliability, performance, and highly elastic data management to enable data warehouse deployment in seconds. Oracle Autonomous Data Warehouse brings new meaning to the concept of automatic or ‘autonomous’ data management. Here’s what that means: Self-Driving. If Oracle Autonomous Data Warehouse had wheels, it would drive itself. However, the self-driving element here refers to a fully managed data warehouse cloud service that takes care of network configuration, storage, and database patching and upgrades for you. No customer DBA required. Self-Securing. This part is truly revolutionary and a first of its kind in our industry. The self-securing part of the database ensures that the architecture always runs the latest security patches. It will also detect anomalous behavior and conduct updates, all on its own, autonomously while still running! Further, the data at rest is encrypted by default using Transparent Data Encryption (TDE). Finally, database clients use SSL/TLS 1.2 encrypted and mutually authenticated connections. (We’ll get much more into the unique self-securing architecture in the third blog of our series.) Self-Repairing. No one wants to experience an outage. It causes a lot of stress and can impact, even halt, the flow of your business. Oracle Autonomous Data Warehouse has automated protection from downtime, purpose-built into the core of the design. High availability is built into every component, and backups are completely automated. This means you can get your nights and weekends back knowing you’ve got a data platform that’s actively working to keep your stuff operating. Sounds cool, right? Let’s pop the hood of this autonomous data vehicle and see how it really works. Here’s what the Oracle Autonomous Data Warehouse looks like: Oracle Autonomous Data Warehouse is your direct engine and integration point into the entire DevOps process. Most of all, data-driven applications can leverage machine learning to deliver powerful results while utilizing local services alongside third-party solutions. This means that your existing developer tools, data integration services, data visualization, and cloud object storage easily integrate with Oracle Autonomous Data Warehouse while still leveraging the power of cloud. To get the most value out of the data, you can visualize it in any way you require. Leverage the Oracle Data Visualization Desktop or use your own third-party business intelligence or data visualization solution. Oracle Data Visualization Finally, and this is more of the revolutionary part, Oracle machine learning provides a notebook application designed for SQL users and provides interactive data analysis that lets you develop, document, share, and automate reports based on sophisticated analytics and data models. Oracle Machine Learning Notebook Oracle Machine Learning SQL notebooks, which are based on Apache Zeppelin technology, enable teams to collaborate, build, evaluate, and deploy predictive models and analytical methodologies. From there, this SQL notebook acts as an interface for data scientists to perform machine learning in the Oracle Autonomous Data Warehouse (ADW). These notebook technologies support the creation of scripts while supporting the documentation of assumptions, approaches and rationale to increase data science team productivity. With greater levels of built-in automation and intelligence, you can couple the data warehouse with powerful machine learning and cognitive solutions. This allows for fast and easy collaboration between data scientists, developers, and business users as it leverages the scalability and performance of the Oracle platform and its cloud services. Let’s recap. In a nutshell, here are some of the benefits of a data-driven, autonomous solution, specifically, Oracle Autonomous Data Warehouse: Simplified, end-to-end management of data and data warehouse resources Fully tuned and ‘ready to go’ for your data requirements – including high performance, out of the box Fully elastic scaling with intelligence around idle-compute shut off Auto-scaling based on dependencies and real-time workload requirements Ability to support solutions running on premise, hybrid, or multi-cloud Ability to leverage native or third-party data integration tools High-performance queries and concurrent workloads: Optimized query performance with preconfigured resource profiles for different types of users Powerful data migration utilities to move vast amounts of data Deep integration with data storage, repository, and processing engines including Amazon AWS Redshift, SQL Server, and other databases If you’re worried about security (and who isn’t), we’ll cover that critical topic in part three of our series where I’ll dig deeper into how Autonomous Data Warehouse stores all data (in rest and motion) in encrypted format. Final Thoughts: The Future is Data-Driven. Make Sure You and Your Data are Ready It doesn’t matter what vertical you’re in or how big your company is, data impacts your future. Those organizations that find ways to not only leverage data, but also make it easy to do so, will find meaningful, competitive advantages in a digital economy. From my point of view, Oracle’s Autonomous Data Warehouse provides an easy-to-use, fully autonomous database that scales elastically, delivers fast query performance and requires no database administration. This is the kind of technology that removes the complexity around working with data. Most of all, it allows you to truly leverage the power of data to do innovative and bottom-line enabling activities in a data-driven future.

Part 2 of 3-Post Series: Architecting Advanced Data Platforms to Support Data Management ‘Smart’ or cognitive solutions are all the rage in today’s data-driven world. Specifically, solutions like AI,...

Announcing Oracle APEX 18.1

Today we have guest blogger - Joel Kallman - Senior Director, Software Development Oracle Application Express (APEX) 18.1 is now generally available! APEX enables you to develop, design and deploy beautiful, responsive, data-driven desktop and mobile applications using only a browser. This release of APEX is a dramatic leap forward in both the ease of integration with remote data sources, and the easy inclusion of robust, high-quality application features. Keeping up with the rapidly changing industry, APEX now makes it easier than ever to build attractive and scalable applications which integrate data from anywhere - within your Oracle database, from a remote Oracle database, or from any REST Service, all with no coding.  And the new APEX 18.1 enables you to quickly add higher-level features which are common to many applications - delivering a rich and powerful end-user experience without writing a line of code. "Over a half million developers are building Oracle Database applications today using  Oracle Application Express (APEX).  Oracle APEX is a low code, high productivity app dev tool which combines rich declarative UI components with SQL data access.  With the new 18.1 release, Oracle APEX can now integrate data from REST services with data from SQL queries.  This new functionality is eagerly awaited by the APEX developer community", said Andy Mendelsohn, Executive Vice President of Database Server Technologies at Oracle Corporation. Some of the major improvements to Oracle Application Express 18.1 include: Application Features It has always been easy to add components to an APEX application - a chart, a form, a report.  But in APEX 18.1, you now have the ability to add higher-level application features to your app, including access control, feedback, activity reporting, email reporting, dynamic user interface selection, and more.  In addition to the existing reporting and data visualization components, you can now create an application with a "cards" report interface, a dashboard, and a timeline report.  The result?  An easily-created powerful and rich application, all without writing a single line of code. REST Enabled SQL Support Oracle REST Data Services (ORDS) REST-Enabled SQL Services enables the execution of SQL in remote Oracle Databases, over HTTP and REST.  You can POST SQL statements to the service, and the service then runs the SQL statements against Oracle database and returns the result to the client in a JSON format.   In APEX 18.1, you can build charts, reports, calendars, trees and even invoke processes against Oracle REST Data Services (ORDS)-provided REST Enabled SQL Services.  No longer is a database link necessary to include data from remote database objects in your APEX application - it can all be done seamlessly via REST Enabled SQL. Web Source Modules APEX now offers the ability to declaratively access data services from a variety of REST endpoints, including ordinary REST data feeds, REST Services from Oracle REST Data Services, and Oracle Cloud Applications REST Services.  In addition to supporting smart caching rules for remote REST data, APEX also offers the unique ability to directly manipulate the results of REST data sources using industry standard SQL. REST Workshop APEX includes a completely rearchitected REST Workshop, to assist in the creation of REST Services against your Oracle database objects.  The REST definitions are managed in a single repository, and the same definitions can be edited via the APEX REST Workshop, SQL Developer or via documented API's.  Users can exploit the data management skills they possess, such as writing SQL and PL/SQL to define RESTful API services for their database.   The new REST Workshop also includes the ability to generate Swagger documentation against your REST definitions, all with the click of a button. Application Builder Improvements In Oracle Application Express 18.1, wizards have been streamlined with smarter defaults and fewer steps, enabling developers to create components quicker than ever before.  There have also been a number of usability enhancements to Page Designer, including greater use of color and graphics on page elements, and "Sticky Filter" which is used to maintain a specific filter in the property editor.  These features are designed to enhance the overall developer experience and improve development productivity.  APEX Spotlight Search provides quick navigation and a unified search experience across the entire APEX interface. Social Authentication APEX 18.1 introduces a new native authentication scheme, Social Sign-In.  Developers can now easily create APEX applications which can use Oracle Identity Cloud Service, Google, Facebook, generic OpenID Connect and generic OAuth2 as the authentication method, all with no coding. Charts The data visualization engine of Oracle Application Express powered by Oracle JET(JavaScript Extension Toolkit), a modular open source toolkit based on modern JavaScript, CSS3 and HTML5 design and development principles.  The charts in APEX are fully HTML5 capable and work on any modern browser, regardless of platform, or screen size.  These charts provide numerous ways to visualize a data set, including bar, line, area, range, combination, scatter, bubble, polar, radar, pie, funnel, and stock charts.  APEX 18.1 features an upgraded Oracle JET 4.2 engine with updated charts and API's.  There are also new chart types including Gantt, Box-Plot and Pyramid, and better support for multi-series, sparse data sets. Mobile UI APEX 18.1 introduce many new UI components to assist in the creation of mobile applications.  Three new component types, ListView, Column Toggle and Reflow Report, are now components which can be used natively with the Universal Theme and are commonly used in mobile applications.  Additional enhancements have been made to the APEX Universal Theme which are mobile-focused, namely, mobile page headers and footers which will remain consistently displayed on mobile devices, and floating item label templates, which optimize the information presented on a mobile screen.  Lastly, APEX 18.1 also includes declarative support for touch-based dynamic actions, tap and double tap, press, swipe, and pan, supporting the creation of rich and functional mobile applications. Font APEX Font APEX is a collection of over 1,000 high-quality icons, many specifically created for use in business applications.  Font APEX in APEX 18.1 includes a new set of high-resolution 32 x 32 icons which include much greater detail and the correctly-sized font will automatically be selected for you, based upon where it is used in your APEX application. Accessibility APEX 18.1 includes a collection of tests in the APEX Advisor which can be used to identify common accessibility issues in an APEX application, including missing headers and titles, and more. This release also deprecates the accessibility modes, as a separate mode is no longer necessary to be accessible. Upgrading If you're an existing Oracle APEX customer, upgrading to APEX 18.1 is as simple as installing the latest version.  The APEX engine will automatically be upgraded and your existing applications will look and run exactly as they did in the earlier versions of APEX.   "We believe that APEX-based PaaS solutions provide a complete platform for extending Oracle’s ERP Cloud. APEX 18.1 introduces two new features that make it a landmark release for our customers. REST Service Consumption gives us the ability to build APEX reports from REST services as if the data were in the local database. This makes embedding data from a REST service directly into an ERP Cloud page much simpler. REST enabled SQL allows us to incorporate data from any Cloud or on-premise Oracle database into our Applications. We can’t wait to introduce APEX 18.1 to our customers!", said Jon Dixon, co-founder of JMJ Cloud. Additional Information Application Express (APEX) is the low code rapid app dev platform which can run in any Oracle Database and is included with every Oracle Database Cloud Service.  APEX, combined with the Oracle Database, provides a fully integrated environment to build, deploy, maintain and monitor data-driven business applications that look great on mobile and desktop devices.  To learn more about Oracle Application Express, visit apex.oracle.com.  To learn more about Oracle Database Cloud, visit cloud.oracle.com/database.  Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/      

Today we have guest blogger - Joel Kallman - Senior Director, Software Development Oracle Application Express (APEX) 18.1 is now generally available! APEX enables you to develop, design and...

Forget the Turing Test—give AI the F. Scott Fitzgerald Test instead

Written by Paul Sonderegger -  Big data strategist, Oracle Scott Fitzgerald pinned human intelligence on its tolerance of paradox. But what kind of artificial intelligence could pass his test? In his 1936 essay “The Crack-Up,” Fitzgerald writes that “the test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function.” For example, he says you should “be able to see that things are hopeless and yet be determined to make them otherwise.” He confesses he’s lost this ability—and as a result, himself. Fitzgerald’s point is not that he needs a better model of the world, but that he needs many models and the freedom to switch among them. This is what allows us to forge ahead despite unexpected obstacles, conflicting priorities, or, in Fitzgerald’s case, hitting his forties and feeling like someone changed the rules of the game while he wasn’t looking. Fitzgerald, having lost his ability to balance opposing ideas, falls into a drab, routinized existence. Every moment, from his morning routine to dinner with friends, becomes a forced act. He mimics the life of a successful literary man without actually living it. Take a simple example. When a navigation app redirects stop-and-go traffic from the New Jersey Turnpike onto local roads in the town of Leonia, otherwise quiet neighborhoods become overrun with shortcut-seeking app-watchers. A compassionate human might weigh up the cost-benefit analysis of a shorter trip with the potential annoyance of hapless suburbanites. But a naïve AI, focused only on finding the fastest travel time, won’t. Local authorities are now plotting to fine non-residents caught driving through the area during rush-hour, even though they’re just following the directions on their smartphones. Read More Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Written by Paul Sonderegger -  Big data strategist, Oracle Scott Fitzgerald pinned human intelligence on its tolerance of paradox. But what kind of artificial intelligence could pass his test? In his...

Options for moving EBS, PeopleSoft applications to cloud – Conversation with Calix, Sherwin Williams and Astute

Today we have guest author - Navita Sood - Marketing Director, Cloud Business Group Once you have decided you want to move to cloud, the next step is deciding if you want to lift and shift your existing workloads to cloud or move to a SaaS application. You can also extend your application with SaaS and then use PaaS offerings to replicate your customizations in cloud. The choice depends on your environment, your business needs, external factors impacting your business, how much you want to invest and how much you are willing to disrupt. It’s important to do a thorough analysis to optimize your ROI. Also, it’s important you involve all the stakeholders in taking this decision because cloud impacts everyone from IT to business. Lastly, once you choose your path, it’s important you lay out a detailed execution plan with your vendor or implementation partner to make your move successful. Recently I moderated a panel discussion at Collaborate user group, where I invited three customers to talk about their unique paths to cloud and how they implemented them. It was interesting to see how different reasons influenced their choice and impacted their business outcomes. On the panel I had Ravi Gade, Sr. Director of Enterprise Applications in Calix, Vivek Puri, Manager – Database, Middleware & Engineered Systems at Sherwin Williams and Arvind, Chief Solution Architect at Astute Business Solutions. Calix moved their E-Business Suite workloads to SaaS, Sherwin Williams is lifting and shifting their E-Business Suite workloads to Oracle cloud @ customer and Astute moved their PeopleSoft application from AWS to Oracle cloud. Although the three are in different stages of their cloud journey, they have already started experiencing the benefits guaranteed by cloud. Calix was undergoing a major business transformation. They were moving from hardware business to software. They wanted a cloud solution that would support their transition. Although their applications were highly customized, they decided anything that wasn’t critical for their business they wouldn’t migrate that to cloud. Instead they preferred to standardize their applications for ease of maintenance. Hence, they opted to move to SaaS. Three things that helped them be successful in their journey were: Involving the business from day one Working with Oracle’s cloud business group in the transformation Leveraging Oracle’s cloud integration services to integrate their on premise and cloud applications Since their transition 2 years back, they have saved $2.5M per year with 40% ROI going from EBS to ERP cloud. These savings came from data center cost, application support cost and resources for application support. They rebuilt only a few critical customizations to their ERP cloud. They were up and running in cloud only in a few months and were able to break even in just 18 months. They saw their IT queue reduce from 500 tickets, 2 years ago to less than 20 tickets today. One of the questions from the audience was on job cuts as an outcome of moving to cloud. Ravi was very frank in pointing out that since 2 years back, when they embarked on the journey to cloud, no one in Calix had lost their job. In fact their roles became more prominent as Calix improved its productivity and introduced new revenue models. Calix saw its DBA’s transform into data architects who were now investing more time doing tasks they enjoyed. All their employees underwent training on new cloud services that have also made them more valuable in the market. Sherwin Williams, being more risk averse wanted to move to cloud very cautiously. They weren’t ready to move their applications and all their data to public cloud. They choose to migrate to Oracle cloud @ customer, to start their cloud journey when they were considering their upgrade. They started their move only 6 months back and were Oracle’s first Cloud @ customer users. In just 6 months they are seeing the following benefits: Cloud @ customer worked as a testing pad for them to start evaluating the benefits of cloud for their business. One of their biggest concerns was integration of on premises applications with the cloud, but the fact that everything was running homogeneously in no time helped all teams build more confidence in cloud. They had highly customized applications, with lots of engineered systems that they weren’t willing to part from. By doing a lift and shift they were able to move all those customizations as is to the cloud behind their own firewall without causing any disruptions or changes to the business processes They were able to future proof their business and move away from having to maintain those applications and underlying infrastructure. Financially, it is a very attractive proposition for them. They are using the same underlying hardware in the cloud, but now they only pay for 1/4th of the rack based on their requirements. In future they are evaluating to move their EBS HR to HCM cloud, so their peripheral applications will be on SaaS and other business critical applications will be on cloud @ customer for now, until they gain more confidence in moving everything to cloud. Astute on the other hand preferred lift and shift to minimize business disruptions and ensure the same experience for their business users, while still receiving all the benefits of Oracle cloud – around automation, performance, speed, agility and cost. They wanted to move away from the data center business 3-4 years back when they moved to AWS. Once Oracle cloud was available they were quick to migrate to Oracle cloud to run their Oracle applications on Oracle cloud. This enabled them to better serve their customers and provide the latest and greatest features available through PaaS and SaaS services. Rightsizing of their infrastructure helped them increase their utilization levels and save costs.  Currently they are exploring adding chatbots to their PeopleSoft application to improve the experience of their customers. There is no best approach to moving to cloud. Pick the most critical business problem you are facing today and see how cloud can solve it. Start your cloud journey from there and then build your custom path to cloud. All paths to cloud would lead to the same goal. Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have guest author - Navita Sood - Marketing Director, Cloud Business Group Once you have decided you want to move to cloud, the next step is deciding if you want to lift and shift your...

Database

Threat Report: Companies Trust Cloud Security

Today we have guest blogger - Alan Zeichick -  principal analyst at Camden Associates. Is the cloud ready for sensitive data? You bet it is. Some 90% of businesses in a new survey say that at least half of their cloud-based data is indeed sensitive, the kind that cybercriminals would love to get their hands on. The migration to the cloud can’t come soon enough, as 66% of companies in the study say at least one cybersecurity incident has disrupted their operations within the past two years, and 80% say they’re concerned about the threat that cybercriminals pose to their data. The good news is that 62% of organizations consider the security of cloud-based enterprise applications to be better than the security of their on-premises applications, and another 21% consider it as good. The caveat: Companies must be proactive about their cloud-based data and can’t naively assume that “someone else” is taking care of that security. Those insights come from a brand-new threat report, the first ever jointly conducted by Oracle and KPMG. The “Oracle and KPMG Cloud Threat Report 2018,” to be released this month at the RSA Conference, fills a unique niche among the vast number of existing threat and security reports, including the well-respected Verizon Data Breach Investigations Report produced annually since 2008. The difference is the Cloud Threat Report’s emphasis on hybrid cloud, and on organizations lifting and shifting workloads and data into the cloud. “In the threat landscape, you have a wide variety of reports around infrastructure, threat analytics, malware, penetrations, data breaches, and patch management,” says one of the designers of the study, Greg Jensen, senior principal director of Oracle’s Cloud Security Business. “What’s missing is pulling this all together for the journey to the cloud.” Indeed, 87% of the 450 businesses surveyed say they have a cloud-first orientation. “That’s the kind of trust these organizations have in cloud-based technology,” Jensen says. Related: Try Oracle Cloud for free Here are data points that break that idea down into more detail: 20% of respondents to the survey say the cloud is much more secure than their on-premises environments; 42% say the cloud is somewhat more secure; and 21% say the cloud is equally secure. Only 21% think the cloud is less secure. 14% say that more than half of their data is in the cloud already, and 46% say that between a quarter and half of their data is in the cloud. That cloud-based data is increasingly “sensitive,” the survey respondents say. That data includes information collected from customer relationship management systems, personally identifiable information (PII), payment card data, legal documents, product designs, source code, and other types of intellectual property. Cyberattacks Reveal the Pace Gap Two-thirds of organizations in the study report some type of past interruption due to a security incident, such as losing the ability to provide service, diminished employee productivity, or delays to IT projects. Just more than half of the businesses say they’ve experienced a financial hit as a result, including a loss of shareholder value, the cost of data loss, or the costs of reputational damage. Oracle’s Jensen says there’s a growing realization of a “pace gap” between how fast organizations can create and/or deploy new business applications and how fast they can secure those applications to meet an organizations security and compliance target”. Security is lagging behind. This gap is exacerbated by agile application development methodologies. So should businesses slow down their deployment of new software? Jensen laughs at that suggestion. Instead, he calls for improving security training, processes, and technology. “A priority area that falls down is training the average end users, because they’re the most vulnerable point of attack, and some of the most successful attacks leverage social engineering, such as phishing,” Jensen says. When it comes to processes, companies must understand the security responsibility they share with their cloud providers. As the Oracle-KPMG study explains, the line of demarcation between what cloud vendors and customers are responsible for securing differs when it comes to software as a service, infrastructure as a service, and platform as a service. With IaaS, for example, service providers “are generally responsible for securing the physical infrastructure up to and including the virtualization layer with the customer, then responsible for protecting the server workload,” the report says. “However, regardless of consumption model—IaaS, PaaS, and SaaS—the customer is generally responsible for data security and user access and identity management.” Machine Learning and Automation Can Help Meantime, emerging technologies can help close the pace gap, by finding and addressing security issues in on-premises data centers, the cloud, and hybrid environments. The study shows that 38% of organizations use behavioral analysis and anomaly detection tools, which can instantly determine when a user is acting in a suspicious manner. For example, if an employee has never tried to download a customer database to her laptop before but is suddenly doing so at 2:00 a.m.—well, even if she has the authority to do so, something doesn’t appear to be right there. Machine learning is another effective tool at reacting quickly to threats, ML algorithms can study tremendous quantities of data (such as transaction logs) and identify patterns. The Oracle-KPMG study shows that 47% of organizations are using machine learning for cybersecurity purposes. Automation is also key: The more that software can handle routine security tasks, the fewer human errors can creep into system configurations and alert responses. In the study, 84% of companies say they’re committed to increased levels of security automation. Overall, the future of the cloud is bright when it comes to security. When the majority of organizations rate cloud security as better than their on-premises security, and when 90% of organizations categorize at least half of their cloud data as sensitive, we’re past the tipping point. Organizations must always remain vigilant, but the cloud has earned their trust. Alan Zeichick is principal analyst at Camden Associates, a tech consultancy in Phoenix, Arizona, specializing in software development, enterprise networking, and cybersecurity. Follow him @zeichick.   Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have guest blogger - Alan Zeichick -  principal analyst at Camden Associates. Is the cloud ready for sensitive data? You bet it is. Some 90% of businesses in a new survey say that at least...

Database Cloud Services

Lift and Shift Your Apps to the Cloud

Today we have guest blogger - Sai Valluri - Product MarketingManager. This year at Collaborate, I had the opportunity to host a panel discussion with three amazing customers and one partner. The title of the panel discussion was “Sharpen Your Competitive Edge: Lift and Shift Your Apps to the Cloud”. This discussion took place on 4/25 at 1:15 pm.     The panel included: Chris Brown, Director-Application Development, Port of Houston Chris is a seasoned IT executive with over 20 years of experience cultured in the full-vertical structure of operation oriented enterprises. Chris has experience in full Cycle Tier One ERP implementations of JDE software. In his current role, Chris oversees staff between two terminals supporting enterprise software port wide (several JDE modules) and interfaces between applications both hosted and on premises. Joe Finlinson, Business Applications Technology Director, Intermountain Healthcare Joe is an experience IT leader with over 15 years in enterprise IT. In his current role, Joe directs all technical aspects of the Business Applications Portfolio including PeopleSoft FSCM, Oracle E-Business Suite HR/Payroll, Kronos Timekeeping and Analytics, Hyperion Budgeting, OBIEE/OBIA, Oracle SOA Suite integrations along with several other applications. Michael Lee Sherwood,  Director of IT, City of Las Vegas Michael is an accomplished technology and innovation leader with a demonstrated history of working in government and private sector industries. Skilled in process improvement, budgeting, operations management, strategy, customer experience, and entrepreneurship. Michael graduated from University of Southern California - Marshall School of Business Niklas Ivelsatt, Senior Partner, Arisant LLC Niklas is the Co-founder and Senior Partner of Arisant LLC. Founded in 2006 in Denver, Arisant is well known for designing, building and supporting scalable, cost effective Oracle infrastructure environments. Arisant is an Oracle Platinum partner. The discussion covered a variety of challenges faced by customers and business benefits accrued by moving to Oracle Cloud. Each of the customers is running a key Oracle application such as E-Business Suite, JD Edwards and PeopleSoft. So in a way it’s a diverse customer group. Each of them made the transition to Cloud to better serve their stakeholders. Some of the challenges faced by customers included: Retiring workforce and its impact on day to day operations. Increasing hardware/ Data Center costs that create a burden on innovation. Providing services to end users quickly and efficiently. Ability to set up and tear down test environments. Business Benefits: For one customer by moving to Cloud at Customer, the customer could keep their E-Business Suite and databases on premise due to security/compliance and concern about putting core systems in the public cloud.  This also addressed any latency concerns. Savings in terms of hard dollars when compared to legacy deployments. Each of the panelists could validate savings and how they could utilize these dollars for other projects. Move to Cloud also helped our customers retrain employees and also attract new talent that was keen to work on Cloud. Cloud enables customers to leverage data and analytics. Thereby better manage operations, analyze data in real time and make faster business decisions. The panel also discussed how Oracle Cloud helped them overcome challenges and helped save costs and grow business. Some of the other topics that the panelists touched upon included total cost of ownership, selection process and best practices learned during the evaluation, implementation and post implementation phases.   Call to Action: You can hear the entire panel discussion at: http://nnf.questdirect.org/questmediaviewer.aspx?video=268022930     p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 14.7px Helvetica; -webkit-text-stroke: #000000} span.s1 {font-kerning: none}

Today we have guest blogger - Sai Valluri - Product MarketingManager. This year at Collaborate, I had the opportunity to host a panel discussion with three amazing customers and one partner. The...

Database

Every CEO is in the Data Security Business!

What would you consider your most valuable resource today? What about the world’s? Turns out they are one and the same: data. According to the Economist, data has replaced oil as the world’s most valuable resource. During every millisecond of our IT world, data is being collected on nearly every activity, creating a virtually limitless resource with equally unlimited demand. The New Data Economy Data drives improvement to products and services, which drives stronger customer adoption, resulting in even more data collection. The more data Tesla can collect on its self-driving cars, the better the cars will perform, and the better Tesla will outperform its competitors. This new data economy has changed the competitive landscape of the tech world. Whoever can acquire the most data the fastest—whether they collect it on their own or purchase it elsewhere—stands to win. But this new wealth of data comes with equally expensive risks. With every opportunity to collect more data comes the opportunity for hackers to steal it.  Cyber attacks the likes of Equifax or WannaCry cost companies billions, including potential hardship imposed on lives of individuals. Needless to say, companies stand to lose more than just money in the face of a cyber attack—a major breach can tarnish a company’s image, damaged credibility weaken consumer confidence, causing lasting damage to their brand and reputation. Cyber Risk is Growing Smart companies aren’t preparing for the “if” of a cyber attack, but the “when,” because hackers are constantly on the lookout to steal or compromise data. In a world where the next cyber attack could strike at any time, causing a data breach that costs your company millions, how should your company defend itself? Even the most secure firewalls, intrusion detection, and data loss prevention solutions can’t protect against an employee accidentally downloading malware by clicking on the wrong email, a security patch in a timely manner or a simple misconfiguration that leaves an entire database open to intrusion. You could hire an army of IT security professionals, but even they wouldn’t be able to manage the tens of thousands of security alerts that come into most security operations centers from today’s hybrid IT systems. Simply put, if you can’t protect your data, your company, —and your job—are at risk. But what if you could automate the fight? Get ahead of the hacker with a built-in army of “robot cyber warriors” to protect your data automatically, have all patches automatically applied, configurations self-tuned and optimized? Protect Your Data from the Inside Out Last October, Oracle introduced world’s first Autonomous Self-Securing Database. Oracle Autonomous Database is designed with built-in adaptive machine learning to protect you from external hackers as well as malicious internal users. It encrypts all of your data, automatically—ensuring comprehensive data protection. It applies security updates, automatically—while your system is running – with no down time. 24/7 Data Protection How many unpatched IT asset does your company have right now? Even one is too many, because if it is detected, it can be exploited. Oracle Autonomous Database patches itself as soon as available, without relying on humans remembering to apply . We all know that unpatched systems leave companies open to attack, yet sometimes leaving security vulnerabilities unaddressed is a company’s only option. Your IT team needs time to bring the system down, patch it, and get it up and running again. During every minute of that time, all of your applications, databases, software, and servers are vulnerable to hackers to exploit. With Oracle Autonomous Database, patching happens automatically, while the system is running--without any downtime. Secure Your Tomorrow Today Protecting the value of your most valuable assets has never been more important. And it’s something even the most powerful IT security team can’t do on their own. By facing the reality of a potential data breach with the power of a virtual cyber robot army, you protect your data, your brand, and your reputation. Oracle Autonomous Database eliminates the chance of human error, protects your data from intentional or external malicious actors. You can’t stop a hacker from trying, but with Oracle Autonomous Database, you will always stop them from succeeding. Join a Database month event: Inside the Mind of a Database Hacker with Penny Avril, VP of Oracle Database Server Technologies and Mark Fallon, Lead Security Architect, Oracle Database Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/    

What would you consider your most valuable resource today? What about the world’s? Turns out they are one and the same: data. According to the Economist, data has replaced oil as the world’s most...

Securing the Oracle Database eBook - Second Edition Now Available

Today we have guest blogger -  Michael Mesaros - director of Product Management. What every data owner should read before hackers and auditors come knocking! According to the Economist, data has surpassed oil as the most valuable asset. Data gives organizations unprecedented advantages, enabling them to find new ways to serve customers and create value. Your data is your asset, but unless you protect it well it could fall in wrong hands and become a liability.    We hear reports about breaches almost daily and by some estimates on average over 10 million records are lost or stolen each day worldwide.  In addition, new laws and regulations such as the European Union’s GDPR are forcing organizations to take a hard look at how they manage and protect data. Since databases contain most of their sensitive data assets, organizations are now appreciating the importance of securing their databases. Oracle Database provides the industry’s most comprehensive security. Read the latest eBook from Oracle, Securing the Oracle Database: A Technical Primer, authored by the Oracle Database Security Product Management team to: Learn the various approaches hackers use to try to gain access to your sensitive data. Understand the multiple layers of assessment, preventive, and detective security controls you need to protect your data. Guide your teams with strategies to shrink the attack surface and keep your databases secure, both on-premises and in the cloud. Use this book as a quick study into what every Database or Security Director/VP should know about the security of Oracle Databases.  This book will help you answer questions such as:   What are my options for authenticating and authorizing database users? How do I enforce separation of duties and limit access to data by administrators and other privileged users? How can I leverage encryption and key management to protect data in motion and at rest? How do I create application data sets that are safe to use in test, development and production environments? How do I audit database user activities and generate management and compliance reports? How do I monitor database activity and protect from attacks such as SQL injection? How do I leverage authorization technologies to build secure applications?  How can I evaluate the security posture of my database, and understand what controls I can implement to manage risk? What is EU GDPR, and how can database security technologies help with this and other regulatory compliance requirements? What do I need to know about securing databases in the cloud? Breaches are happening faster than ever and it is crucial that you are prepared with a sound database security strategy. Hackers aren’t resting in their endless quest to acquire your data, and we cannot risk resting either.  Arm yourself with up-to-date information about these database securityconcepts. Let’s start by securing the source! Let’s start by securing the source! Download your eBook  Join a Database month event: Inside the Mind of a Database Hacker with Penny Avril, VP of Oracle Database Server Technologies and Mark Fallon, Lead Security Architect, Oracle Database Learn more about Oracle Database Security Solutions  

Today we have guest blogger -  Michael Mesaros - director of Product Management. What every data owner should read before hackers and auditors come knocking! According to the Economist, data has...

Database

The Future of Data Management is Autonomous.

Gartner again names Oracle as a Leader in Data Management Solutions for Analytics. At Oracle, we believe that we continue to demonstrate superior ability to execute by delivering ground breaking innovations to the industry. Oracle revolutionized data management with the delivery of the world’s first autonomous database. Oracle Autonomous Database Cloud uses ground-breaking machine learning to enable automation that eliminates human labor, human error, and manual tuning to enable unprecedented availability, high performance, and security at much lower cost. “The Oracle Autonomous Database is based on a technology as revolutionary as the internet,” says Larry Ellison, Oracle Executive Chairman and CTO. Autonomous Database: Uses machine learning to automatically upgrade, patch, and tune itself Recognizes unusual behavior and fixes problems before they become outages ensuring 99.995 reliability Encrypts data by default, applies security patches automatically and protects from internal and external attacks Transform Your Data Management Today. Read the Oracle newsletter featuring Gartner content for more details. https://www.gartner.com/technology/media-products/newsletters/oracle/1-4TXFZ4K/index.html Gartner Magic Quadrant for Data Management Solutions for Analytics, Adam M. Ronthal, Roxane Edjlali, Rick Greenwald, 13 February 2018. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Gartner again names Oracle as a Leader in Data Management Solutions for Analytics. At Oracle, we believe that we continue to demonstrate superior ability to execute by delivering ground breaking...

Oracle Collaborate 18 Key Sessions for Data Management Cloud Services

Oracle Collaborate 18 – Technology and Application Forum -  is coming to Las Vegas, Nevada on April 22-26. It offers 1200 Educational sessions and events, 5000 peer professionals, 200 exhibitors, and unlimited ideas and opportunities. To attend this event please register here. Below are the links to Oracle’s Data Management Cloud Sessions: Steve Daheb, Senior Vice President, Oracle Cloud Keynote “Oracle Cloud — How to Build Your Own Personalized Path to Cloud” Monday, April 23rd, 2:30 p.m. - 3:30 p.m. - Mandalay Bay Ballroom F   Monica Kumar, Vice President, Product Marketing, Oracle “Revolutionize Your Data Management with world's 1st Autonomous Database” Monday, April 23rd, 4:15 PM–5:15 PM – Jasmine G “The Future of Autonomous Cloud” Wednesday, April 25th, 11:00 AM–12:00 – Jasmine G   Sachin Sathaye, Sr. Director, Cloud Platform Services, Oracle “Your journey to cloud with Choice and Control” Monday, April 23rd, 4:15 PM–5:15 PM – Lagoon L   Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/                            

Oracle Collaborate 18 – Technology and Application Forum -  is coming to Las Vegas, Nevada on April 22-26. It offers 1200 Educational sessions and events, 5000 peer professionals, 200 exhibitors, and...

The Forgotten Link To The Cloud

Today we have guest blogger - Francisco Munoz Alvarez - an author and popular speaker at many Oracle conferences around the world. Last year when presenting a session at Collaborate’17 in Las Vegas regarding Tips and Best Practices for DBAs, I went thru the evolution of the DBA profession and also gave a few tips of how a DBA can improve and be successful on his/her career. After my session, many people approached me with questions regarding what will happen with the DBA profession with the introduction of Cloud to our life. Would the DBAs workload be fully automated and the DBA profession will disappear? Should I be afraid of the Cloud? Should I start looking for a new career? And many more questions like this, making me aware of an unexpected situation – the DBAs are blocking many possible Cloud endeavors for their organizations because they are scared of what it would bring to their future in the industry. So, after discovering this unexpected situation I decided to write this post to express my overview on this so important topic! Automation vs. Autonomous Let start this post by clearing some common confusions and misunderstandings. For the past 10 years of my career I have been recommending DBA's to automate most (if not all) business as usual (BAU) work and to concentrate at becoming as much proactive as possible (If you cannot automate a BAU process, delegate it) because you have more important things to do! I am recommending this over and over because I am constantly watching DBA's losing too much time with BAU work, consequently making them unable to expend time on career development (training and learning about new technologies), work at important projects such as per example Optimization, Security, Performance Tuning, High Availability, Migrations, Upgrades and also unable to work with new technologies that could seriously benefit the business and obtain the best ROI to company resources. I would like to use the automotive industry as example to clarify the differences between Automation and Autonomous. I love to drive my car, I love to be behind the wheel and enjoy the experience. Recently I bought a new car that include many driver assistance technologies (Automation). It has Autonomous cruise control (that automatically adjusts the vehicle speed to maintain a safe distance from vehicles ahead, and even stop the car if necessary), Line Changing Alert, Collision alert (alert us if close to have a collision due to speed or proximity and breaks for you if required), Driving behavior alert (Check for fatigue and dangerous driving behaviors) Automatic Head Light and Windscreen Wiper, Parking assistance and much more. Many people could think that all these features would affect my driving experience, but remember, as the person at control (the driver) you can choose what options would be used and when, and also can be adjusted as per your driving requirements. So, it did not affect me at all, by the opposite, they made my driving easier and safer, furthermore allowing me to enjoy it even more. We are also talking about fully autonomous cars (that would fully drive itself, you just need to tell the car where you are going) for long time and many companies are investing resources on this type of technologies (Toyota, Tesla, Google, Uber are only a few) and are constantly making public testing of it. We know this is the future, and we know that is coming, but not anytime too soon (like this year or the next) and when the time comes, the global population would adopt it gradually. The Oracle Database world is very similar to the above example. Automation within database is a reality and well needed, autonomous databases are coming and we cannot stop it. So, let's evolve and be prepared on time, we still have time until everyone starts adopting it gradually. Cloud, the inevitable next step in the DBA DNA evolution! The constant evolution of IT has, among other things, affected the role of a DBA. Today the DBA is not merely a Database Administrator anymore but is morphing more into the Database Architect role. If you want to become a successful DBA (Database Architect) and be more competitive in the market, you should have a different skill set than was normally required in the past. Now a day, you need to have a wide range of understanding in architectural design, Cloud, network, storage, licensing, versioning, automation, and much more - the more knowledge you have, the better opportunities you will find. We know without doubt that our future involves automation and Cloud technologies, so why fight it? Why continue losing our time and energy against it? Let's take advantage of it now! So, what is next? First, learn to change yourself If you want to become a successful professional, first you need to educate yourself to be successful! Your future success depends only in your attitude today. You control your career, nobody else! Becoming a successful DBA is a combination of: Your professional attitude, always think positive and always look for solutions instead to kill yourself in a cup of water. Learn how to research, before do something, investigate, search in the internet, read manuals. You need to show that you know how to do a properly research and look for solutions for your problems yourself. Be innovate, don’t wait for others to do your job, or because the other DBAs don’t care about the business you will do the same. Learn to innovate, learn to become a leader and make everyone follow your example with results. Think Different! Learn to communicate properly; the best way to learn how to communicate effectively is learning to listen first. Listen, then analyse the context expressed and only than communicate an answer in a professional and honest way to your peers. Always treat everyone the same way you would like to be treated. Albert Einstein said one time: “If I had one hour to save the world, I would spend fifty-five minutes defining the problem and only five minutes finding the solution” Second Learn to be Proactive Why check the problems only when they are critical, or when is too late and the database is down, or the users are screaming? Being proactive is the best approach to keep your DB healthy and to show your company, or your clients that you really care about them. Many DBA’s expend most of their time being firefighters only, fixing problems and working on user’s requests all the time. They don’t do any proactive work; this mentality only will cause an overload of work to them, thousands of dollars of overtime, several hours without access to the data to the users, poor performance to the applications, and what is worse of all, several unhappy users thinking that you don’t have the knowledge needed to take care of their data. Let’s mention a small example, you have the archive log area alert set to fire when it is 95% full, and this happens in the middle of the night, some DBA’s will take seriously the alert and solve the problem quickly, others will wait until the next day to take care of it because they are tired, or sleeping, or they are in a place without internet access at the moment the alert arrived. Will be a lot easier if they set a proactive alert to be fire when 75% or 85%, or even better, take a look in the general health status of the DB before leave their work shift, to try to detect and solve any possible problem before be a real problem and be awake in the middle of the night or during the weekend (Remember how important is your personal and family time). I’ll always recommend to DBA’s to run 2 checklists daily, one in the start of their shift and other before they leave their shift. I know several DBA’s that complain all the time that they got so many calls when they are on call, but they don’t do anything to solve the root problem, they only expend their time to solve the symptoms. So, let’s change our mentality, let stop being a firefighter and start to be a real hero! Third, Educate and prepare yourself for the future Finally, here are some things you should be concentrate at learning and improving your skills, as per example: How to manage different RDBMS technologies (as per example: MySQL, SQL Server, DB2, etc). How to manage NoSQL technologies (as per example: Cassandra, Druid, HBase, and MongoDB). How to resolve unavailability issues. Execute recovery test from current and old backups and document the process for your company DRP (Disaster and Recovery Plan). Ensure your company RPO and RTO SLAs are being fulfilled by your high availability plan and backup and recovery strategy. Gaining deep knowledge at performance tuning. Learn how your applications work and how they interact with the database and middle layers. Learn on how to review and implement security Keep up with DB trends & technologies. Use new technologies when applicable (as per example Kafka, Microservices, Containers, Virtualization) Know how to perform storage and physical design.  Diagnose, troubleshoot and resolve any DB related problems. Ensure that Oracle networking software is configured and running properly. Mentor and train new DBA’s (This allow you to review and learn new things). Learn about XML, Java, Python, PHP, HTML, and Linux, Unix, Windows Scripting. Automate all BAU work or delegate it. Implement Capacity Planning /Hardware Planning Architect, Deploy and Maintain Cloud Environments Improve your SQL and PL/SQL skills and Review SQL and PL/SQL codes in your environment. Control and execute code promotions to production environments Master Cloud technologies (IaaS, DBaaS, PaaS and SaaS) Like you can easily see, as DBAs we have a lot of things to do and learn about, so stop losing time with BAU, because you have a lot of more important things to do and learn about. Embrace the future, the Cloud wave, the change, and the evolution. Do not stay in the past anymore, it would only affect yourself and your career in the future! Francisco’s Bio: Francisco Munoz Alvarez is an author and popular speaker at many Oracle conferences around the world. He is also the President of CLOUG (Chilean Oracle Users Group), APACOUC  (APAC Oracle Users Group Community, which is the umbrella organization for all of APAC), IAOUG (Independent Australia Oracle Users Group) and NZOUG (New Zealand Oracle Users Group. He also worked in the first team to introduce Oracle to South America (Oracle 6 and the beta version of Oracle 7). He was also the first Master Oracle 7 Database Administrator in South America, as well as the first Latin American Oracle professional to be awarded a double ACE (ACE in 2008 and ACE Director in 2009) by Oracle HQ. In 2010, he had the privilege to receive a prestigious Oracle Magazine Editor's Choice Award as the Oracle Evangelist of the Year--a huge recognition for his outstanding achievements in the Oracle world that includes the creation and organization of the already famous OTN Tours that are the biggest Oracle evangelist events in the world. Currently, Francisco works for Data Intensity, which is a global leader in data management consulting and services, as the Director of Innovation. He also maintains an Oracle blog (http://www.oraclenz.org) and you can always contact him through this or Twitter (@fcomunoz) regarding any questions about Oracle. Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

Today we have guest blogger - Francisco Munoz Alvarez - an author and popular speaker at many Oracle conferences around the world. Last year when presenting a session at Collaborate’17 in Las Vegas...

Database

Oracle Exadata Cloud Service Certified for SAP Applications

Today we have guest blogger - Bertrand Matthelie - Senior Principal Product Marketing Director In an earlier blog, I described how moving SAP applications & Oracle databases to Oracle Cloud enables customers to preserve existing investments while accelerating innovation, relying on the only cloud architected for enterprise workloads and optimized for Oracle Database. Further enhancing the unique value of Oracle Cloud, SAP Applications based on NetWeaver 7.x have now been certified on Oracle Exadata Cloud Service. Oracle Exadata is the best-performing, most available, and most secure architecture for running Oracle Database. Oracle Exadata Cloud Service enables you to: Get the full performance of Oracle Exadata in the cloud Combine the benefits of on-premises and cloud Increase business agility and operational flexibility with zero CapEx Scale-out quickly and easily The Oracle Exadata Database Machine has proven to be a very popular solution to power SAP deployments. Customers running SAP applications with Exadata on-premises can easily move their SAP workloads to Exadata Cloud Service and benefit from Oracle’s BYOL to PaaS program. All features & options of Oracle Database 12c Release 1 (12.1.0.2) and Release 2 (12.2.0.1), including Real Application Clusters (RAC), Automatic Storage Management (ASM) and Oracle Database In-Memory supported for on-premises deployments of SAP NetWeaver are supported and certified for Exadata Cloud Service. For more information, read SAP Note 2614028 "SAP NetWeaver Application Server ABAP/Java on Oracle Database Exadata Cloud Service" and our White Paper "SAP NetWeaver Application Server ABAP/Java on Oracle Database Exadata Cloud Service". Let us know if you have any question or comment. Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/   p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 10.0px Verdana; color: #666666; -webkit-text-stroke: #666666} span.s1 {font-kerning: none}

Today we have guest blogger - Bertrand Matthelie - Senior Principal Product Marketing Director In an earlier blog, I described how moving SAP applications & Oracle databases to Oracle Cloud enables...

Database

Autonomous Capabilities Will Make Data Warehouses—and DBAs—More Valuable

Today we have guest blogger  - Alan Zeichick - principal analyst at Camden Associates As the old saying goes, you can’t manage what you don’t measure. In a data-driven organization, the best tools for measuring the performance are business intelligence (BI) and analytics engines, which require data. And that explains why data warehouses continue to play such a crucial role in business. Data warehouses often provide the source of that data, by rolling up and summarizing key information from a variety of sources. Data warehouses, which are themselves relational databases, can be complex to set up and manage on a daily basis, so they typically require significant human involvement from database administrators (DBAs). In a large enterprise, a team of DBAs ensure that the data warehouse is extracting data from those disparate data sources, as well as accommodating new and changed data sources. They’re also making sure the extracted data is summarized properly and stored in a structured manner that can be handled by other applications, including those BI and analytics tools. On top of that, DBAs are managing the data warehouse’s infrastructure, everything from server processor utilization, the efficiency of storage, security of the data, backups, and more. However, the labor-intensive nature of data warehouses is about to change, with the advent of Oracle Autonomous Data Warehouse Cloud, announced in October 2017. The self-driving, self-repairing, self-tuning functionality of Oracle’s Data Warehouse Cloud is good for the organization—and good for the DBAs. No Performance-Tuning Knobs Data-driven organizations need timely, up-to-date business intelligence, which can feed instant decision-making, short-term predictions and business adjustments, and long-term strategy. If the data warehouse goes down, slows down, or lacks some information feeds, the impact can be significant. No data warehouse may mean no daily operational dashboards and reports, or inaccurate dashboards or reports. Oracle Autonomous Data Warehouse Cloud is a powerful platform, because the customer doesn’t have to worry about the system itself, explains Penny Avril, vice president of product management for Oracle Databases. “Customers don’t have to worry about the operational management of the underlying database—provisioning, scaling, patching, backing up, failover, all of that is fully automated,” she says. “Customers also don’t have to worry about performance. There are no performance knobs for the customer: DBAs don’t have to tweak anything themselves.” For example, one technique used to drive Autonomous Data Warehouse’s performance is by automating the process of creating storage indexes, which Avril describes as the top challenge faced by database administrators. Those indexes allow applications to quickly extract data required to handle routine reports or ad-hoc queries. “DBAs manually create custom indexes when they manage their own data warehouse. Now, the autonomous data warehouse transparently, and continually, generates indexes automatically based on the queries coming in,” she says. Those automatically created indexes keep the performance high, without any manual tuning or interventional required by DBAs. Related: Join Oracle CEO Mark Hurd for the release of the world’s first autonomous database cloud service. Register now. The organization also can benefit by the automatic scaling features of Autonomous Data Warehouse. When the business requires more horsepower in the data warehouse to maintain performance during times of high utilization, the customer can add more processing power by adding more CPUs to the cloud service, for which there is an additional cost. However, Avril says, “Customers can scale back down again when the peak demand is over”—eliminating that extra cost until the next time the CPUs are needed. Customers can even turn off the processing entirely if needed. “When a customer suspends the service, they pay for storage, but not CPU,” she says. “That’s great for developers and test beds. It’s great for ad-hoc analytics for people running queries. When you don’t need a particular data warehouse, you can just suspend it.” Freedom for the Database Administrator Performance optimization, self-repairing, self-securing, scalability up and down—those benefits serve the organization. What about the poor DBA? Is he or she out of work? Not at all, says Avril, laughing at the question. “They can finally tackle the task backlog,” adding more value to the business, she says. Avril explains that DBAs do two types of day-to-day work. “There are generic tasks, common to all databases, including data warehouses. And there are tasks that are specific to the business. With Oracle’s Autonomous Data Warehouse, the generic tasks go away. Configuring, tuning, provisioning, backup, optimization—gone.” That leaves the good stuff, she explains: “If they aren’t overloaded with generic tasks, DBAs can do business-specific tasks, like data modeling, integrating new data sources, application tuning, and end-to-end service level management.” For example, DBAs will have to manage how applications connect to the data warehouse—and what happens if things go wrong. “If the database survives a failure through failover, does the application know to failover instantly and transparently? The DBA still needs to manage that,” Avril says. In addition, data security still must be managed. “Oracle will take care of patching the data warehouse itself, but Oracle doesn’t see the customer’s data,” she says, “DBAs still need to understand where the data lives, what the data represents, and which people and applications should get to see which data.” No need for a resume writer: DBAs will still have plenty of work to do. For C-level executives, Autonomous Data Warehouse can improve the value of the data warehouse—and the responsiveness of business intelligence and other important applications—by improving availability and performance. “The value of the business is driven by data, and by the usage of the data,” says Avril. “For many companies, the data is the only real capital they have. Oracle is making it easier for the C-level to manage and use that data. That should help the bottom line.” For the DBA, Autonomous Data Warehouse means the end of generic tasks that, on their own, don’t add significant value to the business. Stop worrying about uptime. Forget about disk-drive failures. Move beyond performance tuning. DBAs, you have a business to optimize. Alan Zeichick is principal analyst at Camden Associates, a tech consultancy in Phoenix, Arizona, specializing in software development, enterprise networking, and cybersecurity. Follow him @zeichick. Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

Today we have guest blogger  - Alan Zeichick - principal analyst at Camden Associates As the old saying goes, you can’t manage what you don’t measure. In a data-driven organization, the best tools for...

Autonomous Database

Autonomous vs. Automated

The invention of the telephone in 1876 made an immense impact on human communication, paving the wave for an explosion of inventions and business opportunities. In the late 1970s the dawn of the personal computer ushered in an even more intimate connection with technology. The Internet became the ultimate technological disruption in 1991, when it enabled easy communication around the globe with personalized content delivery. Fast-forward to 2007 when the iPhone let us carry our computer in our pocket and engage with technology anytime, anywhere. Since then, technological disruption has been the norm, not the exception, driven in large part by advances in Artificial Intelligence (AI) and machine learning. In 2008, Tesla made the dream of the electric car real, and just last year Elon Musk showcased the first fully autonomous self-driving car. When it comes to cars, some might think of autonomous, automated, and self-driving as interchangeable terms, but there’s a critical difference to note. An automated car allows drivers to take control of limited functions—think cruise control—but the driver still must keep overall manual control of the vehicle. Whereas an autonomous car eliminates the need for human interaction so the driver can sit back and enjoy the ride—reducing the stress of commute, and giving them a chance to focus on other important activities. And it’s all thanks to AI that an autonomous car has a level of intelligence and independence that only machine learning can bring. Machine learning is a subset of artificial intelligence with the sophistication to discover hidden opportunities, accelerate tedious processes, and identify which data insights matter. And the good news is that it’s not limited to cars. At Oracle OpenWorld last year, Oracle Chairman of the Board and CTO Larry Ellison unveiled his vision for the world’s first autonomous database cloud that is self-driving, self-securing and self-repairing. "This is the most important thing Oracle has done in a long, long time. The same way self-driving cars open a new world of possibilities in our life, self-driving database will bring the world of technology to a different level – reduced risk & cost, unprecedented availability, performance, security, flexibility." Building on the next generation of the industry-leading Oracle Database 18c, the Oracle Autonomous Database Cloud uses ground-breaking machine learning to eliminate human labor, human error, and manual tuning. The result? Unprecedented availability, high performance, and security, all for a much lower cost. The Oracle Autonomous Database is “self-driving,” meaning that it autonomously upgrades and patches itself while running. No human intervention required. The Oracle Autonomous Database, like the technological innovation that proceeded it, didn’t just happen overnight.  Oracle has been developing sophisticated database automation for decades, and invested thousands of engineer years automating key database functions, as this journey map shows. Oracle Database 18c is the latest generation of the world’s most popular database is now available. It provides businesses of all sizes with access to the worlds fastest, most scalable, and reliable technology. Oracle’s self-driving database disrupts the world of database management in the same way self-driving changes the way we commute and revolutionizing the transportation industry. What does it mean for Oracle customers when the World’s #1 Database becomes the World’s 1st Autonomous Database? The payoff is huge. Because no human intervention is needed, the Oracle Autonomous Database eliminates mundane management tasks, reduces labor, costs, and errors, all while increasing security and availability. Game changing technologies of the past such as the phone, computer and the internet fundamentally changed lives and set a course to open new opportunities, innovations, allowing us to dream big. The Oracle Autonomous Database Cloud is revolutionizing how data is managed, enabling faster, easier data access, helping to unlock the potential of your data so your business can benefit – and dream big. Read more about the Oracle Autonomous Database Cloud.

The invention of the telephone in 1876 made an immense impact on human communication, paving the wave for an explosion of inventions and business opportunities. In the late 1970s the dawn of the...

Autonomous Database

Education Available at COLLABORATE 18

Today, we have guest blogger - Moe Fardoost - Senior Director, Cloud and Enterprise Manager Whether organizations are seeking to modernize their Oracle on-premise solutions, evaluate a path to the cloud, or optimize business already in the cloud, COLLABORATE offers a unique opportunity for Oracle users to learn how to maximize existing Oracle investments and take them to the next level. This year's program at COLLABORATE 18 features key themes such as how to achieve breakthrough value from existing on-premise and database offerings; increase the value of on-premise investments by migrating, connecting, and extending with Oracle's Cloud solutions; and how emerging technologies impact established business processes. COLLABORATE 18 offers more than 1,200 educational sessions, with 230 being led by Oracle speakers. For Oracle on-premise customers, sessions include 314 focused on Oracle E-Business Suite, 214 JD Edwards, 179 PeopleSoft, and 99 Hyperion. Steve Daheb, Senior Vice President for Oracle Cloud, will present the Oracle keynote: Oracle Cloud – How to Build Your Own Personalized Path to Cloud. Attendees can then select from over 500 sessions spanning how Oracle's SaaS, IaaS, and PaaS solutions can accelerate business transformation, increase agility, and optimize security with their existing solutions. COLLABORATE, the premiere, annual technology and applications forum for the Oracle community, will take place this year April 22-26, at the Mandalay Bay Resort and Casino in Las Vegas, Nevada. Hosted by three Oracle user groups – IOUG, OAUG, Quest International Users Group – the five-day user conference will host more than 5,000 attendees in keynotes, sessions, workshops, networking events, and an exhibitor showcase with 200+ vendors. Below are some of the sessions to explore at COLLABORATE 18: p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 14.7px Times; color: #1f497d; -webkit-text-stroke: #1f497d} p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 14.7px Times; color: #800080; -webkit-text-stroke: #800080} p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; font: 14.7px Times; -webkit-text-stroke: #000000} span.s1 {font-kerning: none} span.s2 {text-decoration: underline ; font-kerning: none}   What's New in Oracle Database 18c William Hardie, Vice President, Oracle Database Product Management [Session ID: 1162] Preview of Oracle Autonomous Database Maria Colgan, Master Product Manager, Oracle [Session ID: 1363] Revolutionize Your Data Management with World's 1st Autonomous Database Monica Kumar, Vice President, Product Marketing, Oracle [Session ID: 1721] The Future of Autonomous – Powered by Oracle Monica Kumar, Vice President, Product Marketing, Oracle [Session ID: 1162] Upgrade and Migrate to Oracle Database Cloud Roy Swonger, Vice President, Database Upgrades & Utilities, Oracle [Session ID: 1175] Hands-On Lab: Upgrade and Migrate to Oracle Database 18 Roy Swonger, Vice President, Database Upgrades & Utilities, Oracle [Session ID: 1176] Oracle Database In-Memory: Past, Present, and Future Tirthankar Lahiri, Vice President, Data and In-Memory Technologies, Oracle [Session ID: 1337]   Check out the complete agenda where you can search by keyword or apply filters by education track, product line, business goal, or industry. You can also create a profile to build and save your personal education plan.   Haven't registered? What are you waiting for? For more information, visit attendcollaborate.com and register today!  

Today, we have guest blogger - Moe Fardoost - Senior Director, Cloud and Enterprise Manager Whether organizations are seeking to modernize their Oracle on-premise solutions, evaluate a path to the...

Database

Does GPU hardware help Database workloads?

Graphics processing units or GPUs are dedicated highly parallel hardware accelerators that were originally design to accelerate the creation of images. More recently, folks have been looking at GPUs to accelerate other workloads like Database analytics and transaction processing (OLTP). Although GPUs have little or no use for OLTP style workloads, they have been shown to accelerate analytics. So, what kind of benefits can you expect from running the Oracle Database on a GPU and where are you likely to see these benefits?  The Oracle Database has a long history of adopting new technologies as they become available and allowing customers to take advantage of these technologies transparently in their existing applications.  Oracle is continuing this tradition by taking advantage of the latest hardware and software technologies to provide dramatically faster analytics performance. Oracle has already released Oracle Database In-Memory which uses columnar in-memory formats to greatly accelerate analytics.  The columnar in-memory algorithms make extensive use of SIMD Vector instructions that are already present in CPUs.  SIMD Vector instructions accelerate analytics by processing many data elements in a single instruction.  SIMD Vector instructions benefit from having full access to the very large caches and memory bandwidth that exist in current CPU sockets.  An advantage of SIMD Vector instructions is that they are present in all existing CPUs and add no further cost, complexity, or power usage is required on top of the existing hardware. Oracle continues to rapidly add new SIMD Vector algorithms to the database to take further advantage of these specialized instructions.  SIMD Vector processing has already delivered very large analytics performance gains in the Oracle Database, and customers should expect additional performance gains from new and improved uses of these instructions in future database releases.  What great about this approach is that these performance gains will be transparent to applications and require no additional hardware or  effort from the customer other than installing the software. Further, Oracle has been actively working with Intel and other chip vendors for many years to add additional SIMD vector instructions to CPUs for the specific purpose of accelerating Oracle Database algorithms. Some of these instructions are now becoming available, and more instructions will become available as new CPU chips are released in the next few years.  GPUs offer the potential to further accelerate analytic processing through two mechanisms: Adding more parallel processing Using higher bandwidth, but much smaller specialized memory called High Bandwidth Memory (HBM) Oracle is actively working with the major GPU vendors to implement database algorithms that use these devices.  But current generation GPUs have several disadvantages: They are heavily oriented towards floating point and other numeric processing.  Therefore, the large majority of processing power available in these devices is not useful for accelerating database algorithms.  These devices sit on the PCI bus, and don't have direct access to the server's DRAM memory.  Instead GPUs have their own local high bandwidth memory, but the size of this local memory is one to two orders of magnitude smaller than the server memory.   All data that a GPU processes must be moved back and forth across the PCI bus from the main CPUs. It is important to learn the basic architectural benefits and tradeoffs of GPUs in order to understand where they provide the most value.  The huge number of parallel computation engines provided by these devices excel at accelerating tasks that require large numbers of computations on small amounts of data.  GPUs are extremely effective for Blockchain applications because these require billions of computations on a few megabytes of data.  GPUs are great for deep learning since these perform repeated computational loops on megabytes to gigabytes of data.  GPUs are great for graphics because three-dimensional imaging requires millions of computations on every image.  The pattern here is the same - lots of computation on modest amounts of data. Databases analytics has a completely different pattern of data usage.  Analytics typically perform a small number of simple calculations on large amounts of data, often hundreds of gigabytes to petabytes of data.  For example, a typical analytic query will apply a simple predicate (e.g. filter sales by date or region) and then perform a simple aggregation function (e.g. sum or average).  Note that the analytics usage pattern is the exact opposite of the sweet spot for GPUs described above.  Because the data being processed is much larger than can fit in the local GPU memory, data must be moved back and forth across the PCI bus. This limits the total throughput to the PCI bus bandwidth which is dramatically lower than the local memory bandwidth.  This doesn't mean that GPUs don't provide any benefits for analytics, but users should not expect the dramatic benefits seen in other applications. It is just not architecturally possible. Oracle, and other vendors, have found that some database analytics algorithms can in fact run faster on GPUs than using conventional processing methods.  However, care should be taken when reading performance comparisons because the analytic landscape is rapidly changing.  As mentioned before, databases are now increasingly taking advantage of SIMD Vector instructions.  The comparisons that are often published showing huge advantages for GPUs usually contrast performance using traditional database algorithms vs new and highly optimized GPU algorithms.  Further more, these comparisons often use easily available but un-optimized and un-parallelized open-source databases that are orders of magnitude slower than commercial databases for analytics.  Oracle's internal benchmarking shows that comparing GPUs to current vector instruction optimized algorithms greatly narrows the performance advantage of GPUs.  New vector optimized algorithms and parallelization that Oracle will be releasing will further narrow the gap, and in most cases, we find that a standard two-socket server will deliver similar performance using these new algorithms to a server with eight GPU cards. In addition to big changes in software algorithms that are coming in the analytics area, there are big changes coming in hardware. PCI buses will get faster, and future GPUs will reduce their PCI bus communication disadvantages by adding direct high bandwidth communication with the main CPUs.   On the other hand, future CPUs may add support for High Bandwidth Memory eliminating one of the main advantages of GPUs. In summary, Oracle is actively improving its analytic algorithms by further leveraging SIMD Vector instructions and improving parallelism.  The Oracle database is already dramatically faster for analytics than it was a few years ago and will get much faster in the coming releases.  Oracle is working with both conventional CPU vendors and GPU vendors to add new hardware capabilities that specifically optimize database processing.  Current GPUs can be shown to run some analytic algorithms faster but achieving these advantages in a non-benchmark environment is challenging because these algorithms only work for a subset of analytic functions, and data needs to be moved back and forth across the PCI bus.  Oracle is also actively working on adapting its database algorithms to take transparent advantage of GPUs and will release these algorithms if we find that the performance gains are sufficient and sustainable.

Graphics processing units or GPUs are dedicated highly parallel hardware accelerators that were originally design to accelerate the creation of images. More recently, folks have been looking at GPUs...

Oracle Database 18c : Now available on the Oracle Cloud and Oracle Engineered Systems

Today we have a guest blogger, Dominic Giles, Master Product Manager from Oracle Database providing us with insights into what to expect from Oracle Database 18c. Oracle Database 18c's arrival marks a change in the way the world’s most popular database is released. It brings new functionality and improvements to features already available in Oracle Database 12c. In this blog, I'll highlight what you can expect from this new release and where you can get additional information but first let me address the new release model that the Database team has adopted. Release schedule Oracle Database 18c is the first version of the product to follow a yearly release pattern. From here onwards the Oracle Database will be released every year along with quarterly updates. You can find more details on this change by visiting Oracle Support and taking a look at the support Document 2285040.1 or on Mike Dietrich’s blog. If you’re confused as to why we’ve apparently skipped 6 releases of Oracle it may be simpler to regard “Oracle Database 18c” as “Oracle Database 12c Release 2 12.2.0.2”, where we’ve simply changed the naming to reflect the year in which the product is released. We believe the move to a yearly release model and the simplification of the patching process will result in a product that introduces new smaller changes more frequently without the potential issues that a monolithic update brings.    Building on a strong foundation   Oracle Database 18c, as I mentioned earlier, is the next iteration of Oracle Database 12c Release 2 and as a result, it has a lot of incremental enhancements aimed to improve upon this important release. With that in mind, let’s remind ourselves what was in Oracle Database 12c Release 2.   The release itself focused on 3 major areas: Multitenant is Oracle’s strategic container architecture for the Oracle Database. It introduced the concept of a pluggable database (PDB) enabling users to plug and unplug their databases and move them to other containers either locally or in the cloud. The architecture enables massive consolidation and the ability to manage/patch/backup many databases as one. We introduced this architecture in Oracle Database 12c and extended it capabilities in Oracle Database 12c Release 2 with the ability to hot clone, online relocate and provide resource controls for IO, CPU and Memory on a PDB basis. We also ensured that all of the features available in a non-container are available for a PDB (Flashback Database, Continuous Query etc.). Database In-Memory enables users to perform lightning fast analytics against their operational databases without being forced to acquire new hardware or make compromises in the way they process their data. The Oracle Database enables users to do this by adopting a dual in-memory model where OLTP data is held both as rows, enabling it to be efficiently updated, and in a columnar form enabling it to be scanned and aggregated much faster. This columnar in-memory format then leverages compression and software in silicon to analyze billions of rows a second, meaning reports that used to take hours can now be executed in seconds. In Oracle Database 12c Release 2 we introduced many new performance enhancements and extended this capability with new features that enabled us to perform in In-Memory analytics on JSON documents as well as significantly improving the speed at which the In-Memory column store is available to run queries after at startup.   Oracle Database Sharding, released in Oracle Database 12c Release 2, provides OLTP scalability and fault isolation for users that want to scale outside of the usual confines of a typical SMP server. It also supports use cases where data needs to be placed in geographic location because of performance or regulatory reasons. Oracle Sharding provides superior run-time performance and simpler life-cycle management compared to home-grown deployments that use a similar approach to scalability. Users can automatically scale up the shards to reflect increases in workload making Oracle one of the most capable and flexible approaches to web scale workloads for the enterprise today. Oracle 12c Release 2 also included over 600 new features ranging from syntax improvements to features like improved Index Compression, Real Time Materialized views, Index Usage Statistics, Improved JSON support, Enhancements to Real Application Clusters and many many more. I’d strongly recommend taking a look at the “New Features guide for Oracle Database 12c Release 2” available here Incremental improvements across the board As you’d expect from a yearly release Oracle Database 18c doesn’t contain any seismic changes in functionality but there are lots of incremental improvements. These range from syntax enhancements to improvements in performance, some will require that you explicitly enable them whilst others will happen out of the box. Whilst I’m not going to be able to cover all of the many enhancements in detail I’ll do my best to give you a flavor of some of these changes. To do this I’ll break the improvements into 6 main areas : Performance, High Availability, Multitenant, Security, Data Warehousing and Development. Performance For users of Exadata and Real Application Clusters (RAC), Oracle Database 18c brings changes that will enable a significant reduction in the amount of undo that needs to be transferred across the interconnect. It achieves this my using RDMA, over the Infiniband connection, to access the undo blocks in the remote instance. This feature combined with a local commit cache significantly improves the throughput of some OLTP workloads when running on top of RAC. This combined with all of the performance optimization that Exadata brings to the table, cements its position as the highest performance Database Engineered System for both OLTP and Data Warehouse Workloads. To support applications that fetch data primarily via a single unique key Oracle Database 18c provides a memory optimized lookup capability. Users simply need to allocate a portion of Oracle’s Memory (SGA) and identify which tables they want to benefit from this functionality, the database takes care of the rest. SQL fetches are significantly faster as they bypass the SQL layer and utilize an in-memory hash index to reduce the number or operations that need to be performed to get the row. For some classes of application this functionality can result in upwards of 4 times increase in throughput with a halving of their response times. To ease the maintenance work for In-Memory it’s also now possible to have tables and partitions automatically populated into and aged out of the column store. It does this by utilizing the Heat Map such that when the Column Store is under memory pressure it evicts inactive segments if more frequently accessed segments would benefit from population. Oracle Database In-Memory gets a number of improvements as well. It now uses parallel light weight threads to scan its compression units rather than a process driven serial scans. This is available for both serial and parallel scans of data and it can double the speed at which data is read. This improves the already exceptional scan performance of Oracle Database In-Memory. Alongside this feature, Oracle Database In-Memory also enables Oracle Number types to be held in their native binary representation (int, float etc). This enables the data to be processed by the vector processing units on processors like Intel’s Xenon CPU much faster than previously. For some aggregation and arithmetic operations this can result in a staggering 40 times improvement in performance. Finally, In-Memory in Oracle Database 18c also allows you to place data from external tables in the column store, enabling you to execute high performance analytics on data outside of the database. High Availability Whether you are using Oracle Real Application Clusters or Oracle DataGuard we continue to look for ways to improve on the Oracle Database’s high availability functionality. With Oracle Database 18c we’re rolling out a few significant upgrades. Oracle Real Application Clusters also gets a hybrid sharding model. With this technology you can enjoy all of the benefits that a shared disk architecture provides whilst leverage some of the benefits that Sharding offers. The Oracle Database will affinitize table partitions/shards to nodes in the cluster and route connections using the Oracle Database Sharding API based on a shard key. The benefit of this approach is that it formalizes a technique often taken by application developers to improve buffer cache utilization and reduce the number of cross shard pings between instances. It also has the advantage of removing the punitive cost of cross shard queries simply by leveraging RAC’s shared disk architecture. Sharding also gets some improvements in Oracle Database 18c in the form of “User Defined Sharding” and “Swim Lanes”. Users can now specify how shards are to be defined using either the system managed approach, “Hashing”, or by using an explicit user defined model of “Range” and “List” sharding. Using either of these last two approaches gives users the ability to ensure that data is placed in a location appropriate for its access. This might be to reduce the latency between the application and the database or to simply ensure that data is placed in a specific data center to conform to geographical or regulatory requirements. Sharded swim lanes also makes it possible to route requests through sharded application servers all the way to a sharded Oracle Database. Users do this by having their routing layer call a simple REST API. The real benefit of this approach is that it can improve throughput and reduce latency whilst minimizing the number of possible connections the Oracle Database needs to manage. For the users of Java in the Database we’re rolling out a welcome fix that will make it possible to perform rolling patching of the database. Multitenant Multitenant in Oracle Database 18c got a number of updates to continue to round out the overall architecture.  We’re introducing the concept of a Snapshot Carousel. This enables you to define regular snapshots of PDBs. You can then use these snapshots as a source for PDB clones from various points of time, rather than simply the most current one. The Snapshot Carousel might be ideal for a development environment or to augment a non-mission critical backup and recovery process. I’m regularly asked if we support Multitenant container to container active/active Data Guard Standbys. This is where some of the primary PDBs in one container have standby PDBs in an opposing container and vice versa. We continue to move in that direction and in Oracle Database 18c we move a step closer with the introduction of “Refreshable PDB Switchover”. This enables users to create a PDB which is an incrementally updated copy of a “master” PDB. Users may then perform a planned switchover between the PDBs inside of the container. When this happens the master PDB becomes the clone and the old clone the master. It’s important to point out that this feature is not using Data Guard; rather it extends the incremental cloning functionality we introduced in Oracle Database 12c Release 2. In Oracle Database 18c Multitenant also got some Data Guard Improvements. You can now automatically maintain standby databases when you clone a PDB on the primary. This operation will ensure that the PDB including all of its data files are created on the standby database. This significantly simplifies the process needed to provide disaster recovery for PDBs when running inside of a container database. We also have made it possible to clone a PDB from a Active Data Guard Standby. This feature dramatically simplifies the work needed to provide copies of production databases for development environments. Multitenant also got a number of small improvements that are still worth mentioning. We now support the use of backups performed on a PDB prior to it being unplugged and plugged into a new container. You can also expect upgrades to be quicker under Multitenant in Oracle Database 18c. Security The Oracle Database is widely regarded as the most secure database in the industry and we continue to innovate in this space. In Oracle Database 18c we have added a number or small but important updates. A simple change that could have a big impact for the security of some databases is the introduction of schema only accounts. This functionality allows schemas to act as the owners of objects but not allow clients to log in potentially reducing the attack surface of the database. To improve the isolation of Pluggable Databases (PDBs) we are adding the ability for each PDB to have its own key store rather than having one for the entire container. This also simplifies the configuration of non-container databases by introducing explicit parameters and hence removing the requirement to edit the sqlnet.ora file A welcome change for some Microsoft users is the integration of the Oracle Database with Active Directory. Oracle Database 18c allows Active Directory to authenticate and authorize users directly without the need to also use Oracle Internet Directory. In the future we hope to extend this functionality to include other third-party LDAP version 3–compliant directory services. This change significantly reduces the complexity needed to perform this task and as a result improves the overall security and availability of this critical component. Data Warehousing Oracle Database 18c’s support for data warehousing got a number of welcome improvements. Whilst machine learning has gotten a lot of attention in the press and social media recently it’s important to remind ourselves that the Oracle Database has had a number of these algorithms since Oracle 9i.  So, in this release we’ve improved upon our existing capability by implementing some of them directly inside of the database without the need for callouts, as well as added some more. One of the compromises that data warehouse users have had to accept in the past was that if they wanted to use a standby database, they couldn’t use no-logging to rapidly load data into their tables. In Oracle Database 18c that no longer has to be the case. Users can make a choice between two modes whilst accommodating the loading of non-logged data. The first ensures that standbys receive non-logged data changes with minimum impact on loading speed at the primary but at the cost of allowing the standby to have transient non-logged blocks. These non-logged blocks are automatically resolved by managed standby recovery. And the the second ensures all standbys have the data when the primary load commits but at the cost of throttling the speed of loading data at the primary, which means the standbys never have any non-logged blocks. One of the most interesting developments in Oracle Database 18c is the introduction of Polymorphic table functions. Table functions are a popular feature that enables a developer to encapsulate potentially complicate data transformations, aggregations, security rules etc. inside of a function that when selected from returns the data as if it was coming from a physical table. For very complicated ETL operations these table functions can be pipelined and even executed in parallel. The only downside of this approach was that you had to declare the shape of the data returned as part of the definition of the function i.e. the columns to be returned. With Polymorphic tables, the shape of the data to be returned is determined by the parameters passed to the table function. This provides the ability for polymorphic table functions to be more generic in nature at the cost of a little more code. One of my personal favorite features of this release is the ability to merge partitions online. This is particularly useful if you partition your data by some unit of time e.g. minutes, hours, days weeks and at some stage as the data is less frequently updated you aggregate some of the partitions into larger partitions to simplify administration. This was possible in previous versions of the of the database, but the table was inaccessible whilst this took place. In Oracle Database 18c you merge your partitions online and maintain the indexes as well. This rounds out a whole list of online table and partition operations that we introduced in Oracle Database 12c Release 1 and Release 2 e.g. move table online, split partition online, convert table to partition online etc. For some classes of queries getting a relatively accurate approximate answer fast is more useful than getting an exact answer slowly. In Oracle Database 12c we introduced the function APPROX_COUNT_DISTINCT which was typically 97% or greater but can provide the result orders of magnitudes faster. We added additional functions in Oracle Database 12c Release 2 and in 18c we provide some additional aggregation (on group) operations APPROX_COUNT(), APPROX_SUM() and APPROX_RANK(). Oracle Spatial and Graph also added some improvements in this release. We added support for Graphs in Oracle Database 12c Release 2. And now in Oracle Database 18c you can use Property Graph Query Language (PGL) to simplify the querying of the data held within them. Performance was also boosted with the introduction of support for Oracle In Memory and List Hash partitioning. We also added a little bit of syntax sugar when using external tables. You can now specify the external table definition inline on an insert statement. So no need to create definitions that are used once and then dropped anymore. Development As you’d expect there were a number of Oracle Database 18c improvements for developers, but we are also updating to our tools and APIs. JSON is rapidly becoming the preferred format for application developers to transfer data between the application tiers. In Oracle Database 12c we introduced support that enabled JSON to be persisted to the Oracle Database and queried using dot notation. This gave developers a no compromise platform for JSON persistence with the power and industry leading analytics of the Oracle Database. Developers could also treat the Oracle Database as if it was a NoSQL Database using the Simple Oracle Document Access (SODA) API. This meant that whilst some developers could work using REST or JAVA NoSQL APIs to build applications, others could build out analytical reports using SQL. In Oracle Database 18c we’ve also added a new SODA API for C and PL/SQL and included a number of improvements to functions to return or manipulate JSON in the database via SQL. We’ve also enhanced the support for Oracle Sharding and JSON. Global Temporary Tables are an excellent way to hold transient data used in reporting or batch jobs within the Oracle Database. However, their shape, determined by their columns, is persisted across all sessions in the database. In Oracle Database 18c we’ve provide a more flexible approach with Private Temporary Tables. These allow uses to define the shape of the table that is only visible for a given session or even just a transaction. This approach provides more flexibility in the way developers write code and can ultimately lead to better code maintenance. Oracle Application Express, Oracle SQL Developer, Oracle SCLCl, ORDS have all been tested with 18c and in some instance get small bumps in functionality such as support for Sharding. We also plan to release an REST API for the Oracle Database. This will ship with ORDS 18.1 a little later this year. And One Other Thing… We’re also introducing a new mode for Connection Manager. If you’re not familiar with what Connection Manager (CMAN) does today, I’d recommend taking a look here. Basically, CMAN allows you to use it as a connection concentrator enabling you to funnel thousands of sessions into a single Oracle Database. With the new mode introduced in Oracle Database 18c, it’s able to do a lot more. It can now automatically route connections to surviving database resources in the advent of some outage. It can also redirect connections transparently if you re-locate a PDB. It can load-balance connections across databases and PDBs whilst also transparently enabling connection performance enhancements such as statement caching and pre-fetching. And it can now significantly improve the security of incoming connections to the database. All in all, an exciting improvement to a great networking resource for the Oracle Database. Where to get more information We’ve covered some of the bigger changes in Oracle Database 18c but there are many more that we don’t have space to cover here. If you want a more comprehensive list take a look at the new features guide here. https://docs.oracle.com/en/database/oracle/oracle-database/18/newft/new-features.html You can also find more information on the application development tools here http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index.html http://www.oracle.com/technetwork/developer-tools/rest-data-services/overview/index.html http://www.oracle.com/technetwork/developer-tools/sqlcl/overview/sqlcl-index-2994757.html http://www.oracle.com/technetwork/developer-tools/apex/overview/index.html If you’d like to try out Oracle Database 18c you can do it here with LiveSQL https://livesql.oracle.com/apex/livesql/file/index.html For More information on when Oracle Database 18c will be available on other platforms please refer to Oracle Support Document 742060.1

Today we have a guest blogger, Dominic Giles, Master Product Manager from Oracle Database providing us with insights into what to expect from Oracle Database 18c. Oracle Database 18c's arrival marks a...

Autonomous Database

Oracle Data Management Solutions: Success From The Data Center To The Cloud

Oracle is off to a strong start in 2018. Industry analyst Gartner has recognized Oracle as a leader among 22 technology providers evaluated for their offerings in data management and analytic solutions. Gartner evaluates solution providers based on a variety of evaluation criteria including ability to execute and completeness of vision.  Based on these parameters, Gartner has published its latest February 2018 Magic Quadrant for Data Management Solutions for Analytics, and it is our opinion that Oracle stood out from the competition by offering truly meaningful technologies.  The announcement of Oracle Autonomous Database and Autonomous Data Warehouse Cloud enable Oracle to deliver new automation innovations in cloud based data management services   Figure-1: Gartner February 2018 Magic Quadrant for Data Management Solutions for Analytics The Oracle complete and unified data management portfolio that is highly automated provides customers a seamless path to the cloud.  As a customer becomes familiar and finds value in the Oracle Database or Oracle Database Cloud Service, there is a natural expansion into common cloud services such as database backup or IaaS, but also data monetization solutions such as data warehousing, BI, analytics or even big data.  Figure-2: Oracle Solutions For the Data Life-Cycle One of the big challenges for customers is that many public cloud providers lack support for a true hybrid-cloud deployment, which is the model that is predicted to be the norm for most organizations by the end of 2018.   Oracle offers true enterprise grade cloud solutions that are engineered for a 100% compatible hybrid-cloud deployment.  Whether your business requires data to remain on-premises, managed inside your data center or deployed in the cloud, with Oracle you have access to the same enterprise scale data management technology virtually anywhere.   Read the Gartner report: http://www.gartner.com/reprints/?id=1-4O3NVDI&ct=180109&st=sb For more information on Oracle Data Management Cloud Services please visit us at - https://cloud.oracle.com/en_US/data-management Gartner Magic Quadrant for Data Management Solutions for Analytics, Adam M. Ronthal, Roxane Edjlali, Rick Greenwald, 13 February 2018. This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Oracle. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

Oracle is off to a strong start in 2018. Industry analyst Gartner has recognized Oracle as a leader among 22 technology providers evaluated for their offerings in data management and analytic...

Autonomous Database

CloudWorld NYC 2018: Data Management Cloud Keynote, Sessions and Activities

Oracle CloudWorld 2018, New York City, is nearly upon us.  To attend the event, please register here. Oracle’s Data Management Cloud Services will have a significant presence at CloudWorld with the following sessions. Keynote “Revolutionize Your Data Management with World's 1st Autonomous Database” with  Monica Kumar, Vice President, Oracle - 11:20 a.m. – 11:50 a.m. - New York Ballroom West, Third Floor “The Autonomous Data Warehouse Cloud – Simplifying the Path to Innovation” with George Lumpkin, Vice President, Oracle and Edgar Haren, Product Marketing, Oracle - 12:40 p.m. – 1:10 p.m. - New York Ballroom West, Third Floor “Move Your Workloads: No Pain, Lots of Gain” with Sachin Sathaye, Senior Director Oracle, Zach Vinduska, Vice President of IT and Infrastructure, ClubCorp and Arvind Rajan, Entrepreneur and Architect, Astute Join Monica Kumar, Vice President of Oracle Cloud Platform Product Marketing, in the keynote session as she discusses the rise of data as a valuable asset of your organization, and therefore your enterprise database, and what the impact it will have on the future success of your business. Whether you are looking to unlock the power of your application data or interested in integrating all your data and making it accessible to all employees, the Autonomous Database can offer significant advantages to your organization. The Oracle Autonomous Data Warehouse Cloud session explores how this new solution can help customers easily and rapidly deploy data warehouse or data-marts to gain faster insights from their business-critical data.  This session examines common customer challenges with traditional data warehouse deployments, while detailing the value of the Autonomous Data Warehouse Cloud autonomous life-cycle and the features aimed at helping overcome these obstacles.  In addition, attendees will learn how Oracle’s new data management solution fits into a complete ecosystem consisting of business analytics, data integration, visualization, IoT and more. Considering moving your workloads to the cloud? The third session brings customers to the forefront with leaders from ClubCorp and Astute discussing their individual cloud migration experiences.  Hear how and why these organizations decided to “Lift and Shift” their IT deployments to Oracle Cloud.  Also find out why Astute decided to migrate their existing cloud deployment off of Amazon Web Services for Oracle.  This session will also cover the key benefits gained by the customers with tangible metrics on TCO and performance.  Learn from Oracle leadership and customers the value of moving your workloads to Oracle Cloud.   To see the Autonomous Data Warehouse demo and how you can lift and shift your applications, visit us in the demo area and mini-theatre on the show floor.  Talk to Oracle experts in this area. See you in New York City! Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Oracle CloudWorld 2018, New York City, is nearly upon us.  To attend the event, please register here. Oracle’s Data Management Cloud Services will have a significant presence at CloudWorld with the...

Autonomous Database

Oracle Releases Database Security Assessment Tool: A New Weapon in the War to Protect Your Data

Evaluate your database security before hackers do it for you!!   Today, we have guest blogger  -   Vipin Samar, Senior Vice President, Oracle Data is a treasure. And in my last 20 years of working in security, I’ve found that hackers have understood this better than many of the organizations that own and process the data. Attackers are relentless in their pursuit of data, but many organizations ignore database security, focusing only on network and endpoint security. When I ask the leaders responsible for securing their data why this is so, the most frequent answers I hear are: Our databases are protected by multiple firewalls and therefore must be secure. Our databases have had no obvious breaches so far, so whatever we have been doing must be working. Our databases do not have anything sensitive, so there is no need to secure them.   And yet, when they see the results from our field-driven security assessment, the same organizations backtrack. They admit that their databases do, in fact, have sensitive data, and while there may be firewalls, there are very limited security measures in place to directly protect the databases. They are even unsure how secure their databases are, or if they have ever been hacked. Given the high volume of breaches, they realize that they must get ready to face attacks, but don’t  know where to start.   Assessing database security is a good first step but it can be quite an arduous task. It involves finding holes from various angles including different points of entry, analyzing the data found, and then prioritizing next steps.  With DBAs focused on database availability and performance, spending the time to run security assessments or to develop database security expertise is often not a priority.   Hackers, on the other hand, are motivated to attack and find the fastest way in, and then the fastest way out.  They map out the target databases, looking for vulnerabilities in database configuration and over privileged users, run automated tools to quickly penetrate systems, and then exfiltrate sensitive data without leaving behind much of a trail.     If this were a war between organizations and hackers, it would be an asymmetric one. In such situations, assessing your own weaknesses and determining vulnerable points of attack becomes very critical.   Assess First I am excited to announce availability of the Oracle Database Security Assessment Tool (DBSAT). DBSAT helps organizations assess the security configuration of their databases, identify sensitive data, and evaluate database users for risk exposure.  Hackers take similar steps during their reconnaissance, but now organizations can do the same – and do it first. DBSAT is a simple, lightweight, and free tool that helps Oracle customers quickly assess their databases.  Designed to be used by all Oracle database customers in small or large organizations, DBSAT has no dependency on other tools or infrastructure and needs no special expertise.  DBAs can download DBSAT and get actionable reports in as little as 10 minutes. What can you expect DBSAT to find?  Based upon decades of Oracle’s field experience in securing databases against common threats, DBSAT looks at various configuration parameters, identifies gaps, discovers missing security patches, and suggests remediation. It checks whether security measures such as encryption, auditing, and access control are deployed, and how they compare against best practices.  It evaluates user accounts, roles, and associated security policies, determining who can access the database, whether they have highly sensitive privileges, and how those users should be secured. Finally, DBSAT searches your database metadata for more than 50 types of sensitive data including personally identifiable information, job data, health data, financial data, and information technology data. You can also customize the search patterns to look for sensitive data specific to your organization or industry.  DBSAT helps you not only discover how much sensitive data you have, but also which schemas and tables have them. With easy-to-understand summary tables and detailed findings, organizations can quickly assess their risk exposure and plan mitigation steps.  And all of this can be accomplished in a few minutes, without overloading valuable DBAs or requiring them to take special training. Reviewing your DBSAT assessment report may be surprising – and in some cases, shocking – but the suggested remediation steps can improve your security dramatically.    Privacy Regulations and Compliance DBSAT also helps provide recommendations to assist you with regulatory compliance. This includes the European Union General Data Protection Regulation (EU GDPR) that calls for impact assessments and other enhanced privacy protections.  Additionally, DBSAT highlights findings that are applicable to EU GDPR and the Center for Internet Security (CIS) benchmark.   Nothing could be Easier Oracle is a leader in preventive and detective controls for databases, and now with the introduction of DBSAT, security assessment is available to all Oracle Database customers. I urge you to download and try DBSAT – after all, it’s better that you assess your database’s security before the hackers do it for you!

Evaluate your database security before hackers do it for you!!   Today, we have guest blogger  -   Vipin Samar, Senior Vice President, Oracle Data is a treasure. And in my last 20 years of working...

Database

Why Move SAP Applications to Oracle Cloud?

Today we have guest blogger, Bertrand Matthelie, Senior Principal Product Marketing Director, providing us with insights into the value of migrating your SAP applications to Oracle Cloud. SAP NetWeaver-based applications have last year been certified on Oracle Cloud Infrastructure. NetWeaver-based applications represent most of the deployed SAP applications, and the majority of them are powered by Oracle databases. Indeed, while SAP is encouraging customers to move to S/4HANA, a Rimini Street survey shows that 65% of them have no plans to do so. They’re unable to build a business case, deem the ROI unclear, consider S/4HANA to be an unproven, early stage product, and face significant migration & implementation costs. Most customers want instead to keep running their existing proven SAP applications that they spent years customizing to their needs. At the same time however, they face pressure to reduce costs and improve agility to better support the business. Digital disruption is hard at work in all industries and organizations are looking for ways to shift resources from maintenance to innovation. Up to 80% of IT budgets can be spent on “keeping the lights on”; moving enterprises applications to the cloud represents an attractive way to reduce costs, free up resources and focus on higher value activities than infrastructure management. Moving SAP applications & Oracle Databases to Oracle Cloud enables customers to preserve existing investments while accelerating innovation, relying on the only cloud architected for enterprise workloads and optimized for Oracle Database. Key benefits include: High and predictable performance for SAP/enterprise applications with dedicated bare metal instances as well as high performance network and storage resources. Best Oracle Database performance: As demonstrated in a recent Accenture report focused on running enterprise workloads in the cloud, Oracle databases run up to 7.8x faster on Oracle Cloud Infrastructure (OCI) vs leading cloud provider. Lower costs & transparency: The Accenture report also demonstrates that customers can benefit from up to 34% lower infrastructure costs for their SAP/enterprise workloads relying on OCI vs leading cloud provider. Additionally, there are no hidden costs with Oracle Cloud, and Universal Credits allow you to benefit from simple, flexible and predictable pricing. Security and governance: Compute and network isolation help ensure data security; Compartment capabilities coupled with identity and access management and audit allow visibility and control for your SAP deployments. Complete & integrated cloud, enabling you to leverage Oracle’s most comprehensive PaaS & SaaS offering to for example connect your existing SAP applications to SaaS modules from any provider, or to extend your SAP applications with mobile interfaces or chatbots. According to the Rimini Street survey mentioned earlier, 30% of surveyed SAP customers look to augment their existing platforms with cloud applications for innovation. Various resources to learn more are at your disposal, discover now how you can ensure business continuity, reduce costs and accelerate innovation! Let us know if you have any question or comment. Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have guest blogger, Bertrand Matthelie, Senior Principal Product Marketing Director, providing us with insights into the value of migrating your SAP applications to Oracle Cloud. SAP...

Autonomous Database

Oracle Ask TOM Office Hours Now Live!

Hundreds of thousands of companies around the world rely on Oracle Database to power their mission-critical applications. Millions of developers and administrators rely on the Oracle Database dev team to provide them with the tools and knowledge they need to succeed. AskTOM Office Hours continues the pioneering tradition of Ask TOM. Launched in 2000 by Tom Kyte, the site now has a dedicated team who answer hundreds of questions each month. Together they’ve helped millions of developers understand and use Oracle Database. AskTOM Office Hours takes this service to the next level, giving you live, direct access to a horde of experts within Oracle. All dedicated to helping you get the most out of your Oracle investment. To take advantage of this new program, visit the Office Hours home page and find an expert who can help. Sign up for the session and, at the appointed hour, join the webinar. There you can put your questions to the host or listen to the Q&A of others, picking up tips and learning about new features. Have a question about upgrading your database? Sign up for Mike Dietrich and Roy Swonger's session. Struggling to make sense of SQL analytic functions? AskTOM's very own Connor McDonald and Chris Saxon will be online each month to help. JSON, Application Express, PL/SQL, In-Memory, Multitenant...if you've got a question, we've got an expert with an answer, and they're just a URL away. Our experts live all over the globe. So even if you inhabit Middleofnowhereland, you’re sure to find a time-slot that suits you. You need to make the most of Oracle Database and its related technologies. It's our job to make it easy for you. So visit https://asktom.oracle.com/pls/apex/f?p=100:500 and sign up for your sessions.  AskTOM Office Hours: Dedicated to Customer Success Resources: Announcement Topic Page Larry Keynote Larry Demonstrates how AWS is 5x-13x more expensive video iPaper: Oracle Autonomous Data Warehouse Cloud (PDF) Podcast: Oracle Autonomous Data Warehouse Cloud (9:35) Oracle Autonomous Data Warehouse Cloud  Animated Video Oracle Autonomous Data Warehouse Cloud  Executive Video Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Hundreds of thousands of companies around the world rely on Oracle Database to power their mission-critical applications. Millions of developers and administrators rely on the Oracle Database dev team...

Autonomous Database

Forecast the Cost Benefits of Oracle Database Cloud Service with the New TCO Calculator

When looking to migrate a workload to the cloud, one of the first business challenges can be quantifying the business value of the proposed move to the cloud. Many organizations are looking for tools that will help them build the business case for their management.  Oracle's Data Management Cloud Services team has recently launched their new Total Cost of Ownership calculator to help organizations get a detailed break-down of the potential cost savings they would see by moving their on-premises database deployments to Oracle's Database Cloud Service.  The TCO calculator will provide side-by-side compares with an equivalent on-premises deployment, as well as proving a detailed report of one's potential costs savings by compute, storage, software and facility expenditures. Here are the simple steps to using the new tool - 1. First, go to the TCO Tool by clicking here: 2. Once there click on the "Start Now" arrow 3. Now, you will need to input your company name, select a location and a currency type 4. Following step #3, you will simply need to input your on-premises core and storage values.  I have used 32 cores and 20TB for this demo.  As you can see, you will immediately get a snap-shot to the right of your potential savings via Oracle Cloud.  You can now click on the Blue tab to see a detailed break down of the assumptions used in the calculator, and to edit any metrics.  For instance, the calculator is pre-set at a 3YR TCO calculation, but one could enter into the editing fields and change the value to a single year.  Once ready, you can also click on the Orange tab to see and download the detailed report. 5. In the next frame, you will see the detailed breakdown by cost segment, and a synopsis on the top, as well as graphical depiction to the left.  Users can still edit the sections or finally go to the view results tab. 6.  Once you click on "view results" you will come to the download page.  Here you will have the option to access other informational assets, or continue to the download the detailed PDF report of your results. Your detailed report will provide you with graphical representations as well as a full break-down of the assumptions and metrics used to calculate the TCO results. I hope you find the tool helpful, and for more information on Oracle's Data Management Cloud Services, please see the links below.   Resources: Announcement Topic Page Larry Keynote Larry Demonstrates how AWS is 5x-13x more expensive video iPaper: Oracle Autonomous Data Warehouse Cloud (PDF) Podcast: Oracle Autonomous Data Warehouse Cloud (9:35) Oracle Autonomous Data Warehouse Cloud  Animated Video Oracle Autonomous Data Warehouse Cloud  Executive Video Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

When looking to migrate a workload to the cloud, one of the first business challenges can be quantifying the business value of the proposed move to the cloud. Many organizations are looking for tools...

Data Warehouse / Big Data

Weekly Live Demo Webcast Series: Oracle Autonomous Data Warehouse Cloud

The Oracle data management cloud marketing and product management teams are excited to bring you a weekly webcast series focused on the newly announced Oracle Autonomous Data Warehouse Cloud. The Oracle Autonomous Data Warehouse Cloud is the first of Oracle’s upcoming Autonomous Database services. Join us for a live demo every week and see how quick and easy you can provision a data warehouse on Oracle Cloud. This is a weekly webcast series. Each week, we will present a live walk through of this cloud service and highlight use cases that can help you unleash the power of your data. The demo will cover the following topics: Provisioning a new data warehouse Connecting tools and SQL clients Creating users Loading data Running queries on external data Scaling the service Monitoring the service   Presenter: Yasin Baskan, Senior Principal Product Manager, Oracle Autonomous Data Warehouse Held every Tuesday, 9:00 a.m. PT/12:00 p.m. ET (except on 12/26 and 1/02). Join the next live demo. Tue., December 19, 2017 Tue., January 9, 2018 Tue., January 16, 2018 Tue., January 23, 2018 Tue., January 30, 2018 Tue., February 6, 2018 Tue., February 13, 2018 To register for the event please click here Other Resources: Announcement Topic Page Larry Demonstrates how AWS is 5x-13x more expensive video Your New Autonomous Data Warehouse eBook iPaper: Oracle Autonomous Data Warehouse Cloud (PDF) Oracle Versus Commodity Cloud Vendors Competitive Solution Brief 2017 Oracle Autonomous Data Warehouse Cloud  Animated Video Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

The Oracle data management cloud marketing and product management teams are excited to bring you a weekly webcast series focused on the newly announced Oracle Autonomous Data Warehouse Cloud. The...

Customer Stories

Oracle OpenWorld 2017 Data Management PaaS Customer & Partner Stories Part-One

This year we had a great collection of customer and partner videos, so many in fact, that I am making this a two part blog: First let's take a look at some customer videos:   Calypso Technology, Inc. is a leading provider of cross-asset, front-to-back technology solutions for financial markets. With 20 years of experience delivering cross-asset solutions for trading, processing, risk management and accounting, we are able to focus our significant resources on customer problems, bringing simplicity to the most complex business and technology issues.  The constant pressures for better allocation of capital and improved risk management, matched by an ever changing regulatory landscape in the financial markets demand technology solutions that are reliable, adaptable and scalable. In response Calypso provides customers with a single platform designed from the outset to enable consolidation innovation and growth. In this video Vikram Pradham, Sr. Director of Engineering discusses how Oracle Cloud helps Calypso meet demanding customer SLAs. FICO (NYSE: FICO) is a leading analytics software company, helping businesses in 90+ countries make better decisions that drive higher levels of growth, profitability and customer satisfaction. The company’s groundbreaking use of Big Data and mathematical algorithms to predict consumer behavior has transformed entire industries. FICO provides analytics software and tools used across multiple industries to manage risk, fight fraud, build more profitable customer relationships, optimize operations and meet strict government regulations In this video Doug Clare, VP of Marketing, describes how Oracle Cloud and Oracle Exadata helped them extend the performance of one of their key money laundering and compliance applications.   Now let's take a look at what our partners are saying: Flexagon was founded in 2013 by a team who had spent their entire careers in roles related to custom software development, systems integration, product development and automation. Flexagon’s FlexDeploy is a market leading DevOps and Application Release Automation product that supports companies from the smallest contenders to the largest enterprises. Dan Goerdt, President at Flexagon, shares his key takeaways from Oracle OpenWorld 2017, including the announcements of universal credits for Oracle Cloud and the ability for customers to bring PaaS licenses from on-prem to the Cloud, as well as Oracle's ongoing investment in open source technologies such as Docker and Kubernetes, and how those news benefit Oracle and Flexagon customers.   For more than 25 years and across hundreds of successful implementations, Vlamis Software Solutions has been integrating Oracle’s business analytics solutions for many of America’s leading organizations. From large retailers to world-class financial institutions and from CPG manufacturers to the entertainment industry’s leading lights, Vlamis has delivered exceptional analytic insights and blazing performance. At Oracle OpenWorld 2017, Tim Vlamis, Oracle ACE and VP at Vlamis Software Solutions, shared his thoughts on the new Oracle Autonomous Data Warehouse Cloud and how it fits well with Oracle Analytics Cloud to deliver business value to customers immediately.   Viscosity is a recognized consulting firm and trusted advisor to many Fortune 500 organizations and is now a recognized Oracle Platinum Partner for our expertise in mitigating technical challenges and providing enterprise solutions. Recognized in the industry for their in-depth expertise in high availability solutions, database technologies, cloud readiness and migrations, application development, big data integration, and performance tuning; Viscosity supports thousands of mission critical and business critical databases all over the world. Nitin Vengurlekar, Oracle ACE Director and CTO at Viscosity, shares how he thinks Oracle Autonomous Data Warehouse can benefit modern organizations by offering deployment simplicity, high performance through pre-optimized configuration, and elasticity.   Stay tuned for further blogs sharing our customer and partner videos.  For more information please take a look at resources below, and also be sure to take a test drive on any of Oracle’s cloud services with free cloud-credits here. Resources: Announcement Topic Page Larry Keynote Larry Demonstrates how AWS is 5x-13x more expensive video iPaper: Oracle Autonomous Data Warehouse Cloud (PDF) Podcast: Oracle Autonomous Data Warehouse Cloud (9:35) Oracle Autonomous Data Warehouse Cloud  Animated Video Oracle Autonomous Data Warehouse Cloud  Executive Video Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

This year we had a great collection of customer and partner videos, so many in fact, that I am making this a two part blog: First let's take a look at some customer videos:   Calypso Technology, Inc. is...

Database

Oracle Revolutionizes Cloud with the World’s First Self-Driving Database

At Oracle OpenWorld, Oracle Chairman of the Board and CTO Larry Ellison unveiled his vision for the world’s first autonomous database cloud. Powered by Oracle Database 18c, the next generation of the industry-leading database, Oracle Autonomous Database Cloud uses ground-breaking machine learning to enable automation that eliminates human labor, human error, and manual tuning to enable unprecedented availability, high performance, and security at much lower cost. The new autonomous database was designed on a set of key tenets: Self-Driving: Provides continuous adaptive performance tuning based on machine learning.  Automatically upgrades and patches itself while running.  Automatically applies security updates while running to protect against cyberattacks. Self-Scaling: Instantly resizes compute and storage without downtime. Cost savings are multiplied because Oracle Autonomous Database Cloud consumes less compute and storage than Amazon, with lower manual administration costs. Self-Diagnosis: Automatically diagnoses root cause of performance issues, while keeping detailed performance and resource utilization history.  Real-time SQL monitoring automatically diagnoses how resources are used in SQL statements.  Scans for issues across all layers of stack using key diagnostic tools.  Oracle Engineering using Machine Learning to prevent issues from happening. Self-Repairing: Provides automated protection from downtime. SLA guarantees 99.995 percent reliability and availability, which reduces costly planned and unplanned downtime to less than 30 minutes a year. Self-Migration and Data Loading: Automated migration and data loading to the cloud using the tools below.  Database directly reads and loads flat file data from object store. Backup to Cloud Data Pump Export Data Guard Standby Golden Gate Self-Backup: Backups are scheduled on nightly basis to Oracle Database Backup Cloud Service.  Retention Time for backups is configurable. The Oracle Autonomous Database Cloud will be available for many different workload styles, including transactions, mixed workloads, data warehouses, graph analytics, departmental applications, document stores, and IoT workloads.  With built-in automation at all levels to perform maintenance tasks, companies can now use their valuable IT resources to focus on extracting more value from the data they currently manage to directly influence business opportunities and outcomes. In addition, Oracle also announced its latest data management cloud offering, the Autonomous Data Warehouse Cloud.  Oracle Autonomous Data Warehouse Cloud is a next-generation cloud service built on the self-driving Oracle Autonomous Database technology using machine learning to deliver unprecedented performance, reliability, and ease of deployment for data warehouses. As an autonomous cloud service, it eliminates error-prone manual management tasks and frees up DBA resources, which can now be applied to implementing more business projects. Highlights of the Oracle Autonomous Data Warehouse Cloud include: •     Simplicity: Unlike traditional cloud services with complex, manual configurations that require a database expert to specify data distribution keys and sort keys, build indexes, reorganize data, or adjust compression, Oracle Autonomous Data Warehouse Cloud is a simple “load and go” service. Users specify tables, load data, and then run their workloads in a matter of seconds – no manual tuning is needed. •     Industry-Leading Performance: Unlike traditional cloud services, which use generic compute shapes for database cloud services, Oracle Autonomous Data Warehouse Cloud is built on the high-performance Exadata platform. Performance is further enhanced by fully-integrated machine learning algorithms which drive automatic caching, adaptive indexing, and advanced compression. •     Instant Elasticity: Oracle Autonomous Data Warehouse Cloud allocates new data warehouses of any size in seconds and scales compute and storage resources independently of one another with no downtime. Elasticity enables customers to pay for exactly the resources that the database workloads require as they grow and shrink. Many customers are proving the value of data warehouses in the cloud through “sandbox” environments, line of-business data marts, and database backups.  More advanced monetization use cases include high-performance data management projects, data warehouses coupled with cloud computing analytics, and big data cloud implementation.  Oracle’s revolutionary Autonomous Database for Data Warehouse cloud service is the industry’s first solution for delivering business insights with unmatched reliability. Oracle is the market leader for data warehousing solutions.  Oracle Autonomous Database Cloud Service for Data Warehouse makes available a highly scalable solution to customers with the ease, simplicity, high-performance and security, value that only Oracle can deliver.  With Oracle, customers have deployment choice, and can preserve their existing investment while enabling new monetary opportunities for their most valuable asset -their data. For more information please take a look at resources below, and also be sure to take a test drive on any of Oracle’s cloud services with free cloud-credits here.   Resources: Announcement Topic Page Larry Keynote Larry Demonstrates how AWS is 5x-13x more expensive video iPaper: Oracle Autonomous Data Warehouse Cloud (PDF) Podcast: Oracle Autonomous Data Warehouse Cloud (9:35) Oracle Autonomous Data Warehouse Cloud  Animated Video Oracle Autonomous Data Warehouse Cloud  Executive Video Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

At Oracle OpenWorld, Oracle Chairman of the Board and CTO Larry Ellison unveiled his vision for the world’s first autonomous database cloud. Powered by Oracle Database 18c, the next generation of the...

Customer Stories

Oracle OpenWorld 2017 Key Sessions for Data Management Cloud Services

It's that time of year again, and Oracle OpenWorld is upon us.  Oracle OpenWorld is October 1–5, 2017, in San Francisco.  To attend OpenWorld 2017 please register here. Here is the exciting lineup of speakers: Below are links to Oracle Focus On Documents that detail specific sessions of interest for specific categories.  We have created FOD's for several of the primary use cases for Oracle's Data Management Platform.   - Focus on Moving Workloads to the Cloud: https://events.rainfocus.com/widget/oracle/oow17/1504907826863001TLXh - Focus on High Performance Oracle Database: https://events.rainfocus.com/widget/oracle/oow17/1504907827354001TdHH - Focus on Business Continuity and Recovery with Oracle Database: https://events.rainfocus.com/widget/oracle/oow17/1504907827095001TmTS - Focus on Analytics with Oracle Database: https://events.rainfocus.com/widget/oracle/oow17/1504907910673001Cxao - Focus on Oracle Database Consolidation: https://events.rainfocus.com/widget/oracle/oow17/1504907827454001TG7M - Focus on Data Management Solutions for Developers: https://events.rainfocus.com/widget/oracle/oow17/1504907827561001TtpI - Focus on Customer Stories for Oracle Database Cloud: https://events.rainfocus.com/widget/oracle/oow17/1505418112914001d3Do - Focus on Oracle Database Cloud Services: https://events.rainfocus.com/widget/oracle/oow17/1504907826765001TblN - Focus on Oracle Database Exadata Cloud Service: https://events.rainfocus.com/widget/oracle/oow17/1504907827253001T4Mv - Focus on Oracle In-Memory Technologies: https://events.rainfocus.com/widget/oracle/oow17/1504907911350001Clke   Resources: Oracle Versus Commodity Cloud Vendors Solution Brief Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Exadata Express Cloud Service Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

It's that time of year again, and Oracle OpenWorld is upon us.  Oracle OpenWorld is October 1–5, 2017, in San Francisco.  To attend OpenWorld 2017 please register here. Here is the exciting lineup of...

Customer Stories

Oracle And AT&T Leadership Discuss Their Massive Cloud Migration Deal

Today we have a great editorial from IT World that focuses on the details behind AT&T's massive migration to Oracle's Cloud. In this Q&A by IDG News Service Editor in Chief Marc Ferranti, we get the perspectives from Oracle's CEO Mark Hurd and AT&T Communications CEO John Donovan. Below is the article - If worries about digital transformation projects keep you up at night, imagine how it would feel to be responsible for moving thousands of internal databases to the cloud for a company with more than $160 billion in annual sales and 260,000 employees. That's the job that AT&T Communications CEO John Donovan is undertaking, and he's working with Oracle CEO Mark Hurd to do it.  When the companies announced in May that they were working together, Hurd called the agreement "historic." While hyperbole is part of everyday life in tech, lessons learned from the massive project are bound to reverberate across enterprises in a variety of fields, as Hurd noted in the following discussion with Donovan and IDG News Service Editor in Chief Marc Ferranti. AT&T, the biggest telecom company in the world by revenue, had already plunged into  virtualizing and software-controlling its wide area network. The goal is to virtualize 75 percent of its core network functions by 2020, hitting 55 percent by the end of this year. The move to bring its databases to the cloud, meanwhile, will allow it to knock down information silos, achieve greater operational efficiency and innovate new products and services. Oracle, meanwhile, has been knocked for being late to the cloud, allowing IaaS (infrastructure as a service) leaders like Amazon, Google and Microsoft to scoop up customers. But over the past year it introduced its Generation2 IaaS offering, forged ahead on regular updates to its SaaS (software as a service) line, and polished up its Cloud at Customer offering, which lets enterprises put data and applications behind their firewalls while taking advantage of cloud pricing models and technology. This was important for AT&T, which must be mindful of regulatory requirements for customer data.  In the following discussion, edited for publication, Hurd and Donovan talk about working together on what they (half-jokingly?) refer to as the lunatic fringe of digital transformation. IDG: “Collaboration” is a term sometimes used for basically straightforward business deals: a customer pays a vendor for products and services. You’ve called this a collaboration – in what sense is the Oracle-AT&T deal a true collaboration? John Donovan: In this case we didn’t buy what Mark was selling off the shelf. We didn’t look at where Oracle was. We looked at what we were trying to accomplish as a company, how vast the job at hand was, and then we looked at the evolution and the architecture of what Oracle was doing in their cloud strategy in order to find a territory where we could buy and they had to build. Oracle is going to address our specific need: How do you tear down a massive database and regionally distribute it so that you can be really fast in how you’re managing your IT application changes that rest on top of this data? Secondly, on our side, it changes fundamentally how we do things – we don’t have to waste our energy and time now on one by each for 40,000 databases deciding how we’re going to migrate to this new architecture. We created an economic construct and a technical construct that allows my team to focus on how you get it done instead of what are we gonna do and who are we gonna do it with. So I think we met at a point where Oracle has to build a world- class database roadmap to migrate an extremely large company from fixed databases to clouds. That in my view is what collaboration is defined as. IDG: Mark, did you have to change the underlying technology architecture of your infrastructure as a service offering or your platform as a service offering to get this deal done? Mark Hurd: Well we’ve rewritten all of our technologies including Database itself for the quote/unquote cloud, so that’s true with the new infrastructure offering, all of our platform offerings including Java analytics, Database, etcetera. Now in addition to that, to John’s point, we did some other things in this that were probably unique, meaning that we took what we do in the public cloud and we’re actually bringing that to AT&T – we do the patching, we do the managing, etcetera. And then we have a joint team together that’s doing the migration of many of these legacy databases that John described over to the modern database architecture. So it’s both R&D, product development, some unique things that we’re doing for ATT&T, to make all of this happen. IDG: So just to clarify  --   are the databases going to the public cloud or are they going to the Cloud at Customer offering. Or is it a combination of both? Donovan: From AT&T’s perspective, certain data has a regulatory requirement that it sit inside our walls. Other data is capable of being moved to Oracle’s cloud and being consumed as a service. In addition to just looking at databases we also looked at applications, and within applications, capabilities like AI and machine learning, where we could create derived data – and some has to sit on premises, some off. We’re comfortable that we’ve partitioned the things on prem that need to stay on prem. Then we have this added dimension of having this stuff secure in motion. So our data is secure at rest and it's secure in motion whether it is in our data center or in an Oracle facility . Hurd: That’s a really good description of the benefits of Cloud at Customer because you now have the same version of our public cloud that will actually be here at Customer Cloud so we can move data back and forth to the public cloud as AT&T so desires. It really gives us the benefit of taking all of these benefits of the cloud to the customer and the ability to burst and move data back and forth seamlessly as we go forward. One other point John brought up which I think is interesting is that it also gives us a platform now for all of our modern applications to sit on this infrastructure as well, integrated with AI, integrated with analytics.   For the rest of the article and interview, please be sure to visit the link below. Resources: Oracle's Hurd, AT&T's Donovan on their massive cloud migration deal full article Oracle Database Exadata Cloud Machine Product Page Oracle Versus Commercial Cloud Database Providers Competitive Solution Brief Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Analytics Cloud Product Page Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have a great editorial from IT World that focuses on the details behind AT&T's massive migration to Oracle's Cloud. In this Q&A by IDG News Service Editor in Chief Marc Ferranti, we get the...

Data Warehouse / Big Data

Big Data at UPS. Interview with Jack Levis

Today we have a guest blog interview featuring Roberto V. Zicari, editor of ODBMS.ORG (www.odbms.org). ODBMS.ORG is designed to meet the fast-growing need for resources focusing on Big Data, Data Science, Analytical Data Platforms, Scalable Cloud platforms, NewSQL databases, NoSQL datastores, In-Memory Databases, and new approaches to concurrency control.  In this blog Roberto interviews Jack Levis,Senior Director, Industrial Engineering at UPS.  Below is their interview - Q1. Can you give us a background on UPS and some of the challenges that UPS faces? Jack Levis: The e-Commerce revolution brings with it some interesting challenges. First, from a cost-to-serve standpoint, residential deliveries are less dense in terms of distance between stops and pieces per stop. Second, residential customers want personalization in their delivery experience, which adds cost and other challenges. UPS has met these challenges by utilizing operations technology and advanced analytics. Q2. UPS won the 2016 Franz Edelman prize for its On Road Integrated Optimization and Navigation system (ORION). You lead the four-year long development of ORION. ORION completed deployment in 2016. How did you manage to reduce 100 million miles driven annually and save UPS $300 to $400 million each year? Jack Levis: UPS has a long history of innovation and constant improvement. The ORION story actually began in the late 1990s when UPS started building our Package Flow Technologies (PFT) data infrastructure. PFT created predictive models, a “virtual network”, and a suite of planning and execution tools. This was the foundation for ORION. PFT deployed in 2003 and reduced 85 million miles driven per year. ORION was built on this robust foundation. Using the discipline of operations research, ORION created a proprietary route optimization brain. Research into ORION began in 2003, and the first model was field tested in 2008 in Gettysburg, Pennsylvania. The result was that ORION could find ways to serve all customers in a route while at the same time reducing cost. It does so by systematically analyzing more than 200,000 different ways a route can be run and then presenting the optimal route to a driver. It does so in seconds. The ORION savings of 100 million miles and $300 million to $400 million annually is in addition to the savings from PFT.   Q3. What data infrastructure and data analytics tools did you use to implement ORION? Jack Levis: As mentioned above, ORION sits on top of our proprietary PFT technology. The analytics tools are also built in house by UPS’s operations research team. The marriage of operations research, IT and business processes is part of the ORION success story. Q4. What were the main challenges and pitfalls you encountered in the project? Jack Levis: The first challenge was to build the ORION optimization engine (brain) that could not only meet service while reducing cost but do so while thinking like a driver. This meant that ORION needed to balance consistency and optimality. It made no sense to throw things up in the air just to save a penny. To do so, UPS had to reevaluate business rules, methods, procedures, etc. In essence, UPS redesigned the delivery process. The second challenge was ensuring the data fed to ORION was “pristine”, and maps were a major challenge in this regard. Off-the-shelf maps were not accurate enough. UPS patented a process for utilizing our “big data” infrastructure to help make maps accurate enough for ORION. For instance, if a speed limit changes or a bridge is out, this edit must be made and updated information sent to ORION. From the time the map is edited, an optimized route can be in a driver’s hands in 30 seconds. The third and largest challenge was change management. By definition, optimization tools like ORION will require people to change behavior. UPS tested training procedures, new metrics, analysis tools, etc. to ensure the change would take place. Ultimately, the deployment team grew to 700 people. This team impacted tens of thousands of front-line personnel. The deployment team and the front line are the true heroes of the story. Q5. What are the three most important lessons learned during this project? Jack Levis: Never assume you know the answers. The first four years were spent truly understanding the delivery problem. There were many guidelines but few rules. ORION had to turn these guidelines into acceptable algorithms. The team did so by working with drivers and the front line until ORION started thinking like a driver. Data is always a bottleneck. Initially, the team thought maps that could be purchased would be accurate enough. When the optimizations gave bad answers, the team looked at the algorithm. As it turned out, the problem was the map data accuracy and not the algorithm. The algorithm could not be truly tested until the data inputs were “pristine.” Don’t forget deployment and change management. Do not think that “If you build it, they will come.” ORION required extensive change management and front-line support. It is important to have support from the top and show that the results are achievable. Without understanding the importance of change management, new programs run the risk of becoming a “flavor of the month.”   Q6. Can you recommend three best practices so that other projects can have a smoother path? Jack Levis: Focus on decisions. Put effort into areas where a better decision will make an impact. Focus on deployment and data from the beginning. The ORION deployment strategy is now the standard for all operations projects at UPS. Ensure the right data infrastructure is in place with proper data. Utilize appropriate associations and networking for help like the Institute of Operations Research and the Management Sciences (INFORM). Q7. What were the key elements that made this project a success? Jack Levis: Support from senior management to allow the team to continue the research even when failures occurred. Proving that benefits could be achieved. The teams ran 11 different tests in multiple operations to test things like benefit achievement, training, metrics, best practices, etc. No site could be deployed unless an entrance criteria was met. No site could say deployment was completed unless exit criteria was met. There was constant evaluation of metrics and issues. ORION was built inside a delivery process. Operations do not know they are using such advanced mathematics. They are just doing their job. Q8. In general, how do you evaluate if the insight you obtain from data analytics is “correct” or “good” or “relevant” to the problem domain? Jack Levis: The key is to start with decisions. See below (Q9).   Q9. What are the typical mistakes done when analyzing data for a large scale data project? How can they be avoided in practice? Jack Levis: We do NOT focus on technology. We focus on decisions. Big data is a how, not a what. We care less about Big Data than we care about big insight and big impact. By focusing on decision that need to be made, priorities become clearer. Ensuring decision-makers have the right information to make decisions and then measuring the impact of better decisions helps with the process. It helps to ensure the proper data is in place to make an impact. If an impact can be made from a simple tool, that is a good thing. Q10. What are the main barriers to applying machine learning at scale? Jack Levis: The largest barrier is not focusing on decisions. Q11. What is your next project? Jack Levis: UPS will build out ORION. ORION will begin making suggestions to drivers throughout the day. ORION will also provide navigation to drivers. We are also working on ORION making “dispatch” decisions. ORION will begin deciding which driver should serve customers. In essence, at some point it will automate the pickup/delivery process. Simultaneously, UPS will provide ORION-like functionality in other areas of the business. There will be a PFT/ORION for Transportation; city-to-city movements. There will be a PFT/ORION for inside the building operations; sorting, loading, moving of vehicles, etc. Qx Anything else you wish to add? Jack Levis: The advances mentioned above along with automated facilities will begin to automate and optimize the network. Today, ORION optimizes a single driver. Tomorrow, ORION will begin to optimize an entire delivery area. UPS has a bold vision to optimize an entire network from end to end.   Jack Levis, Senior Director, Industrial Engineering, UPS “Through the marriage of technology, information, and analytics UPS reduces cost and improves services. These advanced technologies streamline our business processes and ultimately benefit our customers.” Jack Levis, Senior Director of Industrial Engineering, drives the development of operational technology solutions. These solutions require advanced analytics to reengineer current processes to streamline the business and maximize productivity. Jack has been the business owner and process designer for UPS’s Package Flow Technology suite of systems, which includes its award-winning delivery optimization, ORION (On Road Integrated Optimization and Navigation). These tools have been a breakthrough for UPS, resulting in a reduction of 185 million miles driven each year. ORION has completed deployment and is providing significant operational benefits to UPS and its customers. UPS estimates that ORION is reducing costs by $300 million to $400 million per year. Jack earned his Bachelor of Arts degree in psychology, from California State University Northridge. He also holds a Master’s Certificate in Project Management from George Washington University. Jack is a fellow of the Institute for Operations Research and Management Sciences (INFORMS), receiving their prestigious Kimball Medal and the President’s Award. Jack holds advisory council positions for multiple universities and associations, including the U.S. Census Bureau Scientific Advisory Committee. The blog was originally published here: http://www.odbms.org/blog/2017/08/big-data-at-ups-interview-with-jack-levis/ UPS Resources: UPS Wins 2016 INFORMS Franz Edelman Award For Changing The Future Of Package Delivery https://pressroom.ups.com/pressroom/ContentDetailsViewer.page?ConceptType=PressReleases&id=1460463832713-874 UPS On-Road Integrated Optimization and Navigation (ORION) https://pressroom.ups.com/pressroom/media-kits/mediakits.page?id=1426416927295-300&ConceptType=MediaKits ORION Backgrounder https://pressroom.ups.com/pressroom/ContentDetailsViewer.page?ConceptType=FactSheets&id=1426321616277-282 UPS ORION: The Driver Helper (Infographic) https://pressroom.ups.com/mobile0c9a66/assets/pdf/pressroom/infographic/UPS_ORION_Infographic_v6.pdf Oracle Resources: Oracle Big Data Cloud Service Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database 12c Release 2 for Data Warehousing and Big Data Whitepaper Oracle Analytics Cloud Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

Today we have a guest blog interview featuring Roberto V. Zicari, editor of ODBMS.ORG (www.odbms.org).ODBMS.ORG is designed to meet the fast-growing need for resources focusing on Big Data, Data...

Customer Stories

The Rise of the Renaissance Database Administrator

Today, I wanted to share, via our blog, a portion of our new whitepaper entitled "The Rise of the Renaissance Database Administrator: Cloud Opens Up New Career Vistas".  For the complete whitepaper please download the asset in the resource section at the end of the blog.     Introduction What do these market-defining trends have in common? · Analytics for all · Analytics as competitive differentiator · Internet of Things · Artificial intelligence/Machine learning/Cognitive computing · Real-time analytics/event management They all rely on data – timely, accurate data delivered within an insightful context – to deliver value. The question is: who in the enterprise is most qualified and prepared to help deliver on the vision and values of the data-driven enterprise? It’s going to take a special type of professional to deliver that value to enterprises. Organizations are seeking professionals to step forward and take the lead, provide guidance and lend expertise to move into the brave new world of digital. The move to digital and all that it entails – sophisticated data analytics, online customer engagement and digital process efficiency – requires, above all, the skills and knowledge associated with handling data and turning it into insights. The move to digital is also a search for leadership, and senior executives are only too aware that they alone do not have the skills and know-how to make it happen. Database administrators are in the right place, at the right time, for this emerging opportunity. This is the time for DBAs to step forward, take a leadership role, and move the enterprise to take action. Who else can we look to as stewards of customer information, transaction data and intellectual property? DBAs have the knowledge and education to secure sensitive data on behalf of organizations and citizens of the known universe. Now is the time for DBAs to be proactive and lead their companies to successfully move into the digital realm.   The U.S. Bureau of Labor Statistics predicts that demand for DBAs as an occupation will keep rising, with 11% more jobs available within the next seven years – faster than most occupational categories   Cloud as Digital Growth Catalyst Cloud computing is the gateway to the digital economy, and the growth of cloud is also boosting the profiles of DBAs in a dramatic fashion, creating increased demand as well as elevating the nature of their jobs. The U.S. Bureau of Labor Statistics predicts that demand for DBAs as an occupation will keep rising, with 11% more jobs available within the next seven years – faster than most occupational categories. Within the tech and cloud sector, DBA growth is even more pronounced – expanding at a rate of 26% within this same time period. As BLS describes in its most recent career outlook analysis: “Growth in this occupation will be driven by the increased data needs of companies in all sectors of the economy. Database administrators will be needed to organize and present data in a way that makes it easy for analysts and other stakeholders to understand.” BLS also cites “the increasing popularity of Database as a Service, which allows database administration to be done by a third party over the Internet, could increase the employment of DBAs at cloud computing firms in the data processing, hosting, and related services industry. The increasing adoption of cloud services by small and medium-sized businesses that do not have their own dedicated information technology (IT) departments could increase the employment of DBAs in establishments in this industry.” A look at the nature of cloud traffic – and the increasing importance of data in the cloud – were revealed in the latest data center traffic estimates from Cisco. The latest report estimates, for example, that 3.9 zettabytes of cloud traffic moved through the world’s data centers in the year 2015. By the year 2020, this traffic is projected to also quadruple, to more than 14 zettabytes. More than 20% of this traffic is and will continue to be data analytics and Internet of Things workloads. Clouds and DBA Tasks For years, the roles and responsibilities of DBAs were well defined and contained within the confines of the data center or IT department. DBAs were the ultimate caretakers for their corporate databases, making sure everything was running smoothly, managing the loading and extraction of data, setting access rights and data security, and ensuring that everything was backed up.   As cloud computing is accelerating the growth of data and the ability to move organizations into the digital race, it is also shifting DBAs’ roles and responsibilities. Cloud services – both from databases connected to the cloud, as well as databases run within clouds -- make the role of DBA more important than ever. Cloud services may help automate the management, installation, troubleshooting, performance, patching, backing up and security of data, but DBAs are needed to ensure that data environments continue to run efficiently and line up with business requirements. A survey of members of the Independent Oracle Users Group, conducted by Unisphere research, a division of Information Today, documents the relentless growth of data – as well as its increasing importance to the business – demonstrates the increasing value of DBAs as they engage new avenues of the business. A majority, 62%, report their company’s data volume has grown more than 10% annually, and 74% state their companies’ requirements for secure, well-governed data environments that meet compliance mandates is growing as well. Many organizations are turning to data to help make better business decisions, engage with customers, and increase speed to market.   Seventy-three percent of managers and professionals expect to be using DBaaS within their enterprises by that time, versus 27% at the present time.   Cloud frees up valuable DBA time and resources to provide this value to the business. Cloud-born services can help DBAs in a number of ways, handling much of the lower-level or infrastructure maintenance concerns, including the provisioning of disk space, increasing high availability, providing failover, integrating data, and enhancing security. An additional IOUG-Unisphere survey finds growing interest in Database as a Service (DBaaS) as a viable approach to serving enterprises’ needs for greater agility and faster time to market with cloud computing. DBaaS is taking off, with adoption expected to triple between 2016 and 2018. There will be a significant amount of enterprise data shifting to the cloud over that same time period as well, as enterprises rethink data management in the cloud. Seventy-three percent of managers and professionals expect to be using DBaaS within their enterprises by that time, versus 27% at the present time. Enter the ‘Renaissance DBA Historically, DBAs have focused on maintenance tasks within their enterprises. The typical duties of a DBA have typically included the following types of tasks: · Performing backup and recovery duties · Monitoring database server health · Ensuring disaster recovery and high availability · Database performance tuning · Establishing standards and schedules for database backups · Working with developers and designers and assisting them in their tasks · Establishing and enforcing standards in the database design and use · Overseeing the transfer, replication, and loading of data · Scheduling events or jobs · Ensuring database accessibility to authorized users · Changing database parameters and setup to ensure optimal database performance · Manage space allocated for the database · Assure compliance For the time being, as their organizations shift to cloud, DBAs may still oversee many of these essential tasks, depending on the degree of automation achieved within their organizations. For on-premises systems, such as an ERP application, the DBA has usually had to get involved with all the nuts and bolts of implementing and maintaining the system, from vetting data sources to ensuring integrity. In cloud-based environments, much of the lower-level task work is conducted by the cloud service provider. The cloud provider likely has already set up the database, installed the application, created the tables, and will provide the upgrades. The DBA only has to focus on what’s happening on the business end of the engagement – customizing endpoints and output, usually using the provider’s tools.   At the same time, DBAs are answering to a new calling, serving a much greater role within the business. Thanks to cloud, many lower-level maintenance tasks will either be diminished or go away. Low-level tasks will be done at the click of the button, offered through cloud providers. At the same time, DBAs are assuming greater roles as analysts or consultants to their businesses – providing both strategic and operational guidance to help transform their organizations into consumers of intelligent insights coming from their data resources. This emerging “Renaissance DBA” is part technologist who understands the workings of databases, and part trusted advisor who helps business leaders leverage data insights to move their organizations forward. For the conclusion of the article, and to find out key roles of the “Renaissance DBA”, please download the complete whitepaper in the resource tab below.   About Oracle Cloud Oracle Cloud is the industry’s broadest and most integrated public cloud. It offers best-in-class services across software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS), and even lets you put Oracle Cloud in your own data center. Oracle Cloud helps organizations drive innovation and business transformation by increasing business agility, lowering costs, and reducing IT complexity. Sign up for a free trial at cloud.oracle.com/tryit To learn more about how Oracle Cloud Platform can help your organization easily migrate applications to the cloud, visit oracle.com/paas.  Also, please find the complete whitepaper below in the resource section. Training for Database Administrators is available at Oracle University . Resources: The Rise of the Renaissance Database Administrator Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database 12c Release 2 for Data Warehousing and Big Data Whitepaper Oracle Analytics Cloud Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today, I wanted to share, via our blog, a portion of our new whitepaper entitled "The Rise of the Renaissance Database Administrator: Cloud Opens Up New Career Vistas".  For the complete whitepaper...

Customer Stories

Oracle Cloud Platform allows Asahi Refining to focus on their Core Competencies

Today we have a guest blog from John Klinke,Senior Principal Product Manager for Oracle Middleware. Asahi Refining is a leading provider of precious metal assaying, refining, and bullion products. They recently decided to reduce IT costs by moving away from the heavy lifting of managing and maintaining their on-premises IT infrastructure in order to focus more on their core competencies of refining gold and silver. By choosing Oracle Cloud Platform, they have been able to easily migrate their Java application to the cloud and automate the management, patching and maintenance of their application by running it on Oracle Database Cloud Service and Oracle Java Cloud Service. Watch this video to hear more about the benefits Asahi Refining saw in moving to the Oracle Cloud Platform.   To learn more about how Oracle Cloud Platform can help your organization easily migrate applications to the cloud, visit oracle.com/paas.   Resources: Oracle Versus Commodity Cloud Vendors Solution Brief Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Exadata Express Cloud Service Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have a guest blog from John Klinke,Senior Principal Product Manager for Oracle Middleware. Asahi Refining is a leading provider of precious metal assaying, refining, and bullion products....

Database

Three Ways To Energize Your DBA Career In The Cloud

Today we have a guest blog from Jeff Erickson, Director of Content Strategy at Oracle. It’s no secret that data is the new black gold—a source of wealth creation on par with financial capital, say some. So it’s a little ironic that a group of people who understand better than anyone how to manage data are feeling increasingly left out of the party. Database Administrators, or DBAs, are the experts responsible for the performance and security of a growing list of database types, from relational to in-memory to NoSQL. These databases provide the cornerstone technology of digital business activities like ecommerce, mobile computing, and social media, and are integral to trends like big data, artificial intelligence, and Internet of Things (IoT). But here’s the rub for DBAs: Cloud vendors now offer fully managed database services that take over a lot of the daily tasks of a company’s in-house DBAs. And, due to automation and economies of scale, those service providers can often do it faster and more cheaply. So where does that leave DBAs? “In a very good position,” says Penny Avril, vice president of database product management at Oracle. “Data has enormous value and the importance of the DBA isn’t going away. But their role is changing.”  A lot of the mundane tasks of a DBA are going away, “including a lot of the tasks where DBAs would be on call 24/7,” Avril says. Such tasks might include applying software updates or backing up data, which are automated with a database cloud service. Meanwhile, the parts of the job that are more interesting and more visible to a line of business leader, such as data modeling and data security, will only grow, she says. Here, according to Avril and several other experienced DBAs, are three ways to energize a DBA career in the age of the cloud: 1. Deliver the Right Data to Drive a Business Strategy “There is a whole lot more data coming online from social and data sharing services—and IoT,” says Avril. “So the role of making data available to their business users just radically increases.” This is a natural fit for the DBA, she says, because they know how data moves. For example, a company might want to analyze data from multiple sources, requiring a DBA who understands the data formats and can bring them together. A DBA can deliver even more value by working to understand the company’s business model and advising on what data is most valuable to the business. This shift doesn’t require getting a PhD in statistics. “DBAs may not become data scientists,” notes Avril. “But they can make the data scientist role a lot more valuable by getting the data to them quicker and more efficiently and by providing an understanding of different data sources.” David Start, who heads the Independent Oracle User Group (IOUG), agrees. “There’s a lot more [that] DBAs need to know” besides how to manage a database, he says. “It’s solutions. It’s architectures. It’s critical thinking. It’s communication skills and problem-solving.” His user group members are, he says, determined to help each other meet these challenges. “You can’t just be the Oracle DBA with your hands on the keyboard anymore, you have to look up and understand how business leaders want to use the data.” A good way to start, says Oracle’s Avril, is for DBAs to speak to business analysts from the various lines of business to understand what they hope to gain from new data sources and from the agility that the cloud provides. The other challenge that doesn’t go away, says Avril, is information security, even though the cloud makes that less of a hands-on task as well.  “Someone has to worry about data security as the business expands access into all these forms and sources of data,” she says. The cloud vendor will encrypt it, but doesn’t know what it is or means—nor does it want to. “Someone on the customer side needs to understand and classify the data and understand the privilege model they have for who can access the data.”   2. Master the Tactical Side of Cloud Migration Organizations will take a long time transitioning from today’s world of on-premises databases to a future of fully managed cloud databases. When the IT team moves an Oracle Database to cloud infrastructures such as Amazon Web Services or Oracle Bare Metal infrastructure, “their databases will still be primarily managed by their own DBAs,” Avril says. “Over time, however, as companies use more cloud-based applications, the database they use will be fully managed by the cloud vendor.” During this transition, says longtime DBA Robert Dawson of Oracle implementation partner Meta7, there are a lot of tactical needs that DBAs can help meet. “When a company is working with on-premises and cloud technology, someone needs to understand the cloud environment,” he says, “but also things like networking, VPN access, security, basic tenets of infrastructure, and database as well.” Avril’s advice is to help your company understand which applications should be moved to the cloud immediately and which should wait. “Know the mechanics of how to move applications and data to the cloud and take advantage of free cloud trials to learn how to integrate cloud services with your on-premises data sources.”   3. Grow into New Roles Change is nothing new to the DBA role. “We’ve seen this before in earlier versions of the database when Oracle introduced automation to a whole bunch of DBA tasks—automated storage management, automatic workload repository, and several others,” says Avril. “DBAs initially thought, ‘What am I supposed to do? Will I lose my job?’” But over time DBAs learned to rely on the automation and shifted their time to more high-value work. Not that the transition is always easy, says Dawson. “There’s a level of frustration because DBAs are used to being responsible for the most complex stuff, such as installing Oracle Exadata, installing Oracle RAC, or doing version upgrades,” he says. “Now they need to go back and grow their skills in new ways.” But as Dawson has seen in his own company, the opportunities are boundless if you’re willing to learn new skills. “You can explore open source tools like Docker or Ansible, and learn to use REST services,” he says. “Now you’re growing again. You might have been doing the same thing for the last 10 years, and now you’re opening up the O’Reilly book and reading about infrastructure as code and automation in cloud environments. That’s empowering. That’s career growth for DBAs.” IOUG’s Start agrees that the cloud will be good for DBAs: “It’s the best thing that’s ever happened because it’s shaking up people. It’s a faster pace. It gets people on their toes. It gets the data moving through the business.” For more information, visit www.cloud.oracle.com or sign up for a free trial at cloud.oracle.com/en_US/tryit.   Interested in some hands-on cloud experience? Oracle Code events are one-day, developer-focused events with practical labs and sessions.   Resources: Oracle Versus Commodity Cloud Vendors Solution Brief Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Exadata Express Cloud Service Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

Today we have a guest blog from Jeff Erickson, Director of Content Strategy at Oracle. It’s no secret that data is the new black gold—a source of wealth creation on par with financial capital, say...

Customer Stories

Kscope 2017: Oracle Sessions

ODTUG Kscope17 is the premier Oracle Developer Conference that attracts Oracle experts from all over the world.  Below you will find the registration page as well as all the Oracle Sessions.  Be sure to also check out our partner and Oracle ACE Directors' presentations. Kscope 2017 Conference Home Page Oracle Sessions Sunday June 25- APEX Sessions -  Grand Oaks E/F/G Rooms 8:30 AM - 9:30 AM APEX 5.1 Advanced Universal Theme Shakeeb Rahman, Oracle Corporation   9:30 AM - 10:30 AM  - Grand Oaks E/F/G Rooms APEX 5.1 Advanced Charts Marc Sewtz, Oracle Corporation   11:00 AM - 12:00 PM - Grand Oaks E/F/G Rooms APEX 5.1 Advanced Interactive Grid John Snyders, Oracle Corporatio   1:00 PM - 2:00 PM - Grand Oaks E/F/G Rooms APEX 5.1 Other Advanced New Features Partick Wolf, Oracle Corporation   2:00 PM - 3:00 PM - Grand Oaks E/F/G Rooms Advances with Packaged Apps Marc Sewtz, Oracle Corporation   3:15 PM-4:15 - Grand Oaks E/F/G Rooms Update from Development Joel Kallman, Oracle Corporation   Sunday JUNE 25 -  Business Intelligence Sessions Cibolo Canyon 2/3/4 and Freesia 8:40 AM - 10:00 AM Oracle Analytics Keynote and Product Strategy Gabby Rubin, Oracle Corporation   10:00 AM - 10:30 AM Cibolo Canyon 2/3/4 and Freesia What is New in Oracle BI Extended Universe (OBIEU)? Alan Lee, Oracle Corporation   11:20 AM - 12:00 PM - Cibolo Canyon 2/3/4 and Freesia OAC Samples and Platform Capabilities for Application Developers Phillippe Lions, Oracle Corporation   1:10 PM - 1:30 PM  - Cibolo Canyon 2/3/4 and Freesia Review of Oracle Data Lake Patterns (Cloud and On-Premises) Jean-Pierre Dijcks, Oracle Corporation   1:30 PM - 2:15 PM - Cibolo Canyon 2/3/4 and Freesia Deep Dive Into Big Data Cloud Service and Embedded Components Jean-Pierre Dijcks, Oracle Corporation   2:15 PM - 2:45 PM - Cibolo Canyon 2/3/4 and Freesia Special Update on Data Warehousing in the Cloud Yasin Baskan, Oracle Corporation   3:00 PM - 3:45 PM - Cibolo Canyon 2/3/4 and Freesia Introducing Oracle Data Integration Platform Denis Gray, Oracle Corporation   3:45 PM - 4:10 PM - Cibolo Canyon 2/3/4 and Freesia Hub and Spoke is Dead! Lambda/Kappa Data Integration Patterns Denis Gray, Oracle Corporation   4:10 PM - 4:30 PM - Cibolo Canyon 2/3/4 and Freesia Demo of Oracle Data Integration Platform Cloud Denis Gray, Oracle Corporation   1:00 PM - 1:30 PM - Cibolo Canyon 2/3/4 and Freesia The Essbase Session Gabby Rubin, Oracle Corporation   1:30 PM - 2:15 PM - Cibolo Canyon 2/3/4 and Freesia The Next Next-Gen of Mobile Analysis Jacques Vigeant, Oracle Corporation   2:15 PM - 2:45 PM - Cibolo Canyon 2/3/4 and Freesia The Analytics Wipe Out Matt Milella and Jacques Vigeant, Oracle Corporation   3:15 PM - 4:35 PM - Cibolo Canyon 2/3/4 and Freesia Data Visualization Workshop Philippe Lions, Oracle Corporation   Sunday JUNE 25 - Database Sessions  - Cibolo Canyon Cibolo Canyon 8/9/10 8:30 AM - 10:30 AM - Session 1 Ten Rules for Doing a Pl/SQL Performance Experiment Bryn Llewellyn, Oracle Corporation Why Test Cases Matter Chris Saxon, Oracle Corporation Lifecycle Management and Continuous Integration for Database Code Shay Shmeltzer, Oracle Corporation   11:00 AM - 12:00 PM - Session 2 -  Cibolo Canyon Cibolo Canyon 8/9/10 100 SQL Developer Tricks in 20 Minutes Jeff Smith, Oracle Corporation Leveraging Hyperlinks in REST APIs Colm Divilly, Oracle Corporation Oracle Optimizer Isn’t Black Magic Maria Colgan, Oracle Corporation   1:00 PM - 2:45 PM - Session 3 - - Cibolo Canyon Cibolo Canyon 8/9/10 ORDS REST Development Flexibility Jeff Smith, Oracle Corporation SQL Magic: The Cups and Balls Trick Chris Saxon, Oracle Corporation How to Protect the Crown Jewels Maria Colgan, Oracle Corporation Welding Together REST APIs Colm Divilly, Oracle Corporation Building Your Own Oracle Developer Day VM Barry McGillin, Oracle Corporation   Session 4: 3:15 PM - 5:00 PM - - Cibolo Canyon Cibolo Canyon 8/9/10 The "Hot Patching" Myth—Or Why You Have No Choice but to Use EBR for Zero-Downtime Patching Bryn Llewellyn, Oracle Corporation Hacking into Oracle Database Chris Saxon, Oracle Corporation New Open Source Projects from Database Tools Barry McGillin, Oracle Corporation 30 Minute Q&A Panel Moderated by Helen Sanders. Monday JUNE 26- EPM/BI - - Cibolo Canyon Cibolo Canyon 8/9/10 10:30-11:30 – Grand Oaks H Oracle Internet of Things Cloud Service: Making Sense of Fast Data Sujay Sarkhel Oracle Corporation   10:30-11:30 – Magnolia Architecture Live: Designing an Analytics Platform for the Big Data Era Jean-Pierre Dijcks Oracle Corporation   2:00-3PM- Grand Oaks I What's New and Coming in Planning Prasad Kulkarni Oracle Corporation   3:15-4:15 - CIBOLO CANYON 8/9/10 Build Once, Run Anywhere with Oracle Analytics Cloud Alan Lee Oracle Corporation   3:15-4:15 - PEONY The Future of On-Prem EPM Matt Bradle Oracle Corporation     Monday, June 26 - Database 10:30-11:30 – Larkspur Migrating Oracle Forms to the Cloud using Application Express David Peake Oracle Corporation   10:30-11:30 – Verbena Developing JavaScript Applications: The Oracle Offering Shay Shmeltzer Oracle Corporation   11:45-12:45 - BLUEBONNET/DOGWOOD Deploying REST Services at Scale Colm Divilly Oracle Corporation   TUESDAY, JUNE 27– EPM/BI   8:30-9:30 - CIBOLO CANYON 8/9/10 A Story About Data: Anything, Everywhere with Everyone Matt Milella Oracle Corporation   11:15-12:15 - CIBOLO CANYON 8/9/10 What's Cooking with Oracle Smart View for Office? Al Marciante Oracle Corporation   11:15-12:15 – Grand Oaks H Essbase Cloud Service: Architectural Evolution Kumar Ramaiyer Oracle Corporation 3:30-4:30 - CIBOLO CANYON 8/9/10 Oracle Analytics Cloud - A Comprehensive Overview Alan Lee Oracle Corporation   3:30-4:30 – Grand Oaks I What’s New, What’s Next: Oracle Planning and Budgeting Cloud Service Shankar Viswanathan Oracle Corporatio   4:45-5:45 – Grand Oaks I Oracle Analytics Cloud - Essbase New Capabilities Kumar Ramaiyer Oracle Corporation   TUESDAY, JUNE 27– Database 8:30-9:30 Begonia The Future of Database Maria Colgan Oracle Corporation   11:15-12:15 - Larkspur Low Code Challenge: Build a Real-World Application in 60 Minutes David Peake Oracle Corporation   2:00-3:00 - Verbena Understanding SQL Trace, TKPROF, and Execution Plan for Beginners Carlos Sierra Oracle Corporation   3:30-4:30- Verbena SQLcl & SQL Developer Tips & Tricks Jeff Smith Oracle Corporation WEDNESDAY, JUNE 28–EPM/BI 8:30-9:30 – Grand Oaks A/B Configuring EPBCS for Operational Planning Requirements Manish Daga Oracle Corporatio   9:45-10:45 - CIBOLO CANYON 8/9/10 Oracle Analytics Mobile - Disrupt the Enterprise (In a Good Way) Jacques Vigeant Oracle Corporatio   9:45-10:45 – Grand Oaks I World-Class Marketing Planning with Oracle Cloud Marc Seewald Oracle Corporatio   9:45-10:45 – Magnolia Integrating Raspberry Pi Sensors with Data Visualization Justin Baker Oracle Corporation   9:45-10:45 – Peony Current Trends and Best Practices for Improving User Adoption and Productivity for EPM and BI D.J. Hoelscher Oracle Corporation   11:15-12:15 – Grand Oaks I EPBCS Workforce and Strategic Workforce Planning In Action Mark Rinaldi Oracle Corporatio   1:45-2:45 - CIBOLO CANYON 2/3/4 On-Prem to EPM Cloud (Hybrid) and Cloud-Only- Based Integration Solutions Michael Casey Oracle Corporatio   3:00-4:00 - CIBOLO CANYON 8/9/10 Your Data Speaks with Enterprise Performance Reporting Cloud Al Marciante Oracle Corporation   3:00-4:00 - Magnolia Analytic Views: A New Type of Database View for BI and Data Warehousing William Endress Oracle Corporation   WEDNESDAY, JUNE 28–Database 8:30-9:30 – Verbena SQL Magic! Chris Saxon Oracle Corporatio   11:15-12:15 – Verbena SQL Developer: Three Features You're Not Using But Should Be Jeff Smith Oracle Corporation   3:00-4:00 – Bluebonnet SQL Tuning 101 Carlos Sierra Oracle Corporation   4:15-5:15 – AZALEA REST Enabling the Oracle Database Colm Divilly Oracle Corporation   4:15-5:15 – Larkspur APEX 5.2 Preliminary Features Marc Sewtz Oracle Corporation   THURSDAY DEEP-DIVE SESSIONS   9:00 - 11:00 AM   DATABASE Grand Oaks E/F A Real-World Comparison of the NoPlsql and Thick Database Paradigms Bryn Llewellyn, Oracle Corporation, and Toon Koppelaars, Oracle Corporation For more information on Oracle Data Management Cloud Services visit www.cloud.oracle.com or sign up for a free trial at cloud.oracle.com/en_US/tryit. Resources: Oracle Developer Home Page Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Exadata Express Cloud Service Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/  

ODTUG Kscope17 is the premier Oracle Developer Conference that attracts Oracle experts from all over the world.  Below you will find the registration page as well as all the Oracle Sessions.  Be sure...

Data Warehouse / Big Data

ODTUG Kscope17 is the premier Oracle Developer Conference that attracts Oracle experts from all over the world.

No other conference in the Oracle world features this kind of development-specific training and content, including lessons learned by other companies facing similar challenges and hundreds of opportunities to learn money-saving techniques.  In addition, it's a great forum to meet key Oracle partners and Oracle ACE Directors. ODTUG Kscope17 offers a multitude of benefits for attendees. Among the highlights are: • 300+ technical sessions • Four days of hands-on training – no additional charge • All-day symposiums – no additional charge • Lunch-and-learn sessions with Oracle experts • Plenty of time to meet and talk with the experts and other participants Kscope 2017 Conference Home Page If you can't make it to ODTUG Kscope17, you can still participate from home. Check out the list of sessions we're bringing you live from San Antonio, Texas! Using R for Data Profiling - Michelle Kolbe, Red Pill Analytics  When: Monday, June 26, 2017 - Session 3 Time: 2:00 - 3:00 PM CDT Topic: BI and Reporting Post your questions on the Livestream and the presenter will try to answer them during the Q&A. LIVESTREAM LINK Alexa! How Do You Work with Oracle REST Data Services? - Jonathan Dixon, JMJ Cloud When: Tuesday, June 27, 2017 - Session 7 Time: 11:15 AM - 12:15 PM CDT Topic: Database Post your questions on the Livestream and the presenter will try to answer them during the Q&A. LIVESTREAM LINK Advanced Calculation Techniques: Going Beyond the Calc Dim - George Cooper, Gap When: Wednesday, June 28, 2017 - Session 12 Time: 9:45 - 10:45 AM CDT Topic: Essbase, Subtopic: Essbase Calculation Post your questions on the Livestream and the presenter will try to answer them during the Q&A. LIVESTREAM LINK #LetsWreckThisTogether APEX Talks - Thursday Deep Dive When: Thursday, June 29, 2017 Time: 9:00 - 11:00 AM CDT Topic: APEX Post your questions on the Livestream and the presenter will try to answer them during the Q&A. LIVESTREAM LINK   For more information on Oracle Data Management Cloud Services visit www.cloud.oracle.com or sign up for a free trial at cloud.oracle.com/en_US/tryit. Resources: Oracle Developer Home Page Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Exadata Express Cloud Service Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

No other conference in the Oracle world features this kind of development-specific training and content, including lessons learned by other companies facing similar challenges and hundreds of...

Data Warehouse / Big Data

Getting Clear About Clouds: How to Find the Best Approach for Your Enterprise

Today we have a guest blog from our peers on the Oracle Cloud Machine team.  Today Maywun Wong Director, Product Marketing, discusses the value of Oracle Exadata Cloud Machine and the use cases behind this solution.  Most business and IT executives understand the extent to which cloud computing is transforming business and disrupting IT organizations. These services and solutions—including Software-as-a-Service (SaaS), Infrastructure-as-a-Service (IaaS), and Platform-as-a-Service (PaaS)—are ushering in new cost efficiencies, productivity gains, and innovation.  However, sorting through the options and choosing the right approach can be daunting. Historically, business and IT decision-makers have had three choices when adopting cloud technologies: private clouds, public clouds, and hybrid clouds. Each approach delivers different features, capabilities, challenges, and benefits. How do you choose the best cloud configuration for your organization? How do you ensure that you are minimizing costs, maximizing gains, and retaining the flexibility and agility required for digital business? With the right strategic framework, it's possible to transform the cloud from a tactical benefit into a strategic asset. Let's take a look at the three traditional types of clouds: Public clouds. This approach relies on publicly available infrastructure and services to deliver computing capacity, applications, and/or services. It can be less costly than private clouds, it's relatively fast to deploy and easy use, and it's highly flexible. The potential downside, however, is that it's operated by a third party, it's a shared resource, and outages and downtime can occur. For organizations whose demands fluctuate and may need to scale up or down quickly, public cloud is a versatile, cost-effective way to reap the benefits of the cloud. Oracle's public cloud solution is Oracle Cloud, offering a high level of scalability, flexibility, and dependability. Learn more about Oracle Cloud. Private clouds. This approach uses a proprietary IT architecture to deliver infrastructure, applications, and/or services. A private cloud is developed specifically for a single organization and, consequently, offers a high degree of control and reliability. It's also more readily customizable. However, it does require a higher CapEx investment than other cloud options, as well as greater IT expertise. For organizations operating in regulated industries and subject to data sovereignty laws, private clouds offer the control needed to meet compliance demands. Private clouds also facilitate strict data access controls. Oracle provides a number of solutions for organizations looking to deploy a private cloud: Oracle Private Cloud Appliance, Oracle Exadata, Oracle Exalogic, Oracle Supercluster. Hybrid clouds. This environment relies on a mix of public and private cloud infrastructures, applications, and/or services. By combining elements of both types of clouds, an organization can balance cost and data sovereignty concerns with greater flexibility and scalability. For example, businesses may choose a hybrid cloud model to keep critical or data-sensitive applications or functions behind their firewall, but take advantage of the ease and flexibility of the public cloud for less critical functions such as test/dev.   A New Model Emerges: Oracle Cloud at Customer Oracle has introduced Cloud at Customer. Built on the public Oracle Cloud but located at the customer’s data center, it is managed by Oracle. Cloud at Customer breaks down the traditional barriers to public cloud adoption by combining the best of both worlds: a highly flexible public Oracle Cloud and the privacy of the data center. Oracle Cloud at the Customer includes Oracle Cloud Machine and Exadata Cloud Machines, and offers several important advantages: A lower overall cost point with a high level of flexibility. This makes it ideal for delivering cloud-native solutions that require an on-premises framework. A user experience consistent with Oracle Cloud, with low latency The ability to address data sovereignty and governance concerns The same cloud software stack across the public Oracle Cloud and the on-premises Cloud at Customer elements—such as Oracle Cloud Machine and Exadata Cloud Machine—to simplify workload portability An OpEx approach, which lets your organization buy only the resources it needs and establish a highly predictable OpEx budget. Equipment is owned and operated by Oracle. Maintenance of data behind the firewall. As a result, your IT department can integrate the cloud technology with your company's network security. An architecture that is especially suited to server-side Java users requiring high performance and scalability Cloud at Customer delivers the highest level of performance and most available infrastructure for running an Oracle database, including high performance middleware and native InfiniBand connectivity. Organizations can tap the latest innovations and take advantage of the rapid development the cloud makes possible.   Net Gains Private, public, and hybrid clouds all remain viable options, and all play a role in the modern enterprise. However, organizations that require a public cloud infrastructure should consider their choices carefully. Oracle Cloud at the Customer builds a foundation for a more agile and flexible IT and business framework. It's an environment that supports everything from agile initiatives and DevOps to real-time analytics and advanced features for mobile apps—all of which helps ensure that your organization is on the leading edge of innovation and disruption. Resources: Cloud Operations for Oracle Cloud Machine Whitepaper Oracle Versus Commodity Cloud Vendors Solution Brief Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have a guest blog from our peers on the Oracle Cloud Machine team.  Today Maywun Wong Director, Product Marketing, discusses the value of Oracle Exadata Cloud Machine and the use cases...

Want More from Cloud? New Exadata Subscription-Based On-Premise Services

Today we have a guest blog from our peers on the Oracle Cloud Machine team.  Today Anne Plese Director, Product Marketing, Cloud Infrastructure Database Solutions discusses the value of Oracle Exadata Cloud Machine and the use cases behind this solution.  Cloud-integrated engineered systems for Oracle Databases address many of the key concerns that inhibit enterprises from making a swifter transition away from legacy on-premises infrastructures.  It’s fair to say that the vast majority of companies are well aware of the benefits provided by cloud-hosted applications. But for businesses that have real or perceived concerns about implementing cloud-hosted solutions for their more complex database, analytics, or OLTP needs, many of the benefits this technology provides are just out of reach. Cloud Concerns The primary worries about the public cloud include data residency, security, and vendor lock-in. For private cloud implementations, lock-in is the biggest concern, followed by a lack of appropriate skills, and then security. Data security does appear to be the primary inhibitor, however. According to a Cloud Industry Forum report, privacy and security worries remain as big a barrier to adoption as they were five years ago. In this video, for example, Juan Loaiza, SVP Oracle, reviewed the most common objections companies communicate to us, including: Regulatory or corporate policies that require data to be local to the company or territory Limited resources or IT skills needed to manage database infrastructure Lack of proximity to public cloud data centers, or latency issues that compel many organizations to demand the performance of a LAN infrastructure Legacy database and infrastructure complexity Risk associated with data security and worry about the cost and quality of services cloud providers offer The Confident Transition Oracle appreciates the lingering concerns that some companies have about hosting mission-critical business processes in the cloud. That’s why we have more choices of database infrastructure services to help companies transition to the cloud at their own pace.  Central to that is the Oracle Exadata Cloud Machine. This fully-integrated engineered system is delivered to your data center to address latency and data sovereignty requirements—with all of the benefits of a modern cloud service.  Here’s how it works: The services are physically delivered from inside your firewall, next to systems and data where compliance and operation policies have already been defined. Oracle conforms to regulatory, privacy, and legal frameworks, and meters only the services you consume. A Good Fit if You are New to Exadata Exadata Cloud Machine is identical to Oracle’s public cloud service in that it is managed by Oracle Cloud Experts for you.   It’s cloud-based service that is still under your control, but all of the resource-consuming activities associated with managing infrastructure components—servers, storage, storage software, networking, and firmware—are supported by the Oracle Cloud Operation.  Similar to our Exadata Cloud Services public cloud offering, companies avoid the costs of hiring, training, and retaining staff with specialized skills.  Exadata Cloud Machine enables easy migration of existing databases across the LAN and simplifies database backup to existing data center infrastructure. Plus, it provides maximum-availability architectural configuration with built-in best practices.  For companies that want more from the cloud, the Exadata Cloud Machine is a proven path away from cumbersome and costly traditional on-premises infrastructures.   Learn more about how our engineered systems can address any concerns you have with migrating to the cloud.   Resources: Cloud Operations for Oracle Cloud Machine Whitepaper Oracle Versus Commodity Cloud Vendors Solution Brief Oracle Database 12c Release 2 For The Cloud Solution Brief Oracle Database Cloud Service 30-Day Trial Follow Us On Social Media: Blog: https://blogs.oracle.com/database/ Twitter: https://twitter.com/oracledatabase Facebook: https://www.facebook.com/OracleDatabase LinkedIn: https://blogs.oracle.com/databaseinsider/

Today we have a guest blog from our peers on the Oracle Cloud Machine team.  Today Anne Plese Director, Product Marketing, Cloud Infrastructure Database Solutions discusses the value of Oracle Exadata...