OCI Cache with Redis: The lightning-fast way to improve your application performance

October 17, 2023 | 3 minute read
Jim Battenberg
Senior Director, Product Management
Mike Sorola
Senior Product Manager, Data and AI Services
Text Size 100%:

High performance and low latency are critical for many applications, including ecommerce, financial services, real-time location-based applications, real-time data analytics, gaming, and internet of things (IoT). To achieve this performance, developers often use in-memory data stores like Redis to store and retrieve data with millisecond latency. We're glad to announce that our OCI Cache with Redis service is now generally available for you in all OCI regions.

What is Redis?

Remote Dictionary Server, or Redis, is an open source, in-memory data store that offers a wide range of data structures, including strings, hashes, lists, sets, and sorted sets. It’s often used as a caching layer to improve the performance of applications and a data store for real-time applications.

OCI Redis

Why use OCI Cache with Redis?

Oracle Cloud Infrastructure (OCI) Cache with Redis is a fully managed Redis version 7.0.5 service that makes it easy to deploy and manage Redis on OCI. It offers numerous benefits over self-managed Redis deployments, including the following:

  • Automated management: OCI Cache with Redis automatically manages the provisioning and scaling of a Redis cluster while also patching the OS. This automation allows you to focus on developing your applications. You can use the same Redis API you use today.

  • Scalability: OCI Cache with Redis is easily scalable. You can add or remove nodes from your cluster as needed to meet the demands of your application.

  • Performance: OCI Cache with Redis offers high performance and low latency. Your applications can experience lightning fast response times when accessing data from Redis.

A graphic depicting the architecture for a deployment using OCI Cache with Redis.
Figure 1: Typical use case of OCI Cache with Redis in front of your database

Key features of OCI Cache with Redis

  • Flexible memory shapes: OCI Cache with Redis offers full flexibility on memory, so you can select the one that best meets the needs of your application. From a small 2-GB cluster on a single node to a large 500-GB cluster with up to 5 nodes.

  • Minimal downtime on scaling: You can scale your Redis cluster up and out with one second or less of downtime, which makes it easy to adjust your cluster to meet the changing demands of your application.

  • Automated high availability: Not only is your data automatically replicated across multiple nodes for redundancy, but we also automatically distribute your cluster nodes across availability domains and fault domains when you deploy a cluster with two or more nodes to ensure better resilience to any event.

  • Competitive pricing: OCI Cache with Redis is priced based on the amount of memory you use, so you only pay for what you need.

A screenshot of the Create cluster screen in the Oracle Cloud Console, showing how to configure nodes for flexible Redis.
Figure 2: Node configuration panel


OCI Cache with Redis is aggressively priced based on only the total amount of memory used. Our pricing is simple and predictable. You can find our pricing for the service here or estimate your costs here.

What customers are saying

“OCI Cache with Redis has been a game-changer for our application,” said Joon Daroy, development manager of IT at SmartVisit Solutions. “It has helped us to improve performance and reliability, and it has made it much easier to manage our Redis deployment.”


OCI Cache with Redis is a powerful and easy-to-use Redis service that can help you to improve the performance and reliability of your applications. If you’re looking for a fully managed Redis service, we encourage you to try OCI Cache with Redis.


You can find OCI Cache with Redis in the OCI console under “Databases.” Visit OCI Cache with Redis to learn more about the service.

Jim Battenberg

Senior Director, Product Management

Jim Battenberg joined Oracle in late 2016. He is a Senior Director, Product Management for the Data Management Services team at Oracle, where his focus spans product management, business operations and solution architecture.

Jim’s time in “the cloud” dates back to the mid/late 90’s when the shared hosting, dedicated hosting and ASP markets first began implementing the core concepts of virtualization. He developed and launched a suite of profitable hosting services from scratch and helped take Interliant (later purchased by Navisite) public.  

Prior to Oracle, Jim led the Platform Enablement team for the CenturyLink Cloud. Before that he led marketing, and created the first product marketing function for the Rackspace Cloud. He has also held senior product management and marketing leadership roles at enterprise hardware, software and hosting/cloud companies across startups and Fortune 500 firms. Jim holds a Finance degree and an MBA from the University of Houston.  

Finally, Jim loves yoga, poker and trap music, and claims to be able to combine all 3 – but you’ll have to ask him to see it!

Mike Sorola

Senior Product Manager, Data and AI Services

Mike Sorola is a product manager in Data and AI Services. He joined OCI in 2020 and focuses on managed, open-source services.

Previous Post

OCI Network Firewall: Unveiling policy model transformations and performance advances

Troy Levin | 8 min read

Next Post

Announcing Operational Metrics for OCI Logging's Unified Monitoring Agent

Tamer Karatekin | 2 min read