Hello world! OCI Generative AI is here!

September 19, 2023 | 4 minute read
Luis Cabrera-Cordon
Senior Director of Product Management
Text Size 100%:

Artificial intelligence (AI), and more specifically generative AI, is driving a technological revolution akin to the rise of the internet or mobile communications. At Oracle, we want to ensure that you have cutting-edge tools to harness the power of this revolution for your enterprise.

We are excited to announce the beta launch of OCI Generative AI, a new service hosted in Oracle Cloud Infrastructure (OCI). OCI Generative AI enables you to seamlessly add generative AI capabilities based on large language models (LLMs) to your applications and workflows through simple APIs. OCI Generative AI service provides LLMs from Cohere, the leading generative AI company for enterprise-grade LLMs. We are thrilled to be working closely with Cohere and to bring their models and technology to you, as well as the rest of the Oracle ecosystem.

Oracle and Cohere are poised to lead the enterprise AI revolution, which will fundamentally change how companies do business, said Aidan Gomez, founder and CEO of Cohere. Together, we’re building custom-tailored models to meet the needs of each company in every industry, while providing outstanding performance and data security.

Try the OCI Generative AI in beta

Available large language models

Through OCI Generative AI, you can consume the following prebuilt models. You can also customize or fine tune models with your own data to meet your domain-specific needs.

  • Command model: Command is Cohere’s highly performant generation model. The model takes a user command as input and generates text following the instructions. In the beta release, we’re offering two model sizes, a 52 billion parameter model, and a 6 billion parameter model. The 52 billion parameter model is Cohere’s best and most advanced model, while the 6 billion parameter model will have a lower price-point. 
  • Summarization model endpoint: Summarization runs as an optimization on top of Cohere’s Command model, and you don’t need to provide instructions. Instead, you enter the text to summarize. You can define parameters, such as the length of the summarization and format. For the beta release, we’re exposing a 52 billion parameter model for summarization. Use this model endpoint to summarize news articles, documents, blog posts, and so on.
  • Embed model: Using the Embed model, you can transform text, whether a word, a sentence, or a larger piece of text, into a semantic numeric representation of high-dimensional vectors. In other words, the vector of numbers generated represents the meaning of the text. This immensely powerful tool can enable you to do text classification, clustering, and even semantic search. 

Using OCI Generative AI, you can use pre-trained models right away through the OCI console or a simple API call and power many of your enterprise applications. In many situations, these models work out of the box to solve your use cases with little to no additional effort required.

Adapting the models to your needs

One of the requirements for many industry-specific use cases is the ability to customize the models to perform a specific industry task, to use a specific domain of knowledge, or to generate text that follows a particular industry or corporate format. In OCI Generative AI, you can fine tune the Cohere models to meet your own specific needs. Fine tuning allows you to customize the models without needing to train a model from scratch or provide a large amount of data. Because the models already have a knowledge of language, a modest number of examples (100 or 1,000 examples) can be sufficient for the model to learn to tackle your unique problem. Learn more about fine tuning in OCI Generative AI Fine tuning.

Private and secure

Oracle respects your enterprise data, your privacy, and security.  OCI hosts both prebuilt and custom models. None of your data is shared with Cohere or other customers. In addition, you’re the only entity allowed to use custom models trained on your data. 

Peace of mind with dedicated AI clusters

One of the challenges unique to generative AI enterprise users today is ensuring the availability of compute resources to meet their stringent performance needs.
In addition to the ability to consume the prebuilt models on demand by paying for the tokens you send and receive, OCI Generative AI is also introducing a novel way for enterprises to scale their generative AI workloads to meet their needs. Generative AI customers can reserve compute capacity for Generative AI workloads for a monthly price. This capability brings you the following advantages:

  • You can control the throughput and price tradeoff. For example, if you need a higher throughput for your workload, you can increase the AI units allocated to your dedicated AI cluster.
  • You have full control of your monthly expenses, making it easier to get expense approvals and reducing the chance of billing surprises.
  • This pricing model gives you peace of mind, knowing that you have the resources you need for training and inferencing.


Let the AI revolution begin!

We’re very excited to partner with Cohere and be making the Generative AI service available to you through Oracle’s Beta Program. We can’t wait to see your creativity with these new revolutionary tools.

Oracle is also plugging in generative AI experiences across its suite of business applications, from Cerner to Fusion. We envision a world where generative AI models can help you more quickly extract insights from enterprise data, augment your creativity and empower you to solve real-world problems.

The enterprise generative AI era begins now!
 

 

 

 

Luis Cabrera-Cordon

Senior Director of Product Management


Previous Post

Announcing New Compute Shapes Based on AmpereOne for a Sustainable Cloud

Kailas Jawadekar | 5 min read

Next Post


Ultra-fast serverless functions powered by GraalOS

Rishikesh Palve | 3 min read