Oracle Autonomous Database Select AI helps simplify and automate using generative AI in your applications with a wide range of AI providers you can choose from. Select AI has recently expanded the set of AI providers to include Amazon Bedrock and those with OpenAI-compatible APIs, such as xAI, Mistral AI, and Fireworks AI, among others. This expansion gives you even more models from which you can choose the best fit your specific use cases and business needs.
You can easily add an AI provider and LLM with Select AI by using a simple Select AI profile specification. This lets you quickly try out and switch between different AI models, helping you find the best results and price performance for your application.
Let’s look at these new providers and examples of how to set up the Select AI profile so you can take advantage of them.
Amazon Bedrock
Amazon Bedrock provides consolidated access to over 100 fully managed models from leading AI companies. Note that AWS Bedrock foundation models require access permissions via the Amazon Bedrock console.
To set up a Select AI profile with a model offered by AWS Bedrock, you will need AWS credentials and a model identifier (modelId). Note that AWS Bedrock foundation models require access permissions via the Amazon Bedrock console. To create the AWS credential, specify the username as the AWS access key ID and the password as the AWS secret access key (see AWS Identity and Access Management). With your AWS credentials, log into the AWS bedrock console for access to the Amazon Bedrock foundation models.
The modelId depends on the resources that you use. You should to refer to AWS documentation to get modelId— for example, “amazon.titan-text-express-v1” or “anthropic.claude-v2”. If you use:
- Base model: specify the model ID or its ARN(Amazon Resource Names). For a list of model IDs for base models, see Amazon Bedrock base model IDs.
- Inference profile: Specify the inference profile ID or its ARN. For a list of inference profile IDs, see Supported Regions and models for cross-region inference.
- Provisioned model: Specify the ARN of the Provisioned Throughput. For more information, see Run inference using a Provisioned Throughput.
- Custom model: First purchase Provisioned Throughput for it. Then specify the ARN of the resulting provisioned model. For more information, see Use a custom model in Amazon Bedrock.
- Amazon Bedrock Marketplace model: Specify the ID or ARN of the marketplace endpoint that you created, see Amazon Bedrock Marketplace.
For AWS Bedrock, you set up the credential and network ACL access using:
BEGIN DBMS_CLOUD.create_credential( credential_name => 'AWS_CRED', username => 'Your_AWS_Access_Key_ID', password => 'Your_AWS_Secret_Access_Key'); END; BEGIN DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE( host => 'bedrock-runtime.us-east-1.amazonaws.com', ace => xs$ace_type(privilege_list => xs$name_list('http'), principal_name => 'ADMIN', principal_type => xs_acl.ptype_db)); END;
Then, in your AI profile, specify the provider as “aws”, the AWS credential object name, and the model you want to use. In this example, the profile uses Anthropic Claude (anthropic.claude-v2).
BEGIN DBMS_CLOUD_AI.create_profile( 'AWS', '{"provider": "aws", "credential_name": "AWS_CRED", "object_list": [{"owner": "ADMIN", "name": "users"}, {"owner": "ADMIN", "name": "movies"}, {"owner": "ADMIN", "name": "genres"}, {"owner": "ADMIN", "name": "watch_history"}, {"owner": "ADMIN", "name": "movie_genres"}], "model" : "anthropic.claude-v2"}'); END;
AI providers with OpenAI-compatible APIs
With OpenAI-compatible providers, such as xAI, Mistral AI, and Fireworks AI, you can explore and use an even wider set of AI models such as Grok , Mistral, and more.
To enable this capability, Select AI introduces a new AI profile attribute provider_endpoint for specifying the API endpoint for OpenAI-compatible providers. The AI profile “model” attribute is also required along with the provider endpoint. See each provider’s documentation for available models and their corresponding names.
To set up a Select AI profile with a LLM from an OpenAI-compatible provider, you will need API key, endpoint (base URL), and model name. OpenAI-compatible providers use bearer authentication, so you’ll need to obtain an API key from your provider to create credentials. Refer to each provider’s documentation for details.
To determine the OpenAI-compatible base URL, first locate the full API request URL in the provider’s documentation. For example, for Fireworks AI, the full request URL is: “https://api.fireworks.ai/inference/v1/chat/completions”. Since “/v1/chat/completions” is the OpenAI-compatible path, specify the URL without this in the attribute provider_endpoint as “https://api.fireworks.ai/inference”.
For the model name, some providers may require a prefix for it, so always refer to their documentation for the correct format. Note that you would specify the “model” attribute for Fireworks AI models, such as “accounts/fireworks/models/llama-v3p1-8b-instruct”.
Continuing with Fireworks AI in our example, you set up the credential and network ACL access using:
BEGIN DBMS_CLOUD.create_credential( credential_name => 'FIREWORKS_CRED', username => 'FIREWORKS', password => '<api_key>'); END; BEGIN DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE( host => 'api.fireworks.ai', ace => xs$ace_type(privilege_list => xs$name_list('http'), principal_name => 'ADMIN', principal_type => xs_acl.ptype_db)); END;
Then, in your AI profile, specify the Fireworks credential object name and the model you want to use.
BEGIN DBMS_CLOUD_AI.create_profile( 'FIREWORKS', '{"credential_name": "FIREWORKS_CRED", "object_list": [{"owner": "ADMIN", "name": "GENRE"}, {"owner": "ADMIN", "name": "CUSTOMER"}, {"owner": "ADMIN", "name": "movie_genres"}, {"owner": "ADMIN", "name": "STREAMS"}, {"owner": "ADMIN", "name": "MOVIES"}, {"owner": "ADMIN", "name": "ACTIONS"}], "model" : "accounts/fireworks/models/llama-v3p1-405b-instruct", "provider_endpoint" : "api.fireworks.ai/inference", "conversation" : "true"}'); END;
Select offers choice of AI provider
As noted above, Select AI offers a wide range of AI providers you can choose from. Here’s the latest list:
• OCI Generative AI service
• OpenAI
• Azure OpenAI Service
• Cohere
• Google
• Anthropic
• Hugging Face
• Amazon Bedrock
• OpenAI API-compatible providers
See Select AI Usage Guidelines for more information.
Several of the providers have default models. However, for AWS and OpenAI-compatible providers, the model attribute is required. See Configure Select AI to Use Supported AI Providers, Example: Select AI with AWS, and Example: Select AI with OpenAI-Compatible Providers for details and additional examples.
Resources
For more information…
- Try Select AI for free on OCI: Autonomous Database Free Trial
- Video: Getting Started with Oracle Select AI
- Documentation
- LiveLabs