Wendy Yip

Senior Product Manager, OCI Data Science

Recent Blogs

Introducing Private Endpoints in OCI Data Science: Enhanced Security ...

OCI Data Science is introducing private endpoints for model deployment. Previously, all inference endpoints were either public-facing or accessible through the internal Oracle network. Private endpoints ensure that all traffic remains within your virtual cloud network (VCN), eliminating exposure to the public internet.

OCI Data Science 2024: A year of innovation in AI

2024 has been an incredible year for Oracle Cloud Infrastructure (OCI) Data Science, marked by innovative advancements, powerful new features, and inspiring use cases. Oracle has continued to empower data scientists and developers with tools that help simplify complex workflows, enable cutting-edge AI applications, and deliver actionable insights.

Introducing Llama 3.3 model on OCI Data Science

Meta has introduced the Llama 3.3 70B model, an instruction-tuned, text-only large language model offering enhanced reasoning, math, and instruction-following capabilities comparable to its larger predecessors but with lower costs and broader GPU compatibility. This model can be seamlessly integrated into OCI Data Science, which supports the entire machine learning lifecycle, including no-code solutions like AI Quick Actions for deploying, fine-tuning, and managing models.

LLM inferencing with Arm-based OCI Ampere A1 Compute in OCI Data ...

Oracle Cloud Infrastructure (OCI) Data Science AI Quick Actions now supports inferencing of models in GPT-Generated Unified Format (GGUF) with OCI Ampere A1 CPU shapes. OCI Data Science already supports OCI Ampere A1 in the platform. Oracle and Ampere have partnered to optimize inferencing framework llama.cpp to work with Ampere Arm 64 CPUs. Customers can use a service managed container with the optimized llama.cpp for model inferencing in AI Quick Actions.

Now introducing: Fine-tune and deploy Llama 3.2 models on OCI Data ...

Meta’s Llama models have become the go-to standard for open large language models (LLMs). Oracle Cloud Infrastructure (OCI) Data Science already supports Llama 2 , 3, and 3.1 models, even on CPUs. In a step forward for AI development and deployment, OCI Data Science now supports Llama 3.2 through AI Quick Actions and the Bring Your Own Container (BYOC) feature.

Receive the latest blog updates