Navigating the frontier: Key considerations for developing a generative AI integration strategy for the enterprise

February 9, 2024 | 6 minute read
Jyotika Singh
Principal Applied Scientist, OCI Gen AI
Sid Padgaonkar
Sr. Director - Product Management (Gen AI) - Strategic Customers
Text Size 100%:

Generative AI adoption is trending, and Fortune 500 leaders are eager to create adoption plans for their enterprise. A huge part of adoption is educating customers. This post is the first of a six-part series that equips you to harness generative AI’s potential effectively. In this blog post, we dive into key considerations for generative AI adoption like identifying use cases, data readiness, legacy system integration, cultural shifts, and assessing risks associated with data, undesired outcomes, and unrealistic expectations for a smooth, value-driven adoption. We discuss prioritization of internal use cases and use insights that can help minimize external risks.

In the ever-evolving landscape of enterprise technology, the race to harness the potential of generative AI is on. For the enterprise especially, integrating generative AI into their operations is a strategic imperative. However, there are some key considerations that business leaders need to consider before they chart a path for the enterprise adoption of generative AI.

Assess whether generative AI is applicable or required

The widespread adoption of generative AI and large language models (LLMs) in various industries has led to the natural question of whether implementing these technologies can bring benefits to one's business. While the potential for using generative AI is significant, recognizing that its effectiveness depends on how and why it is employed is essential. The key to realizing the full potential of generative AI lies in identifying the suitable use cases and objectives for its application. To further illustrate this point, let’s explore some practical examples.

Example 1

Company X has an extensive collection of product reviews in their database. Their objective is to understand customer reactions better and make informed decisions to strategize further product developments. They want to analyze the sentiment of each review and thus gain valuable insights towards their products.

Developing custom generative AI solutions to start this task might be unnecessary and potentially excessive. Considering other machine learning (ML) models that have been developed for this problem is often a helpful start. Because this issue is common, they can use many open source tools and cloud-based services, such as Oracle Cloud Infrastructure (OCI) Language,  to automate the process of analyzing sentiments from text.

Example 2

Company Y supports chat communications with customers, requiring human representatives to compile a synopsis of each chat conversation sent back to the customer to outline the nature of their inquiry and the subsequent resolution.

This use case is an excellent candidate for using AI models to automate the process. These conversations have many different components, such as questions, answers, and regular discussions. Given the complexity of such chat dialogues, simpler methods and models are unlikely to perform as well in generating a pointed summary. Given that LLMs work well with use cases like free-form Q&A and information extraction and summarization, automating this process can be highly beneficial to save both time and resources.

Identify use cases and define clear business objectives

To determine the areas where generative AI can offer opportunities, having a thorough understanding of your business needs and the data available to you is vital. We recommend defining a clear business objective and identify specific challenges that you’re looking to address with generative AI. Consider positioning your generative AI strategy as a catalyst for achieving your business goals. A great starting point is knowing a few examples for what generative AI can help you with and ideate after, such as the following examples:

  • Streamlining existing processes: Explore if generative AI can be the solution for tasks that are currently being done manually or are time-consuming.
  • Amplifying current operations: Generative AI can help provide valuable insights from your existing data and merge them with external data sources to strengthen your analytics.
  • Developing new offerings: To determine if generative AI is suitable for your new business ideas, defining your use cases considering your business objectives and available data assets is crucial.

The following examples below can assist you in this use case identification process. If your intended application aligns with any of the provided use cases or if you can create a new application for your business that resembles these use cases, Generative AI might be a worthwhile choice.

Example use cases for generative AI.

 

Assess risks and benefits

Adopting generative AI can help yield a lot of benefits, such as helping with process automation, revenue generation, cost savings, improved efficiency, improved customer experience, and others. Being aware of risks and setting the right expectations is equally important.

We recommend assessing and inquiring about the state of your data and security and identifying and addressing any issues that arise during model inference.

While the potential of generative AI is substantial, risks in overhyping the tech and setting unrealistic expectations can occur. Leaders might face pressure to deliver transformative results ASAP leading to disappointment if the tech doesn’t meet the hype. Other important factors include awareness of data providence and privacy, considerable resource investments and computation costs, and others.

Gauge data readiness and quality

The access and quality of the underlying data can influence the success of AI initiatives. We recommend that leaders conduct a comprehensive assessment of their data infrastructure. Siloed or poor-quality data can often impede the progress of generative AI initiatives.

Evaluate integration with legacy systems and technical debt

The true potential of generative AI can only be realized when it’s integrated into your core business operations. Many organizations operate with legacy systems that aren’t optimized for integration with generative AI solutions. Integrating these systems can introduce technical challenges, increase complexity and result in performance bottlenecks diluting the value of your generative AI solution.

Consider a phased implementation approach to minimize risks

Trying out generative AI for a smaller internal project can be a great first step to take before rolling it out on a larger scale. This smaller trial can help you see how this technology might benefit your business. Learning user behavior from internal use cases can help minimize risks.

At Oracle, we first implemented LLMs using a retriever augmented generation (RAG) approach to improve our internal search engine, MyOracle Search, which assists employees in searching through thousands of internal resources. The RAG approach helps expand the LLMs knowledge beyond the data it was trained on, and it’s a great choice to reap the benefits of an LLM on custom documents.

Before the introduction of generative AI, the search engine operated based on lexical similarity, meaning it finds keyword similarities between search terms and the available resources. The user is then presented with the relevant documents to browse. However, with the integration of generative AI, employees can now enter more complex and flexible queries, as the system no longer relies solely on keywords. The AI model generates specific answers to these queries by compiling relevant information from different resources, improving general search functions, and offering the advantages of both search types to users.

Based on the success of this functionality for this smaller project, we can now expand to our other search engines more confidently, both internal and external to the company.

Audit the enterprises’ ability to absorb the generative AI adoption

Introducing generative AI often necessitates a cultural shift and the development of new skill sets for everyone within the organization. We recommend that leaders invest in change management initiatives that prepare the employees for the introduction of generative AI, including upskilling existing teams, hiring new talent with expertise in AI, and editing business practices and operations to accommodate generative AI. The objective is to prepare the organization to incorporate generative AI services in a way that drives the business forward and aligns with its strategic objectives.

Plan for continuous monitoring and improvement

Generative AI isn’t a one-and-done implementation. It’s a continuously evolving technology that requires monitoring and improvement. We recommend that leaders establish mechanisms for ongoing performance evaluation and monitoring of generative AI models for accuracy and efficacy. Creating feedback loops and business processes can help gather insights and audit the inferences at critical decision junctions. A commitment of continuous monitoring and improvement ensures that the generative AI integration remains adaptive and continues to deliver value over the long term.

Conclusion

Generative AI possesses substantial potential beyond mere buzzwords, providing advantages like increased revenue, cost reduction, improved resource utilization, and the capability to undertake new tasks for business expansion. By incorporating important considerations and risks into the enterprise strategy, the integration of generative AI tools and LLMs can not only propel businesses forward but also safeguard against falling behind in the rapidly advancing landscape.

Read part 2 of this 6-part blog series - “Comprehensive Tactics for Optimizing Large Language Models (LLMs) for Your Application”

If you’re new to Oracle Cloud Infrastructure, try Oracle Cloud Free Trial, a free 30-day trial with US$300 in credits. For more information, see the following resources: 

Jyotika Singh

Principal Applied Scientist, OCI Gen AI

Jyotika is an accomplished Data Science leader and practitioner, AI book author, speaker, researcher, and mentor. She currently works as Principal Applied Scientist at Oracle, where she builds Machine Learning and Generative AI solutions. She has shared her insights as a speaker at over 30 conferences and events, and her work in Data Science has led to the invention of multiple patents utilized by esteemed companies. Her prior roles as Director of Data Science at Placemakr and VP of Data Science at ICX Media have allowed her to contribute to various industry verticals and have led to a successful acquisition, resulting in substantial revenue gains through practical applications of data science and Natural Language Processing (NLP). Jyotika recently authored a book with CRC Press/Taylor and Francis titled 'Natural Language Processing in the Real World.' She is also an open-source contributor and has created Python libraries, including pyAudioProcessing, which has been utilized in research for government bodies, businesses, and several educational institutions across the globe. Jyotika actively volunteers for promoting diversity in STEM and mentoring aspiring individuals. Her contributions have been acknowledged through several awards, including one of the Top 50 Women of Impact in 2023 and 2024 by Women Impact Tech, and being named among the Top 100 most Influential People in Data and Analytics in 2022 by DataIQ, along with other accolades.

Sid Padgaonkar

Sr. Director - Product Management (Gen AI) - Strategic Customers

Sid Padgaonkar is the Senior Director with OCI's Strategic Customers Group. Sid if focused on GEN AI product incubations, outbound product management and GTM strategy.


Previous Post

Developing AI applications with OCI Generative AI and LangChain

Rave Harpaz | 5 min read

Next Post


Comprehensive tactics for optimizing large language models for your application

Jyotika Singh | 8 min read