Wednesday, April 29, 2026

#1144 - OIC Agents powered by OCI GenAI

 Introduction

This is a new feature, available with the 26.04 release, the ability to use one of the models, available with OCI GenAI, as your LLM.

I will go through the setup, step by step.
Let's begin by checking our OIC instance has access to OCI GenAI. I do this by creating a simple integration that uses the OCI GenAI native action.

Validate OIC has access to OCI Gen AI 


Mapping is as follows -

Type = 'TEXT'
Text = request field 'prompt'
Role = 'USER'

I run the integration with prompt set to - who was cathal brugha -

The only extra prep required was to create a policy allowing my OIC instance to use the GenAI family in the relevant compartment (the one in which OIC is running).

Here's an example of such -
Allow dynamic-group yourDynamicGroup to manage 
generative-ai-family in compartment yourCompartment

 
The dynamic group will have entries such as - 
resource.id = 'yourOIC Instance ClientID'

The client id can be found in the entry for your OIC instance, created under Oracle Cloud Services -

You can filter by 'Integration', if you have a lot of entries - 

Once you find your instance, click on OAuth configuration

Scroll down and you will see the Client Id - 

Copy this, including the _APPID suffix.




Check out the models available to you

Please check the following page, when using OCI GenAI as the LLM provider for your OIC Agent -

My OIC instance is in Phoenix, so I see the relevant GenAI region is Chicago (ORD).

Now you can open your OCI console and navigate to GenAI -

Ensure you are in the correct region, in my case, Chicago, then click on Chat. Select your compartment where OIC is located -

Here is the list of models available to you.

As I'm in the us-chicago-1 region, I see some models from Cohere, Meta, OpenAI, Google Gemini and grok. Google Gemini is not hosted in the Oracle Datacenter, so I'll not choose it. I go for openai.gpt-oss-120b.

I ask the usual - who was Cathal Brugha question - 

Which models are actually hosted on OCI? 

Data residency is often a reason for choosing to use OCI Gen AI models. However, as just mentioned, the Google Gemini models are not hosted on OCI. Check out this page for more information - 

I check out the Google Gemini models - 
Ergo, if you use any of these models, your data is being passed to GCP, i.e. it's leaving your OCI region.

Another point to note - the US regions leverage Chicago, another region in the US. In Europe it's a bit different - EU data centers use Frankfurt. The UK uses LHR (uk-london-1).

Data Privacy 

You can check out the data handling in GenAI here


Net, net - no data is stored or shared, if you use models hosted on OCI.

Back to our case at hand, let's use the xai.grok-4 model

OIC Agents using GenAI LLMs

I'm back in my order processing demo project; here I'll create a new Agent Pattern - 

Note the pre-seeded settings - 

There is a input field on the right, this is for the model type; here I enter openai.gpt-oss-120b.


Some other artifacts have also been created. One is the integrations that orchestrates agent actions - 

You can treat this as a blackbox, but it's good to know why it is there.

A new lookup has also been created - 

This you need to configure - 

I enter my compartment id and destination region, in my case, ORD,  (us-chicago-1)

These values are used by the aforementioned integration, when invoking the LLM.

Now to the Agent - 

This agent is a clone of the one I already described in previous posts, so I will not go into the gory details again.

I run the new agent - 


Summa Summarum

There are multiple reasons for using the OCI GenAI based LLM. They include - 
  • common billing - costs come out of the same universal credits pot as OIC
  • data privacy
  • ease of use
Do try it out!












 

 

No comments: