Tuesday, March 17, 2026

#1129 - OCI Log Analytics MCP Server

Introduction

As you all know, I'm a great fan of OCI Log Analytics, especially in respect of OIC. Before delving into the MCP aspect, let's have a quick recap on what Log Analytics offers the OIC administrator -

  • OOTB OIC focused dashboards, especially useful for OIC fleet management based on - 
    • OCI Service Metrics for Oracle Integration
    • OIC Activity Stream logs
  • Ability to create one's own specific dashboards for OIC

  • Enables ad hoc querying of the OIC Activity Stream, via Log Explorer
  • Ability to trace requests over multiple OCI services e.g.
    • OCI API Gateway to OIC
  • Enables long term storage of OIC Logs
  • etc. etc.

Net, net, a very compelling addition to the OIC Admin toolkit. OIC Log Analytics also offers LoganAI, the ability to query your logs using natural language. Please check out my blog post on this topic, before continuing. 

This cool feature has now been augmented with the ability to connect to Log Analytics from your favourite AI Assistant.

A big thanks to my ORCL colleague, Rishabh G. His repo is available here.

As he states - this enables an - MCP server that connects AI assistants to OCI Log Analytics. Query, analyze, and explore your logs through natural language. No query language expertise required.

Log Analytics MCP Server

This is running on a compute node - 

Claude as my AI Assistant

As you can see, I've added the MCP server to my Claude config.

I'm running Claude on windows, hence my use of PuTTY.

Before firing up Claude, let's enter a couple of orders in OIC. These run with trace set to audit

The integration will behave as follows - 
  • All orders for an iCar will be rejected, as we don't stock such a product.
  • Every order for an iBoat takes longer due to extra processing required for this cool product
  • Orders over 10k for customer NiallC will be rejected, as it breaks his credit limit
  • All other orders will be approved and processed.
I will execute 7 orders -
  • orderNr 100 - order for an iCar - this will fail
  • orderNr 101 - order for an iBoat - this will take longer to process
  • orderNr 102 - order for iBike - approved and processed
  • orderNr 103 - order for an iBoat - this will take longer to process
  • orderNr 104 - order for iScooter - approved and processed
  • orderNr 105 - order for iBike - approved and processed
  • orderNr 106 - order over 10K from NiallC - rejected
The tracking fields are as follows - 


Now to Claude -













I now make many api calls over a period of a few hours, using SOAP UI.












  

Friday, March 13, 2026

#1128 - Pushing OIC Activity Stream Logs directly to OCI Log Analytics

Introduction

OCI Log Analytics is a great value add to OIC monitoring, especially in respect of OIC fleet management.
Previously, we needed to send the logs via OCI Logging, and, from there, via a Connector to OCI Log Analytics. Some customer also experienced data truncation, when pushing the activity stream logs bia OCI Logging.

Net, net, the above problem is solved, as you will see, we've simplified this completely. 

Just go to your service instance page in the OCI console, select your service instance and then check out the Settings section on the Details tab  - 




 



Now for a bit of reading, check out the docs here

We have to ensure we have a log group for OIC logs in OCI Log Analytics - 

Click on Administration

We need to create a policy to allow uploads from OIC.

For this we need the OIC client id; this can be retrieved by the Oracle Cloud Services app, auto-created when you provisioned your instance.

Click on the link -

Click OAuth configuration and then scroll down to General Information, here you find the client id - 

We will add this client id to a dynamic group, which we will now create - 


Now to the Policy which will grant permissions to the dynamic group.

allow dynamic-group DynamicGroup to {LOG_ANALYTICS_LOG_GROUP_UPLOAD_LOGS} in compartment LogGroup_Compartment
allow dynamic-group DynamicGroup to {LOG_ANALYTICS_SOURCE_READ} in tenancy
allow dynamic-group DynamicGroup to use loganalytics-ondemand-upload in tenancy
allow dynamic-group DynamicGroup to {LOG_ANALYTICS_LOG_GROUP_UPLOAD_LOGS} in tenancy

That's the prep work done, all we need to do now is enable this in the OIC service instance page -

Click enable and enter the ocid of your target OCI Log Analytics log group.

Sanity Test


Here's the data in OCI Log Analytics Log Explorer

Note, I ran the integration in Debug mode.


Summa Summarum

We have made it even easier to ingest OIC activity stream logs to OCI Log Analytics. It's just so easy; no excuse for not trying it out!







 

Thursday, March 12, 2026

#1127 - n8n AI Agent using OIC Tools


n8n is a really cool workflow automation platform out of Berlin.

Here I'm kicking the tyres, with my  ubiquitous Order Processing agent example. 

Again, this demo is simple, from a functional perspective; I'm simply concentrating on the mechanics here.


Simple n8n Demo

This workflow will process a purchase order, if all is ok it will be posted to one of our ERPs, NetSuite for Irish customers, SAP for die Deutchen.

Let's check out the AI Agent configuration - 

AI Agent Configuration

I set the usual system prompt for this order processing use case - 

---- Guidelines ------

### Order Processing Steps

You will receive input, such as - Process the following Order  - OrderNr is 123, Customer is NiallC, Product is iBike, Price is 2345, Country is Ireland, Customer Email is niall.commiskey@oracle.com.


Tool policy: Call each tool at most once. Never re-check with the same tool. If anything is missing/unclear, ask the user and stop. 

Here are your instructions, including which tools to use.

### Order Processing Steps

#### 0. Duplicate Order Check

Use the GET_ORDER tool to check if an order already exists. 

If it does, then use the NOTIFY_CUSTOMER tool to inform the customer, set the subject to 'Duplicate Order Received', then stop processing. 

If the order is not found, then continue to the Validate Order step.

 #### 1. Validate Order

Use the VALIDATE_ORDER tool to check if an order passes our validation rules. 

If the order is invalid, then use the NOTIFY_CUSTOMER tool to inform the customer. 

Also include the "message" returned by VALIDATE_ORDER. If an order is invalid, processing stops immediately; in such cases, detail exactly why this order failed validation.

If an order is valid, then proceed to Check Inventory.

#### 2. Check Inventory

Use the CHECK_PRODUCT_INVENTORY tool to check if the product is in stock. The tool returns inStock equal to true or false. If the product is out of stock, use the NOTIFY_CUSTOMER tool to inform the customer. Ensure the "message" passed to the NOTIFY_CUSTOMER tool is relevant and friendly. Remember, we are razor focused on customer satisfaction. If an order's product is not in stock (inStock = false),  processing stops immediately; in such cases, detail exactly why this order's processing has been terminated. Otherwise, continue to Create Order in one of our ERP systems.

#### 3. Create Order in one of our ERP systems

Use the CREATE_ORDER_SAP tool to create orders for German customers.

Use the CREATE_ORDER_NETSUITE tool to create orders for Irish customers

#### 4. Email Customer detailing the final outcome of the order process for valid orders.

Use the NOTIFY_CUSTOMER tool to email the customer, detailing the order number from the ERP system used.

#### 5. Display Order Details 

Finally, display all the order details in the agent log, including the order nr returned by the ERP system. Also end with a quote from Mark Aurelius.

---

Simple enough!

OpenAI Chat Model Configuration

I enter my api key for openAI and select the model I want.

MCP Client Configuration

Here I enter my OIC project MCP server url along with a valid Bearer token.


Test the Workflow

Agent begins processing the order -

5 tools have been invoked during processing - we see these in the Logs panel above.

I validate in OIC Observability - 

I check my email - 

Summa Summarum

n8n is a really cool product, simple to use, great UI and monitoring capabilities. 

All I need to do is put the OIC stamp on this -