Thursday, July 22, 2021

#871 OIC August 21 Release New Features update


Check out Antony's blog post here

Wednesday, July 21, 2021

#870 OIC Monitoring and Logging Analytics - adding Business Data to Dashboards

Integration Insight is THE solution for business user facing Dashboards on top of your integrations and processes. However, you can also surface business / payload data in Logging analytics. The example below is again based on the OIC Activity Stream logs. 

Now we do suggest turning off logging for your production integrations, however, there may be reasons for logging, such as for compliance purposes e.g. I need to prove when order nr 123 was processed. The OIC LOG action will still write to the OIC Activity Stream, even when flow logging has not been enabled. 

Here is my simple order processing integration - 


I activate, without enabling logging - 

I test - 

Check out the Activity Stream - 

I see the relevant log entry in Logging Analytics - 

Note, there is only 1 log entry for this integration flow, a small overhead. The admin needs a dashboard that shows which products are being ordered, in my case - different brews. So I need to parse the message and extract 'keg of Spalter Dunkel' - Spalter Dunkel is a great beer from my local brewery here in Franconia. Click here for the beer.

I need to add an extended field to the Logging Analytics Source for this value - 

Here is the regex - 

The extract expression contains a little "Schoenheitsfehler" - I have included the order nr, as you maybe have noticed. I really need to change the LOG action just to log the product.

But, it is an imperfect world, so weiter, weiter!

I have used the pre-seeded field - Message Info - as the field to hold the product name. 

The Log Explorer query is very simple -

I add a couple of more orders -

I add this to an existing Dashboard - 

there are other Visualizations available, for example - 

1. Line 


2. Word Cloud - 

3. Tree Map - 

4. Sunburst -


There are others as well - try them out!

Great stuff - now time for a pint - 

Monday, July 19, 2021

#869 OIC Monitoring and Logging Analytics - more steps towards Fleet Management

The previous post mentioned 2 OIC instances in two different regions. A typical fleet management requirement would be to check whether both OIC instances are up and running. Here's where OCI APM - Application Performance Management can help -

It contains a feature called Synthetic Monitoring -  The APM docs tells us that Synthetic Monitoring helps in simulating a path in the application that a user would normally take, and ensure that the user can transition through the different web pages in the path smoothly. This helps is recognizing application performance issues before the end user experiences it.

I created 2 simple ping-style integrations, one in my London and one in my Phoenix OIC instance.
These can then be called by APM - 

The configuration of such a Monitor is as follows - 

I have set this to invoke these "ping" integrations every 5 minutes.

I can check the results via the History link - 

Metrics from APM are available for use in Logging Analytics. Here is my existing Dashboard - 

I edit this Dashboard and add a metrics based source -

I drop Availability - then edit as follows - 

Now I do the same for the Phoenix monitor - 

Now I deactivate the 'ping' integration on the London instance for ca. 30 minutes - 
The Dashboard is as follows - 


Thursday, July 15, 2021

#868 OIC and Logging Analytics - steps towards fleet management

This post covers the scenario of aggregating the logs of OIC instances from different regions within the one tenancy. In my case, OIC1 is in UK South(London) and OIC2 is in US West(Phoenix). So how best to approach this?

We've seen in a previous post how easy it is to push the OIC Activity Stream logs to OCI Logging Service. That is the starting point for us.

So back to our use-case - this is the high level flow - 

OIC1 London Activity Stream Logs to OCI Logging - 

Logs to Object Storage - 

Replication to Phoenix Object Storage - 

Create Log Collection Rule - following documentation here

Step 1 is to assign the permissions required to collect logs from Object Storage - 

allow service loganalytics to read buckets in compartment yourcompartment

allow service loganalytics to read objects in compartment yourcompartment

Allow group yourGroup to manage all-resources IN compartment OICPMCompartment where any {request.permission='LOG_ANALYTICS_OBJECT_COLLECTION_RULE_CREATE',request.permission='LOG_ANALYTICS_LOG_GROUP_UPLOAD_LOGS',request.permission='LOG_ANALYTICS_ENTITY_UPLOAD_LOGS',request.permission='LOG_ANALYTICS_SOURCE_READ',request.permission='BUCKET_UPDATE',request.permission='LOG_ANALYTICS_OBJECT_COLLECTION_RULE_DELETE'}

Step 2 - create the Rule via OCI CLI -

oci log-analytics object-collection-rule create --from-json <json_file_name> --namespace-name <namespace_name>

My json file is called create.json -


The file contents - 

The response - 

The setup to push the OIC Activity Stream Logs from OIC2 in Phoenix is that described in the previous post - Enable log at OIC instance level, create Service Connector to push the logs to OCI Logging Analytics. 

Ok, so now let's execute some requests to both OIC instances -

10 requests to AA-Hello-World in Phoenix.

10 requests to AA-HiWelt in London -

I check out the logs in OCI Logging Analytics - 

I call also see a high level comparison between the 2 OIC instances - 

Later, I do a check comparing the data with that from the OIC Monitoring console -

As you can see, my phoenix instance processed 449 flows in the last 15 minutes.
The data from OIC Monitoring for the same period - 

A small difference in the timings, due to me taking screenshots!

Wednesday, July 14, 2021

#867 - 1 million Page Views - Thank You!


I couldn't resist the opportunity to bring the count over the 1 million mark. But, of course, it's only a number; unfortunately, it does not directly correlate to € or $. 

Net, net, I hope my blog provides you with value-add in respect of Oracle Integration and OCI I hope to provide. I am looking forward to the next million mark.

Again, thanks for using this resource, it spurs me on to continue posting.