Tuesday, September 3, 2019

#727 OIC - Integrations leveraging Process for Error Handling Part 1

Here is a simple example of leverage Process for Human Intervention in Error Handling.

I have an integration that uses the connectivity agent to write to an on-premise file.
The use case is simple - json request contains customer details and these are simply written to a file.

CreateCustomer in the action that invokes the File Adapter.

The Global Fault Handler is configured as follows -

As you can see, the fault handler is calling a process.
This process will display the Error and the Customer payload.
The use case here - the person to whom this task is assigned views the error, and, if possible, takes corrective action.

In my simple example the connectivity agent is down. She restarts the agent and then res-submits the customer data to the integration.

Now to a test -

As you can see, the request to OIC has timed out.

Here is the Instance Tracking in OIC -

I login to Process Workspace and see the following task -

The error message is salient and to the point -

No response received within response time out window of 260 seconds. Agent may not be running, or temporarily facing connectivity issues to Oracle Integration Cloud Service.

I re-start the agent -

I then click Re-Submit -

Customer data is processed and the file is written -

Ok, this may not be the most appropriate use case for the  process user.
So how about one that is.

I have a DB table - Customers -

working with the customer data from the previous example - we will insert a new record into the customer DB, mapping custnr to cust_id.

The integration has been amended as follows -

I added a scope for the DB Insert - I also added a call to our error handling process in the Scope Fault Handler -

I test with the following payload -

This returns a http 200 - OK.

I check in OIC Monitoring -

However, when I view this -

This is caught by the Scope Fault Handler -

Ergo, the process has been called.

I check Workspace -

Apologies for the error message in German, aber so ist das Leben!
the name value is too large for the column.

Now I can fix this error and re-submit.

Now the issue is 2 files have been written -

So let's change the integration somewhat and leverage only a Global Fault Handler, no more SCOPE

I test with the following payload -

this time I get an http 500 response in Postman

The Process has been called - I fix the name length issue and re-submit -

Integration completes successfully -

File is written -

DB is updated -

No comments: