Monday, September 9, 2019

#729 OIC AQ adapter


















Queue Setup in Oracle Advanced Queuing


First step was to set up the Q in AQ.

CREATE type Message_typ as object (
subject     VARCHAR2(30),
text        VARCHAR2(80)); 

EXECUTE DBMS_AQADM.CREATE_QUEUE_TABLE (queue_table => 'objmsgs80_qtab',queue_payload_type => 'Message_typ');

EXECUTE DBMS_AQADM.CREATE_QUEUE (queue_name => 'msg_queue',queue_table => 'objmsgs80_qtab');
EXECUTE DBMS_AQADM.START_QUEUE (queue_name => 'msg_queue');

I then created a procedure to create a message -

CREATE OR REPLACE PROCEDURE P_AQ_ENQ AS
  enqueue_options      dbms_aq.enqueue_options_t;
   message_properties   dbms_aq.message_properties_t;
   message_handle      RAW(16);
   message             Message_typ;
BEGIN
   message := message_typ('NC MESSAGE','Gruess Gott von AQ');
   dbms_aq.enqueue(queue_name => 'msg_queue',enqueue_options => enqueue_options,message_properties => message_properties, payload => message, msgid => message_handle);
   commit;
   END;

Create the Integration in OIC

simple use case, de-queue message and write it to a file.


























AQ getMsg configured as follows -


























I set Tracking -















I now execute the plsql procedure to enque a message -













I check my ftp directory -













I check out the Monitoring/Tracking screen -













simple and succinct.




Wednesday, September 4, 2019

#728 OIC CI/CD with Flexagon

Comprehensive Oracle Partner offering for CI / CD for Oracle Integration Cloud.
Check it out here

















Tuesday, September 3, 2019

#727 OIC - Integrations leveraging Process for Error Handling Part 1

Here is a simple example of leverage Process for Human Intervention in Error Handling.

I have an integration that uses the connectivity agent to write to an on-premise file.
The use case is simple - json request contains customer details and these are simply written to a file.






































CreateCustomer in the action that invokes the File Adapter.

The Global Fault Handler is configured as follows -



















As you can see, the fault handler is calling a process.
This process will display the Error and the Customer payload.
The use case here - the person to whom this task is assigned views the error, and, if possible, takes corrective action.

In my simple example the connectivity agent is down. She restarts the agent and then res-submits the customer data to the integration.

Now to a test -


As you can see, the request to OIC has timed out.

Here is the Instance Tracking in OIC -




















































I login to Process Workspace and see the following task -







































The error message is salient and to the point -

No response received within response time out window of 260 seconds. Agent may not be running, or temporarily facing connectivity issues to Oracle Integration Cloud Service.

I re-start the agent -













I then click Re-Submit -

























Customer data is processed and the file is written -










Ok, this may not be the most appropriate use case for the  process user.
So how about one that is.

I have a DB table - Customers -







working with the customer data from the previous example - we will insert a new record into the customer DB, mapping custnr to cust_id.

The integration has been amended as follows -

I added a scope for the DB Insert - I also added a call to our error handling process in the Scope Fault Handler -






















I test with the following payload -












This returns a http 200 - OK.

I check in OIC Monitoring -
















However, when I view this -

























This is caught by the Scope Fault Handler -




















Ergo, the process has been called.

I check Workspace -

























Apologies for the error message in German, aber so ist das Leben!
the name value is too large for the column.

Now I can fix this error and re-submit.


























Now the issue is 2 files have been written -










So let's change the integration somewhat and leverage only a Global Fault Handler, no more SCOPE















I test with the following payload -












this time I get an http 500 response in Postman


























The Process has been called - I fix the name length issue and re-submit -

























Integration completes successfully -


















File is written -


















DB is updated -












Tuesday, August 20, 2019

#726 OIC - using the Google Calendar adapter

Simple example of the above -













Pre-Requisites



First step is to do the pre-reqs defined in the Google adapter doc here

Navigate to https://console.developers.google.com/apis

Click on Enable APIs and Services












Select Google Calendar -













Once the API is enabled - then navigate to Credentials - creating a web app.












Credentials include -

Client ID
Client secret

You also need to set the re-direct URI to

https://yourOIC:443/icsapis/agent/oauth/callback























Create the OIC Integration 

Create the connection, leveraging client ID etc.




















Note the Scope: this can be set to either -
https://www.googleapis.com/auth/calendar or
https://www.googleapis.com/auth/calendar.readonly


This simple integration creates an event in my Google Calendar -

























Here is the configuration of the adapter -





















I map the following request fields -




















The response mapping is as follows -

Note: I have filtered on the mapped fields -















I activate and test -












The response is as follows -











I consult my google calendar for September 20th -






















Google Calendar API docs are here