Wednesday, September 30, 2020

#802 - Creating Journal Entries in Netsuite via OIC

 My first real foray into the magical world of bookkeeping.

What is a journal entry?

Apparently journal entries are used to record business transactions. And remember the books must always balance - so here's an example for a sales entry -

I sell my Hare of the Dog T-Shirt on credit, so I need to debit accounts receivable and credit sales.

Ask your accountant, if you need to know more.

Net, net, I can create a Journal Entry in Netsuite, via the OIC Netsuite adapter.



















Again, I am not a Netsuite or bookkeeping expert, so my first step is to create a Journal Entry in Netsuite itself and then look at the xml version.

























Now I have an idea of the structure. 
I create a new app driven integration in OIC with a REST Trigger - request payload as follows -














This seems to me the minimum required.
Credit account 2 and debit account 4. 
I got these account numbers form the xml representation of the Netsuite Journal Entry.

I map as follows - 
















I put an if before mapping debit/credit.

Now to testing - 





















looks good, except for the Permissions Error -

I give myself this permission in Netsuite 




















I re-run the integration -




















Validate in Netsuite -



Tuesday, September 29, 2020

#801 Oracle Integration (OIC) Recipes - HCM Directory Synchronisation

 




The 2 previous posts dealt with Technical Accelerators, now to the recipes.
Recipes are best practice implementations of common use cases. They provide a very quickstart to implementing these.

Let's look at the above recipe; as you can see, I have already installed it.


.










It installs as a package - 











This is what it contains - 






Here is the Integration - I can either clone and edit or edit directly -
I clone it -



Note the Keywords here - you can have a max of 10, so delete that do not apply to you.

If you do not do this, then you will see the following error when trying to save the integration -


 
Thanks for pointing this out Harris!

Ok, so I delete the superfluous keywords -




Open the integration - 









Let's look at the scope - 

















Easy enough to follow - this integration is a scheduled job that will run according to your schedule. It invokes the ATOM Feed from HCM to retrieve new employees, entered after a particular date - dateAtomLastRun.

It uses the REST adapter to invoke an HCM REST api to retrieve new employee details.
Then the ftp adapter is invoked to write the new employees to an file.

This final step can, of course, be replaced by your specific functionality.


So what do I need to do for this to work?

Firstly - complete the connections that came with the recipe -


  






































Now to the ftp connection -
I first create a directory on my ftp server - I am using DriveHQ here



















Ok, all connections are tested and saved - let's look at the Lookup and the Library -














Change emailalias to suit your needs.

The Library function is used when setting the value of dateLastRun for the HCM Atom Feed.



















Now to the final step, before amending the recipe integration -


Now we can return to the integration -

Note the Schedule Parameter and it's default value - adjust as necessary -











SetupVariables











The GetNewHireATomFeed action leverages the HCM connection -

















Again, you may want to do some changes here e.g. amending values such as Maximum entries set to be processed etc.

Now to the newEmployee processing loop -










As already mentioned, the HCM REST api is used here to retrieve employee details, remember, the AtomFeed invoke has been configured just to return header data; again something you could change. This action could also be replaced by invoking the REST api via the HCM adapter. 

What I do have to change is the FTP invoke - to specify my output directory.



 









Note that the file format is based on a csv - again, this is something you can change to suit your requirements.




















That's it, now I can activate and test






Monday, September 14, 2020

#800 OIC Technical Accelerator - Alert Notifications





















What is it?

A technical accelerator package that allows one to send Alert Notifications from your integrations, based on a variety of parameters - 
  • channels - EMAIL, Pager, JIRA, Custom
  • integrations - different channels/recipents for different integrations etc.
  • Errors/Failure Messages -  different channels/recipients based on the error message thrown etc.
Ergo, this technical accelerator is designed to be called from one or more of your integrations.

What's in it?


The Accelerator contains the following artifacts -

  • integration - Oracle Alerting Service
  • connection - Oracle altering Service Invoke
  • lookups - 3 of them - discussed below.






















































Lookups -














The first lookup contains details of who should be informed and how -













The second lookup specifies the channel(s) to be used for each integration -












The third and final lookup allows one to specify the channel, based on error type/Nr -














Let's look at the Integration -

Anatomy of the Integration


REST Trigger Request Payload -


messageID is the error message id, thrown by the integration.















assign_notificationType - create variables and assign initial values to some of them


















Scope by Scope...

InitializeVariables



















Here the variable are set to the request values, where the latter is not null.
e.g.
integration variable var_integrationcodeversion is set to request IntgCode_Version

This assign action below leverages the Lookup - ORCL-T-GENRIC_ENS_NOTIFICATION_SELECTOR -













to set var_NotificationType, based on the version of the integration, i.e.
request parameter - IntgCodeVersion

Simple stuff!

DetermineNotificationType

This scope sets the var_instanceID and then leverages the Lookup -
ORCL-T-GENRIC_ENS_NOTIFICATION_SELECTOR_MSGID -
to set var_notificationType and var_Subject.


Email_Scope -














This scope sets the Email Address(es), leveraging the Lookup -
ORCL-T-GENRIC_ENS_NOTIFICATION_DATA and then sends the email.


PAGERDUTY_Scope -













This scope sets the Pager fields, leveraging the Lookup -
ORCL-T-GENRIC_ENS_NOTIFICATION_DATA and then creates the pager duty incident.

JIRA_Scope - 













This scope sets the JIRA fields, leveraging the Lookup -
ORCL-T-GENRIC_ENS_NOTIFICATION_DATA and then creates the JIRA ticket.


CustomService_Scope -


notificationType="CUSTOM" -
This scope is essentially a placeholder for you to do your own custom processing.


Leveraging the Technical Accelerator

First step is to clone the base integration -







and then Activate it.

Next step is to review the request payload - 

I will need to pass the above payload to the Alerting Service integration.

Now to the calling integration - 



I will add a Global Fault Handler to this integration later.

The Lookups will need to be amended for my AA-BigFileTest integration - 



Ok, so back to my calling integration - 

I add a Global Fault handler that calling the Alerting Service -


Mapping -


Mapping Test result -

This integration simply copies a file from one ftp directory to another -
The file is called testFile.xml.


I delete this file from the ftp source directory - this will cause an error.


Here is the output of my test -



Here is the email I received -