Friday, May 7, 2021

#850 OIC May 21 Release New Features - ATP Bulk Data Import

To begin, a BIG thank you to my colleague Prakash M. here.

This scenario covers the bulk load of data into ATP from an ftp server. I'v got a file full of orders and need to get them into my DB ORDERS table asap. With the May 21 release, the OIC ATP has been powered to do just this. 

The diagram below shows the flow -

OIC reads the file via the ftp adapter. Then the ATP adapter does the magic - stores the file in my OCI Object Storage Bucket and then loads the data from there into ATP via the DBMS_FILE.COPY_DATA procedure. 


 

So here are all the parts - I use the bucket - bucketNC.

OCI Object storage.






Here is my DB table - 







Here is the orders data - this is a .csv file 

So let's get those orders from the csv file into the ORDERS table.


Firstly to OCI...

The ATP adapter has been augmented to allow connectivity over OCI - there is a new security option available - JBDC With OCI Signature. The configuring of this option will require some information from your OCI setup - 

Here are the keys I will use - please refer to my previous post on OCI for details on how to generate keys.


















apart from the private key, I will also require the following -
  • Tenancy OCID
  • Compartment OCID
  • User OCID
  • API Fingerprint

Now to OIC... 

ATP adapter configuration -





















I save and test - 








I now leverage this connection in an integration  -





Here is the configuration of the ATP Invoke - BulkLoadOrders

























































Now to define the input file format - 



















I leave the Record Delimiter blank - please see this post from Prakash for a detailed description of these options as well as the Advanced -

























Now to the mapping - here I simply pass the file reference from the ftp read - 








Note the format field in the target - 











This allows me to make formatting changes on the fly.

Now essentially that's it - 

I activate and test - 
























Note the OPERATION_ID returned - more about that later.

I check out the ATP Table - 







Just to prove to those sceptics out there that OCI Storage is being used...




















I activate and test again - deleting the DB rows beforehand.
































I check out the Bucket - 





















So file is in the bucket, and the orders in the DB - 










check out the DB Schema - a couple of new tables appear - 






































Check out the following in the DB - 

select type, status, table_name, rows_loaded, file_uri_list 
from user_load_operations 
where id = 5;








I also did a load test with a file containing 3 million orders.
The 3 million orders were loaded within 2 minutes. 





No comments: