Monday, March 30, 2015

#388 Stream Explorer --> Patterns - Eliminate Duplicates and Missing Event Patterns.

Introduction


According to the ORCL docs -

A pattern provides you with a simple way to explore event streams based on common business scenarios.

A pattern is a template of an Oracle Event Processing application that already has the business logic built into it. The visual representation of the event stream varies from one pattern type to another based on the key fields you choose.

So which patterns are provided?








































The names are essentially self-explanatory, but, if required, you can find in-depth descriptions in the
users guide here

I'll content myself with giving you some examples here.

Eliminate Duplicates Pattern

I have a file that contains lots of credit card transactions, many of them duplicates.











Now my key here is CreditCardNr, MerchantNr and TransactionValue. In my particular scenario,
that has to be unique.


I create a Stream based on a CSV file.

















Note: I do not create an Exploration.
















Now create the Pattern, Eliminate Duplicates -
Specify the Stream.
Specify the key.















Here is a closeup of the settings -









Play around with the Time Window settings to see how it affects the output.















Test File is available here

Missing Event Pattern

Here I have a simple sensor heartbeat scenario -





 I then create the Pattern Missing Event based on this stream.

Here are my results -














 

To recap, my input file is as follows -



Further on down the file, we are missing some heartbeats from sensor 7.

















I've also added some rogue sensors (10 and 11) -
















Just a note on the Exploration name - It has defaulted to Exploration3.
























Looks better now -









Test input file available here


#387 StreamExplorer --> Source - JMS / Target - JMS

Introduction

In this blog post I detail the setup/creation of a JMS Stream.
I will create a JMS client that writes credit card TXs to a Queue.
This queue will then be the source for my Stream -










JMS Queue Setup

I need to create the queue and the connection factory via the WLS console.







JMS Client

This is a simple Java JMS client that creates a Map Message to log a credit card transaction.
The input fields are –
  • ccNr – credit card number
  • cardholder
  • merNr – merchant number
  • merCountry – merchant country
  • txAmt – transaction amount.

Note how these map to the message –


















The full JMS client code is available here

Test the JMS Client


 










I have exposed the JMS client as a web service and then called
it from a SOA Composite.







This is essentially a BPEL process that calls the JMS client n times.
















The JDev App containing the JMS Client and the SOA project is available here

Now I have 10 messages in the queue.

Create the Stream Explorer artifacts

First I create the Stream -



 Specify the connect info.
















Now manually define the shape of the message being received -



















Now define the Exploration -















Execute the composite to ensure you have JMS messages.
Then check the Live Output Stream for the transactions -















Create the JMS Target Queue via the WLS console


Here I create a queue called CCTXOutQueue























I now add a target to the Exploration.


















Publish the Exploration -











Test again - I execute the composite specifying 10 transactions -

I monitor the out Q -

















Tuesday, March 24, 2015

# 386 Stream Explorer - Targets --> Type: CSV File

According to the ORCL docs -

Every exploration must be configured with a target to send the details downstream. Unless the exploration is published, you cannot send the events downstream to the configured target.

Ok, so Targets are required to send data downstream, e.g. via EDN to a SOA composite, or simply via a file. Before we look in detail at Targets, a word about the difference between draft and published.

Per default, Stream Explorer explorations are created as draft.

Again, to quote the docs -

The draft exploration is not available to other users of the application. An exploration moves to a published state when it is published. When a published exploration is deleted or unpublished, the exploration moves to a draft state.

Now that we have cleared that up, let's move on to Targets -

Let's look at our simple exploration again -











I want to write this data to a file so I set up the following Target -









I now have to Publish the Exploration -













Note the Exploration has status Published.

















Now, where is the output file?









Here is the data -



#385 Stream Explorer - Preferences

In the previous posts I have covered data manipulation via summaries, group by, filters etc.
I also demonstrated the use of a simple time window - analysing orders over a rolling 5 minute time window.

Now I would like to take a step back and look at the Stream Explorer Preferences and what we can do with them.











Under General - we can set the default page after login -












Amending this setting to Catalog results in me being re-directed directly to the Catalog list after login.















View Mode allows us to select between the following -

Browser and Projector.

Selecting Projector will apply a different skin, appropriate when viewing the app on a projector.
















Maybe you can notice the subtle difference.

Notifications -












You are usually notified via a popup box on the UI, when you create, amend artifacts in Stream Explorer. If you do not want to see these messages, then de-select the above.

Catalog -

Let's us configure how we want to view the Catalog.

















I want to see my Favourites first -









Exploration / Live Output Stream -
















Here I can set the number of rows to be displayed.
Note, the default is 100.