Striim

Welcome to the Striim Help Center and Community Site

3.10.1 Fundamentals of TQL programming

Follow

This section covers the TQL language, basic programming tasks, and best practices.

TQL programming rules and best practices

This section covers the rules you must follow when writing TQL applications as well as best practices that will make development easier and more efficient. Refer to the Striim concepts or Glossary if you are unfamiliar with any of the terminology.

Namespaces

Namespaces are logical domains within the Striim environment that contain applications, flows, their components such as sources, streams, and so on, and dashboards.

Every user account has a personal namespace with the same name. For example, the admin user has the admin namespace.

Every Striim application exists in a namespace. For example, if you install an evaluation version using the installer, the PosApp application is in the Samples namespace.

See Using namespaces for more information.

Connections

A Striim application consists of a set of components and the connections between them. Some components may be connected directly, others must use an intermediate stream, as follows:

  • source (OUTPUT TO) > stream > CQ (SELECT FROM)

  • source (OUTPUT TO) > stream > window (OVER)

  • source (OUTPUT TO) > stream > target (INPUT FROM)

  • cache > CQ (SELECT FROM)

  • WActionStore > CQ (SELECT FROM)

  • window > CQ (SELECT FROM)

  • window > CQ (SELECT FROM window, INSERT INTO stream) > stream > window (OVER)

  • CQ (INSERT INTO) > WActionStore

  • CQ (INSERT INTO) > stream > target (INPUT FROM)

Note

The output from a cache or window must be processed by a CQ's SELECT statement before it can be passed to a target. In other words, cache > stream > target and window > stream > target are invalid sequences.

Joins

Inner joins may be performed implicitly by specifying multiple sources in the FROM clause. For example:

FROM PosData5Minutes p, HourlyAveLookup l

Other joins must be explicitly declared. See CREATE CQ (query).

A join must include bound data, in other words, at least one cache, event table, WActionStore, or window. For example, assuming that StoreOrders is a stream that originated from a source and ZipCodeLookup is a cache, the following is valid:

SELECT s.storeId, z.latVal, z.longVal
FROM StoreOrders s, ZipCodeLookup z
WHERE s.zip = z.zip

When joining events from multiple streams, the CQ's SELECT FROM should reference windows rather than streams. See the LargeRTCheck and ZeroContentCheck flows in the MultiLogApp sample application for examples. Define the windows so that they are large enough to include the events to be joined. If necessary, use the GRACE PERIOD option (see CREATE STREAM) and/or a sorter (see CREATE SORTER) to ensure that the events' timestamps are in sync.

Dependencies

You must create a component before you can reference it in the CREATE statement for another component.

component type

create after

create before

application

any other component

flow*

its containing application

the components it contains

source

  • any window for which its output stream is an input (OVER)

  • any query for which its output stream is an input (SELECT FROM)

type

  • any stream for which it is the type (OF)**

  • any cache for which it is the type (OF)

  • referencing it in a WActionStore's CONTEXT OF or EVENT TYPES clause

stream

its type**

  • any source for which it is the output (OUTPUT TO)

  • any window for which it is the input (OVER)

  • any query for which it is an input (SELECT FROM)

  • any targets for which it is the input (INPUT FROM)

cache

any query for which it is an input (SELECT FROM)

window

its input stream (OVER)

any query for which it is an input (SELECT FROM)

WActionStore

its types (CONTEXT OF and EVENT TYPES)

  • any query for which it is an input (SELECT FROM)

  • any query for which it is the output (INSERT INTO)

CQ (query)

  • all input streams (SELECT FROM)

  • all input caches (SELECT FROM)

  • all input windows (SELECT FROM)

  • its output stream (INSERT INTO)

referencing its output stream in a target

target

its input stream (INPUT FROM)

*When an application contains only one flow, it does not need to be explicitly declared by a CREATE FLOW statement. See PosApp for an example of an application with a single flow and MultiLogApp for an example with multiple flows.

**When a stream is created automatically by referencing it in a source's OUTPUT TO clause, it will use the built-in type associated with the source's adapter, so it is not necessary to manually create the type first.

Component names

Names of applications, flows, sources, and so on:

  • must contain only alphanumeric characters and underscores

  • may not start with a numeric character

  • must be unique within the namespace

Components can not be renamed. In the Flow Designer, you can copy a component and give it a new name.

Grouping statements and commenting

TQL supports SQL-style comments. For example:

-- The PosApp sample application demonstrates how a credit card
-- payment processor might use Striim to generate reports on current
-- transaction activity by merchant and send alerts when transaction
-- counts for a merchant are higher or lower than average for the time
-- of day.

CREATE APPLICATION PosApp; ...

To make your TQL easier to read, we recommend grouping statements by CQ or flow and preceding each group with explanatory comments as necessary. See the TQL source code for the Sample applications for programmers for examples.

Including one TQL file in another

You can include one TQL file in another by reference. For example, if part of PosApp were in another file, you could include it in the main file using:

@Samples/PosApp/PosApp_part_2.tql;

You must specify the path relative to the Striim program directory or from root. If you do not, the include will fail with a "file not found" error.

Debugging

When debugging, it may be helpful to run Striim as a process (see Running Striim as a process) so you can use SysOut targets to view the output of sources and CQs (see Writing a simple TQL application for examples). This will also let you quickly restart with different options.

See Reading log files and Changing the log level for instructions on viewing errors.

Writing a simple TQL application

When developing a TQL application, add major components (sources, CQs, WActionStores, targets) one step at a time, verifying that the output is correct before moving on to the next component.

Create a sample data set

When initially coding an application, it is best to use a small sample of the data set for testing and debugging. Once you know all the parts of the application are working, you can move on to test it with real data.

Save the following (the first three lines of the sample data for the PosApp sample application) as .../Striim/Samples/simple.csv:

BUSINESS NAME, MERCHANT ID, PRIMARY ACCOUNT NUMBER, POS DATA CODE, DATETIME, EXP DATE,
CURRENCY CODE, AUTH AMOUNT, TERMINAL ID, ZIP, CITY
COMPANY 1,D6RJPwyuLXoLqQRQcOcouJ26KGxJSf6hgbu,6705362103919221351,0,20130312173210,0916,
USD,2.20,5150279519809946,41363,Quicksand
COMPANY 2,OFp6pKTMg26n1iiFY00M9uSqh9ZfMxMBRf1,4710011837121304048,4,20130312173210,0815,
USD,22.78,5985180438915120,16950,Westfield
Acquire the data using a source

As discussed in TQL programming rules and best practices, you must create a component before you can reference it in another component. This means that the first step in writing a TQL application is to create a source.

Save the following as …/Striim/Samples/simple.tql:

CREATE APPLICATION simple;

CREATE source SimpleSource USING FileReader (
  directory:'Samples',
  wildcard:'simple.csv',
  positionByEOF:false
)
PARSE USING DSVParser (
  header:Yes,
  trimquote:false
) OUTPUT TO RawDataStream;

CREATE TARGET SimpleRawOutput
USING SysOut(name:simpleRaw)
INPUT FROM RawDataStream;;

END APPLICATION simple;

The SysOut adapter writes the output to striim-node.log (see Reading log files). Alternatively, you may run Striim as a process (Running Striim as a process), in which case SysOut will write to the server terminal. This is useful for initial development since you can verify that the data is as expected at each step.

Load, deploy, and start simple.tql as described in Loading and reloading TQL applications during development. You should see the contents of RawDataStream in striim-node.log. Note that if you are running Striim as a process, SysOut output will be written to the terminal running the server process rather than to striim-node.log):

simple: WAEvent{
  data: ["COMPANY 1","D6RJPwyuLXoLqQRQcOcouJ26KGxJSf6hgbu","6705362103919221351","0",
"20130312173210","0916","USD","2.20","5150279519809946","41363","Quicksand"]
  metadata: {"FileName":"simple.csv","FileOffset":138}
  before: null
  dataPresenceBitMap: "AAA="
  beforePresenceBitMap: "AAA="
  typeUUID: null
};
simple: WAEvent{
  data: ["COMPANY 2","OFp6pKTMg26n1iiFY00M9uSqh9ZfMxMBRf1","4710011837121304048","4",
"20130312173210","0815","USD","22.78","5985180438915120","16950","Westfield"]
  metadata: {"FileName":"simple.csv","FileOffset":268}
  before: null
  dataPresenceBitMap: "AAA="
  beforePresenceBitMap: "AAA="
  typeUUID: null
}; 

The second and third lines of simple.csv have been converted to events. WAEvent is the Striim type used for FileReader output. The data field for each event is an array containing the comma-delimited fields from the source file. The five other fields are not used by this application.

Filter the data with a CQ SELECT statement

The next step is to filter the data.

In simple.tql, replace these lines:

CREATE TARGET SimpleRawOutput
USING SysOut(name:simpleRaw)
INPUT FROM RawDataStream;

with the following:

CREATE TYPE FilteredDataType(
  merchantId String KEY,
  dateTime DateTime,
  amount double,
  zip String
);
CREATE STREAM FilteredDataStream OF FilteredDataType;

CREATE CQ Raw2FilteredCQ
INSERT INTO FilteredDataStream
SELECT data[1],
  TO_DATEF(data[4],'yyyyMMddHHmmss'),
  TO_DOUBLE(data[7]),
  data[9]
FROM RawDataStream;

CREATE TARGET SimpleFilteredOutput
USING SysOut(name:simpleFiltered)
INPUT FROM FilteredDataStream;

This selects four of the fields from the source data, converts the data types as necessary, and discards the remaining data. Save the file, then stop, undeploy, and drop the old version, load the new version, deploy, and run it (see Loading and reloading TQL applications during development). You should see these results in striim-node.log:

simpleFiltered: FilteredDataType_1_0{
  merchantId: "D6RJPwyuLXoLqQRQcOcouJ26KGxJSf6hgbu"
  dateTime: 1363134730000
  amount: 2.2
  zip: "41363"
};
simpleFiltered: FilteredDataType_1_0{
  merchantId: "OFp6pKTMg26n1iiFY00M9uSqh9ZfMxMBRf1"
  dateTime: 1363134730000
  amount: 22.78
  zip: "16950"
};
Join data from a cache using a CQ

Caches provide data from non-real-time sources. In this case, the cache is a subset of the USAddresses.txt file used by the PosApp sample application.

Save the following as …/Striim/Samples/simplezipdata.txt:

zip	city	state	latVal	longVal
16950	Westfield	PA	41.9193	-77.523
41363	Quicksand	KY	37.5331	-83.3652

In simple.tql, replace these lines:

CREATE TARGET SimpleFilteredOutput
USING SysOut(name:simpleFiltered)
INPUT FROM FilteredDataStream;

with the following:

CREATE TYPE ZipCacheType(
  zip String KEY,
  city String,
  state String,
  latVal double,
  longVal double
);

CREATE CACHE ZipCache using FileReader (
  directory: 'Samples',
  wildcard: 'simplezipdata.txt')
PARSE USING DSVParser (
  header: Yes,
  columndelimiter: '\t',
  trimquote:false
) QUERY (keytomap:'zip') OF ZipCacheType;

CREATE TYPE JoinedDataType(
  merchantId String KEY,
  zip String,
  city String,
  state String,
  latVal double,
  longVal double
);
CREATE STREAM JoinedDataStream OF JoinedDataType;

CREATE CQ JoinDataCQ
INSERT INTO JoinedDataStream
SELECT  f.merchantId,
        f.zip,
        z.city,
        z.state,
        z.latVal,
        z.longVal
FROM FilteredDataStream f, ZipCache z
WHERE f.zip = z.zip;

CREATE TARGET simpleJoinedData
USING SysOut(name:simpleJoined)
INPUT FROM JoinedDataStream;

Reload and restart simple.tql and you should see these results in striim-node.log:

simpleJoined: JoinedDataType_1_0{
  merchantId: "D6RJPwyuLXoLqQRQcOcouJ26KGxJSf6hgbu"
  zip: "41363"
  city: "Quicksand"
  state: "KY"
  latVal: 37.5331
  longVal: -83.3652
};
simpleJoined: JoinedDataType_1_0{
  merchantId: "OFp6pKTMg26n1iiFY00M9uSqh9ZfMxMBRf1"
  zip: "16950"
  city: "Westfield"
  state: "PA"
  latVal: 41.9193
  longVal: -77.523
};
Populate a Dashboard map from a WActionStore

In simple.tql, replace these lines:

CREATE TARGET JoinedDataTarget
USING SysOut(name:JoinedData)
INPUT FROM JoinedDataStream;

with the following:

CREATE JUMPING WINDOW FilteredDataWindow
OVER FilteredDataStream KEEP 1 ROWS;

CREATE WACTIONSTORE MapData 
CONTEXT OF JoinedDataType
EVENT TYPES ( JoinedDataType KEY(merchantId) )
PERSIST NONE USING();

CREATE CQ PopulateMapDataCQ
INSERT INTO MapData
SELECT  f.merchantId,
        f.zip,
        z.city,
        z.state,
        z.latVal,
        z.longVal
FROM FilteredDataWindow f, ZipCache z
WHERE f.zip = z.zip; 

Reload and restart simple.tql, then create a dashboard (see Dashboard Guide and Dashboard rules and best practices), add a Vector Map, and specify this as its query's SELECT statement:

select * from MapData;

After saving the query, click the map's cog icon, and specify its properties as follows:

simple_map.png

See Visualization types and properties for more information about these values.

Leave the other properties at their defaults, save the map, click Done, and refresh the browser. You should see something like this:

simple_map_2.png

Loading and reloading TQL applications during development

Before you can start a new TQL application, you must first create and deploy it (see Managing deployment groups). For example, the following series of console commands creates, deploys, and runs simple.tql:

W (admin): @Samples/simple.tql;
 Processing - CREATE APPLICATION simple
 ...
 Processing - END APPLICATION simple
 Elapsed time: 2473 ms
W (admin): deploy application simple in default;
 Processing - deploy application simple in default
 Elapsed time: 180 m
W (admin): start application simple;
 Processing - start application simple
 Elapsed time: 73 ms

After editing the TQL file, to run the new version of the application, you must:

  • undeploy and drop the old version

  • load, deploy, and start the new version

For example, the following series of commands will drop the application loaded by the above commands.

W (admin) > undeploy application simple;
 Processing - undeploy application simple
 Elapsed time: 358 ms
W (admin) > drop application simple cascade;
            

Now repeat the @, DEPLOY, and START commands you used the first time you ran the application.

You can automate this by adding the commands to your application:

UNDEPLOY APPLICATION PosApp;
DROP APPLICATION PosApp CASCADE;
CREATE APPLICATION PosApp;
...
END APPLICATION PosApp;
DEPLOY APPLICATION PosApp;
START APPLICATION PosApp;

Parsing the data field of WAEvent

WAEvent is the data type used by the output stream of many readers. Its data field is an array containing one event's field values. Here is a sample event in WAEvent format from the output stream of CsvDataSource in the PosApp sample application:

WAEvent{
  data: ["COMPANY 1159","IQ6wCy3k7PnAiRAN71ROxcNBavvVoUcwp7y","8229344557372754288","1","20130312173212",
"0614","USD","329.64","2094770823399082","79769","Odessa"]
  metadata: {"RecordStatus":"VALID_RECORD","FileName":"posdata.csv","FileOffset":154173}
  before: null
  dataPresenceBitMap: "AAA="
  beforePresenceBitMap: "AAA="
  typeUUID: null
}; 

dataPresenceBitMap, beforePresenceBitMap, and typeUUID are reserved and should be ignored.

To parse the data array, PosApp uses the following TQL:

CREATE CQ CsvToPosData
INSERT INTO PosDataStream
SELECT TO_STRING(data[1]) AS merchantId,
  TO_DATEF(data[4],'yyyyMMddHHmmss') AS dateTime,
  DHOURS(TO_DATEF(data[4],'yyyyMMddHHmmss')) AS hourValue,
  TO_DOUBLE(data[7]) AS amount,
  TO_STRING(data[9]) AS zip
FROM CsvStream;
  • PosDataStream is created automatically using the data types and AS strings in the SELECT statement.

  • The order of the data[#] functions in the SELECT clause determines the order of the fields in the output. These may be specified in any order: for example, data[1] could precede data[0].

  • Fields not referenced by the the SELECT clause are discarded.

  • The data[#] function counts the fields in the array starting from 0, so in this example the first field in the array (COMPANY 1159) is omitted.

  • Non-string values are converted to the types required by the output stream (as defined by the PosData type) by the TO_DATEF, DHOURS, and TO_DOUBLE functions (see Functions for more information).

In the PosDataStream output, the parsed version of the sample event shown above is:

merchantId: "IQ6wCy3k7PnAiRAN71ROxcNBavvVoUcwp7y"
dateTime: 1363134732000
hourValue: 17
amount: 329.64
zip: "79769"

See PosApp for more information. See also the discussions of ParseAccessLog and ParseLog4J in MultiLogApp for additional examples.

To put the raw, unparsed data array into a field in a stream, use this syntax:

CREATE CQ CsvRawData
INSERT INTO PosRawDataStream
SELECT data AS object[]
FROM CsvStream;

The field type will be an array of Java objects.

Using regular expressions (regex)

Striim supports the use of regular expressions (regex) in your TQL applications. It is important to remember that the Striim implementation of regex is Java-based (see java.util.regex.Pattern), so there are a few things to keep in mind as you develop your regex expressions:

  • The backslash character ( \ ) is recognized as an escape character in Java strings, so if you want to define something like \w in regex, use \\w in such cases.

  • In regex, \\ matches a single backslash literal. Therefore if you want to use the backslash character as a literal in the Striim Java implementation of regex, you must actually use \\\\.

  • The java.lang.String class provides you with these methods supporting regex: matches(), split(), replaceFirst(), replaceAll(). Note that the String.replace() methods do not support regex.

  • TQL supports the regex syntax and constructs from java.util.regex. Note that this has some differences from POSIX regex.

If you are new to using regular expressions, refer to the following resources to get started:

You may use regex in LIKE and NOT LIKE expressions. For example:

  • WHERE ProcessName NOT LIKE '%.tmp%': filter out data from temp files

  • WHERE instance_applications LIKE '%Apache%': select only applications with Apache in their names

  • WHERE MerchantID LIKE '45%': select only merchants with IDs that start with 45.

The following entry from the MultiLogApp sample Apache access log data includes information about a REST API call in line 4:

0: 206.130.134.68
1: -
2: AWashington
3: 25/Oct/2013:11:28:36.960 -0700
4: GET http://cloud.saas.me/query?type=ChatterMessage&id=01e33d9a-34ee-ccd0-84b9-
   14109fcf2383&jsessionId=01e33d9a-34c9-1c68-84b9-14109fcf2383 HTTP/1.1
5: 200
6: 0
7: -
8: Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu 
   Chromium/28.0.1500.71 Chrome/28.0.1500.71 Safari/537.36
9: 1506

Regex is also used by the MATCH function. The MATCH function in the ParseAccessLog CQ parses the information in line 4 in to extract the session ID:

MATCH(data[4], ".*jsessionId=(.*) ")

The parsed output is:

sessionId: "01e33d9a-34c9-1c68-84b9-14109fcf2383"

The following, also from MultiLogApp, is an example of the data[2] element of a RawXMLStream WAEvent data array:

"Problem in API call [api=login] [session=01e3928f-e975-ffd4-bdc5-14109fcf2383] 
[user=HGonzalez] [sobject=User]","com.me.saas.SaasMultiApplication$SaasException: 
Problem in API call [api=login] [session=01e3928f-e975-ffd4-bdc5-14109fcf2383] 
[user=HGonzalez] [sobject=User]\n\tat com.me.saas.SaasMultiApplication.login
(SaasMultiApplication.java:1253)\n\tat 
sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)\n\tat 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
\n\tat java.lang.reflect.Method.invoke(Method.java:606)\n\tat 
com.me.saas.SaasMultiApplication$UserApiCall.invoke(SaasMultiApplication.java:360)\n\tat 
com.me.saas.SaasMultiApplication$Session.login(SaasMultiApplication.java:1447)\n\tat 
com.me.saas.SaasMultiApplication.main(SaasMultiApplication.java:1587)"

This is parsed by the ParseLog4J CQ as follows:

MATCH(data[2], '\\\\[api=([a-zA-Z0-9]*)\\\\]'),
MATCH(data[2], '\\\\[session=([a-zA-Z0-9\\-]*)\\\\]'),
MATCH(data[2], '\\\\[user=([a-zA-Z0-9\\-]*)\\\\]'),
MATCH(data[2], '\\\\[sobject=([a-zA-Z0-9]*)\\\\]')

The parsed output is:

api: "login"
sessionId: "01e3928f-e975-ffd4-bdc5-14109fcf2383"
userId: "HGonzalez"
sobject: "User"

See Parsing sources with regular expressions, FreeFormTextParser, and MultiFileReader for additional examples.

Sending alerts from applications

See also Sending alerts about servers and applications.

Applications can send alerts via email or the web UI. To send alerts, generate a stream of type AlertEvent and use it as the input for a subscription (a kind of target).

The syntax for subscriptions is:

CREATE SUBSCRIPTION name 
USING [ EmailAdapter | WebAlertAdapter] (<properties>) 
INPUT FROM <stream of type Global.AlertEvent>

Alerts generated by the WebAlertAdapter appear only in the alert counter in the upper right corner of some pages of the Striim web UI. This delivery method is suitable mostly for development purposes since the counter may be reset before the user sees an alert. You do not need to specify any properties for this adapter.

The EmailAdapter properties are:

property

type

default value

notes

bccEmailList

java.lang.String

"bcc" address(es) for the alerts (separate addresses with commas) or %<field name>%

ccEmailList

java.lang.String

"cc" address(es) for the alerts (separate addresses with commas) or %<field name>%

contentType

java.lang.String

text/html; charset=utf-8

the other supported value is text/plain; charset=utf-8

emailList

java.lang.String

"to" address(es) for the alerts (separate addresses with commas) or %<field name>%

senderEmail

java.lang.String

"from" address for the alerts (if this is not a valid, monitored mailbox, the alert text should instruct the user not to reply) or %<field name>%

smtp_auth

Boolean

True

set to False if the SMTP server does not require authentication, in which case leave smtpUser and stmpPassword blank

smtpPassword

com.webaction. security.Password

password for the SMTP account (see Encrypted passwords); leave blank if smtpUser is not specified

smtpPropertiesName

string

a Striim property set containing SMTP server properties (any properties specified in the EmailAdapter override those in the property set)

smtpUrl

string

network_name:port for the SMTP server (if port is not specified, defaults to 587)

smtpUser

string

user name of the account on the SMTP server; leave blank if authentication is not required

starttls_enable

Boolean

False

set to True if required by the SMTP server

subject

string

subject line for the alerts

threadCount

int

4

number of threads on the Striim server to be used to send alerts

userids

java.lang.String

Striim user(s) to receive alerts at the email address(es) specified in their Striim account properties or %<field name>%

The following would create a property set smtpprop which could then be specified as the value for smtpPropertiesName:

CREATE PROPERTYSET smtpprop (
  SMTPUSER:'xx@example.com',
  SmtpPassword:'secret', 
  smtpurl:'smtp.example.com', 
  threadCount:"5", 
  senderEmail:"alertsender@example.com" );

The input stream for a subscription must use the AlertEvent type. Its fields are:

field

type

notes

name

string

reserved

keyVal

string

For any given keyVal, an alert will be sent on for the first event with a flag value of raise. Subsequent events with the same keyVal and a flag value of raise will be ignored and until a cancel is received for that keyVal.

severity

string

valid values: error, warning, or info

flag

string

valid values: raise or cancel

message

string

the text of the alert, typically passed from a log entry

The following sample code (based on PosApp) generates both types of alerts:

CREATE STREAM AlertStream OF Global.AlertEvent;

CREATE CQ GenerateAlerts
INSERT INTO AlertStream
SELECT n.CompanyName,
  m.MerchantId,
  CASE
    WHEN m.Status = 'OK' THEN 'info'
    ELSE 'warning' END,
  CASE
    WHEN m.Status = 'OK' THEN 'cancel'
    ELSE 'raise' END,
  CASE
    WHEN m.Status = 'OK'
      THEN 'Merchant ' + n.companyName + ' count of ' + m.count +
        ' is back between ' + ROUND_DOUBLE(m.lowerLimit,0) + ' and ' + 
        ROUND_DOUBLE(m.upperLimit,0)
    WHEN m.Status = 'TOOHIGH'
      THEN 'Merchant ' + n.companyName + ' count of ' + m.count +
        ' is above upper limit of ' + ROUND_DOUBLE(m.upperLimit,0)
    WHEN m.Status = 'TOOLOW'
      THEN 'Merchant ' + n.companyName + ' count of ' + m.count +
        ' is below lower limit of ' + ROUND_DOUBLE(m.lowerLimit,0)
    ELSE ''
    END
FROM MerchantTxRateWithStatusStream m, NameLookup n
WHERE m.merchantId = n.merchantId;

CREATE SUBSCRIPTION PosAppEmailAlert
USING EmailAdapter (
  SMTPUSER:'sender@example.com',
  SMTPPASSWORD:'********', 
  smtpurl:'smtp.gmail.com',
  starttls_enable:'true',
  subject:"test subject",
  emailList:"recipient@example.com,recipient2.example.com",
  senderEmail:"alertsender@example.com" 
)
INPUT FROM AlertStream;

CREATE SUBSCRIPTION PosAppWebAlert 
USING WebAlertAdapter( ) 
INPUT FROM AlertStream;

When a merchant's status changes to TOOLOW or TOOHIGH, Striim will send an alert such as, "WARNING - alert from Striim - POSUnusualActivity - 2013-12-20 13:55:14 - Merchant Urban Outfitters Inc. count of 12012 is below lower limit of 13304.347826086958." The "raise" value for the flag field instructs the subscription not to send another alert until the status returns to OK.

Using field values in email alerts

When sending alerts with the EmailAdapter, you can populate the subject, sender address, and recipient addresses with values from the fields of the subscription's input stream.

To do this, first create a custom alert stream with the extra fields you want to use. The first five fields must be identical to Global.AlertEvent. To those, you may add fields containing the subjects, sender addresses, and recipient addresses.

CREATE TYPE CustomEmailAlert (   
   name String,
   keyVal String,
   severity String,
   flag String,
   message String,
   emailsubject String,
   senderEmail String,
   receipientList String 
);
CREATE STREAM CustomAlertStream OF CustomEmailAlert;

Reference the subject, sender, and recipient fields in the EmailAdapter properties as follows:

CREATE SUBSCRIPTION alertSubscription USING EmailAdapter (
  smtpurl:'localhost:25',
  subject: '%emailsubject%',
  senderEmail:,
  emailList: 
)
INPUT FROM MyAlertStream;
CREATE SUBSCRIPTION PosAppCustomEmailAlert
USING EmailAdapter (
  SMTPUSER:'sender@example.com',
  SMTPPASSWORD:'********', 
  smtpurl:'smtp.gmail.com',
  starttls_enable:'true',
  subject:'%emailsubject%',
  emailList:'%receipientlist%',
  senderEmail:'%senderEmaiId%' 
)
INPUT FROM AlertStream;

You do not need to use all three. For example, you could populate only emailList with field values:

  subject:"test subject",
  emailList:'%receipientlist%',
  senderEmail:"alertsender@example.com"

The values in the recipientlist field may include multiple email addresses separated by commas (no spaces).

Handling exceptions

By default, when Striim encounters a non-fatal exception, it ignores it and continues. You may add an EXCEPTIONHANDLER clause to your CREATE APPLICATION statement to log exceptions and take various actions. The syntax is:

CREATE APPLICATION ... EXCEPTIONHANDLER ([<exception>:'<action>',...]);

Supported exceptions are:

  • AdapterException

  • ArithmeticException

  • ClassCastException

  • ConnectionException

  • InvalidDataException

  • NullPointerException

  • NumberFormatException

  • SystemException

  • UnexpectedDDLException

  • UnknownException

Supported actions are:

  • IGNORE

  • CRASH ("Stop Processing" in Flow Designer)

See also Writing exceptions to a WActionStore.

3.10.1
Was this article helpful?
0 out of 0 found this helpful

Comments