Resilient tasks are designed in such a way to prevent data loss. Data integrity refers to the validity of processed data.
This article describes how to design tasks with data resilience and integrity in mind, using transactional processing such as retry mechanisms.
BPA Platform and Transactions
BPA Platform interacts with multiple systems, frequently linking disparate applications together. Because of this, there is a need to treat all steps in a task as a single transaction — for example, extracting data from a database, writing it to another database, then emailing a notification on completion would be handled as a single transaction. In practise however, this would be difficult to achieve — if the email notification failed, the update to the second database would need to be rolled back which would be difficult as BPA Platform doesn't have access to the transaction statements (
Instead, a staging database or table should be setup to store the transactions until processing has finished — it is this that provides the resilience.
The BPA Platform Marketplace Solutions make use of the staging database — found "inbetween" the eCommerce and ERP sides of the solution. It is used to provide a common data "holding" area between both sides. This provides communication resilience between cloud and on-premise systems. It also helps checking order progress by providing an error capture, rectify, and retry process, and as well as an observation point.
Let's take as an example a task that monitors a sales order database and when a new order comes in, a report is emailed:
|||Database Query (ODBC)You typically use the Database Query (ODBC) tool to extract specific records from a relational database, using a valid ODBC driver to make the connection. These are then made available to other task steps as a recordset. — Extracts the order details according to the
The problem with this task is that it doesn't guard against failure. Two potential failures could occur:
- The Database Query (ODBC) step times out because the database server is busy
- The Send Email (SMTP) step fails because the email server is unreachable
If either of the above failures occur, the event is "lost" as no report is sent for that order.
Adding Resilience to the Task
To prevent lost reports, a retry mechanism can be added to the task which does the following:
|||Places the event in a queue|
|||Retries the event when a failure has occurred|
Stage 1: Creating the EventsQueue Table
The "queue" is processed through a database table we must create separately. A local installation of SQL Server is required on the BPA Platform server. Note that SQL Express is also suitable for this situation.
- Either create a database that BPA Platform can access or use an existing, accessible one.
- In the database, create a table called EVENTS with the following columns:
|Column Name||Data Type||Description|
||The data type should match the primary key of the sales orders table,
||This column is the primary key. Populated using the
||The task ID.|
The SQL to create this is:
CREATE TABLE[EVENTS] ( -- assumes the external system's primary key is char(20) EVENT_ID
char(20) NOT NULL, -- to get the Task ID: open task, in the General tab click Options, go to the Advanced tab TASK_ID
intNULL, -- unprocessed = 0; processed = 1 TASK_STATUS
PRIMARY KEY CLUSTERED(EVENT_ID) )
Stage 2: Creating the Stored Procedures to Populate the Events Table
Two stored procedures are required to process the EVENTS table — one to insert the event when the task is triggered and the other to update it when all task steps have completed. Create the following stored procedures in the same database hosting the EVENTS table:
Insert Stored Procedure
CREATE PROCEDUREINSERT_EVENT @PRIMARY_KEY
AS BEGIN INSERT INTOdbo.EVENTS (EVENT_ID, TASK_ID, TASK_STATUS)
VALUES( @PRIMARY_KEY, @TASK_ID, 0 -- inital state is always "Unprocessed" )
Update Stored Procedure
CREATE PROCEDUREUPDATE_EVENT -- If this procedure is called, it means that all -- proceeding task steps have completed -- therefore the status can be changed to "Processed" @PRIMARY_KEY
AS BEGIN UPDATEdbo.EVENTS
SETTASK_STATUS = 1
WHEREEVENT_ID = @PRIMARY_KEY
Stage 3: Adding the Stored Procedures to the Task
The Call Stored Procedure (OLEDB)The Call Stored Procedure (OLEDB) tool executes database stored procedures and functions to provide read, write, and automated data processing. tool is used to process stored procedures during a task.
Adding the INSERT_EVENT Stored Procedure
Do the following:
- Open the task.
- Add a Call Stored Procedure (OLEDB) step.
- In the General tab, Name the step; no other configuration is required in this tab.
- Go to the Main tab.
- Change Connection used to the database hosting the EVENTS table.
- Click Browse and select the INSERT_EVENT stored procedure.
- The variable containing the ORDER_ID from the Microsoft SQL Server Trigger step is mapped to @PRIMARY_KEY.
- The TaskID property (Task Browser > Environment tab > Scope = Task) is mapped to @TASK_ID.
- Click OK to save the step.
- Change the sequence of the task so the new Call Stored Procedure (OLEDB) step is immediately after the Microsoft SQL Server Trigger step.
Adding the UPDATE_EVENT Stored Procedure
Do the following:
- Add another Call Stored Procedure (OLEDB) step to the task.
- This time, select the UPDATE_EVENT stored procedure.
- Again, map the variable containing the ORDER_ID from the Microsoft SQL Server Trigger step to @PRIMARY_KEY.
- Make this step the last one of the task:
Stage 4: Adding the Retry Mechanism to the Task
The simplest method of detecting failed orders and retrying them is to create a scheduled task that checks the EVENTS table for records where
- Create a task with the same Database Query (ODBC), Format as HTML Pro, and Send Email (SMTP) steps as before.
- Open the Database Query (ODBC) step.
- Click to change into free type mode.
- Add a
JOINstatement for the EVENTS and orders table where
TASK_STATUS=0; for example:
ON"SalesOrders"."ORDER_ID" = "EVENTS"."EVENT_ID"
WHERE"EVENTS"."TASK_STATUS" = 0
- Click OK to save and close the step.
- Now open the Send Email (SMTP) step.
- Here we add a Call Stored Procedure memory definitionMemory definitions specify the data, generated by a task at run-time, to be memorised. You can save memorised data in an internal Repository, in an ODBC or OLEDB data source, or in an OLEDB data source via a stored procedure. to the step. This works through those records where
TASK_STATUS=0, marking them as processed (
1) when the email is sent.
- Go to the Main tab and click Memory.
- Add a Call Stored Procedure Memory Provider definition.
- Select the UPDATE_EVENT stored procedure.
- The task variable that holds the
ORDER_IDthat triggered the previous task is not available to this task. Instead, because this new task is stepping through all unprocessed records, we want the ID of the record currently being processed.
- In the Task Browser's Environment tab, change Scope to This Step.
- Expand DocumentSource > Recordset.
@PrimaryKeyto the ID field of the recordset. In this scenario, this is the
ORDER_IDtaken from the SalesOrder table.
- Click OK to save and close the memory definition.
- Click OK to save and close the Send Email (SMTP) step.
- Finally, add a ScheduleThe Schedule tool is used to trigger a task to run based on a date and time schedule. Any number of Schedule steps can be created for a task. step to the task to check the EVENTS table at regular intervals.
Tip: Instead of using a Schedule step, you could add a Call TaskThe Call Task tool is used to trigger an external task to run. Using variables, BPA Platform data can be passed between the two tasks at run-time, if required. step to the original task so it triggers this task to run after it has finished sending the email.
Article originally published 03 December 2015 — Article updated 18 October 2019