Resilient tasks are designed in such a way to prevent data loss. Data integrity refers to the validity of processed data.

This article describes how to design tasks with data resilience and integrity in mind, using transactional processing such as retry mechanisms.

BPA Platform and Transactions

BPA Platform interacts with multiple systems, frequently linking disparate applications together. Because of this, there is a need to treat all steps in a task as a single transaction — for example, extracting data from a database, writing it to another database, then emailing a notification on completion would be handled as a single transaction. In practise however, this would be difficult to achieve — if the email notification failed, the update to the second database would need to be rolled back which would be difficult as BPA Platform doesn't have access to the transaction statements (BEGIN TRANSACTION, COMMIT, and ABORT).

Instead, a staging database or table should be setup to store the transactions until processing has finished — it is this that provides the resilience.

The BPA Platform Marketplace Solutions make use of the staging database — found "inbetween" the eCommerce and ERP sides of the solution. It is used to provide a common data "holding" area between both sides. This provides communication resilience between cloud and on-premise systems. It also helps checking order progress by providing an error capture, rectify, and retry process, and as well as an observation point.

Example Scenario

Let's take as an example a task that monitors a sales order database and when a new order comes in, a report is emailed:

Task with No Resilience

Microsoft SQL Server TriggerClosedThe Microsoft SQL Server Trigger tool uses the Microsoft® SQL Server™ trigger functionality to create a trigger on a table of a specified SQL data source. When a specified data modification is attempted, such as an attempt to add or delete a table row, the trigger then causes a task to run. — Triggers the task to run when a new record is added to the database. The Order_ID is mapped to a task variableClosedVariables are created and used extensively throughout and can either have fixed values or values that are dynamically populated when a task is run. Task variables are only available for the specific task they are created for. Users with Task Administrator rights can create task variables..
Database Query (ODBC)ClosedYou typically use the Database Query (ODBC) tool to extract specific records from a relational database, using a valid ODBC driver to make the connection. These are then made available to other task steps as a recordset. — Extracts the order details according to the Order_ID passed from the Microsoft SQL Server Trigger step.
Format as HTML ProClosedThe Format as HTML Pro tool is used to create a task step that produces single or multiple HTML documents. You can choose to create the HTML template from scratch or import a pre-existing template. Recordsets and other task step properties can also be used to populate the HTML document to product dynamic content documents. These documents can then be consumed and then delivered by Output or Execute task steps. — Populates a HTML-formatted report with the data from the Database Query (ODBC) step.
Send Email (SMTP)ClosedThe Send Email (SMTP) tool sends messages to any SMTP compliant mail server. This tool is capable of sending multiple messages in either text or HTML formats, incorporating data from Input and Format steps to any number of recipients. If recordset data being used from an Input step includes a column containing email addresses, then this can be used as a "dynamic" recipient address. — Emails the report produced by the Format as HTML Pro step.

The problem with this task is that it doesn't guard against failure. Two potential failures could occur:

  1. The Database Query (ODBC) step times out because the database server is busy
  2. The Send Email (SMTP) step fails because the email server is unreachable

If either of the above failures occur, the event is "lost" as no report is sent for that order.

Adding Resilience to the Task

To prevent lost reports, a retry mechanism can be added to the task which does the following:

Places the event in a queue
Retries the event when a failure has occurred

Stage 1: Creating the EventsQueue Table

The "queue" is processed through a database table we must create separately. A local installation of SQL Server is required on the BPA Platform server. Note that SQL Express is also suitable for this situation.

  1. Either create a database that BPA Platform can access or use an existing, accessible one.
  2. In the database, create a table called EVENTS with the following columns:
Column Name Data Type Description
EVENT_ID The data type should match the primary key of the sales orders table, ORDER_ID. This column is the primary key. Populated using the ORDER_ID which triggered the event.
TASK_ID int The task ID.
TASK_STATUS bit, NOT NULL Unprocessed = 0; Processed = 1

The SQL to create this is:

-- assumes the external system's primary key is char(20)
-- to get the Task ID: open task, in the General tab click Options, go to the Advanced tab
-- unprocessed = 0; processed = 1

Stage 2: Creating the Stored Procedures to Populate the Events Table

Two stored procedures are required to process the EVENTS table — one to insert the event when the task is triggered and the other to update it when all task steps have completed. Create the following stored procedures in the same database hosting the EVENTS table:

Insert Stored Procedure

@PRIMARY_KEY char(20),
@TASK_ID int
@PRIMARY_KEY, @TASK_ID, 0 -- inital state is always "Unprocessed"

Update Stored Procedure

-- If this procedure is called, it means that all
-- proceeding task steps have completed
-- therefore the status can be changed to "Processed"
@PRIMARY_KEY char(20)

Stage 3: Adding the Stored Procedures to the Task

The Call Stored Procedure (OLEDB)ClosedThe Call Stored Procedure (OLEDB) tool executes database stored procedures and functions to provide read, write, and automated data processing. tool is used to process stored procedures during a task.

Adding the INSERT_EVENT Stored Procedure

Do the following:

  1. Open the task.
  2. Add a Call Stored Procedure (OLEDB) step.
  3. In the General tab, Name the step; no other configuration is required in this tab.
  4. Go to the Main tab.
  5. Change Connection used to the database hosting the EVENTS table.
  6. Click Browse and select the INSERT_EVENT stored procedure.
  7. The variable containing the ORDER_ID from the Microsoft SQL Server Trigger step is mapped to @PRIMARY_KEY.
  8. The TaskID property (Task Browser > Environment tab > Scope = Task) is mapped to @TASK_ID.

    Call Stored Procedure (OLEDB) - INSERT_EVENT

  9. Click OK to save the step.
  10. Change the sequence of the task so the new Call Stored Procedure (OLEDB) step is immediately after the Microsoft SQL Server Trigger step.

    Resilience Task with INSERT Call Stored Procedure

Adding the UPDATE_EVENT Stored Procedure

Do the following:

  1. Add another Call Stored Procedure (OLEDB) step to the task.
  2. This time, select the UPDATE_EVENT stored procedure.
  3. Again, map the variable containing the ORDER_ID from the Microsoft SQL Server Trigger step to @PRIMARY_KEY.

    Call Stored Procedure (OLEDB) - UPDATE_EVENT

  4. Make this step the last one of the task:

    Resilience Task with INSERT and UPDATE Call Stored Procedures

Stage 4: Adding the Retry Mechanism to the Task

The simplest method of detecting failed orders and retrying them is to create a scheduled task that checks the EVENTS table for records where TASK_STATUS=0 (Unprocessed).

  1. Create a task with the same Database Query (ODBC), Format as HTML Pro, and Send Email (SMTP) steps as before.
  2. Open the Database Query (ODBC) step.
  3. Click Database Query (ODBC) - SQL Free Type Mode to change into free type mode.
  4. Add a JOIN statement for the EVENTS and orders table where TASK_STATUS=0; for example:

      "SalesOrders"."ORDER_ID" AS "ORDER_ID"
      "SalesOrders"."ORDER_DATE" AS "ORDER_DATE"
      "SalesOrders" INNER JOIN "EVENTS"
      "SalesOrders"."ORDER_ID" = "EVENTS"."EVENT_ID"
      "EVENTS"."TASK_STATUS" = 0

  5. Click OK to save and close the step.
  6. Now open the Send Email (SMTP) step.
  7. Here we add a Call Stored Procedure memory definitionClosedMemory definitions specify the data, generated by a task at run-time, to be memorised. You can save memorised data in an internal Repository, in an ODBC or OLEDB data source, or in an OLEDB data source via a stored procedure. to the step. This works through those records where TASK_STATUS=0, marking them as processed (1) when the email is sent.
    1. Go to the Main tab and click Memory.
  8. Add a Call Stored Procedure Memory Provider definition.
  9. Select the UPDATE_EVENT stored procedure.
  10. The task variable that holds the ORDER_ID that triggered the previous task is not available to this task. Instead, because this new task is stepping through all unprocessed records, we want the ID of the record currently being processed.
    1. In the Task Browser's Environment tab, change Scope to This Step.
    2. Expand DocumentSource > Recordset.
    3. Map @PrimaryKey to the ID field of the recordset. In this scenario, this is the ORDER_ID taken from the SalesOrder table.
    4. Click OK to save and close the memory definition.
  11. Click OK to save and close the Send Email (SMTP) step.
  12. Finally, add a ScheduleClosedThe Schedule tool is used to trigger a task to run based on a date and time schedule. Any number of Schedule steps can be created for a task. step to the task to check the EVENTS table at regular intervals.

Retry Mechanism Task

Tip: Instead of using a Schedule step, you could add a Call TaskClosedThe Call Task tool is used to trigger an external task to run. Using variables, BPA Platform data can be passed between the two tasks at run-time, if required. step to the original task so it triggers this task to run after it has finished sending the email.

Article originally published 03 December 2015 — Article updated 18 October 2019