Monitor

Track and monitor execution status, set up automated alert rules, view user activities, SAP® ERP performance and connection pools.


Note
Any time stamp displayed in Integration is based on the user’s registered time zone specified in iPaaS. Note that not all the time zones in iPaaS are supported in Integration. If a time zone in iPaaS is not supported, then the time stamp in Integration defaults to the Pacific Standard Time (PST) time zone. To change the time zone for a user profile, using the App Switcher, go to iPaaS and in iPaaS, click Administration. On the Administration page, edit the user profile. Under Locale information, select the required time zone. Save the changes, log out, and then again log in to see the changes.

Let’s understand each of these sections in detail.

Execution Results

The Execution Results section provides a quick overview of the statistics associated with the workflow executions and Flow service executions (if enabled) along with their respective execution logs.

In the Monitor page, click on Execution Results as shown below:

Let’s now glance through the options provided by the Execution Results:

Dashboard

The Dashboard option offers you a consolidated view of all Workflow and Flow service execution status for the selected time period along with a graphical representation for the same.

You can fetch the execution data for a specific duration by selecting the relevant time frame option given on the top-right corner of the screen.

You can also view the detailed execution logs for each successful/failed Workflow and Flow service by clicking on the relevant figure as shown below:

This will take you to a new screen where you can click the relevant workflow/Flow service name to view the detailed execution log associated with it.

If you want to view a detailed execution log for successful Flow services, click on the relevant figure. This will take you to the following screen:

Workflow Execution

The Workflow Execution option lets you view and monitor workflow execution-related data in a detailed graphical format.

A workflow can have one of the following status:

In case of async webhook-enabled workflows you can have a Failed status if you have exceeded the execution rate limit for your tenant.

Note
The transaction consumption calculation is done based on the duration in which the workflow completes execution. This duration includes the time taken for the pre-processing activities such as fetching metadata, the post-processing activities, and the actual workflow execution.


Due to this, the duration shown in the workflow execution log can sometimes be greater than the elapsed time which indicates the total time taken for workflow trigger/action execution.


Using filters to fetch specific workflow execution data

You can fetch specific workflow execution data for a certain time frame and/or a certain project, workflow, and execution status.

Fetching workflow execution data for specific time frame

You can fetch the execution data for a specific duration by selecting the relevant time frame option given at the top-right corner of the screen.

You can alternatively specify a custom time frame using the date picker option.

Note
The maximum custom date range that you can specify is 30 days.

Fetching workflow execution data associated with specific projects, workflows, execution status, and context ID

You can fetch the execution data associated with one or more projects, one or more workflows, and one or more execution status. This can be achieved by applying the filter criteria.

To apply a filter, click on the Filters option.

You will see the following three filter criteria:

Once you select the required filter options, click Apply. Doing this will display the execution details (in a graphical format) and execution logs for the selected projects and workflows.

To reset the filter back to default, click Reset.

Viewing workflow execution logs

The Workflow Execution option lets you view and monitor the execution log for each workflow that you execute.

Note
You can set the limit of the number of items to be displayed per page by clicking on the drop-down arrow located at the bottom of the Executions table. By default, 50 items will be displayed on a single page. The allowed maximum page size is 150.

You can select the columns you want to view in the Executions table based on your requirements by clicking the Settings button located beside the Download Logs option. The Settings button, when clicked, displays a list of column names and allows you to select the following columns you want to view in the Executions table. Column names that are not selected will be hidden in the Executions table.

To get detailed information on the performance details of a particular workflow, click on the name of that workflow.

Clicking on the workflow’s name will take you to a screen having complete execution log information about that workflow. You can optionally export the execution log of a particular workflow to your local machine by clicking on the Export Logs button located at the top right of the screen.

Next, to view detailed information about the configured trigger or actions for the selected workflow, click on the name of the trigger or action as shown below:

Downloading workflow execution logs

You can optionally download the workflow execution logs (all or filtered records) either in JSON or CSV format to your local machine. To do this, click Download Logs located on the right side of the screen and select the format of the workflow execution log that you want to download.

The required execution logs will be automatically downloaded as a .zip file to the default download location in your machine.

Note
  • A maximum of *100,000* execution records will be contained in a single file inside a compressed zipped folder by default. If the workflow execution logs that you want to download exceed the specified limit, Integration will automatically generate another file inside the zipped folder, which will hold a maximum of 100,000 records. For example, if you have a total of 200,000 execution logs, a zipped folder with two files, each containing 100,000 records will be generated and downloaded in your machine.

  • Integration uses the *tenantname_starttimestamp_endtimestamp* format to name a zipped folder/file. If you have multiple files created in a zipped folder, the files are named by appending an incrementing number suffix to the zipped folder name.

Resuming workflows

The Resume feature allows you to resume the execution of your failed and timed out workflows. Since this feature works at workflow-level, you need to enable it for each workflow that you may want to resume in the future. To enable this feature for a workflow, navigate to the Workflow -> Workflow Settings and then check the Save status of each successfully executed action checkbox under the Execution Settings tab.

Note

  • You can Resume the Workflow executions from the main execution listing page of the Monitor tab. This option is provided under the Actions column.

  • If a workflow is invoked through the Messaging subscriber, then the Resume option is not applicable.

If you have enabled the Resume feature for your workflow and it fails, you will see the Resume button in the execution log of that workflow.

When you click on the Resume button, a dialog box will appear where you will be prompted specify whether you’d like to edit the input JSON data of failed action(s) before resuming the workflow execution or resume the workflow execution directly.

Click on Resume to immediately resume the workflow execution from the point it failed in the previous run.

Click on Edit Input to modify the input JSON of failed action(s).

Once this is done, click Resume. This will resume the workflow execution from the point it failed in the previous run, using the modified JSON input data for failed actions, you will be redirected to the Execution Logs page. Next, you will need to refresh the page by clicking on the Refresh icon located on the left side of the screen. Clicking on this icon will fetch latest status of the workflow execution in Execution Logs.

If the workflow is executed successfully, you will see that the execution status of the workflow is changed from Failed to Success. Moreover, when you click the workflow execution log, you can see the complete execution log details (previously failed action logs and current successfully executed action logs).

Restarting workflows

The Restart feature enables you to restart the execution of your failed, timed out, and stopped workflows. Additionally, this feature supports restarting failed manual workflow executions.

As this feature works at the workflow-level, you need to enable it for each workflow that you may want to restart in the future. To enable this feature for a workflow, go to Workflow -> Workflow Settings and then select the Save status of each successfully executed action checkbox under the Execution Settings tab.

You can restart one or multiple workflows as per your requirements.

Restarting a single workflow

You can Restart the Workflow executions from the main execution listing page of the Monitor tab. This option is provided under the Actions column.


If you have enabled the Restart feature for your workflow and the workflow fails, times out, or is stopped, the Restart button appears in the execution log of that particular workflow.

When you click on the Restart button, a dialog box will appear on screen where you will be prompted to specify whether you’d like to modify the webhook payload data before restarting the workflow or restart the workflow directly.

If you don’t want to modify webhook payload data, click on Restart. This will restart the workflow execution using the existing webhook payload immediately.

If you want to modify the webhook payload data before restarting the workflow, click on Edit Payload.

Once you have modified the webhook payload as per requirement, click Restart. This will restart the workflow execution using the modified webhook payload.

You will be redirected to the Execution Logs page, where you will need to refresh the page by clicking on the Refresh icon located on the left side of the screen. Clicking on this icon will fetch the latest status of the workflow execution in Execution Logs.

When a workflow execution is restarted, in the Status column of the executions table, the label Restarted is placed below Failed, indicating that the workflow has been restarted after encountering a failure.

Note
You can restart a failed workflow multiple times by clicking on the Restart icon located in the Actions column.

You can check the restart history of the restarted workflow by clicking on the icon placed beside the Restarted label in the Status column. Upon clicking the icon, the Restart History window appears, displaying a chronological list of restart events with timestamps. You can click on a timestamp to view the complete execution log information for that specific workflow execution.

You can click on the name of the restarted workflow to view its execution details.

Clicking on the Restart Reference ID will direct you to a screen showing the execution details of the initially restarted workflow. Additionally, this screen also displays the total number of times the workflow has been restarted. Clicking on the Restart Count will return you to the Restart History window.

Restarting multiple workflows

You can select multiple failed Workflow executions to restart in just one click. To achieve this, in the Executions table, select the checkbox beside the name of failed workflows and click the Restart button.

Note
  • When restarting multiple workflows simultaneously, any workflow executions that have already been restarted previously cannot be selected in the bulk restart operation.
  • While restarting multiple workflow executions, if one of the selected workflows fails, times out, or stops, make necessary changes to the corresponding worklfow from the canvas only. Modifying the webhook payload data is not supported when you restart multiple workflows.
  • You can select a maximum of 150 workflow executions to restart at a time.

Flow service Execution

The Flow service Execution option lets you view and monitor Flow service execution-related data in a detailed graphical format.

A Flow service can have one of the following status:

Using filters to fetch specific Flow service execution data

You can fetch specific Flow service execution data for a certain time frame and/or a certain project, Flow service, and execution status.

Fetching Flow service execution data for specific time frame

You can fetch the execution data for a specific duration by selecting the relevant time frame option given at the top-right corner of the screen.

You can alternatively specify a custom time frame using the date picker option.

Note
  • The maximum custom date range that you can specify is 30 days.
  • The flow service monitoring data is consolidated on an hourly basis only.

Fetching Flow service execution data based on execution source, project, Flow service, execution status, and context ID

You can fetch the execution data associated with an execution source, a project, a Flow service, an execution status, and the context ID. This can be achieved by applying the filter criteria.

To apply a filter, click on the Filters option.

You will see the following filter criteria:

Note
  • Currently, you can select only one Execution Source from the drop-down list.
  • Selecting values in other filters first and then in the Execution Source will automatically clear all the applied values in other filters.
  • By default, selecting REST APIs or SOAP APIs as the Execution Source will replace the filter - Flow services with the filter - REST API or SOAP API respectively. You can then choose one or more APIs (REST API or SOAP API) from the drop-down list to fetch the Flow service execution data. On the other hand, selecting Scheduler, User Interface, HTTP Interface, Workflow Interface, Streaming, or JMSTrigger as the Execution Source will not affect other filters.
  1. In a Flow service, select the SetCustomContextID service available in the Flow category. It is recommended to add the setCustomContextID service as the first step in a Flow service.

  2. In the mapping editor, set a value for the id field and then save and run the Flow service.

    Note
    The maximum number of characters supported for the id field is 36. If you specify more than 36 characters, the remaining characters are truncated.
  3. Search for the custom context ID in the Monitor > Execution Results > Flow service execution page by clicking Filters and by specifying the custom context ID in the Context ID field.

  4. Once you select the required filter options, click Apply.
    The execution details will be displayed in a graphical format and the execution logs will appear for the selected filter criteria. To reset the filter back to default, click Reset.

Viewing Flow service execution logs

The Flow service Execution option lets you view and monitor the execution log for each Flow service that you run.

You can select the columns you want to view in the Executions table based on your requirements by clicking on the Settings button located beside the Download Logs option. The Settings button, when clicked, displays a list of column names and allows you to select the columns you want to view in the Executions table. Column names that are not selected will be hidden in the Executions table.

Note
  • Selecting or deselecting the column - Name and Actions is not supported.
  • By default, the columns - Transaction count and Context ID are not selected.

You can view a list of Flow service execution details in an ascending or descending order using the Sort by option. The Sort by option appears when you hover over the column names in the Execution table as shown below:

Note
Currently, the Sort by provision is only added for the columns - Name, Start Time, and Duration.

Click on the Sort by option to view the execution logs in an ascending or descending order. You can sort the order based on the following criteria:

To view the detailed execution logs, click on the name of the relevant flow.

This will take you to a screen having complete information about that particular Flow service. The execution page also displays the total number of documents processed by the Flow service, the number of documents processed successfully, the number of documents that did not process successfully, and the success score.

Next, to view detailed information about operations for the selected Flow service, click on the name of the operation as shown below:

You can view additional information about operation execution including Details, Results, and Business Data. Click on the relevant options to view the required information.

By default, you can retain Flow service Execution result entries for 30 days. You can optionally specify the number of days (up to 30) for which you would like to retain the Flow service execution logs by clicking the Modify Retention Period link. Once the retention period is over, the Flow service execution logs are deleted automatically.

Note
By default, Flow service Execution Results for new tenants are retained for a period of 7 days.

Downloading Flow service execution logs

You can optionally download the execution logs of Flow services to your local machine by clicking Download Logs as shown:

Terminating running Flow service executions

You can optionally terminate the ongoing Flow service executions from the Flow service Execution details page. To do so, navigate to the Running Executions section, select one, multiple, or all running Flow service executions, and click on the Terminate button as shown below:

Note
The Terminate button is available for ongoing Flow service executions only.

Resuming Flow services

The Restart feature enables you to resume execution of your successful or failed Flow services.

For more information, see the Resuming Flow services section.

Restarting Flow services

The Restart feature enables you to restart execution of your successful or failed Flow services.

For more information, see the Restarting Flow services section.

Alert Rules

You can set automated alert rules for your projects to send notifications to specific users when a workflow or Flow service fails, timeouts, or completes execution.

In the Monitor page, click on Alert Rules as shown below:

Workflow Alerts

The Alert Rules option lets you send alert notifications to specific recipients when a certain event(s) occur during workflow execution. This will help you keep relevant users updated about the status of workflow executions.

Adding Workflow Alert Rules

To add a new alert rule, navigate to tenant homescreen > Monitor > Workflow Alerts.

Click on the Add Alert button located at the right of the screen to create a new alert rule.

You will be redirected to a New Workflow Alert configuration screen.

In the New Workflow Alert configuration screen that appears, enter the details as given below:

Once you have entered these details, click Save. This will add the specified alert rule for your tenant. By default, the status of the added alert rule will always be inactive. You will have to manually activate the alert rule by using the Active toggle button.

After this, whenever the selected workflow fails, timeouts, completes execution, an alert notification will be sent to the specified email address.

Managing Workflow Alert Rules

When you click on the Workflow Alerts, you will view all the list of existing alert rules for your tenant. Here, you can see the name of the alert rule, along with its description and status (active/inactive).

Note
Integration provides the following options when you set an alert rule for workflows:
  • Use custom sender address for alert email notifications: Previously, all alert email notifications were sent from the default sender address set by Integration. You can now use your own SMTP configurations to send alert email notifications from a custom sender address. To do so, you will need to share the relevant details of your SMTP configuration with our Customer Service team, which in turn will set up a custom SMTP config in our internal portal.
  • Display tenant name in the subject of the alert email notification: All alert email notifications will now contain the name of the tenant for which the notification is created. This will help you to clearly distinguish between alert notifications of different tenants. Subject format: Integration alert -<workflow_name> from Tenant: <tenantname>
  • Display tenant timezone in alert email notifications: Previously, all alert email notifications had the UTC timezone. Now, each alert email notification will display the relevant time zone of the tenant for which the notification is created.

Flow service Alerts

The Alert Rules option lets you send custom notifications to specific recipients when a certain event(s) occur during Flow service execution. This will help you keep relevant users updated about the status of Flow service executions.

Adding Alert Rules

To add a new alert rule, navigate to tenant homescreen > Monitor > Flow service Alerts.

Click on the Add Alert button located at the right of the screen to create a new alert rule. Along with a new alert rule, you can also add an alert frequency period (5 mins, 10 mins, or 15 mins) to specify how often the alert rule should be run.

You will be redirected to the New Flow service Alert configuration screen.

In the New Flow service Alert configuration screen that appears, enter the details as given below:

Once you have entered these details, click Add. This will add the specified alert rule for your tenant. After this, whenever the selected flow fails, timeouts, completes execution, an alert notification will be sent to the specified email address.

Note
If you delete a Flow service from your tenant, the alert rule (if added) associated with that Flow service will be automatically removed from the list of existing alert rules.

Managing Flow service Alert Rules

Note
You can select the alert frequency period (5 mins, 10 mins, or 15 mins) to send email messages as per your requirements by clicking on the drop-down arrow beside Alert Frequency.

When you click on the Flow service Alerts, you will view all the list of existing alert rules for your tenant. Here, you can see the name of the alert rule, along with its description and status (active/inactive).

General

The General section allows you to view, track, and monitor your tenant activities. You can view audit logs, current month’s transaction usage statistics for your transaction-based tenants, and clear storage locks for integrations.

Audit Logs

The Audit Logs section maintains a record of all the activities performed by the user. It maintains a history of all the actions that are performed within a tenant, including details such as the type of action performed, the user performing the action, and date/time.

For the Develop Anywhere, Deploy Anywhere and Central Control, Distributed Execution capabilities, audit logs are supported for the following operations:

To access the tenant audit logs, navigate to tenant homescreen and then click Monitor > Audit Logs.

Using Filters in Audit Logs

You can apply filters on the audit logs to retrieve specific logs.

Filter Logs by Time-range Selector

You can view audit logs for a certain time frame using the date picker located at the top-right corner of the Audit Logs screen.

By default, you will see the logs from the last 12 hours. You can either choose the time-range menu to select the required time range or specify the start and end date to fetch the audit logs created between them.

Note
The maximum custom date range that you can specify is 30 days.

Filter Logs by Search Query

You can perform two types of searches, namely, Simple Search and Advanced Search to search and view log events as per your requirements. You can search and filter a set of particular log entries by specifying a search term or a query expression in the search query box. Let’s now understand each of these search capabilities in detail.

You can quickly search through your audit logs by entering a search term in the search query box to fetch specific log details.

A search term can consist of a word (such as John or Doe) or a phrase (like untitled workflow). When a search term is entered, the search scans all columns to retrieve log entries that contain the specified word or phrase.

Note
By default, search queries are not case-sensitive.

For instance, to fetch all log details associated with a Default project, you can simply enter Default or default.

Note
  • You can perform searches on columns - Module, Title, and Action. In addition, the platform allows you to perform searches on other attributes - Module ID and Metadata.
  • You can perform a partial text search for the column - Title, which means all records containing the specified search term will be retrieved. However, for other columns - Module and Action, and for other attributes - Module ID and Metadata, you will need to enter a search term that exactly matches your target keyword.

Similarly, if you want to fetch details of all untitled workflows, you can type the phrase Untitled Workflow or untitled workflow in the search query box.

You can perform an advanced search on your logs to narrow down your searches and form complex queries and fetch specific log details

This functionality allows you to search for a particular set of log entries that satisfy the condition specified in a query expression. A query expression consists of multiple search terms in conjunction with operators.

Note
  • You can perform searches on columns - Module, Title, and Action. In addition, the platform allows you to perform searches on other attributes - Module ID and Metadata.
  • You can perform a partial text search for the column - Title, which means all records containing the specified search term will be retrieved. However, for other columns - Module and Action, and for other attributes - Module ID and Metadata, you will need to enter a search term that exactly matches your target keyword.
  • By default, search queries are not case-sensitive.
  • Operators are case-sensitive and hence, always should be capitalized. Operators written in lowercase are considered as search terms.

Supported Query Expressions

Using an advanced search capability, you can combine multiple search terms with different operators to perform a more specific search. A query expression can either consist of multiple search terms separated by commas or multiple search terms grouped together with parentheses.

Following are the different query expressions that you can perform:

A query that has search terms separated by a comma returns log entries containing the specified values in the same row. For example, the query - Project, Delete - will search for log entries containing Project and Delete in the same row.

A multiple query that has search terms joined by an OR operator with each query grouped tog ether with parentheses returns log entries containing either or both of the search terms in the same row.

For example, the query - (Project, Delete) OR (Untitled Workflow) - will search for log lines containing either Project and Delete or Untitled Workflow or both in the same row.

A query with field-based searches returns the log entries where the attribute equals the specified value in the same row.

For example, the query - (module: project, action: create) will search for log lines where the module - project equals action - create in the same row.

Supported Operators

A query expression allows the following operators:

Operator Example Description
OR (Project, Delete) OR (Default) Searches for log entries containing either or both of the specified search terms in the same row
, (comma) Project, Delete Searches for log entries containing all the specified search terms in the same row
: (colon) (module:project, action:delete) Searches for log entries where the field equals the specified value

Example Query Expressions

Following are examples of a few query expressions for searching a specific set of log entries:

Query Expression Description
workflow, delete Retrieve log entries containing workflow and delete in the same row
created, johndoe Retrieve log entries for which the ‘Created’ action is performed by ‘johndoe’
(created, johndoe) OR (updated, johndoe) Retrieve log entries for which the ‘Created’ or ‘Updated’ action is performed by ‘johndoe’
(published, janesmith ) OR (published, johndoe ) OR (published, veronicasmith ) Retrieve log entries for projects that are published by ‘janesmith’ , ‘johndoe’ , or ‘veronicasmith’
(module: project, action: delete) Retrieve log entries where the column fields - ‘module’ and ‘action’ equal values - ‘project’ and ‘delete’ respectively

Downloading Audit Logs

You can download the audit logs (all or filtered records) either in JSON or CSV format to your local machine.

To do this, click the Download Logs button located at the upper-right corner of the Audit Logs screen.

Next, select the desired format type of audit log report that you want to download.

With this, the required audit logs will be downloaded to the default download location in your machine.

Usage

Note
This option is visible only if transactions are enabled for your tenant.

For all transaction-based tenants, a certain number of transactions are allocated to their account depending on the selected plan on a monthly basis. You can view the current month’s transaction usage statistics of your transaction-based tenants through the Usage tab.

Note
Paid tenants access a different Transaction Usage screen compared to non-paid and Free Forever Edition (FFE) tenants.

For non-paid and Free Forever Edition tenants:

To view the transaction usage of your tenant, navigate to the homescreen and then click Monitor > Usage.

Here, you can check the number of transactions already consumed by your tenant workflows and Flow services out of the total allocated transactions, for the current month.

Note
To upgrade your plan for more transactions, contact our support team by clicking the Contact Support link.

For paid tenants:

You will not see the usage bar that is visible to non-paid and Free Forever Edition (FFE) tenants. Only the total number of transactions consumed is displayed when you click the Usage tab.

Clear Storage Locks

Storage locks refer to a mechanism where the system temporarily locks or marks a shared storage resource (Storage service and scheduled integrations with Prevent concurrent executions) during the execution of an integration to prevent other processes from interfering with it. IBM webMethods Integration uses a short-term store for information that needs to persist across server restarts.

The lock mechanism is used to control access and execution of integrations to avoid conflicts, especially in scenarios where multiple processes or instances might interact with the same data or resources concurrently. Locks help ensure that only one instance of an integration or process is executed at a given time, preventing issues like data inconsistency or resource conflicts. This is achieved by acquiring a lock on a storage resource before performing operations on it and releasing the lock afterward to allow other processes to access the resource.

The Integration system automatically removes integration locks at scheduled intervals, occurring every 1 hour and 15 minutes. However, if you want to manually clear the locks before the designated time, you can clear the locks from the Clear Storage Locks page.

Note
The Clear Storage Locks feature is applicable only for Flow services.

The following are instances when the locks are applied on an integration, and you must manually clear the locks:

Scenario 1

If you choose the Prevent concurrent executions option when scheduling an integration, a lock is applied to the integration before each execution.

In the event of a system outage in Integration while a scheduled integration is ongoing, the lock remains in place and is not automatically released. As a result,

Scenario 2

If you have added the Storage add and lock services in an integration, a lock is placed on the integration. This lock is automatically released upon the completion of the integration execution.

Scenario 3

You have an integration that may run for a few hours, say for example, runtime exceeds 1 hour and 15 minutes, and the integration acquires a lock through Storage add and lock services or by using the Prevent concurrent executions option in the scheduler. Now, even if the integration is running, Integration automatically releases the lock during its routine lock removal schedule. To prevent this scenario and increase the lock clear time, contact Support.

Accessing Clear Storage locks

  1. Go to Monitor > General > Clear storage locks.

    The Clear Storage Locks page displays the following details:

    • Storage Context: Name of the storage provided when adding the storage service in the integration.

    • Key: Name of the storage key provided when adding the storage service in the integration.

    • Locked On: Date and time when the lock was applied on the integration.

Clearing Storage Locks

  1. Go to Monitor > General > Clear storage locks. All integrations on which the locks have been applied appear.

  2. Select the checkboxes corresponding to the integrations for which you want to release the lock.

  3. Click Clear. The lock is cleared and the integration will be executed successfully.

Connectors

This section provides a quick overview of the SAP® ERP performance and SAP® ERP connection pools.

On the Monitor page, click Connectors > SAP® ERP.

SAP® ERP allows you to monitor the following:

Insights

The metering functionality provides you with an overall execution count. However, this information alone is insufficient for a deeper analysis of your usage patterns based on conditions such as time range or specific category-keys.

What you need are:

The Insights functionality is designed to provide you with all of these options and more.

Introduction

Insights feature adds a new module to the Integration portfolio. The primary objective of Insights is to provide you with statistics of transactions. This is achieved through the use of graphical depictions highlighting your usage of Flow services and Workflows. The filtering parameters include:

Accessing Insights

  1. Log in to your tenant. Select the Monitor tab.

  2. Select Insights > Overview in the left-hand side menu.

    • This dashboard provides you with an overview of the number and types of transactions processed through your Integration account.
    • The cards provide you with data on the number of transactions for the Last 7 Days and from the start of the month. Each card shows you a percentage change in comparison with the previous 7 days and the previous month respectively. A negative change is depicted with red, whereas green depicts a positive change.
    • The Timeline card provides a plot of the number of transactions over days, displayed as an XY plot.
    • The Share card provides you with a donut chart that displays the division of transactions shared among the Workflows and Flow services.

  3. Use the calendar to choose your date range. Your cards automatically update graphs based on your calendar selection.

    • When you select a date range, the system applies a start time of 00:00 and an end time of 23:59.
    • The current date information is not captured in the depictions.
    • Your data aggregation is done on a per-day basis, based on the GMT time zone.

  4. Enable or disable the attributes by clicking on the graph legends. Y-axis auto-scales based on the number of transactions. For example, 2 million transactions is displayed as 2M, and 500 thousand transactions is displayed as 500K.

  5. The Analytics section provides you with advanced transaction count charts based on selected date range. The Services tab provides data on Workflows and Flow services. If you want to see all the data again under the tab, click on the Reset button at any time.

    • Turn on the Trend line to depict a pattern across all displayed service types.

  6. In the Analytics section, the Projects tab displays a detailed breakdown of all transactions for each project in your tenant. Click on a graph legend to enable or disable a specific project.

    • The Project list is paginated, with each page displaying data for only five projects. If you do not see your project listed, select the next page.
    • Turn on the Trend line to depict a pattern across all displayed projects.

  7. The Highlights section displays transaction data, sorted by the highest transaction count by default. This information is useful in identifying which services or projects are contributing to more transactions and can assist with performance optimization.

    • The highlights list is paginated, with the default listing as 10 items per page.
    • This list includes a filter based on Transaction count and Execution count. For example, if your Execution count - lower limit is 10, then all services with Executions less than 10 are not displayed. The filter range is 1-100.
    • The Project filter list also displays deleted projects.

  8. The Reports section displays the transaction and execution count of Workflows and Flow services month wise in separate rows.

    • The reports list is paginated, with the default listing as 10 items per page.
    • This list includes a filter based on Service Type, Project, and Services. The Services filter is enabled upon selection of the Project and allows you to select a specific service or all the services from the dropdown menu.
    • The report is downloadable as a csv file for storage and future reference by clicking the Download Reports button located at the upper-right corner of the Reports screen.
    • The Project filter list also displays deleted projects.

    Note
    • The deleted project name for Flow service is displayed as Deleted Project on Reports page similar to the Monitor page.

    • If there are multiple deleted Flow services with the same name, then you can view the duplicate Flow services using the Integrations filter on the Reports page.
  9. Top Consumers section provides you with data on the top transaction consumers categorized by:

    • Projects
      • Hover on the graph for these details: Project: Number of transactions

    • Workflows:
      • Naming convention workflowName:projectName
      • Hover on the graph for these details: Workflow: Number of transactions

    • Flow services:
      • Naming convention flowserviceName:projectName
      • Hover on the graph for these details: Flow service: Number of transactions

Note
Metering retains the final authority with regard to metering data in Insights. Insights capability does not capture transactions when data is returned from the cache of a Flow service. However, these transactions continue to be metered.