Explore the release highlights, usage notes, and known issues for webMethods.io Integration.
|New features and enhancements||Description|
|Streaming support and replaying Salesforce events||webMethods.io integration now allows you to integrate with streaming APIs and process streaming API events using listeners. You can subscribe to any supported streaming API event and specify the FlowService that should be invoked when the subscribed event occurs. Currently, this feature is enabled only for the Salesforce connector.
You can create a Salesforce CRM listener, select a subscription channel, and specify the FlowService to be invoked on the incoming events. Additionally, you can configure the headers and parameters as well as enable and disable the listener. Once enabled, the listener receives the streaming API events and processes the received events.
The Salesforce CRM listener can subscribe and listen to Salesforce events. Salesforce stores standard volume events for 24 hours, so for versions of Salesforce later than v37.0, you can retrieve the events if they are within the retention window. You can replay the lost events by selecting the following replay options:
|Two-way SSL communication for hybrid integrations||webMethods Integration Server 10.7 and later versions support two-way SSL communication between the on-premises Integration Server and webMethods.io Integration. In two-way SSL communication, both the on-premises Integration Server and webMethods.io Integration validate each other’s certificate using private keys.|
|Support for multipart request body||For some connectors and operations, webMethods.io Integration now supports the multipart/form-data media type using which you can embed binary data such as files into the request body. Here’s the list of connectors that currently support multipart functionality:
|New Services||Compress category
Use Compress services to compress the data before sending the HTTP request and decompress it after receiving the HTTP response.
compressData: Performs compression of data.
decompressData: Performs decompression of data.
Use Datetime services to build or increment a date/time. The services in datetime provide more explicit timezone processing than similar services in the date category.
build: Builds a date/time string using the specified pattern and the supplied date/time elements.
increment: Increments or decrements a date and time by a specified amount of time.
In the existing Flow category, the new setCustomContextID service is now available. setCustomContextID associates a custom value with an auditing context. This custom value can be used to search for FlowService executions in the webMethods.io Integration Monitor screen.
|Execution Source||In the webMethods.io Integration Monitor screen, the Execution Source filter is now available, using which you can search for FlowService executions and fetch relevant execution data based on the execution source. Available values are Scheduler, User Interface, HTTP Interface, REST APIs, SOAP APIs, Workflow Interface, Streaming, and JMSTrigger.
|Custom Transaction ID||You can now select the SetCustomContextID service available in the Flow category, set a value for the id field in the mapping editor, save and run the FlowService, and then can search for the custom context ID in the Monitor > Execution Results > FlowService execution page by clicking Filters and specifying the custom context ID in the Context ID field.
|Support for Partner Certificate for AS2 connector||webMethods.io Integration now allows you to add a certificate of a trusted partner. The partner’s certificate contains a public key which is required to encrypt outbound request messages and verify the signature of inbound messages.
The certificate received by your trusted partner can be used while configuring accounts for certain connectors (currently only supported in the AS2 connector) to securely send and receive business data across the communication protocol.
|Database Application changes||Dynamic SQL and custom SQL action templates
Custom SQL: Defines and executes custom SQL to perform database operations. You can run almost any SQL statement required, such as data management statements and data definition statements.
Dynamic SQL: Configures a dynamic SQL statement, part of which you set at run time using input fields. At run time, the Dynamic SQL action will create the SQL statement by combining the contents of the input fields and then run it. This is useful when you need the flexibility to set all or part of a SQL statement at run time, instead of at design time.
Database Application supports the following databases: SAP HANA, Amazon Redshift, Snowflake, Heroku PostgreSQL, Teradata, and Vertica.
|Dynamic Accounts for predefined connector operations||The Dynamic Account feature enables you to dynamically override certain details of the account being used to run a predefined operation of a connector.
Currently, the dynamic account feature is enabled for certain operations in the following predefined connectors:
|Dynamic Accounts for custom REST connector operations||The Dynamic Account feature enables you to dynamically override certain details of the account being used to run a custom action created under a custom REST connector.
You can enable this feature while creating a new custom action or by editing an existing custom action of a custom REST connector.
|Input and Output field Validations||The Define IO feature has been enhanced to perform validations for the input and output fields during runtime. These validations are performed based on the constraints specified while defining the fields.
The validation process prevents erroneous information such as incorrect lengths, inappropriate compositions, out of range, invalid or missing data from being processed and stored. Further, it ensures that the data entered is reliable and accurate as identifying and correcting errors can be very time-consuming and expensive.
Two new fields Validate input and Validate output have been added in the Define input and output fields screen in support of this feature. You can either enable or disable the validations based on your requirement.
|Expression Editor||New interface Expression Editor has been added to enable you to create complex expressions with ease for conditional controls such as If, Else If, While, and Do Until in FlowServices. Further, the Expression Editor:
|Map by Condition||A new option Map by Condition has been added in Pipeline that allows you to define conditions for mapping between fields. For example, map the fields BuyersTotal and OrderTotal only if BuyersTotal has a value, that is, Not Null. Only when the condition is satisfied the fields are mapped. Otherwise, the FlowService fails.
This helps you to validate the values at design time and ensure that the FlowService runs without errors.
|Indexed Mapping||The Array Mapping feature in Pipeline has been enhanced to allow you to map array fields at the element level. For example, you can specify the index for the array element you want to map, that is, map the third element in a String Array to a String.
Earlier, the mapping at the array element-level was not possible. A new dialog box Indexed Mapping has been introduced to map the array elements.
This option is not available for Document and Document Reference arrays currently.
|Managing subscriber states in Messaging||You can now view, change, and manage the state of a subscriber. The states that can be set for a subscriber are Enable, Disable, and Suspend.
|Import and export functionality for Messaging connector||webMethods.io Integration now allows you to import and export workflows or FlowServices that uses a Messaging connector.|
|Cloning Messaging connector||webMethods.io Integration now allows you to clone workflows or FlowServices that uses a Messaging connector.|
|Account usage for connectors||In earlier releases, webMethods.io Integration supported two different authentications for connectors in Workflows and FlowServices.
From this release, webMethods.io Integration supports one common Account for both CloudStreams and Node.js connectors. This will reduce the effort to create multiple Accounts for actions and triggers.
The common authentication functionality is currently available for the following connectors:
|Previewing FlowService Recipe Pipeline Mappings||In earlier releases, you were able to preview the FlowService steps of a Recipe but not the pipeline mappings. You can now view the mappings of each FlowService step that does not include either custom operations or other FlowServices. This helps you to gain more insight about the Recipe functions before using them.
|Recipe Lock||A message will now appear for a user who is trying to import a recipe when another user is already importing the same or a different recipe concurrently, in the same tenant.|
|Selecting all data fields to use for business objects||You can now select all the data fields to use for the business object while creating a custom operation by selecting the Fields option.
|Search functional areas while creating custom operations||For some connectors, for example, Coupa and Cumulocity, you can now search for the functional areas while creating a custom operation.
|Tooltip for predefined connectors in FlowServices||While creating a FlowService, if you hover over a predefined connector, you can now view a pop-up window that provides detailed information about the particular connector.
|Revamped Getting Started wizard||The revamped Getting Started wizard allows you to get familiarized with webMethods.io Integration and access the referenced resources.
|Intercom removed from the UI||In earlier releases, you could use the Intercom available on the user interface for support. In this release, the Intercom is removed.
For support, go to the Help menu and click Support.
|New Recipies for Magento Connector||The following new recipies have been added in Workflows for Magento:
This section provides any additional information that you need to work with webMethods.io Integration.
webMethods.io Integration is supported on the latest version of Google Chrome web browser (v92 or later).
After webMethods.io Integration is upgraded, the older version browser’s cache and cookies are not cleared automatically. This may lead to incorrect display of page content. Ensure that you clear the browser’s cache manually after every upgrade, and then log in to webMethods.io Integration.
webMethods.io Integration is best experienced when the scale and layout are adjusted to the below settings:
webMethods.io Integration supports Transport Layer Security (TLS) version 1.1 and version 1.2 for connectivity.
Any time stamp displayed in webMethods.io Integration is based on the user’s registered time zone specified in Software AG Cloud. Not all the time zones in Software AG Cloud are supported in webMethods.io Integration. If a time zone in Software AG Cloud is not supported, then the time stamp in webMethods.io Integration defaults to the Pacific Standard Time (PST) time zone.
To change the time zone for a user profile, using the App Switcher, go to My Cloud and in Software AG Cloud, click My Cloud > Administration. On the Administration page, edit the user profile. Under Locale Information, select the required time zone. Save the changes, log out, and then again log in to see the changes.
Software AG Cloud products are available in several geographical regions, operated by different infrastructure providers. Click here for information on the available products and also the underlying infrastructure provider in each region.
Information on major releases, hotfixes, and patch releases including the latest statistics on cloud environments availability, incident history, and planned maintenance events are available on the webMethods.io iPaaS Service Status page.
If you have allowed the Cloud Universal Messaging (UM) hostname in the firewall, you have to also allow the new UM hostnams along with the old one. Click here and select the Show IP option on the Software AG Cloud Regions website for information regarding IP addresses.
Accounts created for SOAP and REST connectors using the Flow Editor do not display the Edit and Delete icons when the same Accounts are displayed on the Connectors page in webMethods.io Integration. You can use the existing accounts configured using the Flow Editor, but you cannot edit or modify the account details. Ensure that you create a new account for the SOAP and REST connectors if any changes are required for the existing accounts.
Further, the accounts created for SOAP and REST connectors using the Flow Editor cannot be used in Workflows, that is, pre-existing accounts will not be available in the Connect to drop-down list while creating a Workflow. Ensure that you create new accounts in such cases. Also, for migrated REST applications, you can modify an existing operation by creating a new account but not with an existing account.
You can simultaneously open up to five FlowServices in different tabs in a browser.
If you want to import a FlowService to another tenant that has an on-premises connector, before importing the FlowService, ensure that you upload the same on-premises connector to the other tenant in webMethods.io Integration, else you will not be able to import the FlowService.
When you publish a project that has a FlowService, and if the FlowService has an On-Premises connector, the On-Premises connector is not pushed to the destination tenant. Ensure that you configure the same On-Premises connector on the destination tenant to make the published FlowService work.
If the FlowService you are importing use SOAP or REST connectors and if those connectors do not exist in your system, continue importing the FlowService. The connectors are imported along with the FlowService. After importing, create the Accounts and then configure them in the imported FlowService.
After updating a REST-based connector’s document type, if you try to use that connector, the input doctype change is not reflected. After changing the doctype, ensure that you update the operation which is using this doctype. Then only the signature of the operation is changed.
If a FlowService you are importing uses an on-premises connector and if the connector does not exist in your system, the Account appears only after you have uploaded the on-premises connector.
If a FlowService is created with output HashTable and a workflow is created with the same FlowService, and if you run the same service in the FlowService, HashTable is not available in the FlowService Output. Set the Accept header as application/json so that the output has the HashTable object.
The FTP connector used in a FlowService is different from the FTP connector used in a workflow. So if you create an FTP account in a FlowService, the FTP account will not be available inside a workflow. Similarly, if you create an FTP account in a workflow, the same FTP account will not be available inside a FlowService.
A workflow, which contains an on-premises connector, cannot be cloned.
Export and import of workflows with custom Node.js CLI connectors across environments as well as import of a Workflow containing custom Node.js CLI connectors into the My Recipes section is currently not supported.
Workflows containing custom connectors that are available only in a particular tenant, cannot be exported to other tenants. When you try to export such workflows, the following message appears: Cannot export this workflow as the associated CLI app is not published globally. Please publish the app first, and try again.
After navigating to Software AG Cloud, if the trial period has expired, you will not be able to switch back to webMethods.io Integration.
After creating a SOAP connector, updating the description of the same connector will take longer than usual. This happens if you try to update a SOAP connector just after creating it. After creating a SOAP connector, go to the Connectors page and update it.
Additional headers are noticed when running a REST application.
When a REST application FlowService is run with changed header attributes, sw6 headers are observed in the pipeline output result. sw6 is a valid request header used by End-to-End Monitoring for monitoring Software AG product runtimes. This has no impact on feature functionalities.
If you are not able to map multiple documents to a document array based on indexing in a FlowService, create a document array that holds all the required documents using the insertDocument operation of the Document service. For example, if you want to map three documents (D1, D2, and D3) to a document array (DA1[ ]), add the insertDocument operation thrice, once for each document, and insert D1, D2, and D3 into the documents array. You can also use loops to run the insertDocument operation thrice. Finally, map the documents[ ] array, which holds all the three documents, to DA1[ ].
- Although the Workflow to FlowService or on-premises application mapping of the following data types is allowed in webMethods.io Integration, such Workflows, FlowServices, or on-premises applications may not run as expected:
- String type in workflows to Num (Double/Float/Int/Short/Long) type in on-premises applications
- String type in workflows to Byte type in on-premises applications
- Boolean type in workflows to Byte type in on-premises applications
- Object type in workflows to Byte type in on-premises applications
- Number type in workflows to Byte type in on-premises applications
- Array type in workflows to Num type in on-premises applications
- Number type in workflows to Boolean type in on-premises applications
- Boolean type in workflows to Num type in on-premises applications
- Object type in workflows to Num type in on-premises applications
- String of String Array type in workflows to Boolean type in on-premises applications
- Float type in workflows to Num type in on-premises applications
- If the data type of a FlowService is set to Object in the I/O signature, then it is considered as a String type in the associated workflow. Such workflows may not run as expected.
- To avoid issues related to access token invalidation, configure a unique set of client application details for a custom OAuth-based account.
This section lists the issues for this release that were known when this release readme was published.
In a transform pipeline, when both direct mappings and prediction mappings are present and you try to scroll the pipelineInput or pipelineOutput panel, the mapping lines are broken.
Workaround: Collapse and expand the direct mappings. The mapping lines will be repainted properly. This issue exists only when the scroll bar exists and you scroll either horizontally or vertically.
The icon position for adding a new FlowService step is not proper when the Expand All and the Collapse All buttons are used.
Workaround: Toggle the expanded state of the step on which the icon appears. This corrects the icon position when adding a new step.
Unable to create a REST API with JSON file.
This issue is observed when a REST API is created from a Swagger file, which has OAuth as the authentication scheme.
Currently, creating REST APIs from Swagger files which has OAuth as the authentication scheme is not supported.
Users are unable to update the custom time field under the Monitor tab with a simple click in Firefox browser.
Workaround: In Firefox, users can clear the time field value they want to update and reenter the required value. Alternatively they can use any other browser to update the custom time field value with a simple click.
In the Database application, updating an Update operation leads to incorrect output.
Workaround: Once you add or delete the table columns or data fields, verify the input/output names and update the proper input/output names if they are not correct.
After logging out from Software AG Cloud, when a user logs in again using a different username and then navigates to webMethods.io Integration using the App Switcher, the user sees that the earlier user’s session has still not been logged out.
Workaround: Log out from webMethods.io Integration, instead of from Software AG Cloud.
Editing an imported workflow or FlowService having database application operations displays an error.
When a workflow or a FlowService is exported using the webMethods.io Integration export functionality, the connection name is not saved in the exported workflow or FlowService. On editing the database operation in the imported workflow or FlowService, the connection associated with the action is not found.
Workaround: The imported workflow or FlowService having database operations can be executed with a valid connection.
The SQL query in CustomSQL is parsed by the system using a third-party SQL query parser (FoundationDB SQL Parser) to populate the input and output. When the SQL query parser is unable to parse, the error “Failed to fetch Database Metadata” appears.
Workaround: If the input and output for CustomSQL operation is not populated, ignore the error and configure the input and output explicitly.
When you try to import a workflow that contains a Messaging trigger in another environment, the Messaging trigger is not imported.