FlowServices

Use FlowServices to encapsulate a sequence of services within a single service and manage the flow of data among them, and create complex, advanced integration scenarios involving multiple application endpoints.

FlowServices provide you with rich data mapping capabilities, a familiar debugging mechanism, a large collection of built-in services, and more.

FlowServices

webMethods.io Integration offers various features that enables you to automate tasks based on specific requirements. However, there are some scenarios where you want to create complex integrations that require advanced data transformation and custom logic implementation. Earlier you could do this by switching to the Flow Editor using the App Switcher and create the required integrations in the Flow Editor.

We have simplified this process by providing you with FlowServices directly in your webMethods.io Integration project, thus eliminating the need to access the Flow Editor through the App Switcher. With FlowServices, you can encapsulate a sequence of services within a single service and manage the flow of data among them, and create complex, advanced integration scenarios involving multiple application endpoints.

In the Flow Editor, you used a graphical drag and drop tool to create integrations. You set up your integrations by plugging blocks together in the workspace and coding concepts were represented as interlocking blocks.

In FlowServices, you can easily build a FlowService by adding steps and selecting the constructs including Connectors, Controls, FlowServices, and Services from within the steps. The editor is visually enhanced to offer ease of use and is more interactive.

A FlowService step is a basic unit of work that webMethods.io Integration interprets and runs at run time. The steps are the building blocks of a FlowService and are shown as rectangular blocks prefixed with step numbers. Steps may be compared to an actual implementation of a service in a programming language, where the business logic or the algorithm is written. webMethods.io Integration lists steps sequentially from top to bottom and also runs them in that order.

Enabling FlowServices

Let us see how to enable the FlowServices capability in webMethods.io Integration.

When you login to webMethods.io Integration, you are redirected to the Projects page by default. You can either create a new project or select an existing project.

Create a new workflow inside a project by clicking the icon.

On the new workflow canvas, click Proceed to enable the FlowServices option as shown below. This switch is permanent. Once you enable FlowServices for a user, you cannot disable it.

Once you click Proceed, the FlowServices capability gets enabled and the FlowServices icon appears.

Click the FlowServices icon to view the list of FlowServices. You can click the icon to go to the FlowServices editor where you can start creating a FlowService.

Once you have created a FlowService, the FlowService will be automatically added in the FlowServices panel of the workflow canvas, and you can use the FlowService in any workflow of this project.

The FlowServices tab also appears as shown below once you enable FlowServices. You can click the icon to start creating a FlowService. Here, you will see the list of all FlowServices created under the selected project and all FlowServices created from here will be automatically added under the FlowServices panel in the Workflow canvas.

Note: Click here for information on how to manage roles and project permissions.

Migrating Flow Editor Integrations to FlowServices

If you are an existing customer and have created integrations in the Flow Editor, you can migrate those integrations to FlowServices in webMethods.io Integration using the Migrate Integrations functionality. You can migrate integrations from a Flow Editor project to the same project in FlowServices.

How it works

  1. In webMethods.io Integration, enable FlowServices. Ensure that you have the Developer and Admin roles assigned from the Settings > Roles page.

  2. In webMethods.io Integration, select the project where you want to migrate the Flow Editor integrations and click the FlowServices tab. webMethods.io Integration displays the number of integrations available for migration.

  3. Click Migrate Integrations.

A dialog box appears displaying the list of integrations that will be migrated. Click OK to continue.

A dialog box appears showing the migration results. Click OK.

All integrations available in the Flow Editor project are migrated and available in FlowServices in case of a successful migration. In case of any errors, it is recommended to recreate those integrations in FlowServices.

Notes

  • After you migrate the integrations, you will not be able to open or edit the same integrations in the Flow Editor.
  • If a Flow Editor integration refers to another integration, both the integrations will be migrated.

Core elements, constructs, and components of FlowServices

Let us see the core elements, constructs, and components that are used to create and run a FlowService.

On the new FlowService page, click on the rectangular box as shown below.

By default, the left panel lists the recently used Connectors, Controls, FlowServices, and Services.

You can type a keyword in the rectangular box and search for the available elements. webMethods.io Integration filters the data based on what you type in the search box.

Click All to view the categories available on the right panel, which you can use to build the FlowService.

Categories

Connectors

Displays the connectors available to create the FlowService.

Connectors are grouped in the following categories on the FlowServices panel.

Predefined Connectors

Predefined and configurable connectors. These connectors allow you to connect to the SaaS providers.

Note: A few connectors are deprecated in this release. A deprecated connector displays the Deprecated label just below the connector where ever it appears in the user interface. Deprecated connectors will continue to work as before and are fully supported by Software AG. If you are using deprecated connectors in your existing Workflows and/or FlowServices, they will work as expected. No new feature enhancements will be made for deprecated connectors. If you are creating new Workflows or FlowServices, it is recommended to use the provided alternative connectors instead of the deprecated connectors. The deprecation is applicable only for Actions. The deprecation is not applicable for Triggers, that is, Triggers are supported for both deprecated and alternative connectors. For the list of triggers, see the documentation for alternative connectors.

REST Connectors

You can define REST Resources and Methods and create custom REST connectors. You can invoke a REST API in a FlowService by using a REST connector.

SOAP Connectors

Displays custom SOAP connectors. Custom SOAP connectors enable you to access third party web services hosted in the cloud or on-premises environment. You can also invoke a SOAP API in a FlowService by using a SOAP connector.

On-Premises Connectors

On-Premises applications uploaded from on-premises systems.

Flat File Connectors

Displays the Flat File connectors created either manually or from a sample file.

CONTROLS

Controls are the programming constructs in a FlowService. This allows you to run a specified sequence based on a field value, try a set of steps, and catch and handle failures. The panel displays conditional expressions, looping structures, and transform pipeline. Conditional expressions perform different computations or actions depending on whether a specified boolean condition evaluates to true or false.

Sequence

Use the Sequence step to build a set of steps that you want to treat as a group. Steps in a group are run in order, one after another.

Conditional Controls

Loops

Loops run a set of steps multiple times based on the block you have chosen. It repeats a sequence of child steps once for each element in an array that you specify. For example, if your pipeline contains an array of purchase-order line items, you could use a Loop to process each line item in the array. Loop requires you to specify an input array that contains the individual elements that are used as input to one or more steps in the Loop. At run time, the Loop runs one pass of the loop for each member in the specified array. For example, if you want to run a Loop for each line item stored in a purchase order, you would use the document list in which the order’s line items are stored as the Loop’s input array.

A Loop takes as input an array field that is in the pipeline. It loops over the members of an input array, executing its child steps each time through the loop. For example, if you have a FlowService that takes a string as input and a string list in the pipeline, use Loops to invoke the FlowService one time for each string in the string list. You identify a single array field to use as input when you set the properties for the Loop. You can also designate a single field for the output. Loop collects an output value each time it runs through the loop and creates an output array that contains the collected output values.

Note: In case of an infinite loop, there is a default timeout configured in webMethods.io Integration. If the time taken for execution exceeds this limit, the FlowService execution is terminated. Contact your administrator for customizing the default timeout.

Error Handling

Exit

This step exits the entire FlowService and signals success or failure as part of the exit.

Switch, Case

Switch allows a variable to be tested for equality against a list of values. Each value is called a case, and the variable being switched on is checked for each case, that is, Switch evaluates a variable and skips to the value that matches the case. For example, if the Switch variable evaluates as “A”, then case “A” is run.

A switch statement can have an optional default case, which must appear at the end of the switch. The default case can be used for performing a task when none of the cases are true. You cannot insert multiple default statements.

You can include case steps that match null or empty switch values. A switch value is considered to be null if the variable does not exist in the pipeline or is explicitly set to null. A switch value is considered to be an empty string if the variable exists in the pipeline but its value is a zero length string. Switch runs the first case that matches the value, and exits the block.

Branch

Branch is a conditional execution of steps and constitutes a group of expressions. Within a Branch, webMethods.io Integration runs the first expression that evaluates to true. Expressions are the conditions on the pipeline variables.

At run time, Branch evaluates the conditions provided and runs the first expression whose condition is evaluated to true. If none of the expressions are true, the default expression if included, is run.

The following figure is the default template of a Branch construct.

If you want to perform different actions on different values of one or more pipeline variables, use the Branch construct. In the following example, action is a pipeline variable created using the Define input and output fields dialog box.

The Branch contains two conditional expressions and a default expression.

Scenario 1

If the value of action starts with mult, webMethods.io Integration evaluates the first expression (action = /^mult/) and runs step 3, that is, performs the multiplication operation.

Scenario 2

If the value of action is addition (action == “addition”), then the Branch starts its execution from step 2. As step 2 is evaluated to false, the execution moves to the next expression, that is, step 4. Step 4 is evaluated to true, hence step 5 is run, that is, the addition operation is performed. Remaining expressions in the Branch, if any, are ignored and the execution falls through to the next step after the Branch in the FlowService.

Scenario 3

Let us assume that the value of action is subtraction. The Branch then starts its execution from step 2. As webMethods.io Integration evaluates step 2 to false, the execution moves to the next expression, that is, step 4. webMethods.io Integration evaluates Step 4 to false, hence evaluates step 6 ($default), that is, runs step 7, which makes the execution exit the FlowService.

Notes

  • If you are specifying a field in a document or in a document reference, format it as document/documentVariable. For example, if you want to specify a field name, from the document employeeProfile, then format it as employeeProfile/name.

  • If you specify a literal value in an expression, the value you specify must exactly match the run-time value of the pipeline variable. Further, the value must be case sensitive.

  • webMethods.io Integration runs only the first target step whose expression evaluates to true. If none of the expressions evaluate to true, none of the child steps are invoked, and the execution falls through to the next step in the FlowService, if there is no default expression.

  • If you want to prevent the execution from falling through a Branch step when an unmatched value occurs at run time, include a default target step to handle unmatched cases. Branch can have zero to many default expressions. webMethods.io Integration runs the first sequentially encountered default expression.

  • The default step does not need to be the last step of a Branch but webMethods.io Integration always evaluates the default step at the end.

Any step other than an expression cannot be a direct child of a Branch step. Further, you cannot add the expression step anywhere outside a Branch. If you are branching on expressions, ensure that the expressions you assign to the target steps are mutually exclusive. In addition, do not use null or empty values when branching on expressions. webMethods.io Integration ignores such expressions and does not display any errors. You can provide multiple conditions for each expression and can also use regular expressions, for example, /^mult/. The expressions you create can also specify a range of values for the variables.

Specify the value of the variables in the expressions, as mentioned in the following table:

To Match… Specify…
That exact string A string
The string representation of the object’s value.
Example for Boolean object: true
Example for Integer object: 123
A constrained object value
Any string matching the criteria specified by the regular expression.
Example: /^REL/
A regular expression
An empty string A blank field
A null value $null
Any unmatched value (that is, run the step if the value does not match any other label) $default

Transform Pipeline

As systems rarely produce data in the exact format that the other systems need, at times you need to transform data in terms of format, structure, or values of data. Using the Transform Pipeline control in the FlowService you can do transformations on data in the Pipeline.

You can insert multiple transformers in a single Transform Pipeline step, to perform multiple data transformations. When multiple transformers are added, the Transform Pipeline step behavior is as follows:

Consequently, the output of one transformer cannot be used as the input to another transform. These characteristics make the Transform Pipeline step different than that of a normal step in a FlowService.

Inserting a Transform Pipeline Step in a FlowService
  1. Create a FlowService. A FlowService step is created initially without any data.

  2. Select Transform Pipeline from the step drop-down list. The Transform Pipeline step is added.

Adding Transformers
  1. Create a FlowService. A FlowService step is created initially without any data.

  2. Select Transform Pipeline from the step drop-down list.

  3. Click on the FlowService step. The Pipeline panel appears.

  4. Click on the Transformers column header. The Select Transformer step is added under the Transformers column.

  5. Select the function from Select Transformer. The input field values are modified based on the selected transformer.

Mapping Pipeline Fields to Transformer Fields

The pipeline fields can be mapped to the service fields of the transformers added.

  1. Go to the Transformer step (for which the transformers have been added) in the Pipeline panel.

  2. Click in the Transformer step. The fields are displayed.

  3. Link the fields as per your requirements.

Notes

  • At a time, only one transformer can be expanded. On collapsing, all the existing transformers are visible.
  • You can delete a transformer using the delete icon against each transformer. If a transformer is deleted, the mappings made to its service fields will also be deleted.
Mapping Fields Directly

The Pipeline Input fields can be directly mapped to any of the Pipeline Output fields. This is known as Direct Mapping.

Example

Let’s see how transformers work with the help of an example. In the below example, we will transform the case of a string field label to upper case.

1.Provide a name and description of the new FlowService and click the I/O icon.

2.Define an input field Currency Code and click Done.

3.Select Transform Pipeline.

4.Click the pipeline mapping icon and open the pipeline mapping window.

5.On the pipeline mapping window, click the Add New Transformer option.

The Select Transformer panel appears.

6.Click ALL on the transformer panel.

7.Select the toupper service.

8.Click the Expand icon.

9.Map Currency Code to inString and value to Currency Code.

10.Save the FlowService and run it. In the Input values dialog box, type usd as the Currency Code.

11.Click Run on the Input values dialog box. The transformer converts the string field usd in lowercase to uppercase.

Note: You can delete a transformer by clicking the delete icon against each transformer. If a transformer is deleted, the mappings made to the service variables are also deleted.

Flow Services

Displays the FlowServices available in the selected project. This enables you to invoke other FlowServices from the current FlowService. The input and output of the referred FlowService must be accordingly mapped for a successful execution.

Services

Displays the service categories. An extensive library of built-in services is available for performing common tasks such as transforming data values, performing simple mathematical operations, and so on.

Service input and output parameters are the names and types of fields that the service requires as input and generates as output and these parameters are collectively referred to as a signature.

Note: Related services are grouped in categories. Click here for information on the service categories, and input parameters, output parameters, and usage notes if any for each service.

Tasks associated with FlowServices

You can perform the following tasks for FlowServices by clicking the icons available on the FlowServices panel.

Icons Task/Description
Provide a FlowService name and description. The asterisk denotes that the FlowService is modified.
Search for text in the FlowService, inside the editor, and within the steps.
Debug a FlowService and inspect the data flow during the debugging session. For more information, see Debug FlowServices.
Run the FlowService after you save it. This enables you to test the FlowService execution in real time and view the test results.
Define the input and output fields of a FlowService. For more information, see Input and Output Fields Declaration.
This option allows you to log business data at the FlowService level. You need to first click the I/O icon and define the input and output fields. Then click the Log business data icon and select the defined fields in the Log business data dialog box.
You can choose whether you want to log business data only when errors occur On Failure, or choose Always to always log business data. The default setting is On Failure.
When selecting fields for logging, you can specify the same display name for more than one field, but this is not recommended. Having the same display name might make monitoring the fields at run time difficult.
The business data log appears in the Execution Details page under Business Data.
For more information, see Log Business Data.
Click the Navigator icon to view a summary of the FlowService steps, which appear on the right panel. Move the slider available on the navigator panel to move through the FlowService. If you move through the main FlowService page, the navigator view will automatically move. You can click on a navigator step to go to that step in the FlowService.
Press Ctrl and select a step or multiple steps in the FlowService. Then click the delete step(s) icon to delete the selected step or multiple steps in the FlowService. You can also delete steps by right-clicking on a step on the FlowService editor and selecting the delete option.
Press Ctrl and select a step or multiple steps in the FlowService. Then click the move step(s) icon and select the move steps up, down, right, or left options.
You can also move steps by right-clicking on a step on the FlowService editor and selecting the available options.
Cut, Copy, Paste, Duplicate, Enable, and Disable steps in the FlowService.
You can also right-click on a step on the FlowService editor and add a step or a child step for the selected step, cut, copy, and paste steps, copy the selected step(s) and paste the step(s) after the selected step(s) by clicking Duplicate, delete a step, and move steps up, down, right, or left.
Undo a step action.
Redo a step action.
Whenever you save a FlowService, a newer version is added to the Version History with the default commit message. You can also provide a custom commit message by clicking the drop-down arrow beside the Save option and selecting Save with message.
Access Help topics or take a tour.
The following options are available:
  • Schedule: Define a schedule for the FlowService execution. Select Once if you want to schedule the FlowService to run just once immediately or run once at a specified date and time. Select Recurring if you want to define a recurrence pattern. You can define a recurrence pattern daily, weekly, monthly, and in hours. Select the frequency (Hourly, Daily, Weekly, Monthly) with which the pattern recurs, and then select the options for the frequency. Click the + icon to repeat the execution for daily, weekly, and monthly schedules. Click the delete icon to delete the selected execution time for daily, weekly, and monthly schedules.
    Select Prevent concurrent executions to skip the next scheduled execution if the previous scheduled execution is still running except when the previous scheduled execution is running for some time. In this case, the next scheduled execution starts even if the previous scheduled execution is still running. If you do not select this option, the next scheduled execution starts, even if the previous scheduled execution has not yet completed. In the Input Value Set panel, provide inputs to the FlowService based on the defined input signature. Click Delete if you want to permanently remove the current recurrence schedule.
    Note: Any time stamp displayed in webMethods.io Integration is based on the time zone you have specified in Software AG Cloud. All time zones available in Software AG Cloud are currently not supported in webMethods.io Integration. If a time zone in Software AG Cloud is not supported, then the time stamp in webMethods.io Integration defaults to the Pacific Standard Time (PST) time zone.
    Note: You can pause a FlowService execution that was scheduled by clicking the Pause option on the Overview page of a FlowService.
  • Key Shortcuts: View keyboard shortcut keys.
  • Execution History: View successful and failed FlowService executions, operations, and business data logs.
  • Version History: View the version commit history of the FlowService.
An account is linked and configured.
Note: For a migrated or imported FlowService, the step-level validation might show that the account is not configured. This means that the steps are linked to an account alias but the account does not exist. In such a case, you can create the account inline and refresh the page, or you can go to the Connectors page and create the account with the same name as the alias.
An account is not linked.
A red circle on a step may appear for many error scenarios, for example, if an account is not configured, or the associated action is not selected, or for any invalid constructs.
Map or edit existing pipeline mappings.
Click the ellipsis icon and perform the following tasks:
  • Comment: Add a note for the selected step.
  • Disable: Disable or deactivate a step. You can enable the step by clicking the enable step icon . You can cut, copy, or delete a step after you disable it.
  • Log business data: This option allows you to log business data at the step level.
    Note: At the step level, the Log business data option is enabled only for connectors.
    In the Log business data dialog box, choose whether you want to log business data only when errors occur On Failure, or choose Always to always log business data. The default setting is On Failure. Expand the Input Fields and Output Fields trees to display the fields available in the signature, and select the check boxes next to the fields you want to log. If you want to define a display name for a field, type the display name beside the field. The display name defaults to the name of the selected field, but it can be modified. When selecting fields for logging, you can have the same display name for more than one field, but this is not recommended. Having the same display name might make monitoring the fields at run time difficult.
    The business data log appears in the Execution Details page under Operations > Business Data. See the Log Business Data section for more information.
  • Delete: Delete the step.
Click this icon to open the referenced FlowService if a FlowService references any other FlowService.
Indicates count of only the immediate child steps.

Input and Output Fields Declaration

The Define I/O feature allows you to define the input and output fields for a FlowService. You can define the input and output fields from the Define input and output fields screen. You can access the screen by clicking the Define I/O icon () from the FlowServices action bar.

The Define input and output fields screen has two tabs:

You can declare the fields of a FlowService in one of the following ways:

Notes: The declared input and output fields are automatically displayed in the Pipeline Input and Pipeline Output columns of the Pipeline panel. The field names are case sensitive.

Guidelines for Defining Input and Output Fields

Although declaring input and output fields for a FlowService is optional, it is strongly recommended that you make it a practice to declare the fields for every FlowService that you create due to the following reasons:

Input Fields

Output Fields

Declaring Fields Manually

You can declare fields manually in one of the following ways:

Using Add a new set

Here, the instructions are explained for input fields. You can follow the same instructions to add output fields from the Output Fields tab.

  1. Go to the Define input and output fields screen.

  2. Select Add a new set in the Input Fields tab.

  3. Click Add in the Data Fields section. The fields are listed under Data fields.

  4. Enter the details as per your requirements.

  5. Click Done. The input fields are declared.

Notes:

  • When you select the Type as String in the Display Type field, select:

    • Text Field if you want the input entered in a text field.

    • Password if you want the input entered as a password.

    • Large Editor if you want the input entered in a large text area instead of a text field. This is useful if you expect a large amount of text as input for the field, or you need to have new line characters in the input.

  • In the Pick List field, define the values that will appear as choices when webMethods.io Integration prompts for input at run time.

  • In the Content Type field, you can define constraints for the field such as, minimum value of a field or maximum length of a string. For example,

Loading through XML or JSON

Here, the instructions are explained for input fields. You can follow the same instructions to add output fields from the Output Fields tab.

  1. Go to the Define input and output fields screen.

  2. Select Add a new set in the Input Fields tab.

  3. Perform one of the following actions:

    • Load XML - Click Load XML in the Data Fields section to define the content in XML. The Type or paste XML content text box appears.

    • Load JSON - Click Load JSON in the Data Fields section to define content in JSON. The Type or paste JSON content text box appears.

  4. Type the details as per your requirements. Alternatively, you can paste the details from an XML or JSON file.

  5. Perform one of the following actions:

    • Load XML - Click Load XML if you have defined the fields in the XML format.

    • Load JSON - Click Load JSON if you have defined the fields in the JSON format.

    The input fields are defined.

Declaring Fields using Document Reference

You can use a document type to define the input or output fields for a FlowService. For example, if there are multiple FlowServices with identical input fields but different output fields, you can use a document type to define the input fields rather than manually specifying individual input fields for each FlowService. When you assign a Document Type to the Input or Output side, you cannot add, modify, or delete the fields on that part of the tab.

You can select Document Type from the Document reference drop-down list. A Document Type can be created using the Document Types option under Project > Configurations > FlowService.

You can create pipeline fields as document references, create document types comprising of document references, and also define the signature of FlowServices comprising of document references.|

  1. Go to the Define input and output fields screen.

  2. Select Document Reference in the Input Fields tab. The Choose Document Reference drop-down list appears listing all documents available in the project.

  3. Select the document from the Choose Document Reference drop-down list.

  4. Click Done. The input fields are declared.

Input and Output Field Validations

The Input and Output Field Validations feature enables you to validate the input and output fields during runtime. While declaring the input and output fields you can provide constraints such as, minimum value a field can accept, maximum length of a string, and so on. During runtime, webMethods.io Integration validates the values and if the constraints are not satisfied, then the FlowService execution fails.

You can enable this feature using the options Validate input and Validate output in the Input Fields and Output Fields tabs of the Define input and output fields screen.

Example

Summary

Let us see how to provide input values to a FlowService. We will create a FlowService slacktest that posts a user defined message on a specific Slack channel.

Basic Flow

1.Create the FlowService.

2.Click the Define IO () icon and define the input fields channel and message.

3.Click the mapping icon () and define the pipeline mapping as shown in the following illustration.

4.Click the Run icon (). The Input values dialog box appears.

Note: If you do not define any input fields in a FlowService, then the Input values dialog box does not appear and the FlowService executes directly.

5.Specify the input values as general and Hello Team !!! and save the value set as slack_valueset_1.

You can run the FlowService without saving the value set too. A value set is a placeholder to store run time input values. You can use the same value set if you want to run the same FlowService with the same input values. This is useful when you have multiple input fields. Value sets are stored in your browser’s local storage and the value sets are retained till the local storage is cleared.

Include empty values for string types: If you do not provide an input value for a string field, then at run time, the value will be null by default. If you select this option, then at run time, the value will be an empty string.

6.Click Run.

7.View the result. The values for the input fields channel and message are displayed.

Pipeline Mapping

The Pipeline feature offers a graphical representation view for all of your data and allows you to map the data between FlowService’s user input, and connectors or services. Pipeline is the general term used to refer the data structure in which the input and output values are maintained for each step of a FlowService. The pipeline starts with the user defined input to the FlowService and collects inputs and outputs from every step to the next step in the FlowService. When the FlowService executes, pipeline has access to data until the previous step’s output.

Pipeline Panel

The Pipeline panel enables you to map the input and output fields. You can access the Pipeline panel by clicking the mapping icon () on the FlowServices step. The Pipeline panel appears only when you select any operation of the connector or service and an account must be configured in case of connector. The FlowService generates the fields based on the operation and the service or connector selected. You can define the input or output fields in the Define IO section and map them to the service or connector fields.

The Pipeline panel has the following columns:

You can map the fields in the following order:

  1. Pipeline Input fields to Service Input fields

  2. Service Output fields to Pipeline Output fields

Note: For easy identification purpose, services are indicated with fn text in the column header and the connector icon is displayed for connectors.

Pipeline Taskbar

You can perform the following actions while mapping the input and output fields in a FlowService. The Taskbar is available at the top right-corner of the Pipeline panel.

Name and Icon Description
Recommended Mappings ( ) Recommends mappings based on the pattern of previous mappings. The recommended mappings are represented with dotted lines. For more information, see Smart Mapping.
Show only Mapped () Displays only the mapped fields.
Copy () Copies a field from the Pipleine Input and Pipeline Output sections. This icon is enabled only when a field is selected. For more information, see Copy and Paste Fields.
Clear () Clears all mappings and values set for fields in the pipeline. This icon is enabled always. For more information, see Clear Values.
Delete () Deletes the selected mapping. You can also delete the values set for input and output fields, delete the map between fields, and delete the drop value. This icon is enabled if a particular mapping is selected by clicking on the mapping line.
For more information, see Delete Mappings.
Move () Moves the selected field to either left, right, up, or down in the pipeline. The left and right icons are enabled only when the selected field has a possibility to move to the immediate parental hierarchy (if any). The up and down icons are enabled only when the selected field has a possibility to move one level up or down its sibling field.
Map () Connects service and pipeline fields in the Pipeline with a line called a map (link). For more information, see Map Pipeline Fields.
Set Value () Sets the values for input and output fields. For more information, see Set Values.
Drop () Drops a field from the Pipeline Input or Pipeline Output sections. For more information, see Drop Fields.
Map by Condition () Sets a condition on a pipeline map. Only when the condition is satisfied the target field is assigned a value. For more information, see May by Condition.
Expand () Expands the Pipeline panel. For more information, see Expand Mapping Panel.
Close () Closes the mapping panel and returns to the FlowService. This icon is enabled always. Whatever the state of mappings or data manipulations done till that point are saved.
Paste () Pastes the field in the pipeline after a copy field action. For more information, see Copy and Paste Fields.
Add () Adds a field to the pipeline. You can add fields that were not declared as input or output fields (through Define IO) of the FlowService. For more information, see Add Fields.

Note: You can also access these actions by right-clicking a field in the pipeline as shown:

Set Values

The Set Values feature allows you to set values for the fields by clicking on the field in the Pipeline panel.

The behavior of the Set Value feature is as follows:

For example, let us see the Adding Integers (AddInts) functionality using the Math service.

  1. Create a FlowService. For example, AddingIntegers. A FlowService step is created initially without any data.

  2. Select the AddInts operation in the FlowService step.

  3. Click on the FlowService step. The Pipeline panel appears. Two service input fields num1 and num2 are listed in the Service Input column.

  4. Click any of the fields, for example, num1. The Set Value dialog box appears.

  5. Enter the value in the num1 field.

  6. Click Save. The value is set and the corresponding field is represented with the SetValue icon. If you export this step, the set value of this field is exported.

  7. Repeat the above steps to set values for the required fields.

Pipeline Variable Substitution

The Perform pipeline variable substitution check box indicates whether you want the application to perform pipeline variable substitution at runtime. If selected, the application replaces the pipeline variable with the runtime value of the variable while running the FlowService.

Clear Values

The Clear Values feature allows you to reset the field values. If the field is a document or document reference type, Clear All resets the nested fields as well. The default values of different types on Set Value (without giving any input, just opening the Set Value screen and save) and Clear All are as follows:

Field Type On Default Set Value On Clear All
String “(empty)
Boolean False False
Array [] []
Float, Int, Double, Long, Short nothing set
Document, Document Reference An object with all String fields at all levels are set to “, nothing is set for other types. An object with all String fields at all levels are set to “, nothing is set for other types.

Map Pipeline Fields

The Map Pipeline Fields feature allows you to connect service and pipeline fields in the pipeline with a line called a map (link). You can select two input fields and click the Map icon to create a map between them. Creating a map between fields copies the value from one field to another at run time. There are two types of mapping:

Implicit Mapping

Within a FlowService, webMethods.io Integration implicitly maps fields whose names are the same and whose data types are compatible. FlowService connects implicitly mapped fields with a dotted line.

Explicit Mapping

You can map input fields from the Pipeline Input column to Service In fields, and also can map the output from the Service Out to a different field in the Pipeline Out column. Explicit Mapping can be achieved in the following ways:

Delete Mappings

The Delete Mappings feature allows you to remove mappings . You can delete each map individually or all mappings at a time.

Search Fields

The Search feature allows you to search fields. When you type field names in the Search text box, webMethods.io Integration performs a dynamic search across all fields (including nested fields), and displays the resultant fields in the respective panels. Additionally, each column has its own search bar that allows you to search in the respective column.

Expand Mapping Panel

The Expand feature allows you to resize the Pipeline panel. On clicking the Expand icon, the Pipeline panel size increases and you can resize the column widths as per your requirements to view the mappings clearly.

Tip: This feature is useful in scenarios when there are many mappings and the current default column width is not sufficient for clear view of the mappings. You can click Expand to increase the Pipeline panel size and resize the column for viewing the mappings with more clarity.

Copy and Paste Fields

The Copy and Paste feature allows you to copy any field and paste in the pipeline. Depending on the context, you can either paste the field or the field path. For example, if you copy a field and paste the field in the Set Value dialog box, the field path is pasted. Alternatively, you can use keyboard shortcuts Ctrl+C to copy and Ctrl+V to paste fields. You can perform the copy and paste actions in the following ways:

Drop Fields

The Drop feature allows you to remove fields from a pipeline that are no longer used by the subsequent steps. You can drop fields from only the Pipeline Input and Pipeline Out columns. By dropping the unwanted fields you can reduce the size of the pipeline at run time. Also, the length and complexity of the Pipeline Input and Pipeline Out columns is reduced, thus making the Pipeline panel much easier to use when you are working with a complex FlowService. If any output field is dropped, then that particular field is not sent as Pipeline Input to the next step. This way you can restrict the flow of fields from one step to another.

For easy identification purpose, the Drop icon appears after a field is dropped. For example, .

You cannot drop a field in the following scenarios:

Add Fields

The Add Fields feature allows you to add fields to a pipeline.

Map by Condition

The Map by Condition feature allows you to define conditions for the maps (links) drawn between fields in a pipeline. A condition consists of one or more expressions that allows you to:

For example, map the fields BuyersTotal and OrderTotal only if BuyersTotal has a value, that is, Not Null.

During runtime, the application runs all conditional mappings and assigns a value to the target field only when the condition is true. Otherwise, the application ignores the mapping. In scenarios where the fields are mandatary and the conditions are not satisfied, you might observe FlowService runtime failure issues.

You can map multiple source fields to the same target field if the mappings to the target have conditions. In some scenarios, where you have mapped multiple source fields to a single target field, at most, only one of the conditions you define can be true at run time. If more than one condition to the same target field evaluates to true, then the result is not definite because the order in which the conditions are run is not guaranteed.

You can define conditions for a map using the Condition Editor screen. The Condition Editor screen can be accessed by selecting the map and clicking the Map by Condition () icon on the Pipeline panel or by double-clicking the map.

The Condition Editor screen functionality is similar to that of the Expression Editor screen. For more information about the Condition Editor screen description, see Expression Editor.

Points to Consider when Defining Conditions for Maps

The condition expression given for the map behaves similar to the FlowService containing a Branch condition. For more information on Branch steps, see Branch.

Adding Conditions to a Map

Example

Assume that there is a FlowService CentimetertoMeterConversion that converts the values provided by you from centimeters to meters. If the meter value is zero, then a custom message is logged. The FlowService includes the following:

Let us modify the logic of the FlowService such that it converts only specific values instead of all values based on the following conditions:

Before you Begin

Basic Flow

  1. Go to the CentimetertoMeterConversion FlowService.

  2. Select the centimeters to num1 map and click from the pipeline tool bar. The Condition Editor screen appears.

    By default, the first field is selected and displayed in the editor.

  3. Ensure that the Enable Condition during execution check box is selected. Otherwise, the condition is not considered during runtime.

  4. Define the conditions. For more information on how to define conditions and sub conditions, see Creating Complex Expressions.

    As per the example, the condition minValue > 1 and maxValue < 9999 is defined for the selected mapping as shown in the following iluustration:

  5. Click Save. The condition is added to the map and the icon is added to indicate that the map has a condition defined.

  6. Map the default Pipeline Input field to num1.

  7. Define condition for the Default Pipeline Input field.

    Note: To remove a condition from a map:

    1. Select the map and click the Map By Condition icon. The Condition Editor appears.

    2. Click Remove Condition.

  8. Click Run. The results are displayed based on the values you provide.

Default Pipeline Mapping Rules and Behavior

Before you start mapping fields, it is recommended to go through the mapping guidelines to avoid runtime issues due to incorrect mappings.

Notes:

  • Mapping between a multidimensional data type (Target) and any single dimensional data type (Source), then the Target field is assigned to the specified element of the Source field. For rules to know whether mapping occurs at runtime, see the tables in the Default Pipeline Mapping Rules and Behaviour section.
    • Single Dimension Data Types - String, Short, Long, Integer, Float, Double, Boolean, Byte Array (b[]), Big Decimal, Big Integer, Object
    • Multi Dimension Data Types - String [], Short [], Long [], Integer [],Float [], Double [], Boolean [], Big Decimal [], Big Integer [], Object[]
  • Document, Document References can be mapped only with Document, Document References And arrays of these (Document [], Document Reference[]).

Indexed Mapping

The Indexed Mapping feature allows you to map array fields ( String List or Object List) and specify which element in the array you want to map to or from. For example, for String Lists and Object Lists, you can specify the index for the list element you want to map, that is, map the third element in a String List to a String.

You can map the array fields at the index level from the Indexed Mapping dialog box. The Indexed Mapping dialog box automatically appears when you map array fields in the Pipeline panel.

Note: Index mapping for the Document and Document Reference data types are not supported.

The Indexed Mapping dialog box has the following options:

Notes:

  • If you have not mapped:
    • any indexes and the Map specific elements in an array option is selected, then the parent fields are automatically mapped.
    • any one of the target index with the source index, then that index element is mapped to parent.
  • If you have mapped multiple source indexes to a single target, then the result is not definite because the order in which the mappings are run is not guaranteed.
  • If you want to add or delete a row index, you can use the Add and Delete buttons in the Indexed Mapping dialog box.

Guidelines for Mapping Array Fields

Default Pipeline Rules for Mapping Array Fields

When you create maps between scalar and array fields, you can specify which element of the array field you want to map to or from. Scalar fields are those that hold a single value, such as String and Object. Array fields are those that hold multiple values, such as String List and Object List. For example, you can map a String to the second element of a String List.

If you do not specify which element in the array that you want to map to or from, the application uses the default rules in Pipeline to determine the value of the target field. The following table lists the default pipeline rules considered for mapping to and from array fields:

Field 1 - Field 2 Behavior
A scalar field - An array field that is empty (the field does not have a defined length) The map defines the length of the array field; that is, it contains one element and has length of one. The first (and only) element in the array is assigned the value of the scalar field.
A scalar field - An array field with a defined length The length of the array is preserved and each element of the array is assigned the value of the scalar field.
An array field - A scalar field The scalar field is assigned the first element in the array.
An array field - An array field that does not have a defined length The map defines the length of the target array field; that is, it will be the same length as the source array field. The elements in the target array field are assigned the values of the corresponding elements in the source array field.
An array field - An array field that has a defined length The length of the source array field must equal the length of the target array field. If the lengths do not match, the map does not occur. If the lengths are equal, the elements in the target array field are assigned the values of the corresponding elements in the source array field.

Smart Mapping

Software AG leverages the collective intelligence of the webMethods.io Integration community to suggest which fields should be mapped in a data map by learning from the mappings you create.

Smart mapping provides you with recommendations while mapping the pipeline data and utilizes a series of algorithms to determine the likely accuracy of a given suggestion, enabling you to toggle between high, medium, and low levels of mapping confidence. A Machine Learning (ML) algorithm is applied to provide the suggestions. The ML algorithm learns from the mappings you create and provides suggestions automatically to map similar fields. The algorithm benefits from having more data from more number of users.

Smart mapping is anonymous and does not store any customer-specific information or actual data of any kind and is available to only those tenants who provide their consent to share their mapping information. Individual fields contained within the records are indexed. Specifically, the field names and their hierarchy are indexed, as well as the mapping relationships between them.

After providing consent, if you have the relevant role permissions, you can choose to opt-out of smart mapping. Your maps will not be indexed, that is, if you opt out of smart mapping, you will not be able to see or utilize any mapping suggestions and your maps will no longer be indexed. However, as the index is anonymous, any maps and profiles indexed during the opt-in period will remain in the database. Further, the data collected is confined to that particular data center where the tenant resides.

Mapping recommendations are not tenant specific, so mapping inputs from one tenant maybe used to make recommendations for another tenant. When you do the mapping, the data collected does not reflect immediately as a recommendation for another user as the information is recorded and processed in our database only at specified intervals.

Note: For trial tenants, the mapping data is collected by default. Further, for trial tenants and for the Free Forever Edition, smart mapping is always enabled and cannot be changed. For paid tenants, only an Administrator has the permission to enable or disable smart mapping.

The following information is not indexed:

How it works

  1. To provide your consent to share the mapping information, from the webMethods.io Integration navigation bar, click on the profile icon located at the top-right corner of the home page, and select Settings > Preferences.

  2. On the Configure Tenant Preferences page, select the Publish integration mappings to recommendations engine option and click Save to enable smart mapping. This provides you mapping recommendations whenever you do mapping. By enabling this option, you are also providing us your consent to collect your mapping information. For trial tenants and for the Free Forever Edition, this option is always enabled by default and cannot be changed.

  3. Select a project and then the FlowService for which you want to do smart mapping.

  4. On the pipeline mapping screen, select Recommend Mappings to see the recommended mappings. If you select the Show only mapped option, the Recommend mappings option is automatically cleared.
    When the screen is initially loaded, only high recommendations appear by default.

  5. Drag the recommendation accuracy slider to view the mapping recommendations and filter the recommendations based on their recommendation accuracy. The recommendation accuracy or confidence number appears when you hover the pointer over a mapping and is the level of mapping confidence of that specific recommendation.

    Based on the level of mapping confidence, the recommendations are grouped into High, Medium, and Low categories:

    • Mappings have the highest probability
    • Mappings have medium probability
    • Mappings have the least probability

    You can select more than one category by moving the slider. In the following example, mappings that have medium and high probability are displayed.

  6. Select a mapping and click Accept to hard map it. You can select one or more recommendations and accept only those recommendations (Selected (x)), or you can accept all the recommendations that appear on the screen (All shown (x)). The accepted recommendation behaves like an actual mapping.

  7. To unmap a hard mapping, select the hard mapping and then click the delete icon available on the pipeline action bar. Once the hard mapping is deleted, the recommendations appear again. To hard map the recommendations again, click Accept and select the desired option.

Note: If you an Administrator of a paid tenant, and have cleared the Publish Integration Mappings to Recommendations Engine option, webMethods.io Integration displays a message to inform you whether you want to enable the smart mapping feature. If you select Yes, the Predict Mappings option appears in the pipeline mapping screen.

Expression Editor

The Expression Editor is an interface that allows you to define complex expressions (conditions) using the conditional controls such as If, Else If, While, and Do Until. A complex expression is formed by combining simple conditions (rules) with logical operators AND and OR or negating these conditions with logical negation operator NOT.

You can access Expression Editor by clicking the Expression () icon on the FlowService step. The Expression icon appears only when you select a conditional construct control in the FlowService step.

Components of Expression Editor

Expression Editor consists of two sections:

Expression View

The Expression View section displays the conditions defined. Additionally, when you hover the mouse pointer on operator of the condition, the condition’s scope is highlighted for easy reference as shown in the following illustration:

Expression Builder

The Expression Builder section allows you to create complex conditions by defining rules. By default, a rule step is displayed to start with the condition definition. A rule step consists of Left Operand, Operator, and Right Operand to define a condition. The operands list the fields available in the Pipeline. You can group multiple rules to define sub conditions.

Expression Builder contains the following controls that aid you in defining the complex conditions:

At any time you can either modify or delete a part or whole of the complex condition using these controls.

When you hover on the operands, the complete path of the operand appears.

Creating Complex Expressions

Example

Let us create a complex expression for checking the vaccination slot availability with the following conditions:

Before you Begin

Basic Flow

  1. Go to FlowServices.

  2. Select the project where you want to create the new FlowService. You can also create a new project.

  3. Click to create a FlowService. The Start Creating the FlowService screen appears.

  4. Provide a name for the FlowService. For example, GetVaccinationSlotStatus and an optional description for the new FlowService.

  5. Select a conditional control in the FlowService step. For example, If.

  6. Click the Expression () icon on the conditional FlowService step. The Expression Editor screen appears.

  7. Perform the following steps to define a rule:

    1. Select a left operand.

    2. Select an operator.

    3. Select a right operand.

    The rule is defined. As per the example, the rule 1 - Location is Washington is defined as follows:

  8. [Optional] Click or Add Rule. A new rule step is added.

  9. Repeat the above steps to define more rules as per your requirement.

  10. Select the operator (group-level) from the And drop-down list to conjunct the previous and new rules. As per the example, rule 2 - Zipcode is 20010 is defined and both are combined with an And operator as follows:

  11. Perform the following steps to define a group:

    1. Click Add Group. A new rule step appears and is grouped under a separate block.

    2. Select the operands and operator to define the rule.

    3. [Optional] Click or Add Rule (group-level). A new rule step is added to the group.

    4. [Optional] Select the operator from the And drop-down list (group-level) to conjunct the previous and new rules in the group.

    5. Repeat the above steps to define all rules in the group. The sub condition is defined.

    As per the example, group 1 - Vaccine can be either Pfizer, Moderna, or Sputnik, group 2 - Age is less than or equal to 45 and slot exists are defined as follows. The groups and other rules are combined with an And operator.

  12. Define the rule 3 - Venue is Howard University Hospital as per the example. The expression created is as follows:

  13. [Optional] Perform the following steps to remove a rule or rule group:

    • Rule - Click adjacent to the rule that must be deleted. The rule is deleted from the complex condition.

    • Rule Group - Click Remove Group in the group that must be deleted. The rule group is deleted from the complex condition.

    Tip: You can use the keys Ctrl+Z and Ctrl+Y to undo and redo the delete actions.

  14. [Optional] Perform the following steps to negate a rule or rule group:

    • Rule - Click adjacent to the rule that must be negated. The rule is deleted from the complex condition.

    • Rule Group - Click Negate Group in the group that must be negated. The rule group is deleted from the complex condition. The button is renamed as Remove Negation and allows you to remove the negation for that group.

  15. Click on the Expression Editor screen. The condition is added to the FlowService step. You can click the down and up arrows in the FlowService step to view and hide the complete expression.

The complex condition is created and you can proceed with the other FlowService steps.

Creating FlowServices

See the following examples to learn how to create FlowServices. Click here for information on how to manage roles and project permissions.

Get leads from Salesforce CRM and create corresponding customer leads in Microsoft Dynamics 365 CRM

Summary

Get leads from Salesforce CRM and create corresponding customer leads in Microsoft Dynamics 365 CRM.

Before you begin

  • Log in to your tenant and enable FlowServices.
  • Check if you have the Developer and Admin roles assigned from the Settings > Roles page.
  • Note: Click here for information on how to manage roles and project permissions.

  • Obtain the credentials to log in to Salesforce CRM and Microsoft Dynamics 365 CRM back end accounts.
  • In webMethods.io Integration, create Salesforce CRM and Microsoft Dynamics 365 CRM accounts.
  • Basic Flow

    1.Select the project where you want to create the new FlowService. You can also create a new project.

    2.Click the FlowServices tab and on the FlowServices page, click the icon.

    3.Provide a name, for example, SalesforceToMSDynamics365CRM, and an optional description for the new FlowService.

    4.Type Salesforce in the search box and select Salesforce CRM.

    5.Select the queryleads operation.

    6.Select the SalesforceCRM_1 account. You can also create or configure an account inline.

    7.Click the icon to add a new step.

    8.Type repeat, select Repeat to create a repeat step, and then select the /Lead option.

    9.Select Microsoft Dynamics 365 CRM and then select the associated createLead action.

    10.Select the msdynamics_1 account. You can also create or configure an account inline.

    11.Click the mapping icon to map the input and output fields.

    12.Click the Pipeline Input fields (FirstName and LastName) and drag it to the service input (firstname and lastname). The service output is automatically mapped to the Pipeline Output by a dotted line.

    13.Save the FlowService and run it by clicking the Run icon.

    14.View the FlowService execution results for a successful run. The lead ID generated is acf4f843-2970-ea11-a811-000d3a4da920.

    Note: If a value is null for a field or if the datatype is unknown, then the obj (object) icon is displayed.

    15.Download the results.

    16.You can also view the previous FlowService execution results.

    Note: By default, all FlowService execution results are retained for 30 days. You can optionally specify the number of days (up to 30 days) for which you would like to retain the FlowService execution logs by clicking the Modify Retention Period link available at Monitor > Execution Results > FlowService Execution. Once the retention period is over, the FlowService execution logs are deleted automatically.

    Get attendees from Concur and create contacts in Salesforce CRM

    Summary

    Get attendees from Concur and create contacts in Salesforce CRM.

    Before you begin

  • Log in to your tenant and enable FlowServices.
  • Check if you have the Developer and Admin roles assigned from the Settings > Roles page.
  • Note: Click here for information on how to manage roles and project permissions.

  • Obtain the credentials to log in to Concur and Salesforce CRM back end accounts.
  • In webMethods.io Integration, create Concur (Concur_1) and Salesforce CRM (SalesforceCRM_1) accounts.
  • Basic Flow

    1.Select the project where you want to create the new FlowService. You can also create a new project.

    2.Click the FlowServices tab and on the FlowServices page, click the icon.

    3.Provide a name, for example, ConcurAttendeeToSalesforce, and an optional description for the new FlowService.

    4.Type Concur in the search box, select Concur, and then in the Type to choose action field, select Add Custom Operation.

    5.Do the steps as shown in the following images to create the GetConcurAttendees custom operation.

    6.Select GetConcurAttendees, click the icon, and select the Concur_1 account. You can also create this account inline.

    7.Click the icon to add a new step.

    8.Type repeat, select Repeat to create a repeat step, and then select the /item option.

    9.Select Salesforce CRM and then select the associated createcontact action.

    10.Select the SalesforceCRM_1 account. You can also create or configure this account inline.

    11.Click the mapping icon to map the input and output fields.

    12.Map the input and output fields as shown below. Double-click the OtherCity and OtherCountry fields and set the required input values. You can also select the field and click the Set Value icon . The service output is automatically mapped to the Pipeline Output by a dotted line.

    13.Save the FlowService and run it by clicking the Run icon.

    14.View the FlowService execution results.

    Retrieve files stored in Amazon Simple Storage Service (S3) bucket and log the content of the files

    Summary

    Retrieve files stored in Amazon Simple Storage Service (S3) bucket and log the content of the files.

    Before you begin

  • Log in to your tenant and enable FlowServices.
  • Check if you have the Developer and Admin roles assigned from the Settings > Roles page.
  • Note: Click here for information on how to manage roles and project permissions.

  • Obtain the credentials to log in to Amazon S3 back end account.
  • In webMethods.io Integration, create an Amazon S3 account, for example, AmazonSimpleStorageServiceS3_1.
  • Amazon S3 Bucket name to retrieve the files.
  • Basic Flow

    1.Select the project where you want to create the new FlowService. You can also create a new project.

    2.Click the FlowServices tab and on the FlowServices page, click the icon.

    3.Provide a name, for example, AmazonS3, and an optional description for the new FlowService.

    4.Type Amazon Simple Storage Service (S3) in the search box, select it, and then select the action or operation, getBucket.

    5.Select the AmazonSimpleStorageServiceS3_1 account. You can also create or configure an account inline.

    6.Click the mapping icon to set a value for getbucketinput.

    7.Expand getBucketInput and click bucketName.

    8.Type softwareagbucketaws as the value for bucketName.

    9.Click the icon to add a new step. Then type repeat to select the Repeat step and select the /content option.

    10.Select Amazon Simple Storage Service (S3) and choose the getObject action to retrieve the object from the specified bucket.

    11.Click the mapping icon to map the input and output fields.

    12.As shown below, click the pipeline input fields (name and key) and drag it to the service input (bucketName and objectName). The service output (getObjectOutput) is automatically mapped to the pipeline output by a dotted line.

    .

    13.Click the + icon to add a new step as shown below.

    14.Select the Flow function and then select the logCustomMessage service. You can also type logCustomMessage and select it. This logs a message, which you can view in the FlowService execution results screen.

    15.Map the input (stream) to message. This step logs the contents of the files in the bucket.

    16.Save the FlowService and then run it.

    17.You can also view the execution result in the Monitor > FlowService Execution page. Click the execution result name link to view the execution details.

    Creating Custom Operations in FlowServices

    webMethods.io Integration provides predefined connectors, which contain SaaS provider-specific information that enables you to connect to a particular SaaS provider. Further, each connector uses an account to connect to the provider’s back end and perform operations. Each connector comes with a predefined set of operations. You can also create your own custom operations while creating a FlowService.


    Let us see how to create a custom operation while creating a FlowService and then use that custom operation to create a Salesforce CRM back end account.

    1.After you log in, select a project or create a new project where you want to create the FlowService.

    2.Click the FlowServices tab and on the FlowServices page, click the icon.

    3.Provide a name SalesforceCreateAccountCustom and description of the new FlowService.

    4.Type Salesforce in the search box, select Salesforce CRM, and then select Add Custom Operation.

    5.On the Connect to account page, select the supported Authentication type and the Salesforce CRM account created from the drop-down list. Provide a name and description of the custom operation.

    6.Select the create operation.

    Note that for REST-based connectors, after selecting the operation, you can click Headers to add input Headers, if required. webMethods.io Integration displays the default HTTP transport headers for the operation, along with their default values. At run time, while processing the headers, webMethods.io Integration substitutes the values as necessary. In order to customize the headers, do the following:

    a. Click + Add to add a custom Header.

    b. Click the icon to specify the header name and an optional default value for the header variable. If the variable is null in the input pipeline, this value will be used at run time. If the variable already has an existing default value defined, this value will overwrite the existing value at run time.

    c. If headers appear in the signature, select Active to activate the headers in the signature.

    d. To delete a custom header that you have added, click Delete.

    Note: You cannot delete the required headers.

    You can also customize the parameters for REST-based connectors after selecting the operation, by clicking the Parameter option. Review the operation parameter details. webMethods.io Integration displays the parameter Name and Description, the Data Type used to represent the kind of information the parameter can hold, the parameterization Type of the request, and the default value needed to access the operation. To specify a default value for the parameter, click the icon and then type or paste the default value. The default value is used at run time. You cannot add or delete request parameters. You can only modify the default value of a parameter. All parameters appear in the signature.

    Now let us go back to our example.

    7. After you select the create operation, click Next, and then select the Business Object Account. Business Objects appear only for certain connectors and operations.

    8.Select the data fields and confirm the action to create the custom operation. Data fields appear only for certain connectors and operations.

    9.On the FlowService editor, click the icon to define the input and output fields.

    10.Create two Input Fields, AccountName and CityName.

    11.Click to edit the FlowService mapping. Only fields selected earlier are shown in the input panel.

    12.Save the FlowService and click Run. Provide the custom field values in the AccountName and CityName fields, for example, Software AG and Bangalore respectively, and then run the FlowService.

    As the results show, the Software AG account is created.

    Support for Multipart Request Body

    webMethods.io Integration supports the multipart/form-data media type using which you can embed binary data such as files into the request body. Though application/x-www-form-urlencoded is a more natural way of encoding, it becomes inefficient for encoding binary data or text containing non-ASCII characters. The media type multipart/form-data is the preferred media type for request payloads that contain files, non-ASCII, and binary data.

    For example, if you want to create a user as well as upload a photo, the request has to be a multipart request where one part is an image file while the other part is a JSON request body. Further, uploading the CRM individual contact data to Salesforce is time consuming, so using the MIME/Multipart attachments capability, you can upload the csv/json file containing multiple contacts data into Salesforce in a single run.

    For some connectors and operations, for example, for SalesforceR Bulk v2 Data Loader, DocuWare, Google Drive, Amazon S3, and FreshService, webMethods.io Integration supports multipart request body.

    Example of a multipart request

    A multipart/form-data request body contains a series of parts separated by a boundary delimiter, constructed using Carriage Return Line Feed (CRLF), “–”, and also the value of the boundary parameters. The boundary delimiter must not appear inside any of the encapsulated parts.

    Each part has the Name, Type, Content-Type, and Part ID fields.

    --BOUNDARY
    
    
    Content-Type: application/json
    
    
    Content-Disposition: form-data; name="job"
    
    
    {
    
    
    "object":"Contact",
    
    
    "contentType":"CSV",
    
    
    "operation":"insert"
    
    
    }
    
    
    --BOUNDARY
    
    
    Content-Type: text/csv
    
    
    Content-Disposition: form-data; name="content"; filename="content"
    
    
    (Content of your CSV file)
    
    
    --BOUNDARY—

    Part Configuration

    While adding a custom action, for example, for the Freshservice Create Child Ticket With Attachments operation, you can click Attachments to view the list of all the configured parts to be sent to the service provider. You can send a multipart/form-data payload which contains either a file, or text, or a document type.

    The parts to be sent to the service provider appear in the input signature. All options including the Add option to add a custom part are disabled if the resource is not of type multipart/form-data. Currently, multipart/form-data payload is supported only at the request level, that is, only in the input signature.

    Example to create a ticket with attachments (multipart data) in Freshservice

    Before you begin

    Basic Flow

    1. Select the project where you want to create the new FlowService. You can also create a new project.
    2. Click the FlowServices tab and on the FlowServices page, click the icon.

    3. Provide a name for the FlowService, for example, CreateTicketWithAttachment, and an optional description for the new FlowService.

    4. Upload a file from the FTP server to Freshservice. Select the File Transfer Protocol (FTP/FTPS) connector, the getFile operation, and the FTPS_4 account.

    5. Let us upload the hello.txt file available on FTP server in Freshservice. Click Set Value for remoteFile and select the same file in the Set Value - remoteFile dialog box. Click Save.

    6. Select the Freshservice connector. As you are creating a ticket with an attachment, select the createTIcketWithAttachment operation and the account as Freshservice_1.

    7. Click the Edit Mapping icon. Under Pipeline Input, map contentStream to value.

    8. Set the following values on the mapping screen:

      • Set the value for filename as hello.txt.
      • Set the value for email as abc@xxxx.com.
      • Set the value for subject as create ticket.
      • Set the value for description as create ticket with attachment.
      • Set the value for priority as 1.
      • Set the value for status as 2.
      • Set the value for requester_id as 27000450816. This is the agent ID in Freshservice.
      • Set the value for phone as 123456789.
      • Set the value for source as test.
    9. Save and run the FlowService.

    10. Go to Freshservice, click Tickets, and check that the ticket is generated.

    Create bulk accounts in Salesforce using a multipart predefined operation

    For some connectors and operations, for example, for the Salesforce R Bulk v2 Data Loader connector and createAndUploadDataUsingMultipart predefined operation, webMethods.io Integration supports multipart request body.

    Before you begin

  • Log in to your tenant and enable FlowServices.
  • Check if you have the Developer and Admin roles assigned from the User Management > Roles page.
  • Obtain the credentials to log in to the Salesforce back end account.
  • In webMethods.io Integration, create the Salesforce R Bulk v2 Data Loader account, SFBulkV2_02Apr.
  • Basic Flow

    1.Select the project where you want to create the new FlowService. You can also create a new project.

    2.Click the FlowServices tab and on the FlowServices page, click the icon.

    3.Provide a name and an optional description for the new FlowService.

    4.Type IO in the search box, select IO, and then select the stringToStream service.

    5.Click the mapping icon to map the input and output fields.

    6.Set the below value for string:

    7.Set the value for encoding as UTF-8.

    8.Click the icon to add a new step.

    9.Select the Salesforce(R) Bulk v2 Data Loader connector and the createAndUploadDataUsingMultipart predefined operation. Select the SFBulkV2_02Apr account.

    10.Click the mapping icon to map the input and output fields.

    11.Map inputStream to value. The service output is automatically mapped to the Pipeline Output by a dotted line.

    12.Set values for fileroot as shown below:

    13.For fileRoot1, set values for name as content, contentType as text/csv, type as FILE, and filename as content.

    14.Save the FlowService and run it by clicking the Run icon.

    15.View the FlowService execution results for a successful run.

    Built-in Services

    webMethods.io Integration has an extensive library of services for performing common integration tasks such as transforming data values, performing simple mathematical operations, and so on.

    Services are invoked at run time. While creating a FlowService, you can sequence services and manage the flow of data among them.

    Related services are grouped in categories. Input and output parameters are the names and types of fields that the service requires as input and generates as output and these parameters are collectively referred to as a signature.

    Service Category Description
    Compress Use Compress services to compress the data before sending the HTTP request and decompress it after receiving the HTTP response.
    Date Use Date services to generate and format date values.
    Datetime Use Datetime services to build or increment a date/time. The services in datetime provide more explicit timezone processing than similar services in the Date category.
    Document Use Document services to perform operations on documents.
    Flat File Use Flat File services to convert data bytes, data stream, and data string to a document and vice versa.
    Flow Use Flow services to perform utility-type tasks.
    Hashtable Use Hashtable services to create, update, and obtain information about the hashtable.
    IO Use IO services to convert data between byte[ ], characters, and InputStream representations. These services are used for reading and writing bytes, characters, and streamed data to the file system. These services behave like the corresponding methods in the java.io.InputStream class. These services can be invoked only by other services. Streams cannot be passed between clients and the server, so these services will not execute if they are invoked from a client.
    JSON Use JSON services to convert JSON content into a document and to convert a document into JSON content.
    List Use List services to retrieve, replace, or add elements in an Object List, Document List, or String List, including converting String Lists to Document Lists.
    Math Use Math services to perform mathematical operations on string-based numeric values. Services that operate on integer values use Java’s long data type (64-bit, two’s complement). Services that operate on float values use Java’s double data type (64-bit IEEE 754). If extremely precise calculations are critical to your application, you should write your own Java services to perform math functions.
    MIME Use MIME services to create MIME messages and extract information from MIME messages.
    Storage Use Storage services to insert, retrieve, update, and remove entries from a data store.
    String Use String services to perform string manipulation and substitution operations.
    Transaction Use Transaction services only in conjunction with the Database Application operations. These services are applicable when the Database Application account is of type Transactional.
    Utils Contains utility services.
    XML Use XML services to convert a document to XML content and XML content to a document.

    Compress Services

    Use Compress services to compress the data before sending the HTTP request and decompress it after receiving the HTTP response.

    The following Compress services are available:

    Service Description
    compressData Performs compression of data.
    decompressData Performs decompression of data.

    compressData

    Compresses the data before sending the HTTP request using any of the specified compression schemes.

    Input Parameters

    Output Parameters

    decompressData

    Decompresses the data based on the response header of the HTTP response.

    Input Parameters

    Output Parameters

    Date Services

    Use Date services to generate and format date values.

    Pattern String Symbols - Many of the Date services require you to specify pattern strings describing the data’s current format and/or the format to which you want it converted. For services that require a pattern string, use the symbols in the following table to describe the format of your data. For example, to describe a date in the January 15, 1999 format, you would use the pattern string MMMMM dd, yyyy. To describe the format 01/15/99, you would use the pattern string MM/dd/yy.

    Symbol Meaning Presentation Example
    G era designator Text AD
    y year Number 1996 or 96
    M month in year Text or Number July or Jul or 07
    d day in month Number 10
    h hour in am/pm (1-12) Number 12
    H hour in day (0-23) Number 0
    m minute in hour Number 30
    s second in minute Number 55
    S millisecond Number 978
    E day in week Text Tuesday or Tue
    D day in year Number 189
    F day of week in month Number 2 (2nd Wed in July)
    w week in year Number 27
    W week in month Number 2
    a am/pm marker Text PM
    k hour in day (1-24) Number 24
    K hour in am/pm (0-11) Number 0
    z time zone Text Pacific Standard Time or PST or GMT-08:00
    Z RFC 822 time zone (JVM 1.4 or later) Number -0800 (offset from GMT/UT)
    escape for text Delimiter
    ’ ‘ single quote Literal '

    Time Zones - When working with date services, you can specify time zones. The Earth is divided into 24 standard time zones, one for every 15 degrees of longitude. Using the time zone including Greenwich, England (known as Greenwich Mean Time, or GMT) as the starting point, the time is increased by an hour for each time zone east of Greenwich and decreases by an hour for each time zone west of Greenwich. The time difference between a time zone and the time zone including Greenwich, England (GMT) is referred to as the raw offset.

    The following table identifies the different time zones for the Earth and the raw offset for each zone from Greenwich, England. The effects of daylight savings time are ignored in this table.

    Note: Greenwich Mean Time (GMT) is also known as Universal Time (UT).

    ID Raw Offset Name
    MIT -11 Midway Islands Time
    HST -10 Hawaii Standard Time
    AST -9 Alaska Standard Time
    PST -8 Pacific Standard Time
    PNT -7 Phoenix Standard Time
    MST -7 Mountain Standard Time
    CST -6 Central Standard Time
    EST -5 Eastern Standard Time
    IET -5 Indiana Eastern Standard Time
    PRT -4 Puerto Rico and U.S. Virgin Islands Time
    CNT -3.5 Canada Newfoundland Time
    AGT -3 Argentina Standard Time
    BET -3 Brazil Eastern Time
    GMT 0 Greenwich Mean Time
    ECT +1 European Central Time
    CAT +2 Central Africa Time
    EET +2 Eastern European Time
    ART +2 (Arabic) Egypt Standard Time
    EAT +3 Eastern African Time
    MET +3.5 Middle East Time
    NET +4 Near East Time
    PLT +5 Pakistan Lahore Time
    IST +5.5 India Standard Time
    BST +6 Bangladesh Standard Time
    VST +7 Vietnam Standard Time
    CTT +8 China Taiwan Time
    JST +9 Japan Standard Time
    ACT +9.5 Australian Central Time
    AET +10 Australian Eastern Time
    SST +11 Solomon Standard Time
    NST +12 New Zealand Standard Time

    Examples - You can specify timezone input parameters in the following formats:

    The following Date services are available:

    Service Description
    calculateDateDifference Calculates the difference between two dates and returns the result as seconds, minutes, hours, and days.
    compareDates Compares two dates and returns the result as integer.
    currentNanoTime Returns the current time returned by the most precise system timer, in nanoseconds.
    dateBuild Builds a date String using the specified pattern and the specified date services.
    dateTimeBuild Builds a date/time string using the specified pattern and the specified date services.
    dateTimeFormat Converts date/time (represented as a String) string from one format to another.
    elapsedNanoTime Calculates the time elapsed between the current time and the given time, in nanoseconds.
    formatDate Formats a Date object as a string.
    getCurrentDate Returns the current date as a Date object.
    getCurrentDateString Returns the current date as a String in a specified format.
    incrementDate Increments a date by a specified period.

    calculateDateDifference

    Calculates the difference between two dates and returns the result as seconds, minutes, hours, and days.

    Input Parameters

    Output Parameters

    Usage Notes

    Each output value represents the same date difference, but in a different scale. Do not add these values together. Make sure your subsequent FlowService steps use the correct output, depending on the scale required.

    compareDates

    Compares two dates and returns the result as an integer.

    Input Parameters

    ** Output Parameters**

    Usage Notes

    If the formats specified in the startDatePattern and endDatePattern parameters are different, webMethods.io Integration takes the units that are not specified in the startDate and endDate values as 0.

    That is, if the startDatePattern is yyyyMMdd HH:mm and the startDate is 20151030 11:11 and if the endDatePattern is yyyyMMdd HH:mm:ss.SSSand the endDate is 20151030 11:11:55:111, then the compareDates service considers start date to be before the end date and will return the result as -1.

    To calculate the difference between two dates, use the calculateDateDifference service.

    currentNanoTime

    Returns the current time returned by the most precise system timer, in nanoseconds.

    Input Parameters

    None.

    Output Parameters

    dateBuild

    Builds a date String using the specified pattern and the specified date services.

    Input Parameters

    Output Parameters

    dateTimeBuild

    Builds a date/time string using the specified pattern and the specified date services.

    Input Parameters

    Output Parameters

    dateTimeFormat

    Converts date/time (represented as a String) string from one format to another.

    Input Parameters

    Output Parameters

    Usage Notes

    As described in the “Notes on Invalid Dates” section, if the pattern yy is used for the year, dateTimeFormat uses a 50-year moving window to interpret the value of the year.

    If currentPattern does not contain a time zone, the value is assumed to be in the default time zone.

    If newPattern contains a time zone, the default time zone is used.

    elapsedNanoTime

    Calculates the time elapsed between the current time and the given time, in nanoseconds.

    Input Parameters

    Output Parameters

    formatDate

    Formats a Date object as a string.

    Input Parameters

    Output Parameters

    getCurrentDate

    Returns the current date as a Date object.

    Input Parameters

    None.

    Output Parameters

    getCurrentDateString

    Returns the current date as a String in a specified format.

    Input Parameters

    Output Parameters

    incrementDate

    Increments a date by a specified amount of time.

    Input Parameters

    Output Parameters

    Usage Notes

    The addYears, addMonths, addDays, addHours, addMinutes, addSeconds, and addMilliSeconds input parameters can take positive or negative values. For example, If startDate is 10/10/2001, startDatePattern is MM/dd/yyyy, addYears is 1, and addMonths is -1, endDate will be 09/10/2002.

    If you specify only the startDate, startDatePattern, and endDatePattern input parameters and do not specify any of the optional input parameters to increment the period, the incrementDate service just converts the format of startDate from startDatePattern to endDatePattern and returns it as endDate.

    The format of the date specified in the startDate parameter must match the format specified in the startDatePattern and the format of the date specified in the endDate parameter must match the endDatePattern format.

    Datetime Services

    Use Datetime services to build or increment a date/time. The services in datetime provide more explicit timezone processing than similar services in the date category.

    Providing Time Zones

    You can specify timezone input parameters to the datetime services in the following formats:

    The following Datetime services are available:

    Service Description
    build Builds a date/time string using the specified pattern and the supplied date/time elements.
    increment Increments or decrements a date and time by a specified amount of time.

    build

    Builds a date/time string using the specified pattern and the supplied date/time elements.

    Input Parameters

    Output Parameters

    Usage Notes

    The build service replaces the date:dateBuild and date:dateTimeBuild services which are deprecated.

    If you specify a parameter that does not exist in the supplied pattern, the service ignores that parameter.

    If you do not specify a timezone, useSystemTimeZone is set to false, and the pattern includes a time zone, the service ends with an exception.

    If a time zone is provided as input to the service either in the timezone parameter or by setting useSystemTimeZone to true, the build service calculates the date/time starting with a “zoned” date/time. The resulting values can differ when daylight savings time transitions are in effect. If no time zone is provided as input to the service either by not specifying timezone or by setting useSystemTimeZone to false, then the build service calculates the date/time starting with an “unzoned” date/time.

    The build service is similar to date:dateBuild and date:dateTimeBuild, however, the build service allows the building of a date/time that does not include a time zone. Furthermore, the build service assembles a date/time using each of the provided parameters. Consequently the build service can build a date/time with a value that would be invalid in the current time zone, such as a date/time that would fall into the gap of a daylight saving time transition. This is unlike the date:dateBuild and date:dateTimeBuild services which build a local java.util.Date object that uses the timezone of the machine running webMethods.io Integration. The date:dateBuild and date:dateTimeBuild service then applies the offset between the local timezone and the specified timezone.

    increment

    Increments or decrements a date and time by a specified amount of time.

    Input Parameters

    Output Parameters

    Usage Notes

    The increment service replaces the date:incrementDate service which is deprecated.

    The increment service can be used to decrement a date and time by specifying negative numbers. The addYears , addMonths , addDays , addHours , addMinutes , addSeconds , and addMilliSeconds input parameters can take positive or negative values.

    The service ends with an exception if the format of the date specified in the startDate parameter does not match the format specified in the startDatePattern.

    If endDatePattern includes a time zone, such as “z”, then the input string and startDatePattern must also have time zone fields, or timeZone must be set, or useSystemTimeZone must be true. Otherwise the service ends with an error.

    If you specify only the startDate, startDatePattern, and endDatePattern input parameters and do not specify any of the optional input parameters to increment the period, the increment service just converts the format of startDate from startDatePattern to endDatePattern and returns it as endDate.

    If you specify a value for timezone, the service ignores the useSystemTimeZone parameter value.

    If you specify a value for timezone and the startDate includes a time zone, then the service uses the supplied timezone to convert the startDate time zone.

    If you do not specify a value for timezone and the startDate includes a time zone, then the service uses the time zone in the startDate and ignores the useSystemTimeZone parameter.

    If you do not specify a value for timezone, the startDate does not include a time zone, and useSystemTimeZone parameter is true, then the service uses the system time zone.

    If startDate does not include a time zone, you do not specify a value for timezone, and useSystemTimeZone is false, the resulting endDate will not include a time zone.

    The increment service is similar to date:incrementDate, however, the increment service provides more specific handling of time zones. To match the behavior of date:incrementDate, set useSameInstant to true.

    Document Services

    Use Document services to perform operations on documents.

    The following Document services are available:

    Service Description
    bytesToDocument Converts an array of bytes to a document.
    deleteDocuments Deletes the specified documents from a set of documents.
    documentListToDocument Constructs a document from a document list by generating key/value pairs from the values of two elements that you specify in the document list.
    documentToBytes Converts a document to an array of bytes.
    documentToDocumentList Expands the contents of a document into a list of documents.
    Each key/value pair in the source document is transformed to a single document containing two keys (whose names you specify). These two keys will contain the key name and value of the original pair.
    findDocuments Searches a set of documents for entries matching a set of Criteria.
    groupDocuments Groups a set of documents based on specified criteria.
    insertDocument Inserts a new document in a set of documents at a specified position.
    searchDocuments Searches a set of documents for entries matching a set of Criteria.
    sortDocuments Sorts a set of input documents based on the specified sortCriteria.

    bytesToDocument

    Converts an array of bytes to a document. This service can only be used with byte arrays created by executing the documentToBytes service.

    Input Parameters

    Output Parameters

    Usage Notes

    Use this service with the documentToBytes service, which converts a document into a byte array. You can pass the resulting byte array to the bytesToDocument service to convert it back into the original document.

    In order for the document-to-bytes-to-document conversion to work, the entire content of the document must be serializable. Every object in the document must be of a data type known to webMethods.io Integration, or it must support the java.io.Serializable interface.

    If webMethods.io Integration encounters an unknown object in the document that does not support the java.io.Serializable interface, that object’s value will be lost. It will be replaced with a string containing the object’s class name.

    deleteDocuments

    Deletes the specified documents from a set of documents.

    Input Parameters

    Output Parameters

    Usage Notes

    The deleteDocuments service returns an error if the indices parameter value is less than zero or more than the number of documents in the documents input parameter.

    documentListToDocument

    Constructs a document from a document list by generating key/value pairs from the values of two elements that you specify in the document list.

    Input Parameters

    Output Parameters

    Usage Notes

    The following example illustrates how the documentListToDocument service would convert a document list that contains three documents to a single document containing three key/value pairs. When you use the documentListToDocument service, you specify which two elements from the source list are to be transformed into the keys and values in the output document. In the following example, the values from the pName elements in the source list are transformed into key names, and the values from the pValue elements are transformed into the values for these keys.

    A documentList containing these three documents:

    Key Value
    pName cx_timeout
    pValue 1000
    Key Value
    pName cx_max
    pValue 2500
    Key Value
    pName cx_min
    pValue 10

    Would be converted to a document containing these three keys:

    Key Value
    cx_timeout 1000
    cx_max 2500
    cx_min 10

    documentToBytes

    Converts a document to an array of bytes.

    Input Parameters

    Output Parameters

    Usage Notes

    Use the documentToBytes service with the bytesToDocument service, which converts the byte array created by this service back into the original document.

    The documentToBytes service is useful when you want to write a document to a file, an input stream, or a cache.

    In order for the document-to-bytes-to-document conversion to work, the entire content of the document must be serializable. Every object in the document must be of a data type known to webMethods.io Integration, or it must support the java.io.Serializable interface. If webMethods.io Integration encounters an unknown object in the document that does not support the java.io.Serializable interface, that object’s value will be lost. webMethods.io Integration will replace it with a string containing the object’s class name.

    documentToDocumentList

    Expands the contents of a document into a list of documents.

    Each key/value pair in the source document is transformed to a single document containing two keys (whose names you specify). These two keys will contain the key name and value of the original pair.

    Input Parameters

    Output Parameters

    Usage Notes

    The following example shows how a document containing three keys would be converted to a document list containing three documents. In this example, the names pName and pValue are specified as names for the two new keys in the document list.

    A document containing these three keys:

    Key Value
    cx_timeout 1000
    cx_max 2500
    cx_min 10

    Would be converted to a document list containing these three documents:

    Key Value
    pName cx_timeout
    pValue 1000
    Key Value
    pName cx_max
    pValue 2500
    Key Value
    pName cx_min
    pValue 10

    findDocuments

    Searches a set of documents for entries matching a set of criteria.

    Input Parameters

    Output Parameters

    groupDocuments

    Groups a set of documents based on specified criteria.

    Input Parameters

    Output Parameters

    Usage Notes

    The following example illustrates how to specify the values for the groupCriteria parameter:

    key compareStringsAs pattern
    name string
    age numeric
    birthdate datetime yyyy-MM-dd

    The input documents will be grouped based on name, age, and birth date.

    insertDocument

    Inserts a new document in a set of documents at a specified position.

    Input Parameters

    Output Parameters

    searchDocuments

    Searches a set of documents for entries matching a set of Criteria.

    Input Parameters

    Output Parameters

    Usage Note

    For example, if you want to search a set of documents for documents where BirthDate is 10th January 2008, the values for the searchCriteria parameter would be:

    key value compareStringsAs pattern
    Birthdate 2008-01-10 datetime yyyy-MM-dd

    sortDocuments

    Sorts a set of input documents based on the specified sortCriteria.

    Input Parameters

    Output Parameters

    Usage Notes

    For example, if you want to sort a set of documents based on name, age, and then on birth date, the values for sortCriteria parameter would be:

    key order compareStringsAs pattern
    Name ascending string  
    Age descending numeric  
    Birthdate ascending datetime yyyy-MM-dd

    Flat File Services

    Use Flat File services to convert data bytes, data stream, and data string to a document and vice versa.

    The following Flat File services are available:

    Service Description
    delimitedDataBytesToDocument Converts delimited data bytes (byte array) to a document.
    delimitedDataStreamToDocument Converts delimited data stream to a document.
    delimitedDataStringToDocument Converts delimited data string to a document.
    documentToDelimitedDataBytes Converts a document to delimited data bytes (byte array object).
    documentToDelimitedDataStream Converts a document to a delimited data stream.
    documentToDelimitedDataString Converts a document to a delimited data string.

    delimitedDataBytesToDocument

    Converts delimited data bytes (byte array) to a document.

    Input Parameters

    Output Parameters

    delimitedDataStreamToDocument

    Converts delimited data stream to a document. The permissible size of the content stream is based on your tenancy.

    Input Parameters

    Output Parameters

    delimitedDataStringToDocument

    Converts delimited data string to a document.

    Input Parameters

    Output Parameters

    documentToDelimitedDataBytes

    Converts a document to delimited data bytes (byte array object).

    Input Parameters

    Output Parameters

    documentToDelimitedDataStream

    Converts a document to a delimited data stream.

    Input Parameters

    Output Parameters

    documentToDelimitedDataString

    Converts a document to a delimited data string.

    Input Parameters

    Output Parameters

    Flow Services

    Use Flow services to perform utility-type tasks.

    The following Flow services are available:

    Service Description
    clearPipeline Removes all fields from the pipeline. You may optionally specify fields that should not be cleared by this service.
    countProcessedDocuments Counts the number of documents processed by a FlowService. Details about the processed documents can be viewed in the Execution Results screen.
    getHTTPRequest Gets information about the HTTP request, received by webMethods.io Integration.
    getLastError Obtains detailed information about the last error that was trapped within a FlowService.
    getSessionInfo Obtains detailed information about the current logged-in user session. Also provides the current FlowService name and the execution result reference identifier.
    logCustomMessage Logs a message, which can be viewed in the Execution Results screen.
    setCustomContextID Associates a custom value with an auditing context. You can use the custom value to search for FlowService executions based on the custom ID in the webMethods.io Integration Monitor screen.
    setHTTPResponse Sets the HTTP response information to be returned by webMethods.io Integration.
    sleep Causes the currently executing FlowService to pause for the specified number of seconds.

    clearPipeline

    Removes all fields from the pipeline. You may optionally specify fields that should not be cleared by this service.

    Input Parameters

    Output Parameters

    None.

    countProcessedDocuments

    Counts the number of documents processed by a FlowService. Details about the processed documents can be viewed in the Execution Results screen.

    Input Parameters

    Output Parameters

    None.

    Usage Notes

    To increment the number of documents processed by a list, use the sizeOfList service in the List service category.

    getHTTPRequest

    Retrieves information about the HTTP request, received by webMethods.io Integration.

    Input Parameters

    None.

    Output Parameters

    getLastError

    Obtains detailed information about the last error that was trapped within a FlowService.

    Input Parameters

    None.

    Output Parameters

    Usage Notes

    You can use this service in the catch section of the try-catch block. Each execution of a FlowService or a service (whether the FlowService or the service succeeds or fails) updates the value returned by getLastError. Consequently, getLastError itself resets the value of lastError. Therefore, if the results of getLastError will be used as input to subsequent FlowServices, map the value of lastError to a variable in the pipeline.

    If a map has multiple transformers, then a subsequent call to getLastError will return the error associated with the last failed transformer in the map, even if it is followed by successful transformers.

    getSessionInfo

    Obtains detailed information about the current logged-in user session. Also provides the current FlowService execution result reference identifier.

    Input Parameters

    None.

    Output Parameters

    logCustomMessage

    Logs a message, which can be viewed in the Execution Results screen.

    Input Parameters

    Output Parameters

    None.

    setCustomContextID

    Associates a custom value with an auditing context. This custom value can be used to search for FlowService executions in the Monitor screen.

    Input Parameters

    Output Parameters

    None.

    Usage Notes

    setHTTPResponse

    Sets the HTTP response information to be returned by webMethods.io Integration.

    Input Parameters

    Output Parameters

    None.

    sleep

    Causes the currently executing FlowService to pause for the specified number of seconds.

    Input Parameters

    Output Parameters

    None.

    Hashtable Services

    Use Hashtable services to create, update, and obtain information about the hashtable.

    The following Hashtable services are available:

    Service Description
    containsKey Checks for the existence of a hashtable element.
    createHashtable Creates a hashtable object.
    get Gets the value for a specified key in the hashtable.
    listKeys Lists all the keys stored in the hashtable.
    put Adds a key/value pair in the hashtable.
    remove Removes a key/value pair from the hashtable.
    size Gets the number of elements in the hashtable.

    containsKey

    Checks for the existence of a hashtable element.

    Input Parameters

    Output Parameters

    createHashtable

    Creates a hashtable object.

    Input Parameters

    None.

    Output Parameters

    get

    Gets the value for a specified key in the hashtable.

    Input Parameters

    Output Parameters

    listKeys

    Lists all the keys stored in the hashtable.

    Input Parameters

    Output Parameters

    put

    Adds a key/value pair in the hashtable.

    Input Parameters

    Output Parameters

    remove

    Removes a key/value pair from the hashtable.

    Input Parameters

    Output Parameters

    size

    Gets the number of elements in the hashtable.

    Input Parameters

    Output Parameters

    IO Services

    Use IO services to convert data between byte[ ], characters, and InputStream representations. These services are used for reading and writing bytes, characters, and streamed data to the file system and behave like the corresponding methods in the java.io.InputStream class. These services can be invoked only by other services. Streams cannot be passed between clients and the server, so these services will not run if they are invoked from a client.

    The following IO services are available:

    Service Description
    bytesToStream Converts a byte[ ] to java.io.ByteArrayInputStream.
    streamToBytes Creates a byte[ ] from data that is read from an InputStream.
    streamToString Creates a string from data that is read from an InputStream.
    stringToStream Converts a string to a binary stream.

    bytesToStream

    Converts a byte[ ] to java.io.ByteArrayInputStream.

    Input Parameters

    Output Parameters

    Usage Notes

    This service constructs stream from the byte array using the constructor ByteArrayInputStream(byte[ ]). This constructor does not make a copy of the byte array, so any changes to bytes will be reflected in the data read from the stream.

    streamToBytes

    Creates a byte[ ] from data that is read from an InputStream.

    Input Parameters

    Output Parameters

    Usage Notes

    This service reads all of the bytes from stream until the end of file is reached, and then it closes the InputStream.

    streamToString

    Creates a string from data that is read from an InputStream.

    Input Parameters

    Output Parameters

    stringToStream

    Converts a string to a binary stream.

    Input Parameters

    Output Parameters

    JSON Services

    Use JSON services to convert JSON content into a document and to convert a document into JSON content.

    The following JSON services are available:

    Service Description
    documentToJSONBytes Converts a document to JSON bytes (byte array).
    documentToJSONStream Converts a document to a JSON stream.
    documentToJSONString Converts a document to a JSON string.
    jsonBytesToDocument Converts JSON content in bytes (byte array) to a document.
    jsonStreamToDocument Converts content from the JSON content stream to a document.
    jsonStringToDocument Converts content from the JSON string to a document.

    documentToJSONBytes

    Converts a document to JSON bytes (byte array).

    Input Parameters

    Output Parameters

    documentToJSONStream

    Converts a document to a JSON stream.

    Input Parameters

    Output Parameters

    documentToJSONString

    Converts a document to a JSON string.

    Input Parameters

    Output Parameters

    jsonBytesToDocument

    Converts JSON content in bytes (byte array) to a document.

    Input Parameters

    Output Parameters

    jsonStreamToDocument

    Converts content from the JSON content stream to a document. The permissible size of the content stream is based on your tenancy.

    Input Parameters

    Output Parameters

    jsonStringToDocument

    Converts content from the JSON content string to a document.

    Input Parameters

    Output Parameters

    List Services

    Use List services to retrieve, replace, or add elements in an Object List, Document List, or String List, including converting String Lists to Document Lists.

    The following List services are available:

    Service Description
    addItemToVector Adds an item or a list of items to a java.util.Vector object.
    appendToDocumentList Adds documents to a document list.
    appendToStringList Adds Strings to a String list.
    sizeOfList Returns the number of elements in a list.
    stringListToDocumentList Converts a String list to a document list.
    vectorToArray Converts a java.util.Vector object to an array.

    addItemToVector

    Adds an item or a list of items to a java.util.Vector object.

    Input Parameters

    Output Parameters

    Usage Notes

    Either of the optional input parameters, item or itemList, is required.

    appendToDocumentList

    Adds documents to a document list.

    Input Parameters

    Output Parameters

    Usage Notes

    The documents contained in fromList and fromItem are not actually appended as entries to toList. Instead, references to the documents in fromList and fromItem are appended as entries to toList. Consequently, any changes made to the documents in fromList and fromItem also affect the resulting toList.

    appendToStringList

    Adds Strings to a String list.

    Input Parameters

    Output Parameters

    Usage Notes

    The Strings contained in fromList and fromItem are not actually appended as entries to toList. Instead, references to the Strings in fromList and fromItem are appended as entries to toList. Consequently, any changes made to the Strings in fromList and fromItem also affect the resulting toList.

    sizeOfList

    Returns the number of elements in a list.

    Input Parameters

    Output Parameters

    Usage Notes

    For example, if fromList consists of:

    The result would be:

    stringListToDocumentList

    Converts a String list to a document list.

    Input Parameters

    Output Parameters

    Usage Notes

    Creates a document list containing one document for each element in the fromList. Each document will contain a single String element named key.

    vectorToArray

    Converts a java.util.Vector object to an array.

    Input Parameters

    Output Parameters

    Math Services

    Use Math services to perform mathematical operations on string-based numeric values. Services that operate on integer values use Java’s long data type (64-bit, two’s complement). Services that operate on float values use Java’s double data type (64-bit IEEE 754). If extremely precise calculations are critical to your application, you should write your own Java services to perform math functions.

    The following Math services are available:

    Service Description
    absoluteValue Returns the absolute value of the input number.
    addFloatList Adds a list of floating point numbers (represented in a string list) and returns the sum.
    addFloats Adds one floating point number (represented as a String) to another and returns the sum.
    addIntList Adds a list of integers (represented in a String list) and returns the sum.
    addInts Adds one integer (represented as a String) to another and returns the sum.
    addObjects Adds one java.lang.Number object to another and returns the sum.
    divideFloats Divides one floating point number (represented as a String) by another (num1/num2) and returns the quotient.
    divideInts Divides one integer (represented as a String) by another (num1/num2) and returns the quotient.
    divideObjects Divides one java.lang.Number object by another (num1/num2) and returns the quotient.
    max Returns the largest number from a list of numbers.
    min Returns the smallest number from a list of numbers.
    multiplyFloatList Multiplies a list of floating point numbers (represented in a String list) and returns the product.
    multiplyFloats Multiples one floating point number (represented as String) by another and returns the product.
    multiplyIntList Multiplies a list of integers (represented in a String list) and returns the product.
    multiplyInts Multiplies one integer (represented as a String) by another and returns the product.
    multiplyObjects Multiplies one java.lang.Number object by another and returns the product.
    randomDouble Returns the next pseudorandom, uniformly distributed double between 0.0 and 1.0.
    roundNumber Returns a rounded number.
    subtractFloats Subtracts one floating point number (represented as a String) from another and returns the difference.
    subtractInts Subtracts one integer (represented as a String) from another and returns the difference.
    subtractObjects Subtracts one java.lang.Number object from another and returns the difference.
    toNumber Converts a string to numeric data type.

    absoluteValue

    Returns the absolute value of the input number.

    Input Parameters

    Output Parameters

    addFloatList

    Adds a list of floating point numbers (represented in a string list) and returns the sum.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in numList are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    addFloats

    Adds one floating point number (represented as a String) to another and returns the sum.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    addIntList

    Adds a list of integers (represented in a String list) and returns the sum.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in numList are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    addInts

    Adds one integer (represented as a String) to another and returns the sum.

    Input Parameters

    Output Parameters

    Usage Notes

    Ensure that the result of your calculation is less than 64 bits in width (the maximum width for the long data type). If the result exceeds this limit, it will generate a data overflow.

    Ensure that the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    addObjects

    Adds one java.lang.Number object to another and returns the sum.

    Input Parameters

    Output Parameters

    Usage Notes

    This service accepts the following sub-classes of java.lang.Number: java.lang.Byte, java.lang.Double, java.lang.Float, java.lang.Integer, java.lang.Long, java.lang.Short.

    This service applies the following rules for binary numeric promotion to the operands in order:

    These promotion rules mirror the Java rules for numeric promotion of numeric types.

    divideFloats

    Divides one floating point number (represented as a String) by another (num1/num2) and returns the quotient.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    divideInts

    Divides one integer (represented as a String) by another (num1/num2) and returns the quotient.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    divideObjects

    Divides one java.lang.Number object by another (num1/num2) and returns the quotient.

    Input Parameters

    Output Parameters

    Usage Notes

    This service accepts the following sub-classes of java.lang.Number: java.lang.Byte, java.lang.Double, java.lang.Float, java.lang.Integer, java.lang.Long, java.lang.Short.

    This service applies the following rules for binary numeric promotion to the operands in order:

    These promotion rules mirror the Java rules for numeric promotion of numeric types.

    max

    Returns the largest number from a list of numbers.

    Input Parameters

    Output Parameters

    min

    Returns the smallest number from a list of numbers.

    Input Parameters

    Output Parameters

    multiplyFloatList

    Multiplies a list of floating point numbers (represented in a String list) and returns the product.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in numList are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    multiplyFloats

    Multiples one floating point number (represented as String) by another and returns the product.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    multiplyIntList

    Multiplies a list of integers (represented in a String list) and returns the product.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the result of your calculation is less than 64 bits in width (the maximum width for the long data type). If the result exceeds this limit, it will generate a data overflow.

    Make sure the strings that are passed to the service in numList are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    multiplyInts

    Multiplies one integer (represented as a String) by another and returns the product.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the result of your calculation is less than 64 bits in width (the maximum width for the long data type). If the result exceeds this limit, it will generate a data overflow.

    Make sure the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    multiplyObjects

    Multiplies one java.lang.Number object by another and returns the product.

    Input Parameters

    Output Parameters

    Usage Notes

    This service accepts the following sub-classes of java.lang.Number: java.lang.Byte, java.lang.Double, java.lang.Float, java.lang.Integer, java.lang.Long, java.lang.Short.

    This service applies the following rules for binary numeric promotion to the operands in order:

    These promotion rules mirror the Java rules for numeric promotion of numeric types.

    randomDouble

    Returns the next pseudorandom, uniformly distributed double between 0.0 and 1.0.

    Random number generators are often referred to as pseudorandom number generators because the numbers produced tend to repeat themselves over time.

    Input Parameters

    Output Parameters

    roundNumber

    Returns a rounded number.

    Input Parameters

    Output Parameters

    subtractFloats

    Subtracts one floating point number (represented as a String) from another and returns the difference.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the strings that are passed to the service in num1andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    subtractInts

    Subtracts one integer (represented as a String) from another and returns the difference.

    Input Parameters

    Output Parameters

    Usage Notes

    Make sure the result of your calculation is less than 64 bits in width (the maximum width for the long data type). If the result exceeds this limit, it will generate a data overflow.

    Make sure the strings that are passed to the service in num1 andnum2 are in a locale-neutral format (that is, using the pattern -####.##). Passing locally formatted strings may result in unexpected results. For example, calling addFloats in a German locale with the arguments 1,23 and 2,34 will result in the value 357, not 3.57 or 3,57.

    subtractObjects

    Subtracts one java.lang.Number object from another and returns the difference.

    Input Parameters

    Output Parameters

    Usage Notes

    This service accepts the following sub-classes of java.lang.Number: java.lang.Byte, java.lang.Double, java.lang.Float, java.lang.Integer, java.lang.Long, java.lang.Short.

    This service applies the following rules for binary numeric promotion to the operands. The following rules are applied in order:

    These promotion rules mirror the Java rules for numeric promotion of numeric types.

    toNumber

    Converts a string to numeric data type.

    Input Parameters

    Output Parameters

    MIME Services

    Use MIME services to create MIME messages and extract information from MIME messages.

    The following MIME services are available:

    Service Function
    addBodyPart Adds a body part (header fields and content) to a specified MIME object.
    addMimeHeader Adds one or more header fields to a specified MIME object.
    createMimeData Creates a MIME object.
    getBodyPartContent Retrieves the content (payload) from the specified MIME object.
    getBodyPartHeader Returns the list of header fields for the specified body part.
    getContentType Returns the value of the Content-Type message header from the specified MIME object.
    getEnvelopeStream Generates an InputStream representation of a MIME message from a specified MIME object.
    getMimeHeader Returns the list of message headers from a specified MIME object.
    getNumParts Returns the number of body parts in the specified MIME object.
    getPrimaryContentType Returns the top-level portion of a MIME object’s Content-Type value.
    getSubContentType Returns the sub-type portion of a MIME object’s Content-Type value.
    mergeHeaderAndBody Concatenates the contents of the header and body mapped to the input.

    addBodyPart

    Adds a body part (header fields and content) to a specified MIME object.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    The way in which the contenttype and encoding parameters are applied depends on whether the finished message is single-part or multipart.

    For single-part messages:

    For multipart messages:

    addMimeHeader

    Adds one or more header fields to a specified MIME object.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    If you add MIME headers before you add multiple body parts, the header fields will be added to each of the body parts. If you do not want this behavior, either drop mimeHeader from the pipeline immediately after you execute addMimeHeader, or invoke addMimeHeader after you’ve added all body parts to the MIME object.

    Be aware that the contenttype and encoding parameters used by the addBodyPart service will override any Content-Type or Content-Transfer-Encoding settings in mimeData. Moreover, in certain cases, the getEnvelopeStream will override these settings when it generates a multipart message. For information about how the Content-Type or Content-Transfer-Encoding headers are derived at run time, see the Usage Notes under addBodyPart.

    createMimeData

    Creates a MIME object.

    If no input parameter is passed to this service, the service creates an empty MIME object. Otherwise, the service creates a MIME object containing the elements (header fields and content) from the MIME message in input.

    Input Parameters

    Output Parameters

    Usage Notes

    All of the other MIME services operate on the mimeData IData object produced by this service. They do not operate directly on MIME message streams.

    Important: You can examine the contents of mimeData during testing and debugging. However, because the internal structure of mimeData is subject to change without notice, do not explicitly set or map data to/from these elements in your service. To manipulate or access the contents of mimeData, use only the MIME services that is provided.

    getBodyPartContent

    Retrieves the content (payload) from the specified MIME object.

    You use this service for both single-part and multi-part messages.

    To retrieve content from a multi-part message, you set the index (to select the part by index number) or contentID (to select the part by contentID value) parameter to specify the body part whose content you want to retrieve. To get the content from a single-part message, you omit the index and contentID parameters or set index to 0.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    If you omit index or contentID when retrieving content from a multi-part message, getBodyPartContent returns the payload from the first body part. If you use index or contentID to select a body part that does not exist in mimeData, content will be null.

    getBodyPartHeader

    Returns the list of header fields for the specified body part.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    If you omit index or contentID, getBodyPartHeader returns the message headers from the first body part. If you use index or contentID to select a body part that does not exist in mimeData, content will be null.

    getContentType

    Returns the value of the Content-Type message header from the specified MIME object.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    getEnvelopeStream

    Generates an InputStream representation of a MIME message from a specified MIME object.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    If you omit index or contentID, getEnvelopeStream generates the MIME message from the entire contents of the mimeData. If you use index or contentID to select a body part that does not exist in mimeData, content will be null.

    getEnvelopeStream automatically inserts the MIME-Version and Message-ID message headers into the MIME message it puts into envStream.

    getMimeHeader

    Returns the list of message headers from a specified MIME object.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    getNumParts

    Returns the number of body parts in the specified MIME object.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    getPrimaryContentType

    Returns the top-level portion of a MIME object’s Content-Type value.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    getSubContentType

    Returns the sub-type portion of a MIME object’s Content-Type value.

    Input Parameters

    Output Parameters

    Usage Notes

    This service operates on the MIME object (mimeData) produced by createMimeData.

    mergeHeaderAndBody

    Concatenates the contents of the header and body mapped to the input.

    You can use this service to reassemble the message into its original form so that it can be used as input to the createMimeData service (or any other service that requires the entire http response as an InputStream).

    Input Parameters

    Output Parameters

    Usage Notes

    Use this service to merge the Headers and Body to get the original MIME message.

    Storage Services

    Use Storage services to insert, retrieve, update, and remove entries from a data store.

    When using the storage services, keep in mind that the short-term store is not intended to be used as a general-purpose storage engine. Rather, it is primarily provided to support shared storage of application resources and transient data in webMethods.io Integration. It is recommended not to use the short-term store to process high volumes, large data records, or to permanently archive records.

    Notes:

    • User specific data which may be considered as personal data will be stored and retained till the retention period defined in Execution Results.
    • These services are a tool for maintaining state information in the short-term store. It is up to the developer of the FlowService to make sure that the FlowService keeps track of its state and correctly handles restarts.

    Locking Considerations

    The following sections describe in general how the storage services handle locking requests.

    Entry Locking

    To maintain data integrity, the short-term store uses locking to ensure that multiple threads do not modify the same entry at the same time. For insertions and removals, the short-term store sets and releases the lock. For updates, the client must set and release the lock. Using locking improperly, that is, creating a lock but not releasing it, can cause deadlocks in the short-term store.

    The following guidelines can help you avoid short-term store deadlocks:

    Data Store Locking

    When a storage service locks an entry, the service also implicitly locks the data store in which the entry resides. This behavior prevents another thread from deleting the entire data store and the entries it contains while your thread is working with the entry. When the locked entry is unlocked, the implicit lock on the data store is also released.

    Be careful when explicitly unlocking data stores. Consider the following example:

    1. User_A locks an item. This creates two locks: an explicit lock on the entry, and an implicit lock on the data store.
    2. User_A later unlocks the data store explicitly while still holding the lock on the entry.
    3. User_B locks, then deletes the data store, including the entry locked by User_A in the first step.

    When User_A explicitly unlocked the data store in step 2, User_B was able to delete the entry the User_A was working with.

    Automatic Promotion to Exclusive Lock

    If a storage service tries to acquire an exclusive lock on an object, but finds a shared lock from the same thread already in place on the object, the service will try to promote the lock to an exclusive lock.

    If a storage service that requires an exclusive lock encounters a shared or exclusive lock held by another thread, it will wait until the object becomes available. If the object remains locked for the period specified by the waitlength parameter passed by the service, the service will fail.

    Sample FlowService for Checkpoint Restart

    The following diagram shows how to create checkpoint restarts into your FlowServices. It explains the logic of a FlowService and shows where the various storage services are used to achieve checkpoint restarts.

    The following Storage services are available:

    Service Description
    add Inserts a new entry into a data store.
    deleteStore Deletes a data store and all its contents. Any data in the data store is deleted. If the data store does not exist, the service takes no action.
    get Retrieves a value from a data store and locks the entry and the data store on behalf of the thread that invoked the service.
    keys Obtains a list of all the keys in a data store.
    lock Locks an entry and/or data store on behalf of the thread invoking this service.
    put Inserts or updates an entry in a data store. If the key does not exist in the data store, the entry is inserted.
    remove Removes an entry from a data store.
    unlock Unlocks an entry or a data store.

    add

    Inserts a new entry into a data store.

    If the key already exists in the data store, the service does nothing.

    Input Parameters

    Output Parameters

    deleteStore

    Deletes a data store and all its contents. Any data in the data store is deleted. If the data store does not exist, the service takes no action.

    Input Parameters

    Output Parameters

    Usage Notes

    This service obtains an exclusive lock on the data store, but no locks on the individual entries in the data store. If this service finds a shared lock from the same thread on the data store, the service will automatically promote the lock to an exclusive lock. The exclusive lock prevents other threads from acquiring locks on the data store or entries within the data store during the delete operation.

    get

    Retrieves a value from a data store and locks the entry and the data store on behalf of the thread that invoked the service.

    Important: This service does not automatically release the lock on the data store or entry after performing the get operation, so you need to ensure that the lock is released by calling the put or unlock services. If you do not release the lock, webMethods.io Integration will release the lock at the end of the FlowService execution.

    Input Parameters

    Output Parameters

    Usage Notes

    keys

    Obtains a list of all the keys in a data store.

    Input Parameters

    Output Parameters

    lock

    Locks an entry and/or data store on behalf of the thread invoking this service.

    Important: When you lock an entry or data store using this service, you must release the lock by using a put or an explicit unlock. If you do not release the lock, webMethods.io Integration will release the lock at the end of the FlowService execution. Further, be careful when releasing locks with the unlock service. If you release a lock on a data store, another thread can obtain a lock on the data store and delete it, and the entries it contains, even if your thread still has locks on one or more of the entries.

    Input Parameters

    Output Parameters

    None.

    Usage Notes

    put

    Inserts or updates an entry in a data store. If the key does not exist in the data store, the entry is inserted.

    If the requested entry is not currently locked by the thread that invoked this service, the put service will automatically attempt to lock the entry for the duration of the put operation.

    The service obtains an exclusive lock on the entry and a shared lock on the data store. If the service finds a shared lock from the same thread on the entry, the service will automatically promote the shared lock to an exclusive lock.

    This service releases the lock when the put operation has completed.

    Input Parameters

    Output Parameters

    Usage Notes

    When storing and retrieving the flow state in the short-term store for checkpoint restart purposes, ensure that the value of key is unique to the transaction.

    remove

    Removes an entry from a data store. This service obtains an exclusive lock on the entry and a shared lock on the data store.

    Input Parameters

    Output Parameters

    unlock

    Unlocks an entry or a data store.

    When a FlowService retrieves an entry using the get service, the entry is locked to prevent modification by other users before the FlowService completes. The entry remains locked until the lock owner invokes a put service. To unlock a service without using the put service, use the unlock service.

    In addition, if a FlowService uses the lock service to lock an entry or data store, you must use the unlock or put service to release the lock.

    Important: Be careful when releasing locks with this service. If you release a lock on a data store, another thread can obtain a lock on the data store and delete it, and the entries it contains, even if the original thread still has locks on one or more of the entries.

    Input Parameters

    Output Parameters

    None.

    String Services

    Use String services to perform string manipulation and substitution operations.

    The following String services are available:

    Service Description
    base64Decode Decodes a Base-64 encoded string into a sequence of bytes.
    base64Encode Converts a sequence of bytes into a Base64-encoded String.
    bytesToString Converts a sequence of bytes to a String.
    compareStrings Performs a case-sensitive comparison of two strings, and indicates whether the strings are identical.
    concat Concatenates two strings.
    fuzzyMatch A given string is not exactly matched against a set of strings. If the match is above similarityThreshold, it returns the matchedValue. If more than one string has not exactly matched, then the first matched string is returned.
    HTMLDecode Replaces HTML character entities with native characters.
    HTMLEncode Replaces HTML-sensitive characters with equivalent HTML character entities.
    indexOf Returns the index of the first occurrence of a sequence of characters in a string.
    isAlphanumeric Determines whether a string consists entirely of alphanumeric characters (in the ranges A–Z, a–z, or 0–9).
    isDate Determines whether a string follows a specified date pattern.
    isNullOrBlank Checks a string for a null or a blank value.
    isNumber Determines whether the contents of a string can be converted to a float value.
    length Returns the length of a string.
    lookupDictionary Looks up a given key in a hash table and returns the string to which that key is mapped.
    makeString Builds a single string by concatenating the elements of a String List.
    messageFormat Formats an array of strings into a given message pattern.
    numericFormat Formats a number into a given numeric pattern.
    objectToString Converts an object to string representation using the Java toString() method of the object.
    padLeft Pads a string to a specified length by adding pad characters to the beginning of the string.
    padRight Pads a string to a specified length by adding pad characters to the end of the string.
    replace Replaces all occurrences of a specified substring with a substitute string.
    stringToBytes Converts a string to a byte array.
    substitutePipelineVariables Replaces a pipeline variable with its corresponding value.
    substring Returns a substring of a given string.
    tokenize Tokenizes a string using specified delimiter characters and generates a String List from the resulting tokens.
    toLower Converts all characters in a given string to lowercase.
    toUpper Converts all characters in a given string to uppercase.
    trim Trims leading and trailing white space from a given string.
    URLDecode Decodes a URL-encoded string.
    URLEncode URL-encodes a string.

    base64Decode

    Decodes a Base-64 encoded string into a sequence of bytes.

    Input Parameters

    Output Parameters

    base64Encode

    Converts a sequence of bytes into a Base64-encoded String.

    Input Parameters

    Output Parameters

    Usage Notes

    By default, the base64Encode service inserts line breaks after 76 characters of data, which is not the canonical lexical form expected by implementations such as MTOM. You can use the useNewLine parameter to remove the line breaks.

    bytesToString

    Converts a sequence of bytes to a String.

    Input Parameters

    Output Parameters

    compareStrings

    Performs a case-sensitive comparison of two strings and indicates whether the strings are identical.

    Input Parameters

    Output Parameters

    concat

    Concatenates two strings.

    Input Parameters

    Output Parameters

    fuzzyMatch

    A given string is not exactly matched against a set of strings. If the match is above similarityThreshold, it returns the matchedValue. If more than one string has not exactly matched, then the first matched string is returned.

    Input Parameters

    Output Parameters

    Usage Notes

    Search the web for more information about Levenshtein and JaroWinkler algorithms.

    HTMLDecode

    Replaces HTML character entities with native characters.

    Specifically, this service:

    Replaces this HTML character entity… With…
    &gt; >
    &lt; <
    &amp; &
    &quot; "

    Input Parameters

    Output Parameters

    HTMLEncode

    Replaces HTML-sensitive characters with equivalent HTML character entities.

    Specifically, this service:

    Replaces this native language character… With…
    > &gt;
    < &lt;
    & &amp;
    " &quot;
    &#39

    These translations are useful when displaying text in an HTML context.

    Input Parameters

    Output Parameters

    indexOf

    Returns the index of the first occurrence of a sequence of characters in a string.

    Input Parameters

    Output Parameters

    isAlphanumeric

    Determines whether a string consists entirely of alphanumeric characters (in the ranges A–Z, a–z, or 0–9).

    Input Parameters

    Output Parameters

    The service returns false if inString is not specified.

    isDate

    Determines whether a string follows a specified date pattern.

    Input Parameters

    Output Parameters

    The service returns false if inString is not specified.

    Usage Notes

    The service returns an error if both inString and pattern are not specified.

    You can specify any random string (for example, 111212) as both inString and pattern. The service returns true if the same user-defined string is specified as both inString and pattern. This is because the java.text.SimpleDateFormat class parses the user-defined input string and pattern to a valid date when the particular input values are identical.

    isNullOrBlank

    Checks a string for a null or a blank value.

    Input Parameters

    Output Parameters

    isNumber

    Determines whether the contents of a string can be converted to a float value.

    Input Parameters

    Output Parameters

    The service returns false if inString is not specified.

    length

    Returns the length of a string.

    Input Parameters

    Output Parameters

    lookupDictionary

    Looks up a given key in a hash table and returns the string to which that key is mapped.

    Input Parameters

    Output Parameters

    makeString

    Builds a single string by concatenating the elements of a String List.

    Input Parameters

    Output Parameters

    messageFormat

    Formats an array of strings into a given message pattern.

    Input Parameters

    Output Parameters

    numericFormat

    Formats a number into a given numeric pattern.

    Input Parameters

    Output Parameters

    objectToString

    Converts an object to string representation using the Java toString() method of the object.

    Input Parameters

    Output Parameters

    padLeft

    Pads a string to a specified length by adding pad characters to the beginning of the string.

    Input Parameters

    Output Parameters

    Usage Notes

    If padString is longer than one character and does not fit exactly into the resulting string, the beginning of padString is aligned with the beginning of the resulting string. For example, suppose inString equals shipped and padString equals x9y.

    If length equals… Then value will contain…
    7 shipped
    10 x9yshipped
    12 x9x9yshipped

    If inString is longer than length characters, only the last length characters from inString are returned. For example, if inString equals acct1234 and length equals 4, value will contain 1234.

    padRight

    Pads a string to a specified length by adding pad characters to the end of the string.

    Input Parameters

    Output Parameters

    Usage Notes

    If padString is longer than one character and does not fit exactly into the resulting string, the end of padString is aligned with the end of the resulting string. For example, suppose inString equals shipped and padString equals x9y.

    If length equals… Then value will contain…
    7 shipped
    10 shippedx9y
    12 shippedx9y9y

    If inString is longer than length characters, only the first length characters from inString are returned. For example, if inString equals 1234acct and length equals 4, value will contain 1234.

    replace

    Replaces all occurrences of a specified substring with a substitute string.

    Input Parameters

    Output Parameters

    stringToBytes

    Converts a string to a byte array.

    Input Parameters

    Output Parameters

    substitutePipelineVariables

    Replaces a pipeline variable with its corresponding value.

    Input Parameters

    Output Parameters

    Usage Notes

    The service returns an error if inString is not specified.

    If inString does not contain any variable between the % symbols, or contains a value other than the pipeline variable between the % symbols, the service does not perform any variable substitution from the pipeline.

    If you want to include the % symbol in the output, you can specify it as \% in inString. To specify the value of the pipeline variable as a percentage in the output, append \% after the variable name in inString. For example, suppose a pipeline variable revenueIncreasePercent has a value of 100.

    If inString equals… Then value will contain…
    %revenueIncreasePercent%\% 100%

    The service cannot be used for substitution of global variables.

    substring

    Returns a substring of a given string.

    Input Parameters

    Output Parameters

    tokenize

    Tokenizes a string using specified delimiter characters and generates a String List from the resulting tokens.

    This service does not return delimiters as tokens.

    Input Parameters

    Output Parameters

    toLower

    Converts all characters in a given string to lowercase.

    Input Parameters

    Output Parameters

    toUpper

    Converts all characters in a given string to uppercase.

    Input Parameters

    Output Parameters

    trim

    Trims leading and trailing white space from a given string.

    Input Parameters

    Output Parameters

    URLDecode

    Decodes a URL-encoded string.

    Input Parameters

    Output Parameters

    URLEncode

    URL-encodes a string.

    Encodes characters the same way that data posted from a WWW form is encoded, that is, the application/x-www-form-urlencoded MIME type.

    Input Parameters

    Output Parameters

    Transaction Services

    Use Transaction services only in conjunction with Database Connector operations. These services are applicable when the Database Connector account is of type Transactional.

    The following Transaction services are available:

    Service Description
    commit Commits an explicit transaction.
    rollback Rolls back an explicit transaction.
    setTimeout Manually sets a transaction timeout interval for implicit and explicit transactions.
    start Starts an explicit transaction.

    commit

    Commits an explicit transaction.

    Input Parameters

    Output Parameters

    None.

    Usage Notes

    This service must be used in conjunction with the Transaction:start service. If the transactionName parameter was not provided in a prior call to Transaction:start, a run-time error will be returned.

    rollback

    Rolls back an explicit transaction.

    Input Parameters

    Output Parameters

    None.

    Usage Notes

    This service must be used in conjunction with the Transaction:start service. If the given transactionName parameter was not provided in a prior call to Transaction:start, a run-time error will be returned.

    setTimeout

    Manually sets a transaction timeout interval for implicit and explicit transactions.

    Input Parameters

    Output Parameters

    None.

    Usage Notes

    You must call this service before you call the Transaction:start service. If the execution of a transaction takes longer than the transaction timeout interval, all transacted operations are rolled back.

    start

    Starts an explicit transaction.

    Input Parameters

    Output Parameters

    Usage Notes

    This service is intended for use with the Transaction:commit or Transaction:rollback service. The transactionName value returned by a call to this service can be provided to Transaction:commit (to commit the transaction) or Transaction:rollback (to roll back the transaction).

    Utils Services

    Contains utility services.

    The following Utils services are available:

    Service Description
    generateUUID Generates a random Universally Unique Identifier (UUID).

    generateUUID

    Generates a random Universally Unique Identifier (UUID).

    Input Parameters

    None

    Output Parameters

    XML Services

    Use XML services to convert a document to XML content and XML content to a document.

    The following XML services are available:

    Service Description
    documentToXMLBytes Converts a document to XML content bytes, as a byte array object.
    documentToXMLStream Converts a document to XML stream, as a java.io.InputStream object.
    documentToXMLString Converts a document to XML content string.
    getXMLNodeType Returns information about an XML node.
    queryXMLNode Queries an XML node.
    xmlBytesToDocument Converts XML content bytes (byte array) to a document.
    xmlNodeToDocument Converts an XML node to a document.
    xmlStreamToDocument Converts an XML content stream to a document.
    xmlStringToDocument Converts an XML string to a document.
    xmlStringToXMLNode Converts a String, byte[ ], or InputStream containing an XML document to an XML node.

    documentToXMLBytes

    Converts a document to XML content bytes, as a byte array object. This service will recurse through a given document and build an XML representation from the elements within it. Key names are turned into XML elements, and the key values are turned into the contents of those elements.

    Input Parameters

    Output Parameters

    documentToXMLStream

    Converts a document to xml stream, as a java.io.InputStream object. This service will recurse through a given document and build an XML representation from the elements within it. Key names are turned into XML elements and the key values are turned into contents of those elements.

    Input Parameters

    Output Parameters

    documentToXMLString

    Converts a document to xml content string. This service will recurse through a given document and build an XML representation from the elements within it. Key names are turned into XML elements, and the key values are turned into the contents of those elements.

    Input Parameters

    Output Parameters

    getXMLNodeType

    Returns information about an XML node.

    Input Parameters

    Output Parameters

    queryXMLNode

    Queries an XML node.

    The fields parameter specifies how data is extracted from the node to produce an output variable. This output variable is called a “binding,” because the fields parameter binds a certain part of the document to a particular output variable. At run time, this service must include at least one fields entry. The service must include at least one entry in fields. The result of each query you specify in fields is returned in a variable whose name and type you specify.

    Input Parameters

    Output Parameters

    Usage Notes

    If queryXMLNode fails, it throws an exception. Common reasons for queryXMLNode to fail include:

    xmlBytesToDocument

    Converts XML content bytes (byte array) to a document. This service transforms each element and attribute in XML content bytes to an element in a Document.

    Input Parameters

    Output Parameters

    xmlNodeToDocument

    Converts an XML node to a document.

    This service transforms each element and attribute in the XML node to an element in a Document.

    Notes:

    Input Parameters

    Output Parameters

    xmlStreamToDocument

    Converts an XML content stream to a document. This service transforms each element and attribute in the XML content stream to an element in a Document.

    Input Parameters

    Output Parameters

    xmlStringToDocument

    Converts an XML string to a document. This service transforms each element and attribute in the XML string to an element in a Document.

    Input Parameters

    Output Parameters

    xmlStringToXMLNode

    Converts a String, byte[ ], or InputStream containing an XML document to an XML node.

    An XML node is a representation of an XML document that can be consumed by webMethods.io Integration.

    Input Parameters

    Output Parameters

    Usage Notes

    The input parameters xmldata, $filedata, and $filestream are mutually exclusive. Specify only one of the preceding parameters. webMethods.io Integration checks the parameters in the following order: $filedata, $filestream, and xmldata, and uses the value of the first parameter with a value.

    Log Business Data

    While executing a FlowService, webMethods.io Integration allows you to log business data at the step level as well as at the FlowService level.

    Log business data at the step level

    1.Provide a name and description of the FlowService.

    2.As we will query contacts from Salesforce CRM and log the business data, select the Salesforce CRM connector, queryContacts operation, and SalesforceCRM_2 as the account.

    3.Select the Log business data option available at the step level as shown below. At the step level, the Log business data option is enabled only for connectors.

    4.In the Log business data dialog box, choose Always to always log business data. As we will query the contacts from Salesforce CRM, select the output fields and specify the display names for AccountId, LastName, and FirstName as shown below.

    5.Run the FlowService and go to the Execution History page by clicking the Execution history option.

    6.Click on the execution history entry to view the execution details page. As you can see below, the business data is logged.

    Log business data at the FlowService level

    1.Click the icon.
    Let us define the input field FirstName as shown below.

    2.Click the Log business data option available at the FlowService level.

    3.In the Log business data dialog box, choose Always to always log business data.

    4.In an earlier step, you had defined the input field FirstName. Select the input field FirstName as shown below and specify the display name as First Name of Customer.

    5.Run the FlowService. For the FirstName field, enter the value John and again click Run.

    6.Click the Execution history option to go to the Execution History page.

    7.Click on the execution history entry to view the execution details page. As you can see below, the business data is logged.

    Reference Data

    Reference data is data that defines the set of permissible values to be used by other data fields. It is a collection of key-value pairs, which can be used to determine the value of a data field based on the value of another data field. For example, the value of a status field in an Application can be “Canceled” and that needs to be interpreted as “CN” in another Application.

    webMethods.io Integration allows you to upload reference data from a text file containing tabular data separated by a character, for example, a comma, semicolon, and so on. The uploaded file should not have an empty column heading or space in the first row, and the first row cannot be empty.

    Reference data appears under Services in the FlowServices workspace. You can access the uploaded reference data in FlowServices as a list of documents by using the reference data service and providing an appropriate name. You can filter the documents returned into the pipeline by the reference data service.

    You can Delete, Download, or Edit a reference data from the Reference Data screen. If a reference data is used in a FlowService, you will not be able to delete that reference data. You must remove the reference data from the FlowService and then delete the reference data. The Download option allows you to download the previously uploaded reference data, edit it, and then upload the modified file.

    Reference Data Signature

    Reference data signature is derived from the column names of the uploaded text file. You can filter the Reference data by providing an appropriate matchCriteria. The output of Reference data is a list of documents that match the specified matchCriteria.

    Input Parameters

    matchCriteria: Document. Criteria on which documents from the Reference data are matched.
    Parameters for matchCriteria are:

    joins: List of join criteria. Each join criteria consists of:

    Output Parameters

    Reference Data Name: Document List. List of documents that match the retrieve criteria.

    Creating a Reference Data

    Let us see how to create a reference data and use it in a FlowService with the help of an example.
    In this example, we will upload a file that contains a list of courier vendors as reference data in webMethods.io Integration, and then import the list of vendors as contacts into Salesforce CRM.

    Before you begin

  • Log in to your tenant and enable FlowServices.
  • Check if you have the Developer and Admin roles assigned from the Settings > Roles page.
  • Obtain the credentials to log in to the Salesforce CRM back end account.
  • Create a Salesforce CRM account in webMethods.io Integration. You can also create this account inline at a later stage.
  • Basic Flow

    1.Select the project where you want to create the reference data. You can also create a new project.

    2.Click Configurations > FlowService > Reference Data > Add Reference Data.

    3.Provide a name (CourierVendors) and an optional description for the reference data.

    4.For the Reference Data File, click Browse file and select the file. Both csv and a text file having tabular data are supported. The maximum file size you can upload is 1 MB. As shown in the sample, the file should not have an empty column heading or space in the first row and that row cannot be empty. This is because the first row of data is read as column headings.

    5.Click Next to define the reference data. Select the Field separator and the Text qualifier. Determine the encoding of the reference data file and from the File Encoding drop down list, select the same encoding.

    6.Click Next to preview the data. If you select an incorrect encoding, garbage characters may appear in the preview pane.

    7.Click Done to create the reference data. The new reference data appears in the Reference Data page. The reference data also appears under the Services category in the FlowServices workspace.

    8.Go to FlowServices and create a new FlowService. Provide a name ImportCourierVendorsAsContacts and description of the FlowService. Then type reference data in the first step and select Reference Data.

    9.Select CourierVendors. Click the icon to add a new step. We need to create a contact for each vendor, so type Repeat to select the Repeat step, and then select the CourierVendors option.

    10.Select Salesforce CRM, the action as createcontact and associate the SalesforceCRM_1 account.

    11.Click the mapping icon and map the input and output fields as shown below.

    12.Save and run the FlowService, and view the results.

    Cloning FlowServices

    The Clone feature allows you to copy the existing functionality of a FlowService into a new FlowService. This is particularly useful if you want to recreate an existing FlowService but change few options. You need not start from scratch and create all over again. When you clone a FlowService, the clone is exactly the same as the original. Obviously, you can make changes whenever you want, just like any other FlowService.

    To clone a FlowService:

    1. Locate the FlowService that you want to clone.

    2. Click the ellipsis icon () available on the FlowService and click Clone from the displayed menu. The Clone FlowService dialog box appears.

    3. Enter the FlowService Name and Project.

    4. Click Clone. The FlowService is created under the specified project. You can modify according to your needs as any other FlowService.

    Notes

    • Ensure that you create the account or reference data associated with the respective asset in the target project.
    • You can create a clone of a cloned FlowService.
    • Editing a cloned FlowService will not affect the original FlowService from which it was cloned.
    • A cloned FlowService, containing the Messaging connector, placed within the same project will inherit all assets, configurations, and authorizations of the original FlowService. However, cloning a FlowService to a different project will copy items of the original FlowService but not retain assets, configurations, and authorizations. You will need to create the account or reference data associated with the respective asset in the destination project.

    Document Types

    What is a document type?

    A document type contains a set of fields used to define the structure and type of data in a document. You can use a document type to specify the input or output parameters for a FlowService. Input and output parameters are the names and types of fields that a FlowService requires as input and generates as output. These parameters are collectively referred to as a signature.

    For example, a FlowService takes two string values, an account number (AcctNum ) and a dollar amount (OrderTotal ) as inputs and produces an authorization code (AuthCode ) as the output. If you have multiple FlowServices with identical input parameters but different output parameters, you can use a document type to define the input parameters rather than manually specifying individual input fields for each FlowService.

    Benefits of creating a document type

    Document types provide the following benefits:

  • Using a document type as the input or output signature for a FlowService reduces the effort required to build a FlowService.
  • Using a document type to build document or document list fields reduces the effort and time needed to declare input or output parameters or build other document fields.
  • Document types improve accuracy because there is less possibility to introduce a typing error while typing field names.
  • Document types make future changes easier to implement because you make a change at one place (the document type) rather than everywhere the document type is used.
  • How do I create a document type?

    You can create a document type in the following ways from Projects > select a project > Configurations > FlowService > Document Types > Add Document Type:

  • Build from scratch: Create an empty document type and define the structure of the document type yourself by inserting fields to define its contents and structure.
  • Build from XML Schema Definition: Create a document type from an XML Schema Definition. The structure and content of the document type matches that of the source file.
  • You can also create document types for already created REST connectors from Projects > select a project > Connectors > REST > Document Types option or from the Request Body and Response Body panels while creating a REST connector.

    Document types created for a REST connector do not appear in the Projects > select a project > Configurations > FlowService > Document Types page but appears in the Document Types panel for the selected REST connector.

    Notes

    • You can copy a document type from the Document Types page or from the Document Type page for a REST connector. Currently you cannot copy a document type across projects. When you copy a FlowService or a Workflow across projects, the document type associated with it is also copied.
    • If a Document Type is used in any other Document Type or a FlowService, you will not be able to delete the Document Type. Other than direct dependencies, if there are document types that are created along with the selected document type from the same xsd source, and if any of these documents types are used in any FlowServices or document types (other than these generated document types), then also you will not be able to delete the document type.

    Creating Document Types from Scratch

    Creating a document type from scratch allows you to create an empty document type, and specify the structure of the document type by inserting fields to define its contents and structure.

    To add or edit a document type from scratch

    1.From the webMethods.io Integration navigation bar, click Projects. Select a project and click Configurations > FlowService > Document Types. From the Document Types page, you can add, edit, delete, or copy a document type. To edit an existing document type, on the Document Types page, click the Edit icon for the document type.

    2.To create a new document type from scratch, from the Document Types page, click Add Document Type > Build from scratch.

    3.Provide a name and description of your document type. Required fields are marked with an asterisk on the page.

    4.Click Load XML to generate a document type from the XML structure or click Load JSON to generate a document type from the JSON structure.

    5.Click the icon to add a new field and update the field properties.

    Provide the Name and Type of the field to define the structure and content of the document type. The type can be a String, Document, Document Reference, Object, Boolean, Double, Float, Integer, Long, or Short. If you select the Type as Document Reference, select a Document Reference. Types are used to declare the expected content and structure of the signatures, document contents, and pipeline contents.

    If you select the Type as String, in the Display Type field, select Text Field if you want the input entered in a text field. Select Password if you want the input entered as a password, with asterisks reflected instead of characters. Select Large Editor if you want the input entered in a large text area instead of a text field. This is useful if you expect a large amount of text as input for the field, or you need to have new line characters in the input. In the Pick List field, define the values that will appear as choices when webMethods.io Integration prompts for input at run time.

    In addition to specifying the name and type of the field, and whether the type is an Array, you can set properties that specify an XML Namespace and indicate whether the field is required at runtime by selecting the Required field option. Select the Content Type you can apply to String, String list, or String table variables. See Content Types and Variable Constraints for information.

    You can add a field from the fields panel by clicking the duplicate field icon. You can also copy a field from the fields panel and depending on the context, you can either paste the field or the field path. For example, if you copy a field and paste the field in the Set Value window in a FlowService, the field path will be pasted. If you copy an array item, the path that is pasted includes the item index. You cannot modify or paste the child fields of a Document Reference. When defining a document type, it is recommended to avoid adding identically named fields to the document. In particular, do not add identically named fields that are of the same data type.

    You can assign an XML namespace and prefix to a field by specifying a URI for the XML namespace property and by using the prefix:fieldName format for the field name. For example, suppose a field is named eg:account and the XML namespace property is set to http://www.example.com. The prefix is eg, the localname is account, and the namespace name is http://www.example.com.

    Keep the following points in mind when assigning XML namespaces and prefixes to a field:

  • The field name must be in the format: prefix:fieldName.
  • You must specify a URI in the XML namespace property.
  • Do not use the same prefix for different namespaces in the same document type, input signature, or output signature.
  • 6.Click Save after you have entered the details and constraints for each field.

    Note: When you edit a document type, any change is automatically propagated to all FlowServices that use or reference the document type.

    The new document type appears in the Document Types page.

    Creating Document Types from an XML Schema Definition

    Let us see how to create a document type from an XML Schema Definition with the help of an example. We will create a document type from an XML Schema Definition and then use the document type to create Accounts in Salesforce CRM.

    Before you begin

  • Check if you have the Developer and Admin roles assigned from the Settings > Roles page.
  • Obtain the credentials to log in to the Salesforce CRM back end account.
  • Create the Salesforce Account (SalesforceCRM_1) in webMethods.io Integration. You can also create this account inline at a later step.
  • 1.Log in to your tenant and from the webMethods.io Integration navigation bar, click Projects. Select a project and then select Configurations > FlowService > Document Types. From the Document Types page, you can add, edit, delete, or copy a document type. To edit an existing document type, on the Document Types page, click the Edit icon for the document type.

    2.To create a new document type from an XML Schema Definition, from the Document Types page, click Add Document Type > Build from XML Schema Definition.

    3.Provide a name and description of your document type, for example, customerOrder. Required fields are marked with an asterisk on the page.

    4.On the Source selection panel, under XML schema source, do one of the following to specify the source file for the document type:

  • To use an XML Schema Definition that resides on the Internet as the source, select URL. Then, type the URL of the resource. The URL you specify must begin with http: or https:.
  • To use an XML Schema Definition that resides in your local file system as the source, select File. Then click Browse and select the file. You can add additional imported or included XML schema files.

    Note: The maximum file upload size is 5 MB which includes the primary source file and additional files, if any.

  • 5.Click Next and on the Processing options panel, under Content model compliances, select a content model compliance to indicate how strictly webMethods.io Integration represents content models from the XML Schema Definition in the resulting document type. Let us select None.

    6.You can specify whether webMethods.io Integration enforces strict, lax, or no content model compliance when generating the document type. Content models provide a formal description of the structure and allowed content for a complex type. The type of compliance that you specify can affect whether webMethods.io Integration generates a document type from a particular XML Schema Definition successfully. Currently, webMethods.io Integration does not support repeating model groups, nested model groups, or the any attribute. If you select strict compliance, webMethods.io Integration does not generate a document type from any XML Schema Definition that contains those items.

    Select… To…
    Strict Generate the document type only if webMethods.io Integration can represent the content models defined in the XML Schema Definition correctly. Document type generation fails if webMethods.io Integration cannot accurately represent the content models in the source XML Schema Definition. Currently, webMethods.io Integration does not support repeating model groups, nested model groups, or any attribute. If you select strict compliance, webMethods.io Integration does not generate a document type from any XML Schema Definition that contains those items.
    Lax When possible, generate a document type that correctly represents the content models for the complex types defined in the XML Schema Definition. If webMethods.io Integration cannot correctly represent the content model in the XML Schema Definition in the resulting document type, webMethods.io Integration generates the document type using a compliance mode of None. When you select lax compliance, webMethods.io Integration generates the document type even if the content models in the XML Schema Definition cannot be represented correctly.
    None Generate a document type that does not necessarily represent or maintain the content models in the source XML Schema Definition.

    7.If you select strict or lax compliance, do one of the following to specify whether document types generated contain multiple *body fields to preserve the location of text in instance documents.

  • Select the Preserve text position check box to indicate that the document type generated preserves the locations for text in instance documents. The resulting document type contains a *body field after each field and includes a leading *body field. In instance documents for this document type, webMethods.io Integration places text that appears after a field in the *body.
  • Clear the Preserve text position check box to indicate that the document type generated does not preserve the locations for text in instance documents. The resulting document type contains a single *body field at the top of the document type. In instance documents for this document type, text data around fields is all placed in the same *body field.
  • 8.If you want webMethods.io Integration to use the Xerces parser to validate the XML Schema Definition, select the Validate schema using Xerces check box.

    Note: webMethods.io Integration automatically uses an internal schema parser to validate the XML Schema Definition. However, the Xerces parser provides more strict validation than the internal schema parser. As a result, some schemas that the internal schema parser considers to be valid might be considered invalid by the Xerces parser.

    9.Click Next and on the Select root nodes panel, select the elements that you want to use as the root elements for the document type. The resulting document type contains all of the selected root elements as top-level fields in the generated document type.

    10.Click Save.

    webMethods.io Integration creates the document type. As you can see, other document types are created automatically from the same xsd source file.

    Note: If an element in the XML Schema Definition is a complex type, webMethods.io Integration creates the document type that defines the structure of the above complex type automatically, along with the main document type. For example, in the below EmployeeDetails.xsd file, the field address is a complex type containing Doorno, street, city, and pincode as its fields.

    Then two document types are created when we create the document type with this xsd source. One document type is the main document type, Employeedetails and the other document type is docTypeRef_AddressType, which represents the structure of the complex type.

    Notes

    • If you have selected strict compliance and webMethods.io Integration cannot represent the content model in the complex type accurately, webMethods.io Integration does not generate any document type.

    • If you have selected lax compliance and indicated that webMethods.io Integration should preserve text locations for content types that allow mixed content (you selected the Preserve text position check box), webMethods.io Integration adds *body fields in the document type only if the complex type allows mixed content and webMethods.io Integration can correctly represent the content model declared in the complex type definition. If webMethods.io Integration cannot represent the content model in a document type, webMethods.io Integration adds a single *body field to the document type.

    • If the XML Schema Definition contains an element reference to an element declaration whose type is a named complex type definition (as opposed to an anonymous complex type definition), webMethods.io Integration creates a document type for the named complex type definition only if it is referred multiple times in the schema.

    • webMethods.io Integration uses the prefixes declared in the XML Schema or the ones you specified as part of the field names. Field names have the format prefix:elementName or prefix:@attributeName.

    • If the XML Schema does not use prefixes, webMethods.io Integration creates prefixes for each unique namespace and uses those prefixes in the field names. webMethods.io Integration uses “ns” as the prefix.

    • If the XML Schema Definition contains a user-specified namespace prefix and a default namespace declaration, both pointing to the same namespace URI, webMethods.io Integration uses the user-specified namespace prefix and not the default namespace.

    • If the namespace prefix in the XML Schema and the default namespace point to the same namespace URI, webMethods.io Integration gives preference to the user-specified namespace prefix over the default namespace.

    11.Now let us use the document type customerOrder to create Accounts in Salesforce CRM.
    Go to the FlowServices page, click the + icon to create a new FlowService, and then provide a name CreateAccountsDocType and description of the FlowService.

    Click the I/O icon as shown above, and define the input and output fields. Select the document reference and save it.

    12.Select Salesforce CRM and link the SalesforceCRM_1 account.

    13.Click the mapping icon and map the input and output fields.

    14.Close the mapping panel by clicking the icon, click the run icon and enter the input values.

    15.Click Run and inspect the results.

    Content Types

    The following table identifies the content types you can apply to String or String list variables. Each of these content types corresponds to a built-in simple type defined in the specification XML Schema Part 2: Datatypes.

    Content Types Description
    anyURI A Uniform Resource Identifier Reference. The value of anyURI may be absolute or relative. Constraining Facets
    enumeration, length, maxLength, minLength, pattern
    Note: The anyURI type indicates that the variable value plays the role of a URI and is defined like a URI. URI references are not validated because it is impractical for applications to check the validity of a URI reference.
    base64Binary Base64-encoded binary data. Constraining Facets
    enumeration, length, maxLength, minLength, pattern
    boolean True or false. Constraining Facets
    pattern
    Example
    true, 1, false, 0
    byte A whole number whose value is greater than or equal to –128 but less than or equal to 127. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    -128, -26, 0, 15, 125
    date A calendar date from the Gregorian calendar. Values need to match the following pattern: CCYY-MM-DD
    Where CC represents the century, YY the year, MM the month, DD the day. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and coordinated universal time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    1997-08-09 (August 9, 1997)
    dateTime A specific instant of time (a date and time of day). Values need to match the following pattern: CCYY-MM-DDThh:mm:ss.sss
    Where CC represents the century, YY the year, MM the month, DD the day, T the date/time separator, hh the hour, mm the minutes, and ss the seconds. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    2000-06-29T17:30:00-05:00 represents 5:30 pm Eastern Standard time on June 29, 2000. (Eastern Standard Time is 5 hours behind Coordinated Universal Time.)
    decimal A number with an optional decimal point. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    8.01, 290, -47.24
    double Double-precision 64-bit floating point type. Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example 6.02E23, 3.14, -26, 1.25e-2
    duration A length of time. Values need to match the following pattern: PnYnMnDTnHnMnS
    Where nY represents the number of years, nM represents the number of months, nD is the number of days, T separates the date and time, nH the number of hours, nM the number of minutes and nS the number of seconds. Precede the duration with a minus (-) sign to indicate a negative duration.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    P2Y10M20DT5H50M represents a duration of 2 years, 10 months, 20 days, 5 hours, and 50 minutes
    ENTITIES Sequence of whitespace-separated ENTITY values declared in the DTD. Represents the ENTITIES attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength
    ENTITY Name associated with an unparsed entity of the DTD. Represents the ENTITY attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    float A number with a fractional part. Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    8.01, 25, 6.02E23, -5.5
    gDay A specific day that recurs every month. Values must match the following pattern: —DD
    Where DD represents the day. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    ---24 indicates the 24th day of each month
    gMonth A Gregorian month that occurs every year. Values must match the following pattern: –MM
    Where MM represents the month. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    --11 represents November
    gMonthDay A specific day and month that recurs every year in the Gregorian calendar. Values must match the following pattern: –MM-DD
    Where MM represents the month and DD represents the day. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    --09-24 represents September 24th
    gYear A specific year in the Gregorian calendar. Values must match the following pattern: CCYY
    Where CC represents the century, and YY the year. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    2001 indicates the year 2001.
    gYearMonth A specific month and year in the Gregorian calendar. Values must match the following pattern: CCYY-MM
    Where CC represents the century, YY the year, and MM the month. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    2001-04 indicates April 2001.
    hexBinary Hex-encoded binary data. Constraining Facets
    enumeration, length, maxLength, minLength, pattern
    ID A name that uniquely identifies an individual element in an instance document. The value for ID needs to be a valid XML name. The ID datatype represents the ID attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    IDREF A reference to an element with a unique ID. The value of IDREF is the same as the ID value. The IDREF datatype represents the IDREF attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    IDREFS Sequence of white space separated IDREFs used in an XML document. The IDREFS datatype represents the IDREFS attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength
    int A whole number with a value greater than or equal to -2147483647 but less than or equal to 2147483647. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    -21474836, -55500, 0, 33123, 4271974
    integer A positive or negative whole number. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    -2500, -5, 0, 15, 365
    language Language identifiers used to indicate the language in which the content is written. Natural language identifiers are defined in IETF RFC 1766. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    long A whole number with a value greater than or equal to -9223372036854775808 but less than or equal to 9223372036854775807. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    -55600, -23, 0, 256, 3211569432
    Name XML names that match the Name production of XML 1.0 (Second Edition). Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    NCName Non-colonized XML names. Set of all strings that match the NCName production of Namespaces in XML. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    negativeInteger An integer with a value less than or equal to –1. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    -255556, -354, -3, -1
    NMTOKEN Any mixture of name characters. Represents the NMTOKEN attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    NMTOKENS Sequences of NMTOKENS. Represents the NMTOKENS attribute type from the XML 1.0 Recommendation. Constraining Facets
    enumeration, length, maxLength, minLength
    nonNegativeInteger An integer with a value greater than or equal to 0. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    <br> 0, 15, 32123
    nonPositiveInteger An integer with a value less than or equal to 0. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits, whiteSpace
    Example
    -256453, -357, -1, 0
    normalizedString Represents white space normalized strings. Set of strings (sequence of UCS characters) that do not contain the carriage return (#xD), line feed (#xA), or tab (#x9) characters. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    Example
    MAB-0907
    positiveInteger An integer with a value greater than or equal to 1. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    1, 1500, 23000
    short A whole number with a value greater than or equal to -32768 but less than or equal to 32767. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    -32000, -543, 0, 456, 3265
    string Character strings in XML. A sequence of UCS characters (ISO 10646 and Unicode). By default, all white space is preserved for variables with a string content constraint. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    Example
    MAB-0907
    time An instant of time that occurs every day. Values must match the following pattern: hh:mm:ss.sss
    Where hh indicates the hour, mm the minutes, and ss the seconds. The pattern can include a Z at the end to indicate Coordinated Universal Time or to indicate the difference between the time zone and Coordinated Universal Time.
    Constraining Facets
    enumeration, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern
    Example
    18:10:00-05:00 (6:10 pm, Eastern Standard Time) Eastern Standard Time is 5 hours behind Coordinated Universal Time.
    token Represents tokenized strings. Set of strings that do not contain the carriage return (#xD), line feed (#xA), or tab (#x9) characters, leading or trailing spaces (#x20), or sequences of two or more spaces. Constraining Facets
    enumeration, length, maxLength, minLength, pattern, whiteSpace
    unsignedByte A whole number greater than or equal to 0, but less than or equal to 255. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    0, 112, 200
    unsignedInt A whole number greater than or equal to 0, but less than or equal to 4294967295. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    0, 22335, 123223333
    unsignedLong A whole number greater than or equal to 0, but less than or equal to 18446744073709551615. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    0, 2001, 3363124
    unsignedShort A whole number greater then or equal to 0, but less than or equal to 65535. Constraining Facets
    enumeration, fractionDigits, maxExclusive, maxInclusive, minExclusive, minInclusive, pattern, totalDigits
    Example
    0, 1000, 65000

    Customizing a String Content Type

    Instead of applying an existing content type or simple type to a String or String list, you can customize an existing type and apply the new, modified type to a variable. You can customize a content type or simple type by changing the constraining facets applied to the type.

    When you customize a type, you actually create a new content type. The constraining facets you can specify depend on the content type. Note that content types and constraining facets correspond to datatypes and constraining facets defined in XML Schema. For more information about constraining facets for a datatype, see the specification XML Schema Part 2: Datatypes (http://www.w3.org/TR/xmlschema-2/).

    To customize a content type

    1.Select the variable to which you want to apply a customized content type and click the edit icon.

    2.Click the Content type drop-down arrow and then in the Content type list, select the content type you want to customize. The constraining facet fields below the Content type list are made available for data entry.

    3.In the fields below the Content type field, specify the constraining facet values you want to apply to the content type and click Save.

    Note: The constraining facets displayed below the Content type list depend on the primitive type from which the simple type is derived. Primitive types are the basic data types from which all other data types are derived. For example, if the primitive type is string, the constraining facets Enumeration, Length, Minimum Length, Maximum Length, Whitespace, and pattern are displayed. For more information about primitive types, see XML Schema Part 2: Datatypes at http://www.w3.org/TR/xmlschema-2/.

    Variable Constraints

    You apply content constraints to variables in the document types that you want to use as blueprints in data validation. Content constraints describe the data a variable can contain. At validation time, if the variable value does not conform to the content constraints applied to the variable, the validation engine considers the value to be invalid.

    When applying content constraints to variables, do the following:

    Select a content type: A content type specifies the type of data for the variable value, such as string, integer, boolean, or date. A content type corresponds to a simple type definition in a schema.

    Set constraining facets: Constraining facets restrict the content type, which in turn, restrict the value of the variable to which the content type is applied. Each content type has a set of constraining facets. For example, you can set a length restriction for a string content type, or a maximum value restriction for an integer content type.

    For example, for a String variable named itemQuantity, specify a content type that requires the variable value to be an integer. You could then set constraining facets that limit the content of itemQuantity to a value between 1 and 100.

    The content types and constraining facets correspond to the built-in data types and constraining facets in XML Schema. The World Wide Web Consortium (W3C) defines the built-in data types and constraining facets in the specification XML Schema Part 2: Datatypes (http://www.w3c.org/TR/xmlschema-2).

    Applying Constraints to a Variable

    You can apply content constraints to variables in the document types that you want to use as blueprints in data validation.

    To apply constraints to a variable

    1.On the Document Types page, for a document type, click the Edit icon. Select a field and click the edit field icon to view the field properties panel.

    You can apply constraints to the fields of a document type, and also the fields declared on the Input/Output page after you click the Define I/O option in a FlowService. If the selected variable is a String or String list, and you want to specify content constraints for the variable, do the following:

    If you want to use a content type that corresponds to a built-in simple type in XML schema, in the Content type list, select the type for the variable contents. To apply the selected type to the variable, click Save.

    2.Repeat this procedure for each variable to which you want to apply the constraints in the document type and click Save.

    Lock and Unlock FlowServices

    webMethods.io Integration allows you to manage a FlowService during the development life cycle by auto locking. When you edit a FlowService, it is automatically locked for you. This restricts multiple users from editing the FlowService at the same time. After you edit a FlowService and save the changes, you can exit the edit mode to unlock the FlowService and make it available for other users.

    If you are editing a FlowService and if another user opens the FlowService, then that user sees the following message.

    A FlowService cannot be edited in view mode.

    After you edit a FlowService and save the changes, the other user sees the following message:

    If you are editing a FlowService and another user tries to delete the FlowService, the other user sees the following message:

    If you have kept the user session idle for quite sometime or because of any other issues a FlowService lock remains and is not resolved, then only a user with Admin role can unlock the FlowService and make it available for editing.

    To unlock a FlowService, click the ellipsis icon available on a FlowService and select Unlock.

    If the Admin unlocks the FlowService, all editing permissions are automatically revoked for the user who has locked the Flowservice, and any unsaved changes will be lost. The following message appears after the Admin unlocks the FlowService:

    Further, all users who are currently editing or viewing the FlowService, see the following message:

    Version Management of FlowServices

    webMethods.io Integration allows you to view the version history of a FlowService. Click the ellipsis icon and select the Version history option available on the FlowServices tool bar panel to view the version history.

    When you save a FlowService, a new version is added to the Version History with the default commit message. You can also provide a custom commit message by clicking the drop-down arrow beside the Save option and selecting Save with message.
    You can click on any version on the Version history panel and view the corresponding FlowService version.
    To restore an earlier version, select the earlier version of the FlowService, and click the Restore icon .
    If you have reverted to an earlier version and there is a scheduled execution for the FlowService, the reverted version runs as per the defined schedule.

    Notes

    • If a FlowService references any other FlowService, then the pipeline mapping of the referenced FlowService is also restored to that particular version. But if the pipeline mapping of the referenced FlowService has been modified in a later version, the modification might break the mappings and the FlowService execution will not be successful.
    • If a FlowService references any document types, reference data, custom operations, REST connectors, and SOAP connectors, and if those references have been modified, then those references will not be restored.
    • If you delete a FlowService and then create another FlowService with the same name, the version history of the deleted FlowService will be available.

    Debug FlowServices

    You can debug a FlowService and can inspect the data flow during the debugging session. The FlowService gets automatically saved when you start the debug session.

    You can do the following in debug mode:

  • Start a FlowService in debug mode, specify the input values, and inspect the results.
  • Examine and edit the pipeline data before and after executing the individual steps.
  • Monitor the execution path, execute the steps one at a time, or specify breakpoints where you want to halt the execution.
  • To start the Debug session, from the FlowServices page, select a FlowService, insert the breakpoints, and then click the Debug icon . The debug session halts at the first step even if you have not inserted a breakpoint.

    If you have defined input fields for the FlowService, the Input Values dialog box appears where you can specify the input values, else the Debug panel appears if no input fields are defined.

    Notes

    • If you have not defined any input fields or if there are no pipeline output variables, then while debugging, a message appears as the pipeline is empty. Further, if you disable a step, that step will not be considered while debugging. The FlowService Execution page (Monitor > FlowService Execution) as well as the Execution History page do not display any execution logs for a debug session.
    • If a FlowService has a child FlowService, webMethods.io Integration will not step into the child FlowService during a debug session.

    The following table describes the options available while debugging the FlowService:

    Icon Applicable for… Action/Description
    Insert Breakpoints Insert a breakpoint in a FlowService by clicking on the step number.
    To remove a breakpoint, click on the step number where the breakpoint is inserted.
    Breakpoints are recognized only when you run a FlowService in debug session.
    A breakpoint is a point where you want processing to pause when you debug the FlowService. Breakpoints can help you isolate a section of code or examine data values at a particular point in the execution path. For example, you might want to set a pair of breakpoints before and after a particular step, so that you can examine the pipeline before and after that step executes.When you run a FlowService that contains a breakpoint, the FlowService is executed up to, but not including the designated breakpoint.
    Icon Applicable for… Action/Description
    Disable Breakpoints Ignores all breakpoints inserted in the FlowService steps.
    Icon Applicable for… Action/Description
    Enable Breakpoints Enables all breakpoints inserted in the FlowService steps.
    Icon Applicable for… Action/Description
    Resume Resumes the debug session but pauses at the next breakpoint.
    Icon Applicable for… Action/Description
    Stop Terminates the debug session. A debug session might also stop by itself for the following reasons:
    • The FlowService that you are debugging executes to completion (error or success).
    • You select Step over for the last step in the FlowService.
    • You Exit the FlowService.
    Icon Applicable for… Action/Description
    Restart Restarts the debug session from the first step.
    Icon Applicable for… Action/Description
    Step over Executes the FlowService on a step-by-step basis. For conditional controls:
    • Conditions are evaluated
    • If values are true, then the steps inside it are executed on the next step over.
    Icon Applicable for… Action/Description
    Clear all breakpoints Removes all breakpoints inserted in the FlowService.
    Icon Applicable for… Action/Description
    Close Closes the Debug panel and goes back to the FlowService.

    Modifying the current pipeline data while debugging

    While debugging, you can modify the contents of the pipeline by clicking on the field values as shown below. The changed values are not applied on the current step but on successive steps when you do a Step over or Resume.

    While modifying the pipeline, keep the following points in mind:

  • You can modify the pipeline data only during an active debug session.
  • When you modify values in the pipeline, the changes apply only to the current debug session. The FlowService is not permanently changed.
  • You can modify existing variables but cannot add new variables to the pipeline.
  • Note: While running or debugging FlowServices and testing operations, if the input has any duplicate keys, or if the service returns an output with duplicate keys, you can view those keys.

    Restart and Resume FlowServices

    webMethods.io Integration allows you to restart or resume failed FlowService executions. If an execution fails, you can resume that execution from the point where it had failed. Resuming an execution does not execute the previously successful operations but executes only the failed operations and operations that are not yet executed. When a FlowService execution is restarted, the execution occurs from the beginning.

    Note: The Restart and Resume options are available only if you have the required capability for restarting and resuming FlowServices. Contact Global Support to enable the Restart and Resume capability as these options are not available by default in the user interface.

    The following table provides information on when you can restart or resume a FlowService execution:

    Execution Result Status Restartable Resumable
    Successful Executions Yes No
    Failed Executions Yes Yes
    Completed with Errors Yes No
    Running Executions No No

    How It Works

    1.From the webMethods.io Integration navigation bar, click Projects. Select a project and then select FlowServices.

    2.Click the ellipsis icon available on the FlowService and select Overview.

    3.On the Overview page, select the Enable FlowService to be restarted option. Enabling this option increases the execution time of a FlowService. If the FlowService is updated, you cannot restart or resume its executions that have occurred before the update.
    Further, FlowServices using Operations and other FlowServices, which have fields of type Object in their signature, may not execute properly when restarted or resumed.

    4.Select the FlowService and run it.

    5.Provide the input values, if needed, and then click Run.

    6.Go to the Monitor > Execution Results > FlowService Execution page. Click on the execution log and on the execution details page, click the Restart or Resume options.

    7.Provide the input values, and then click Run to restart the execution from the beginning. Click the Resume option, edit the input data, and resume the execution from the point where it had failed in the previous run.

    Invoke FlowServices over HTTP

    webMethods.io Integration allows you to trigger the execution of a FlowService from an external system. This option provides you with another way to trigger FlowService executions from a software application, for example, a REST client, apart from manual and scheduled executions from the user interface.

    On the external program, provide the HTTP URL, Method, required JSON body, and necessary header parameters, including the required security credentials (user name and password) to invoke the FlowService. After the FlowService is executed, the response contains the pipeline data.

    How it Works

    1.Log in to your tenant and from the webMethods.io Integration navigation bar, click Projects. Select a project and then click FlowServices.

    2.Click the ellipsis icon available on a FlowService and select Overview.

    The Overview page appears.

    3.On the Overview page, select the Enable FlowService to be invoked over HTTP option. Once the FlowService is enabled to be invoked over HTTP, the HTTP request URL appears.

    4.Click the Advanced details section to view the HTTP Method, sample JSON input, and the parameters that are required to invoke the FlowService from an external system.

    Synchronous URL

    You can execute FlowServices synchronously using the run URL:

    https://sub-domain.domain/integration/rest/external/integration/run/stagename/integrationname

    run - FlowService executes and the response contains the pipeline data.

    sub-domain is a domain that is part of the primary domain.

    run - FlowService executes and the response contains the pipeline data.

    stagename is the name of the active stage.

    integrationname is the name of the FlowService.

    Note: You must provide your user name and password to execute the FlowService from the external program, else you may encounter the 401 - Unauthorized User Error.

    Asynchronous URL

    You can execute FlowServices asynchronously using the submit URL:

    https://sub-domain.domain/integration/rest/external/integration/submit/stagename/integrationname

    submit - FlowService has been submitted for execution and the response contains a status indicating whether the FlowService has been submitted for execution. When the request is submitted for execution using the submit option, the response contains a reference to the execution result identifier so that a new HTTP call can be made later to get the execution results.

    Application Status Codes for submit

  • 0 - SUCCESS: Successfully submitted the FlowService for execution.
  • -1 - ERROR: Problem while submitting the FlowService for execution.
  • To get the execution results, construct the URL of the new HTTP call from the URI field available in the Response section.

    To construct the URL of the new HTTP call, add the response URI obtained from resultReference in the Response section to: https://sub-domain.domain.com

    Response URI format: https://sub-domain.domain/integration/rest/external/integration/execution/result?resultReference=765733-6a21-4b02-864f-e958f698373

    HTTP Status Codes

    5.Download the Postman app (Windows 64-bit) from https://www.postman.com/downloads, install the app, and then sign-in to Postman.

    6.On the Postman page, click Create a request.

    7.On the Postman app Request page, select POST. Then copy the Synchronous URL from the webMethods.io Integration FlowService Overview page and paste it in the Enter Request URL field as shown below.

    8.Go to the webMethods.io Integration FlowService Overview > Advanced details page and get the Content-Type and Accept header parameters.

    9.Go to the Postman app Request page, click Headers, and enter the Content-Type and Accept values as shown below.

    10.On the Postman app Request page, select POST, click Authorization, select the Type as Basic Auth, enter your webMethods.io Integration login user name and password, and click Send as shown below.

    The result appears in the Body section at the lower part of the Postman Request page. The execution result also appears on the webMethods.io Integration FlowService Execution History page and on the Monitor > Execution Results > FlowService Execution page under Execution Logs.

    Export FlowServices

    webMethods.io Integration allows you to export FlowServices from the FlowServices page. You can export FlowServices only if you have the required capability for exporting FlowServices. You can export a FlowService from one tenant and import that FlowService to another tenant.

    How It Works

    1.From the webMethods.io Integration navigation bar, click Projects. Select a project and then click FlowServices. The FlowServices page appears.

    2.Click the ellipsis icon available on the FlowService and select Export. If the FlowService you are exporting uses Reference Data, Document Types, SOAP, or REST Connectors, then those components are also exported along with the FlowService.

    The Confirm Export dialog box appears.

    3.Click Export to export the FlowService. The FlowService is downloaded as a zip file to your default download folder. The zip file has the same name as that of the FlowService. Do not modify the contents of the exported zip file. If you modify the contents of the zip file, the FlowService cannot be imported back to webMethods.io Integration.

    Exporting a FlowService having an On-Premises Connector

    After exporting a FlowService that has an on-premises connector, if you want to import the FlowService, then before importing the FlowService, ensure that you upload the same on-premises connector to webMethods.io Integration. Else you will not be able to import the FlowService.

    Import FlowServices

    webMethods.io Integration allows you to import FlowServices from a zip file that was earlier exported from webMethods.io Integration. You can export FlowServices from one tenant and import those FlowServices to another tenant. You can import FlowServices provided you have the required capability.

    Note: If you want to import a FlowService that has an on-premises connector, before importing the FlowService, ensure that you upload the same on-premises connector to webMethods.io Integration, else you will not be able to import the FlowService.

    How It Works

    1.From the webMethods.io Integration navigation bar, click Projects. Select a project and then click FlowServices. The FlowServices page appears.

    2.Click Import and select the zip file that contains the exported FlowService.

    While importing a FlowService, if a dependent FlowService conflicts with an existing FlowService in the same project, you can:

    Notes

    • If you are importing a FlowService that uses a Messaging connector, you will need to create the Accounts and destinations, and then configure them in the imported FlowService.
    • If the FlowService you are importing use SOAP or REST connectors and if those connectors do not exist in your system, continue importing the FlowService. The connectors are imported along with the FlowService. After importing, create the Accounts and then configure them in the imported FlowService.
    • If a FlowService you are importing uses an on-premises connector and if the connector does not exist in your system, the Account appears only after you have uploaded the on-premises connector.
    • If an Account is used in multiple steps in a FlowService, after importing the FlowService in a different project, the Account name appears in the relevant steps as shown.


      In the FlowService step as shown, the account appears as configured, but is not available in the project.


      In such cases, create the Account with the same name. The Account will be automatically configured in the relevant steps. If you create the Account with a different name, you have to configure the Account manually at each step. If an Account with the same name already exists in the project, then the Account will be automatically linked in the relevant steps.