Integration Server 11.1 | Integration Server Administrator's Guide | Configuring Integration Server for Streaming | Event Streaming in Integration Server
 
Event Streaming in Integration Server
 
Steps for Building an Event Streaming Solution
An event stream is continuous sequence of events that flow from producers to a streaming provider, such as Apache Kafka, and on to consumers that process the events. Each event captures an activity that has been deemed of interest in a solution or system. These activities could include user interactions, data changes, system events, or any other events relevant to the functionality or behavior of the system.
Event streaming is an implementation of the publish-and-subscribe model with producers that send events, a streaming provider to receive, organize, and distribute events, and consumers to retrieve and process events. However, event streaming differs from traditional publish-and-subscribe in the volume and types of events. In traditional publish-and-subscribe, a producer publishes an event, the event broker routes it to subscribers, and consumers retrieve and process the event. This pattern suits critical events, such as a purchase order or a financial transaction, that must be processed in a guaranteed and ordered manner. In event streaming, producers send streams of events to topics on the streaming provider which saves the events to the specified topic. Consumers tap into the stream and retrieve only the events in which they are interested. The streaming provider places the events in a log and does not remove them even after the events have been retrieved by consumers, allowing events to be replayed and retrieved by multiple consumers. The event streaming pattern is suited for processing, analyzing, and aggregating a high volume of events. In some cases, the overall pattern of events might be of more interest than the individual events themselves.
Integration Server provides a generic, highly scalable solution for sending, receiving and processing streams of events. The event streaming capabilities included in Integration Server are provider-agnostic, allowing the Integration Server to integrate with and process events from multiple streaming providers. That is, the capabilities are not tailored to a specific event streaming platform, such as Kafka. Integration Server may not support all the proprietary extensions offered by a streaming platform. Integration Server event streaming capabilities can be used with the event streaming patterns and with the traditional message processing.
Integration Server can act as an event producer that sends events to topics on a streaming provider. Integration Server can act as an event consumer that retrieves events from the streaming provider. Specifically:
*Integration Server can publish events (messages) to a topic on the streaming provider using the built-in service pub.streaming:send. A topic is a log that may be distributed across multiple partitions which may be located on one or more servers for the streaming provider. The streaming provider manages appending the messages to the log and placing the message in the correct topic partition. Note that partitions might not be used by all streaming providers.
*Integration Server consumes events from topics on the streaming provider using consumers created for event triggers. Consumers read events from the topic and then pass the events to the processing logic defined for the trigger.
If the streaming provider supports partitions (like Kafka), then each consumer is assigned to one or more partitions, and each consumer receives events in publishing order. The event trigger can be configured to operate with multiple consumers. Each consumer can process events from different partitions concurrently. However, multiple consumers cannot access the same partition concurrently.
If you are running multiple instances of the same Integration Server (for example, in a cluster), you may find that there are more consumers than partitions. In this case, some of the consumers will be inactive.
*Integration Server uses a streaming connection alias to establish the connection to the steaming provider. The streaming connection alias contains all the information needed for Integration Server to connect to the streaming provider. Event producers and consumers rely on the streaming connection alias. An invocation of pub.streaming:send identifies the streaming connection alias to use to send the event. Similarly, an event trigger identifies the streaming connection alias to use to receive the events.
Event specifications use a streaming connection alias to identify the streaming provider to which instances of the event specification are sent. An event specification defines the event streaming topic and the expected structure of the event content. For more information about event specifications, see Event Specifications .
Note:
A streaming provider might also be referred to as a broker. The Integration Server documentation uses the term streaming provider to avoid confusion with the IBM webMethods Broker product.