Gateway Hub

Publishing

Overview

The Publishing page allows you to configure publishing of normalised metric and event data to a downstream Kafka instance. Only one downstream Kafka instance is supported by a Gateway Hub cluster.

For more information on how to configure publishing to an external Kafka instance, see Kafka publishing message formats.

Status

If you configure Kafka publishing, an information bar is displayed at the top of this configuration screen providing the current publishing status. Important information such as the inability to connect to the configured Kafka instance is displayed here.

Note: Retrieving the status from a remote Kafka instance can take 30+ seconds depending on various factors such as the connection speed, the relative location, the processing capability, and more.

The possible statuses are as follows:

  • Configuration error
  • Waiting
  • Success

Kafka Publisher Configuration

This section is where you enter the details of your downstream Kafka instance, and any additional settings.

Enable or disable publishing using the toggle to the right of Publishing.

Field Description
Bootstrap servers

Host:port values that the Gateway Hub uses to establish connection to Kafka. More rows can be added using the add button.

Topic Prefix Optional prefix. This allows you to avoid collisions with existing Kafka topic names. The default is itrs-.
Producer Configuration Name Name of an additional Kafka setting. You can use any setting (other than callbacks) defined in the Apache Kafka documentation . Add more rows as needed using the add button.
Producer Configuration Value Value associated with setting defined in Producer Configuration Name.

Click the Request Snapshot to request a snapshot of all metric data.

Click the Request Schema to request all schemas.

Security protocol

This section is where you select and configure the security protocols. The options are:

  • PLAINTEXT
  • SASL_SSL

If you select SASL_SSL, then the following configuration options appear:

Field Description

CA certificate

Use Upload File to select the certificate used to sign the Kafka broker's public keys.
Kerberos principal Principal name used for Kerberos.
Kerberos keytab The keytab file encoding the password for the Kerberos principal.

Filtering

Filtering allows you to publish a smaller subset of your data. This can considerably reduce the storage and processing requirements of a downstream application. You can use one or more filter predicates to filter the data by message type and entity query. A record is published if it meets any of the includes conditions. However, a record is not published if it meets any of the excludes conditions even if it also meets an includes condition.

For example, given the configuration below, all metrics and events will be published for Entities where Application = Fidessa except where Department = Fixed Income

Include/Exclude Message type Entities
Include Events Application = Fidessa
Include Metrics Application = Fidessa
Exclude All Department = 'Fixed Income'

You can specify filters on entities using the basic or advance menus:

  • The basic menu allows you to select entities by specifying an attribute and a value.
  • The advanced menu allows you to input a filter manually.

Note: Events and metrics data are treated independently, some Gateway Hub components only use events data.