Geneos ["Geneos"]
You are currently viewing an older version of the documentation. You can find the latest documentation here.
["Geneos > Netprobe"]["User Guide"]

Publisher

Overview

The Publisher plug-in enables you to publish data from an FKM outbound stream to an index on any of the following:

  • Elasticsearch host

  • Splunk server

The Publisher plug-in supports the following versions:

Destination Version/s
Elasticsearch 6.2.4 to 7.4.1
Splunk server 7.3.1
   

Intended audience

This guide is directed towards Geneos users who want to publish data from a configured FKM outbound stream to any of the following:

  • Elasticsearch server

  • Splunk server

As a user, you should be familiar with the use and capability of the FKM plug-in, the Elasticsearch API, and Splunk.

Prerequisites

Java requirements

Caution: The Java installation and environment configuration is a common source of errors for users setting up Java-based components and plug-ins. It is recommended to read Configure the Java environment to help you understand your Java installation.

Elasticsearch credentials

If you are looking to publish data to Elasticsearch, then you need the following:

  • Elasticsearch server host name or IP address
  • Elasticsearch server port
  • Elasticsearch server credentials, if applicable

You must also be familiar with Elasticsearch API, as well as how it is implemented in your organisation.

Splunk credentials

If you are looking to publish data to Splunk, then you need the following:

  • Splunk server host name or IP address
  • Splunk HEC port
  • Splunk HEC token

FKM outbound stream

A Publisher sampler receives messages from the FKM outbound stream. You must have a configured FKM sampler ready to use for the Publisher sampler.

Setup and configuration

Setup involves the following tasks:

  1. Create the Publisher sampler.
  2. Associate the Publisher sampler with a managed entity.
  3. Publish an outbound stream from the FKM sampler.

Note: If you are using this plugin with Gateway Hub, you must create a user defined data schema. For instructions, see Create a data schema.

Create the Publisher sampler

  1. In the Gateway Setup Editor, create a new sampler by right-clicking the Samplers folder and selecting New Sampler.
  2. Enter a name for this sampler in the Name field.
  3. In the Plugin field, click the drop-down list and select publisher.
  4. In the Destination field, click the drop-down list and select
    1. collectionAgent if you want to publish to Collection Agent. To configure, follow the steps in Publisher
    2. elasticsearch if you want to publish to Elasticsearch server. To configure, follow the steps in Configure the Elasticsearch destination.
  5. Click Save current document to apply your changes.

Success: The sampler can now be associated with a managed entity.

Configure the Elasticsearch destination

If you are publishing to Elasticsearch, configure the plugin as follows:

  1. In the Host field, enter the Elasticsearch server host name or IP address.
  2. In the Port field, enter the port number.
  3. Note: You can toggle between data and var for the Host and Port fields. This toggle option allows you to define either a text or numerical value (data) or variable (var) for these fields.

  4. In the Index field, enter the Elasticsearch index where you want to add the JSON document.
  5. If you want to change the _type endpoint, specify the endpoint in the Endpoint field.
  6. If you want to use an HTTPS connection between the Publisher sampler and the Elasticsearch host, select Https under Protocols.
  7. If authentication is needed to access the Elasticsearch host, click Authentication > Type and select Basic.
    • In the Username field, enter the Elasticsearch host username.
    • In the Password field, enter the Elasticsearch host password. You can toggle between setting and encrypting a password (stdAES) or defining a variable (var).

Configure the Splunk destination

'If you are publishing to Splunk, configure the plugin as follows:

  1. In the Host field, enter the Splunk server host name or IP address.
  2. In the Port field, enter the port number.
  3. Note: You can toggle between data and var for the Host and Port fields. This toggle option allows you to define either a text or numerical value (data) or variable (var) for these fields.

  4. If you want to change the index, specify the index where you want to add the JSON document in the Index field.
  5. If you want to change the batch size, specify the value in the Batch size field.
  6. In the Token field, input the HEC token or a variable.

Associate the sampler with a managed entity

  1. In the Gateway Setup Editor, create a new managed entity by right-clicking the Managed entities folder and selecting New Managed entity.
  2. Enter a name for this managed entity. For example, enter "publisher-me" in the Name field.
  3. In the Options field, select the probe on which you want the sampler to run.
  4. Under the Sampler field, click Add new.
  5. In the text field under Ref, select the sampler you just created from the drop-down list.
  6. Click Save current document to apply your changes.

Success: The Publisher Admin dataview now appears under the managed entity in the Active Console state tree.

Publish an outbound stream from the FKM sampler

  1. In the Gateway Setup Editor, locate and select the FKM sampler you wish to publish an outbound stream from.
  2. In Files, click inside the Outbound stream name field for the source you want to publish from.
  3. In the Outbound stream name field, specify the Publisher sampler you have just created. The format must follow a fully qualified stream name:
  4. managedEntity-name.publisher-sampler(type)

    For example:

    ME.Publisher

    Note: The managed entity part of the format can be omitted if the sampler falls under the same managed entity as the FKM sampler.

  5. Click Save current document to apply your changes.

Success: The Publisher sampler now receives outbound stream messages coming from the configured FKM sampler.

Elasticsearch Admin View

The Publisher sampler automatically creates the Admin view to monitor the status of its streams, if there are any.

Headline legend

Name Description
protocol Connection protocol used. For example, HTTP or HTTPS.
host Elasticsearch server host name or IP address that the Publisher sampler is connected to.
port Elasticsearch server port that the Publisher sampler is connected to.
index

Elasticsearch index where the stream data is published.

This field conforms to the Elasticsearch REST API. For more information, see the Index API page of theElasticsearch Reference.

endpoint

Elasticsearch_type endpoint where the stream data is published. By default, this is the document type, _doc.

This field conforms to the Elasticsearch REST API. For more information, see the Index API page of the Elasticsearch Reference.

   

Table legend

Name Description
name Name of the FKM outbound stream tied to the Publisher sampler.
bufferSize

Number of messages that the sampler holds in the stream.

The sampler holds these messages until they are consumed by another sampler.

pending Number of messages waiting to be consumed by the Publisher sampler from the native stream.
sending Number of messages waiting to be received by Elasticsearch from the Publisher sampler.
success

Number of messages successfully published to Elasticsearch.

failed Number of messages that failed to be published to Elasticsearch. This can be due to an issue with the schema, or the connection dropping between the Publisher sampler and the Elasticsearch host.
lost

Total number of messages that did not reach the Publisher sampler. This can be due to the buffer filling up too quickly.

Lost messages indicate that you may need to increase the Buffer size or throttle the FKM sampler.

   

Note: Stream messages are stored in the buffer until they are consumed by another component. However, If there are no samplers or clients consuming the stream, then the stream registry purges the messages immediately.

Basic configuration

A Publisher sampler receives its stream from a corresponding FKM sampler. If you wish to assign an outbound stream to a Publisher sampler, see File Keyword Monitor configuration in File Keyword Monitor configuration.

Note: You can safely update the configuration of this plug-in without causing the Netprobe to restart.

Note: If you are using this plugin with Gateway Hub, you must create a user defined data schema. For instructions, see Create a data schema.

Configuration option Description
Host

Elasticsearch server host name or IP address.

You can toggle between entering a text or numerical value (data) or a variable (var).

Mandatory: Yes

Port

Elasticsearch server port.

You can toggle between entering a numerical value (data) or a variable (var).

Mandatory: No

Default: 9200

Index

Index where you want to add the JSON document.

This field conforms to the Elasticsearch REST API. For more information, see the Index API page of the Elasticsearch Reference.

Mandatory: Yes

Buffer size

Sets the maximum number of messages that the Publisher sampler holds in memory at a time.

Messages clear the buffer when the stream is received by the Elasticsearch server.

Mandatory: No

Default: 1000

Endpoint

Elasticsearch_type endpoint where you want to publish the stream data. By default, this is the document type, _doc.

This option conforms to the Elasticsearch REST API. For more information, see the Index API page of the Elasticsearch Reference.

Mandatory: No

Protocol

Connection protocol to use. By default, this is HTTP.

Use HTTPS if you want to set a secure connection.

Mandatory: No

Authentication

Authentication method to use.

The Publisher plug-in supports the following authentication types:

  • None — requires no authentication. This is the default setting.
  • Basic — requires basic authentication. If you choose this type, then you must provide a username and password.
  • Bearer — requires bearer authentication using Elasticsearch token API. If you choose this type, then you must provide the requisite fields. For more information, see Bearer authentication.

Mandatory: No

   

Advanced configuration

Field Description
Create admin view

Enables or disables the Elasticsearch Admin View on the managed entity. The Admin view is enabled by default.

You can toggle between a checkbox (data) or a variable (var).

Default: Enabled

   

Bearer authentication

The bearer authentication option enables you to connect to an Elasticsearch server via token API, without needing basic authentication.

The Publisher sampler supports the bearer authentication password grant type, as defined in the Elasticsearch API. For detailed information, see the Get token API page of the Elasticsearch Reference.

password

This grant type implements the OAuth 2.0 resource owner password credentials grant. A trusted user (the grantor) can either retrieve a token for their own use, or on behalf of an end-user (the grantee).

Publisher plug-in bearer authentication password grant type

Field Description
Username

For the Grantor, this is the username of the trusted user to retrieve an access token. This field is required.

For the Grantee, this is the username of the end-user to access the Elasticsearch server. This is an optional field.

You can toggle between entering a text or numerical value (data) or a variable (var).

Password

For the Grantor, this is the password of the trusted user to retrieve an access token. This field is required.

For the Grantee, this is the password of the end-user to access the Elasticsearch server. This is an optional field.

Choose the appropriate field when specifying the password:

  • stdAES — use this to input your plaintext password. If you select stdAES, you can define your password directly in the sampler and store it in standard AES encryption hash in the Gateway.
  • var — use this to pass the password as a variable. The variable is defined in Managed entity > Advanced > Var. This is useful for situations where you have multiple samplers that use the same credentials.
   

Splunk Admin View

The Publisher sampler automatically creates the Admin view to monitor the status of its streams, if there are any.

Headline legend

Name Description
protocol Connection protocol used. For example, HTTP or HTTPS.
host Splunk server host name or IP address that the Publisher sampler is connected to.
port Splunk HEC port that the Publisher sampler is connected to.
index

Splunk index where the stream data is published.

Table legend

Name Description
name Name of the FKM outbound stream tied to the Publisher sampler.
bufferSize

Number of messages that the sampler holds in the stream.

The sampler holds these messages until they are consumed by another sampler.

pending

Number of messages waiting to be consumed by the Publisher sampler from the native stream.

Note:  Incoming messages are counted as pending until the Batch size is reached.

sending Number of messages waiting to be received bySplunk from the Publisher sampler.
success

Number of messages successfully published to Splunk.

failed Number of messages that failed to be published to Splunk. This can be due to an issue with the schema, or the connection dropping between the Publisher sampler and the Collection Agent host.
lost

Total number of messages that did not reach the Publisher sampler. This can be due to the buffer filling up too quickly.

Lost messages indicate that you may need to increase the Buffer size or throttle the FKM sampler.

   

Note: Stream messages are stored in the buffer until they are consumed by another component. However, If there are no samplers or clients consuming the stream, then the stream registry purges the messages immediately.

Basic configuration

A Publisher sampler receives its stream from a corresponding FKM sampler. If you wish to assign an outbound stream to a Publisher sampler, see File Keyword Monitor configuration in File Keyword Monitor configuration.

Note: You can safely update the configuration of this plug-in without causing the Netprobe to restart.

Note: If you are using this plugin with Gateway Hub, you must create a user defined data schema. For instructions, see Create a data schema.

Field Description
Host

Splunk server host name or IP address.

You can toggle between entering a text or numerical value (data) or a variable (var).

Mandatory: Yes

Port

Splunk HEC port.

You can toggle between entering a numerical value (data) or a variable (var).

Mandatory: No

Default: 8088

Index

Index where you want to add the JSON document.

If set, the index must exist in the Splunk server. The Publisher plugin cannot verify if the index exists or not.

If not set, the index is determined by the Splunk HEC.

Mandatory: No

Buffer size

Sets the maximum number of messages that the Publisher sampler holds in memory at a time.

Messages clear the buffer when the stream is received by the Collection Agent.

Mandatory: No

Default: 1000

Protocol

Connection protocol to use. Use HTTPS if you want to set a secure connection.

Mandatory: No

Default: Https

Token

Authentication to access Splunk server.

Choose the appropriate field when specifying the token:

  • stdAES — use this to input your plaintext password. If you select stdAES, you can define your password directly in the sampler and store it in standard AES encryption hash in the Gateway.
  • var — use this to pass the password as a variable. The variable is defined in Managed entity > Advanced > Var. This is useful for situations where you have multiple samplers that use the same credentials.

Mandatory: Yes

Batch size

Number of messages to be published at a time.

Mandatory: No

Default: 5

Timeout

Sets the waiting time for a batch of messages defined by the Batch size to be collected. The timeout is reset when this batch of messages is received and published.

If the timeout expires, pending messages are published and the timeout is reset.

Mandatory: No

Default: 60

Unit: seconds

   

Advanced configuration

Field Description
Create admin view

Enables or disables the sampler Splunk Admin View on the managed entity.

Default: Enabled