Configure Global Log Receiver

Objective

This guide provides instructions on how to enable sending of your tenant logs from F5® Distributed Cloud Regional Edge (RE) Sites to an external log collection system. The sent logs include all system and application logs of your tenant. This also includes logs of all Customer Edge (CE) Sites of that tenant. For conceptual information about logging, see Logs.

A folder gets created each day in your log collection system and in that folder, a folder for each hour gets created. The logs are sent for every 5 minutes into the hourly folder and are stored in a compressed gzip file.

Global log receiver supports sending the logs for the following log collection systems:

  • Amazon S3
  • Generic HTTP or HTTPs server
  • Datadog
  • Splunk
  • AWS Cloudwatch
  • Kafka
  • Azure Event Hubs
  • Azure Blob Storage

Note: Currently, global log receiver supports only sending the request (access) logs, security events, and audit logs for all HTTP Load Balancers and sites.

Using the instructions provided in this guide, you can configure a Global Log Receiver in the F5® Distributed Cloud Console (Console) to enable sending of logs to an external log collection system.


Prerequisites

Note: If you do not have an account, see Create an Account.

  • An external log collection system reachable publicly.

  • The following IP ranges are required to be added to your firewall's allow-list:

    • 193.16.236.68/32
    • 185.160.8.156/32

Enable Global Receiver of Logs

You can configure global log receiver in either system namespace or in shared namespace. In case of configuring in shared namespace, you can configure to send from either shared namespace or all namespaces or specific list of namespaces. If you are configuring in system namespace, you can only send logs from system namespace.

The example shown in this guide creates a global log receiver object in the Console in system namespace for sending the logs to the external log collection system.

Perform the following in the F5® Distributed Cloud Console:

Step 1: Start creating a global log receiver.
  • In the Console home page, select Cloud and Edge Sites service or Shared Configuration service.

  • Select Management > Log Management in the primary navigation menu for Cloud and Edge Sites service. If it is Shared Configuration, select Manage > Global Log Receiver.

  • Select Global Log Receiver in case of Cloud and Edge Sites service.

NavGlobalLogReceiver
Figure: Navigate to Global Log Receiver Page

  • Select Add Global Log Receiver button.
Step 2: Configure global log receiver properties.

Do the following in the Global Log Receiver section:

  • Enter a name in the metadata section. Optionally, set labels and add a description.

  • Select Request Logs or Security Events for the Log Type field. The request logs are set by default.

  • In case of Cloud and Edge Sites service, select logs from current namespace for the Log Message Selection field. This is also the default option.

  • In case of Shared Configuration, you can select one of the following options:

    • Select logs from current namespace - send logs from the shared namespace.
    • Select logs from all namespaces - send logs from all namespaces.
    • Select logs in specific namespaces - send logs from specified namespaces. Enter the namespace name in the displayed namespaces list. Use Add item button to add more than one namespace.
  • Select S3 Receiver for the Receiver Configuration box. Configure following for S3 receiver:

    • Enter your AWS S3 bucket name in the S3 Bucket Name field.

    • Select AWS Cloud Credentials box, select a cloud credentials object from the drop-down. Alternatively, you can also use the Create new Cloud Credential button to create new object. For instructions on creating cloud credentials, see Cloud Credentials.

    • Select AWS Region box, select a region from the drop-down. Ensure that you select the same region in which the S3 storage is configured.

GlobalLogReceiver
Figure: Global Log Receiver Configuration

Note: Similarly, you can configure receivers for other systems such as HTTP(s) server, Datadog, Splunk, Azure Hub Events, or Azure Blob Storage.

Step 3: Optionally, configure advanced settings.

Advanced settings include configuring batch options and TLS. Using batch options, you can apply limits such as maximum number of messages bytes or timeout for a batch of logs to be sent to the receiver.

Select Show Advanced Fields toggle and do the following in the Batch Options section:

  • Select Timeout Seconds for the Batch Timeout Options and enter a timeout value in the Timeout Seconds box.
  • Select Max Events for the Batch Max Events and enter a value between 32 and 2000 in the Max Eventss box.
  • Select Max Bytes for the Batch Bytes and enter a value between 4096 and 1048576 in the Batch Bytes box. Logs will be sent after the batch is size is equal to or more than the specified byte size.

Do the following for TLS section:

  • Select Use TLS for the TLS field.
  • Select Server CA Certificates for the Trusted CA field. Enter the certificates in PEM or Base64 format in the Server CA Certificates box.
  • Select Enable mTLS for mTLS config and enter client certificate in PEM or Base64 format in the Client Certificate box.
    • Select Configure in the Client Private Key field, enter the secret in the box with type selected as Text.
    • Select Blindfold, wait for the operation to complete, and click Apply.
Step 4: Complete log receiver creation.

Select Save & Exit to complete creating the global log receiver. Verify that logs are received into your S3 bucket in AWS.

Configuring Splunk Receiver

In case of configuring Splunk receiver, according to a Splunk article, there are 2 different Splunk HEC URI:

https://http-inputs-firehose-customer_stack.splunkcloud.com/services/collector

  • For customers running HEC on their own deployments or using the Splunk test drive instance, then port 8088 will need to be specified:

https://input-prd-uniqueid.cloud.splunk.com:8088/services/collector

In either of the scenarios, you can use the following command to validate the URL:

  • In case of Splunk Cloud, enter %>nslookup http-inputs-<customer_stack>.splunkcloud.com

  • In case of Splunk Test Drive, enter $ nslookup input-prd-uniqueid.cloud.splunk.com


Concepts