This guide provides instructions on how to enable sending of your tenant logs from F5® Distributed Cloud Regional Edge (RE) Sites to an external log collection system. The sent logs include all system and application logs of your tenant. This also includes logs of all Customer Edge (CE) Sites of that tenant. For conceptual information about logging, see Logs.
A folder gets created each day in your log collection system and in that folder, a folder for each hour gets created. The logs are sent for every 5 minutes into the hourly folder and are stored in a compressed gzip file.
Global log receiver supports sending the logs for the following log collection systems:
AWS Cloudwatch
AWS S3
Azure Blob Storage
Azure Event Hubs
Datadog
GCP Bucket
Generic HTTP or HTTPs server
IBM QRadar
Kafka
NewRelic
Splunk
SumoLogic
Note: Currently, global log receiver supports only sending the request (access) logs, security events, and audit logs for all HTTP Load Balancers and sites.
Using the instructions provided in this guide, you can configure a Global Log Receiver in the F5® Distributed Cloud Console (Console) to enable sending of logs to an external log collection system.
An external log collection system reachable publicly.
The following IP ranges are required to be added to your firewall's allow-list:
193.16.236.68/32
185.160.8.156/32
Enable Global Receiver of Logs
You can configure global log receiver in either system namespace or in shared namespace. In case of configuring in shared namespace, you can configure to send from either shared namespace or all namespaces or specific list of namespaces. If you are configuring in system namespace, you can only send logs from system namespace.
The example shown in this guide creates a global log receiver object in the Console in system namespace for sending the logs to the external log collection system.
Perform the following in the F5® Distributed Cloud Console:
Step 1: Start creating a global log receiver.
In the Console home page, select the Multi-Cloud Network Connect service or the Shared Configuration service.
Shared Configuration service: Select Manage > Global Log Receiver.
Figure: Navigate to Global Log Receiver Page
Select Add Global Log Receiver button.
Step 2: Configure global log receiver properties.
Enter a name in the metadata section. Optionally, set labels and add a description.
Select Request Logs, Security Events, or Audit Logs for the Log Type field. The request logs are set by default.
In case of Multi-Cloud Network Connect service, select logs from current namespace for the Log Message Selection field. This is also the default option.
In case of Shared Configuration, you can select one of the following options:
Select logs from current namespace - send logs from the shared namespace.
Select logs from all namespaces - send logs from all namespaces.
Select logs in specific namespaces - send logs from specified namespaces. Enter the namespace name in the displayed namespaces list. Use Add item button to add more than one namespace.
Select a receiver in the Receiver Configuration drop-down menu and follow the corresponding instructions below.
AWS Cloudwatch ReceiverFigure: AWS Cloudwatch Global Log Receiver Configuration
Enter the group name for the target CloudWatch logs stream in the Group Name field.
Enter the stream name for the target CloudWatch logs stream in the Stream Name field.
In the AWS Cloud Credentials box, select a cloud credentials object from the drop-down. Alternatively, you can also use the Create new Cloud Credential button to create new object. For instructions on creating cloud credentials, see Cloud Credentials.
Note: For an AWS cloud credential, only the optionAWS Programmatic Access Credentials is supported.
Select a region from the AWS Region drop-down menu. Ensure that you select the same region in which the S3 storage is configured.
AWS S3 ReceiverFigure: AWS S3 Global Log Receiver Configuration
Enter your AWS S3 bucket name in the S3 Bucket Name field.
Select AWS Cloud Credentials box, select a cloud credentials object from the drop-down. Alternatively, you can also use the Create new Cloud Credential button to create new object. For instructions on creating cloud credentials, see Cloud Credentials.
Note: For an AWS cloud credential, only the optionAWS Programmatic Access Credentials is supported.
Select AWS Region box, select a region from the drop-down. Ensure that you select the same region in which the S3 storage is configured.
Azure Blob StorageFigure: Azure Blob Global Log Receiver Configuration
Click Configure to setup the Azure Blob storage connection string.
Figure: Azure Blob Connection String
Select Blindfolded Secret from the Secret Type drop-down menu.
Use the Action drop-down menu to select:
Use Blindfold Blindfolded Secret: Enter your connection string into the Blindfolded Secret field.
Blindfold New Secret: Select Built-in from the Policy Type drop-down menu and enter your connection string into the Secret to Blindfold field, or select Custom for the policy type and then choose a custom policy and enter your connection string.
* Where to get the Azure BLOB connection string:
Go to All Services > Storage accounts > (name of account).
Select Access Keys on the navigation menu for the storage account you have selected.
In the Access Keys pane, choose one of the access keys and click Show for the Connection String. Click Copy to Clipboard on the left side of the shown field.
Figure: Azure Blob Connection String Location
In the Container Name field, enter the name of the Azure container into which the logs should be sent.
Azure Event HubsFigure: Azure Event Hubs Global Log Receiver Configuration
Click Configure to setup the Azure Event Hubs storage connection string.
Figure: Azure Blob Connection String
Select Blindfolded Secret from the Secret Type drop-down menu.
Use the Action drop-down menu to select:
Use Blindfold Blindfolded Secret: Enter your connection string into the Blindfolded Secret field.
Blindfold New Secret: Select Built-in from the Policy Type drop-down menu and enter your connection string into the Secret to Blindfold field, or select Custom for the policy type and then choose a custom policy and enter your connection string.
Note: Your connection string should look like EntityPath=<EventHubName>.
In the Container Name field, enter the name of the Azure container into which the logs should be stored.
Datadog ReceiverFigure: Datadog Global Log Receiver Configuration
Enter the URL for DataDog into the Datadog into the Datadog Site field (datadoghq.com)
Click Configure to setup the Datadog API Key.
Figure: Azure Blob Connection String
Select Blindfolded Secret from the Secret Type drop-down menu.
Use the Action drop-down menu to select:
Use Blindfold Blindfolded Secret: Enter your Datadog API Key into the Blindfolded Secret field.
Blindfold New Secret: Select Built-in from the Policy Type drop-down menu and enter your Datadog API Key into the Secret to Blindfold field, or select Custom for the policy type and then choose a custom policy and enter your Datadog API Key.
GCP Bucket ReceiverFigure: GCP Global Log Receiver Configuration
In the GCP Bucket Name field, enter the name of the bucket into which the logs should be sent.
Use the GCP Cloud Credentials drop-down menu to select an existing set of credentials. Alternatively select Add Item to create new credentials. For help creating new credentials, see Cloud Credentials.
HTTP ReceiverFigure: HTTP Global Log Receiver Configuration
Enter the URI for your HTTP receiver in the HTTP Uri field.
Choose an authentication type from the Authentication drop-down menu.
None: no authentication will be performed.
Basic Authentication: Enter your login in the User Name field and click Configure to enter your password.
Token Authentication: Click Configure to enter your authentication token.
IBM QRadar ReceiverFigure: QRadar Global Log Receiver Configuration
Note: In some cases, multiple events will be sent to the global log receiver at once. The QRadar receiver needs to be told to split the events into multiple records. To do this, configure the Message Pattern field as follows: Figure: QRadar Message Pattern Configuration
Kafka ReceiverFigure: Kafka Global Log Receiver Configuration
Enter a Kafka bootstrap server in the form of host:port into the Kafka Bootstrap Server List. Use the Add Item button to add additional pairs.
Enter the Kafka topic name for the reported events.
Select the endpoint thats applicable to your NewRelic account in the NewRelic Account Endpoint drop-down menu.
Click Configure to setup the NewRelic license key.
Figure: NewRelic License Key
Select Blindfolded Secret from the Secret Type drop-down menu.
Use the Action drop-down menu to select:
Use Blindfold Blindfolded Secret: Enter your NewRelic license key into the Blindfolded Secret field.
Blindfold New Secret: Select Built-in from the Policy Type drop-down menu and enter your NewRelic license key into the Secret to Blindfold field, or select Custom for the policy type and then choose a custom policy and enter your NewRelic license key.
Select Blindfolded Secret from the Secret Type drop-down menu.
Use the Action drop-down menu to select:
Use Blindfold Blindfolded Secret: Enter your Splunk HEC token into the Blindfolded Secret field.
Blindfold New Secret: Select Built-in from the Policy Type drop-down menu and enter your Splunk HEC token into the Secret to Blindfold field, or select Custom for the policy type and then choose a custom policy and enter your Splunk HEC token.
*Splunk Configuration Details.
According to a Splunk article, there are 2 different Splunk HEC URI:
Click Configure to setup the SumoLogic Source Address URL.
Figure: SumoLogic Source Address URL
Select Blindfolded Secret from the Secret Type drop-down menu.
Use the Action drop-down menu to select:
Use Blindfold Blindfolded Secret: Enter your SumoLogic HTTP collector URL into the Blindfolded Secret field.
Blindfold New Secret: Select Built-in from the Policy Type drop-down menu and enter your SumoLogic HTTP collector URL into the Secret to Blindfold field, or select Custom for the policy type and then choose a custom policy and enter your SumoLogic HTTP collector URL.
Step 3: Optionally, configure advanced settings.
Advanced settings include configuring batch options and TLS. Using batch options, you can apply limits such as maximum number of messages bytes or timeout for a batch of logs to be sent to the receiver.
Select Show Advanced Fields toggle and do the following in the Batch Options section:
Select Timeout Seconds for the Batch Timeout Options and enter a timeout value in the Timeout Seconds box.
Select Max Events for the Batch Max Events and enter a value between 32 and 2000 in the Max Events box.
Select Max Bytes for the Batch Bytes and enter a value between 4096 and 1048576 in the Batch Bytes box. Logs will be sent after the batch is size is equal to or more than the specified byte size.
Do the following for TLS section:
Select Use TLS for the TLS field.
Select Server CA Certificates for the Trusted CA field. Enter the certificates in PEM or Base64 format in the Server CA Certificates box.
Select Enable mTLS for mTLS config and enter client certificate in PEM or Base64 format in the Client Certificate box.
Select Configure in the Client Private Key field, enter the secret in the box with type selected as Text.
Select Blindfold, wait for the operation to complete, and click Apply.
Step 4: Complete log receiver creation.
Select Save & Exit to complete creating the global log receiver.
Step 5: Inspect your connection and verify that logs are collected in the receiver..
Select ... > Test Connection in the Actions column for your global log receiver object. Wait for the test to complete and return message indicating successful connection.
Note: The F5 Distributed Cloud performs up to 20 retries in case of connection failure. The retries are performed only in case of the following HTTP response codes are returned:
429
500 and above, but not 501
Verify that logs are collected into your receiver (such as S3 bucket in AWS).