Configure Global Log Receiver
Objective
This guide provides instructions on how to enable sending of your tenant logs from F5® Distributed Cloud Regional Edge (RE) Sites to an external log collection system. The sent logs include all system and application logs of your tenant. This also includes logs of all Customer Edge (CE) Sites of that tenant. For conceptual information about logging, see Logs.
A folder gets created each day in your log collection system and in that folder, a folder for each hour gets created. The logs are sent for every 5 minutes into the hourly folder and are stored in a compressed gzip
file.
Global log receiver supports sending the logs for the following log collection systems:
- AWS Cloudwatch
- AWS S3
- Azure Blob Storage
- Azure Event Hubs
- Datadog
- GCP Bucket
- Generic HTTP or HTTPs server
- IBM QRadar
- Kafka
- NewRelic
- Splunk
- SumoLogic
Note: Currently, global log receiver supports only sending the request (access) logs, security events, and audit logs for all HTTP Load Balancers and sites.
Using the instructions provided in this guide, you can configure a Global Log Receiver in the F5® Distributed Cloud Console (Console) to enable sending of logs to an external log collection system.
Prerequisites
- A valid Account is required.
Note: If you do not have an account, see Create an Account.
-
An external log collection system reachable publicly.
-
The following IP ranges are required to be added to your firewall's allow-list:
- 193.16.236.64/29
- 185.160.8.152/29
Enable Global Receiver of Logs
You can configure global log receiver in either system namespace (Multi-Cloud Network Connect service) or in shared namespace (Shared Configuration service). In case of configuring in shared namespace, you can configure to send from either shared namespace or all namespaces or specific list of namespaces. If you are configuring in system namespace, you can only send logs from system namespace.
The example shown in this guide creates a global log receiver object in the Console in system namespace for sending the logs to the external log collection system.
Perform the following in the F5® Distributed Cloud Console:
Step 1: Start creating a global log receiver.
-
In the Console home page, select the
Multi-Cloud Network Connect
service or theShared Configuration
service. -
Multi-Cloud Network Connect
service: SelectManage
>Log Management
>Global Log Receiver
. -
Shared Configuration
service: SelectManage
>Global Log Receiver
.
Figure: Navigate to Global Log Receiver Page
- Select
Add Global Log Receiver
button.
Step 2: Configure global log receiver properties.
-
Enter a name in the metadata section. Optionally, set labels and add a description.
-
Select
Request Logs
,Security Events
, orAudit Logs
for theLog Type
field. The request logs are set by default. -
In case of
Multi-Cloud Network Connect
service, select logs from current namespace for theLog Message Selection
field. This is also the default option. -
In case of
Shared Configuration
, you can select one of the following options:Select logs from current namespace
- send logs from the shared namespace.
Note: In case of DNS Request Logs, sending logs from current namespace is not supported. Either select logs from all namespaces or select logs from specific namespaces, and select the system namespace to send logs.
Select logs from all namespaces
- send logs from all namespaces.Select logs in specific namespaces
- send logs from specified namespaces. Enter the namespace name in the displayed namespaces list. UseAdd item
button to add more than one namespace.
-
Select a receiver in the
Receiver Configuration
drop-down menu and follow the corresponding instructions below.
AWS Cloudwatch Receiver
Figure: AWS Cloudwatch Global Log Receiver Configuration
-
Enter the group name for the target CloudWatch logs stream in the
Group Name
field. -
Enter the stream name for the target CloudWatch logs stream in the
Stream Name
field. -
In the
AWS Cloud Credentials
box, select a cloud credentials object from the drop-down. Alternatively, you can also use theCreate new Cloud Credential
button to create new object. For instructions on creating cloud credentials, see Cloud Credentials.
Note: For an AWS cloud credential, only the option
AWS Programmatic Access Credentials
is supported.
- Select a region from the
AWS Region
drop-down menu. Ensure that you select the same region in which the S3 storage is configured.
AWS S3 Receiver
Figure: AWS S3 Global Log Receiver Configuration
-
Enter your AWS S3 bucket name in the
S3 Bucket Name
field. -
Select
AWS Cloud Credentials
box, select a cloud credentials object from the drop-down. Alternatively, you can also use theCreate new Cloud Credential
button to create new object. For instructions on creating cloud credentials, see Cloud Credentials.
Note: For an AWS cloud credential, only the option
AWS Programmatic Access Credentials
is supported.
- Select
AWS Region
box, select a region from the drop-down. Ensure that you select the same region in which the S3 storage is configured.
Azure Blob Storage
Figure: Azure Blob Global Log Receiver Configuration
-
Click
Configure
to setup the Azure Blob storage connection string.Figure: Azure Blob Connection String
-
Select
Blindfolded Secret
from theSecret Type
drop-down menu. -
Use the
Action
drop-down menu to select:Use Blindfold Blindfolded Secret
: Enter your connection string into theBlindfolded Secret
field.Blindfold New Secret
: SelectBuilt-in
from thePolicy Type
drop-down menu and enter your connection string into theSecret to Blindfold
field, or selectCustom
for the policy type and then choose a custom policy and enter your connection string.
* Where to get the Azure BLOB connection string:
- Go to
All Services
>Storage accounts
>(name of account)
. - Select
Access Keys
on the navigation menu for the storage account you have selected. - In the
Access Keys
pane, choose one of the access keys and click Show for the Connection String. ClickCopy to Clipboard
on the left side of the shown field.
Figure: Azure Blob Connection String Location
-
-
In the
Container Name
field, enter the name of the Azure container into which the logs should be sent.
Azure Event Hubs
Figure: Azure Event Hubs Global Log Receiver Configuration
-
Click
Configure
to setup the Azure Event Hubs storage connection string.Figure: Azure Blob Connection String
- Select
Blindfolded Secret
from theSecret Type
drop-down menu. - Use the
Action
drop-down menu to select:Use Blindfold Blindfolded Secret
: Enter your connection string into theBlindfolded Secret
field.Blindfold New Secret
: SelectBuilt-in
from thePolicy Type
drop-down menu and enter your connection string into theSecret to Blindfold
field, or selectCustom
for the policy type and then choose a custom policy and enter your connection string.
- Select
Note: Your connection string should look like
EntityPath=<EventHubName>
.
* Where to get the Azure Event Hubs connection string: Refer to the following Microsoft article: https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-get-connection-string.
- In the
Container Name
field, enter the name of the Azure container into which the logs should be stored.
Datadog Receiver
Figure: Datadog Global Log Receiver Configuration
-
Enter the URL for DataDog into the Datadog into the
Datadog Site
field (datadoghq.com
) -
Click
Configure
to setup the Datadog API Key.Figure: Azure Blob Connection String
-
Select
Blindfolded Secret
from theSecret Type
drop-down menu. -
Use the
Action
drop-down menu to select:Use Blindfold Blindfolded Secret
: Enter your Datadog API Key into theBlindfolded Secret
field.Blindfold New Secret
: SelectBuilt-in
from thePolicy Type
drop-down menu and enter your Datadog API Key into theSecret to Blindfold
field, or selectCustom
for the policy type and then choose a custom policy and enter your Datadog API Key.
* Where to get the Datadog API Key:
- Go to https://app.datadoghq.com/organization-settings/api-keys
- Create API key or select existing one
- A dialog will pop-up. Click Copy from the dialog and paste this value into the the global log receiver configuration for API Key
Datadog Documentation: https://docs.datadoghq.com/account_management/api-app-keys/#add-an-api-key-or-client-tokenFigure: Datadog API Key Location
-
GCP Bucket Receiver
Figure: GCP Global Log Receiver Configuration
-
In the
GCP Bucket Name
field, enter the name of the bucket into which the logs should be sent. -
Use the
GCP Cloud Credentials
drop-down menu to select an existing set of credentials. Alternatively selectAdd Item
to create new credentials. For help creating new credentials, see Cloud Credentials.
HTTP Receiver
Figure: HTTP Global Log Receiver Configuration
-
Enter the URI for your HTTP receiver in the
HTTP Uri
field. -
Choose an authentication type from the
Authentication
drop-down menu.None
: no authentication will be performed.Basic Authentication
: Enter your login in theUser Name
field and clickConfigure
to enter your password.Token Authentication
: ClickConfigure
to enter your authentication token.
IBM QRadar Receiver
Figure: QRadar Global Log Receiver Configuration
Enter the URI for your QRadar receiver in the Log Source Collector URL
field. For more information, refer to the following IBM document: https://www.ibm.com/docs/en/dsm?topic=options-http-receiver-protocol-configuration
Note: In some cases, multiple events will be sent to the global log receiver at once. The QRadar receiver needs to be told to split the events into multiple records. To do this, configure the
Message Pattern
field as follows:Figure: QRadar Message Pattern Configuration
Kafka Receiver
Figure: Kafka Global Log Receiver Configuration
-
Enter a Kafka bootstrap server in the form of host:port into the
Kafka Bootstrap Server List
. Use theAdd Item
button to add additional pairs. -
Enter the Kafka topic name for the reported events.
NewRelic Receiver
Figure: NewRelic Log Receiver Configuration
-
Select the endpoint thats applicable to your NewRelic account in the
NewRelic Account Endpoint
drop-down menu. -
Click
Configure
to setup the NewRelic license key.Figure: NewRelic License Key
- Select
Blindfolded Secret
from theSecret Type
drop-down menu. - Use the
Action
drop-down menu to select:Use Blindfold Blindfolded Secret
: Enter your NewRelic license key into theBlindfolded Secret
field.Blindfold New Secret
: SelectBuilt-in
from thePolicy Type
drop-down menu and enter your NewRelic license key into theSecret to Blindfold
field, or selectCustom
for the policy type and then choose a custom policy and enter your NewRelic license key.
- Select
Splunk Receiver
Figure: Splunk Log Receiver Configuration
-
Enter the
Splunk HEC Logs Endpoint
. -
Click
Configure
to setup the Splunk HEC token.Figure: Splunk License Key
- Select
Blindfolded Secret
from theSecret Type
drop-down menu. - Use the
Action
drop-down menu to select:Use Blindfold Blindfolded Secret
: Enter your Splunk HEC token into theBlindfolded Secret
field.Blindfold New Secret
: SelectBuilt-in
from thePolicy Type
drop-down menu and enter your Splunk HEC token into theSecret to Blindfold
field, or selectCustom
for the policy type and then choose a custom policy and enter your Splunk HEC token.
- Select
*Splunk Configuration Details.
According to a Splunk article, there are 2 different Splunk HEC URI:
- For Splunk Cloud customers, the standard HEC URI is: https://http-inputs-customer_stack.splunkcloud.com
- Splunk Cloud customers do NOT need to specify port 8088, all HEC traffic goes over port 443.
- Fur customers using AWS Firehose, then you will have a second HEC URL: https://http-inputs-firehose-customer_stack.splunkcloud.com
- For customers running HEC on their own deployments or using the Splunk test drive instance, then port 8088 will need to be specified: https://input-prd-uniqueid.cloud.splunk.com:8088
In either of the scenarios, you can use the following command to validate the URL:
-
In case of Splunk Cloud, enter
%>nslookup http-inputs-<customer_stack>.splunkcloud.com
-
In case of Splunk Test Drive, enter
$ nslookup input-prd-uniqueid.cloud.splunk.com
Splunk Documentation: https://docs.splunk.com/Documentation/Splunk/9.0.4/Data/UsetheHTTPEventCollector
SumoLogic Receiver
Figure: SumoLogic Log Receiver Configuration
-
Click
Configure
to setup the SumoLogic Source Address URL.Figure: SumoLogic Source Address URL
- Select
Blindfolded Secret
from theSecret Type
drop-down menu. - Use the
Action
drop-down menu to select:Use Blindfold Blindfolded Secret
: Enter your SumoLogic HTTP collector URL into theBlindfolded Secret
field.Blindfold New Secret
: SelectBuilt-in
from thePolicy Type
drop-down menu and enter your SumoLogic HTTP collector URL into theSecret to Blindfold
field, or selectCustom
for the policy type and then choose a custom policy and enter your SumoLogic HTTP collector URL.
- Select
Step 3: Optionally, configure advanced settings.
Advanced settings include configuring batch options and TLS. Using batch options, you can apply limits such as maximum number of messages bytes or timeout for a batch of logs to be sent to the receiver.
Select Show Advanced Fields
toggle and do the following in the Batch Options
section:
- Select
Timeout Seconds
for theBatch Timeout Options
and enter a timeout value in theTimeout Seconds
box. - Select
Max Events
for theBatch Max Events
and enter a value between 32 and 2000 in theMax Events
box. - Select
Max Bytes
for theBatch Bytes
and enter a value between 4096 and 1048576 in theBatch Bytes
box. Logs will be sent after the batch is size is equal to or more than the specified byte size.
Do the following for TLS
section:
- Select
Use TLS
for theTLS
field. - Select
Server CA Certificates
for theTrusted CA
field. Enter the certificates in PEM or Base64 format in theServer CA Certificates
box. - Select
Enable mTLS
formTLS config
and enter client certificate in PEM or Base64 format in theClient Certificate
box.- Select
Configure
in theClient Private Key
field, enter the secret in the box with type selected asText
. - Select
Blindfold
, wait for the operation to complete, and clickApply
.
- Select
Step 4: Complete log receiver creation.
Select Save & Exit
to complete creating the global log receiver.
Step 5: Inspect your connection and verify that logs are collected in the receiver..
- Select
...
>Test Connection
in theActions
column for your global log receiver object. Wait for the test to complete and return message indicating successful connection.
Note: The F5 Distributed Cloud performs up to 20 retries in case of connection failure. The retries are performed only in case of the following HTTP response codes are returned:
- 429
- 500 and above, but not 501
- Verify that logs are collected into your receiver (such as S3 bucket in AWS).