Skip to main content

vast limits GmbH and uberAgent are now part of Citrix, a business unit of Cloud Software Group. Learn more at Citrix.com.


This documentation does not apply to the most recent version of uberAgent. Click here for the latest version.

No Data in Splunk Even Though uberAgent Sends Successfully

Scenario 1: uberAgent is sending data directly to Splunk

Symptoms

According to uberAgent’s log file, there are no events in its in-memory send queue, indicating a successful data transfer to the Splunk server(s):

ReceiverStatistics,Splunk; SPLUNKSERVER:19500 - Events in queue: 0, queue size: 0.0 KB, sent: 1597, added to queue: 1597, rejected from queue: 0

When searching in Splunk’s internal logs for errors using the following search:index=_internal error you see messages like the following:

ERROR TcpInputProc - Message rejected. Received unexpected 707406419 byte message! from src=IPADDRESS:PORT. Maximum message allowed: 67108864

Cause

The port uberAgent sends data to (default: 19500) is configured to receive data from Universal Forwarders.

Splunk Universal Forwarders do not send raw event data, they use a specific protocol. uberAgent, on the other hand, sends raw event data. If the formats sent by the source and expected on the target do not match the above error message may be logged and the incoming data is ignored on the Splunk server.

Resolution

If uberAgent is sending data directly to Splunk do not open port 19500 via Forwarding and receiving on the Splunk server. Install the uberAgent_indexer app instead which opens port 19500 as a raw TCP port.

Scenario 2: uberAgent is sending data to a locally installed Splunk Universal Forwarder

Symptoms

According to uberAgent’s log file, there are no events in its in-memory send queue, indicating a successful data transfer to the locally installed Universal Forwarder:

ReceiverStatistics,Splunk; localhost:19500 - Events in queue: 0, queue size: 0.0 KB, sent: 1597, added to queue: 1597, rejected from queue: 0

When searching for incoming data on port 19500 on the Splunk server using the following search: index=* source="tcp:19500" you see messages like the following (shortened):

--splunk-cooked-mode-v3--\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...

Cause

The port Universal Forwarder sends data to (default: 9997) is configured to receive data from uberAgent.

Splunk Universal Forwarders do not send raw event data, they use a specific protocol. uberAgent, on the other hand, sends raw event data. If the formats sent by the source and expected on the target do not match above messages may be seen and the dashboards will be empty.

Resolution

If uberAgent is sending data to Splunk via a locally installed Universal Forwarder open TCP port 19500 on the Universal Forwarder (the same machine uberAgent runs on) and configure uberAgent to send to localhost:19500 as described here.

To enable Splunk to receive the Universal Forwarder’s data open a receiving Port (default: 9997) on Splunk via Forwarding and receiving as described here. Do not open port 9997 as a raw TCP port.

Comments

Your email address will not be published. Required fields are marked *