Getting Data In

splunk forwarder maximum monitored file limit?

JasonCzerak
Explorer

Is there a max file count a single forwarder can monitor? I have some oracle applications that generate 10,000's of new files daily, various sizes. I have a dedicated host that has these shared mounts mounted up and just scanning for new data. Every how and again it crashes the box. I can't seem to make anything out on the console.

Am I running into some sort of file limit?

Tags (1)

dart
Splunk Employee
Splunk Employee

Can you use a local forwarder on the source data, so we can determine if it's Splunk or mount related issues?

0 Karma

dart
Splunk Employee
Splunk Employee

The Splunk on Splunk app will give you an idea of what's causing the queue blockage.
It's possible for a single Universal Forwarder to saturate an indexer if it has sufficient event throughput.

0 Karma

JasonCzerak
Explorer

09-04-2012 00:10:36.644 -0500 INFO TailingProcessor - Could not send data to output queue (parsingQueue), retrying...
09-04-2012 00:10:37.836 -0500 INFO TailingProcessor - ...continuing.

0 Karma

JasonCzerak
Explorer

For one of the data sources, I have, and it works just fine with about 100,000 files. It's when I added several other data sources it broke down.

I do have these errors in splunkd.log that I've just noticed:
09-04-2012 03:06:06.730 -0500 INFO TailingProcessor - File descriptor cache is full (100), trimming...

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...