I am trying to calculate the hardware requirements for a Splunk installation.
The main issue I have is that the hardware planning guide seems to be aimed towards systems where as data indexing volumes increase so does the number of concurrent users. The system we are looking to install would be indexing between 2GB -8GB of data per day. However, the system would have a large number of scheduled saved searches, multiple users carrying out searches via dashboards (saved searches and realtime searches) and operational users who may be building more dashboard functionality or carry out operational troubleshooting.
The current proposed hardware is:
2xQuadCore Processors 2.?GHz, 72GB RAM, 1TB SAN.
It seem realistic that a server of this specification would do a good job when the amount to be indexed is so small? Any thoughts and advice would be appreciated.
asked 27 Jan '11, 07:48
At those rates, the hardware will spend very little I/O and cpu time overall on indexing.
When building dashboards it's not shocking if the first takes are somewhat inefficient.
With 8 cores, you can expect to see up to 3 cores used by scheduled searches, and at that data load probably less than one core used for indexing. This means that you probably will get around 4-5 concurrent longer running searches before performance will begin to seriously drop off for the box reasons as opposed to disk reasons. I don't know how to estimate real usage where your uptake and time spent using the tool may vary. However if you imagine tens of concurrent search-running users (not idle in the ui), you will almost certainly need to spread the load further or add cores.
If your searches will generally be over very recent time, then the operating system should be able to keep most of the working set in RAM, meaning that recent time searches shouldn't care much about SAN performance. Of course historical searches still will.
answered 27 Jan '11, 18:30