Splunk Search

Alert if field value is greater than X for Y amount of time

ipstatic2
New Member

I would like to trigger an alert if the value of a field (queued_jobs) is equal or greater than 10 for more than 2 minutes. I don't care if one or two times the value is 50 or 100. I only care that if in a 2 minute window queued_jobs was consistently equal or greater than 10.

0 Karma

DaleFRice
Explorer
*search criteria* earliest=-2m  queued_jobs<10

Set it as an alert over a rolling window of two minutes, and tell it to alert you if the number of results equals 0. If there's ever a two minute window during which the field is constantly 10 or greater, you'll be alerted.

If you want it to return if the average is greater than 10, it's a bit trickier. Try this:

*search criteria* earliest=-2m | outlier |stats avg(queued_jobs) as average

Like before, set it as an alert over a rolling window of two minutes. This time, set it to use a custom condition is met. Set the custom condition to:

eval average>=10

This should let you know if the average is greater than 10 jobs queued. Outlier will remove entries with absurd spikes (if you suddenly get 2000 tiny jobs that get cleared up in a few seconds, it won't throw off the average), and this condition should alert you if the average number of jobs is ever greater than 10 for a period of over two minutes.

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...