Splunk Search

dashboard SLA time calculation

dm2
Explorer

I have two different queries, one calculates total critical alerts and the second one calculates total time critical alerts where "opened".

I need to calculate the average between them time/count, how can i achieve it?

dm2_0-1709129077757.png

dm2_1-1709129104853.pngdm2_2-1709129121576.png

 

Labels (1)
0 Karma

Richfez
SplunkTrust
SplunkTrust

I'm not completely clear on what the question is asking, but for the sake of a) getting discussion started to determine that and b) maybe this is what you are looking for anyway, ...

The big search for total time - those last three lines, try changing them to the following 5.

...
| stats sum(time_difference) AS TOTAL, count as COUNT
| eval AVERAGE_TIME_PER_EVENT = TOTAL / COUNT
| fieldformat TOTAL = strftime(TOTAL, "%H:%M:%S")
| eval AVERAGE_TIME_PER_EVENT = tostring(AVERAGE_TIME_PER_EVENT, "duration") 
| table COUNT, TOTAL, AVERAGE_TIME_PER_EVENT

 I don't think I have any typos in the above, but maybe they exist.  🙂

Anyway, all I do differently is add the count into your stats, eval a new field that's the total / count.  Then I leave your fieldformat the same, but right after that I introduce you to "tostring" with a type of duration, that should do what you want just like the fieldformat (H:M:S) but will ALSO add "days" to the front when appropriate (like 1+14:21:54 for 1 day plus 14 hours, blah blah.).  You can change both to eval or use fieldformat for both - it's fine either way.

Lastly I added the two new fields to the table.

If this is no where near what you want the answer to, ... well, maybe clarify your question a bit!  🙂

Happy Splunking,

Rich

0 Karma

dm2
Explorer

I did not get results.. 

I have to calculate average time for closing alerts by severity, so in this case I am calculting "medium" severity alerts, so the equation should be total medium alerts / total time of "closing" medium alerts

dm2_0-1709805043376.png

 

0 Karma

Richfez
SplunkTrust
SplunkTrust

As to the missing results - sure, because your TOTAL field appears empty.  You should just debug that for a start.  All the extra conditional logic you want can be implemented later once you get this core piece working. 

Here's how I'd approach it:

Temporarily comment out (triple backticks before and after them) or remove all the fieldformats and the trailing table command so you can see what values all fields actually have.  When you put them back in, I suggest doing those "pretty it up" tasks as one of the last steps after all actual "work" has been done.  Also this makes it easier to follow the code because it'll be structured better - first get your data, then do your calculations, lastly make things pretty.

Then just backtrack.  Divide and conquer.  Remove all the stuff after the stats command where TOTAL is calculated.  If there's no result for TOTAL, figure out why.  Since TOTAL is the sum of time_difference, take out everything from the stats onward and see what time_difference is in the events.  If it's blank, then work backwards one more step and see where it comes from - incident review time and notable time - so what are the values for *those* fields?  At some point you'll see what I'm sure is a facepalm somewhere in there.

Once you have all that straightened out, add back in the extra stuff one step at a time, confirming the results at each step.  You'll have a lot better understanding of the data you are working with and also how all this works, too.

THEN.

There's likely to be no reason at all to separately do only "medium" severity.  I suspect if you remove the "where" way up near the top, then do all your stats "by severity" you may be able to just calculate the answers for all severities in one pass.  But again, baby steps. Get it working first, then we can modify it to do that.

 

0 Karma
Get Updates on the Splunk Community!

Enter the Splunk Community Dashboard Challenge for Your Chance to Win!

The Splunk Community Dashboard Challenge is underway! This is your chance to showcase your skills in creating ...

.conf24 | Session Scheduler is Live!!

.conf24 is happening June 11 - 14 in Las Vegas, and we are thrilled to announce that the conference catalog ...

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...