Splunk Search

Graphing Elapsed Time

Memphis
Engager

Hi all - 

I am a Splunk Novice, especially when it comes to writing my own queries.  I have created a Splunk Query that serves my first goal:  calculate elapsed time between 2 events.  

Now, goal #2 is to graph that over a time period (i.e. 7 days).  What is stalling my brain is that these events happen every day - in fact, they are batches that run on a cron schedule, so they better be happening every day!  So I am unable to just change the time preset and graph this, because I am using earliest and latest events to calculate beginning and end.  Here is my query to calculate duration: 

 

index=*XYZ" "Batchname1"
| stats earliest(_time) AS Earliest, latest(_time) AS Latest
| eval Elapsed_Time=Latest-Earliest, Start_Time_Std=strftime(Earliest,"%H:%M:%S:%Y-%m-%d"), End_Time_Std=strftime(Latest,"%H:%M:%S:%Y-%m-%d")
| eval Elapsed_Time=Elapsed_Time/60
| table Start_Time_Std, End_Time_Std, Elapsed_Time



Any ideas on how to graph this duration over time so I can develop trend lines, etc?  Thanks all for the help! 

Labels (1)
0 Karma
1 Solution

yuanliu
SplunkTrust
SplunkTrust

First, on thought process.  Splunk allows you to create additional field in event stream.  If you mark each day as "day -1", "day -2", etc., you can group earliest and latest by day.

This is how to do that in Splunk

 

index=*XYZ" "Batchname1" earliest=-7d@d latest=-0d@d
| eval dayback = mvrange(0, 7)
| eval day = mvmap(dayback, if(_time < relative_time(now(), "-" . dayback . "d@day") AND relative_time(now(), "-" . tostring(dayback + 1) . "d@day") < _time, dayback, null()))
| stats min(_time) as Earliest max(_time) as Latest by day
| fieldformat Earliest = strftime(Earliest, "%F %T")
| fieldformat Latest = strftime(Latest, "%F %T")
| eval day = "day -" . tostring(day + 1)

 

The output looks like

dayEarliestLatest
day -12024-04-23 00:01:002024-04-23 23:53:00
day -22024-04-22 09:29:002024-04-22 23:31:00
day -32024-04-21 14:29:002024-04-21 14:29:00
day -42024-04-20 00:01:002024-04-20 19:14:00
day -52024-04-19 01:13:002024-04-19 23:47:00
day -62024-04-18 00:01:002024-04-18 23:28:00
day -72024-04-17 00:01:002024-04-17 23:14:00

Two pointers:

  1. It doesn't seem to make sense to search in current day.  So I shifted search period to -0day@day.  If your requirement includes current day, you need to change latest as well as tweak the definition of day a little.
  2. Do not use earliest(_time); min(_time) is cheaper.

The following is the emulation I use to test the above.

 

index = _audit earliest=-7d@d latest=-0d@d action=validate_token
| timechart span=1m count
| where count > 0
``` emulation of
index=*XYZ" "Batchname1" earliest=-7d@d latest=-0d@d
```

 

 

View solution in original post

0 Karma

yuanliu
SplunkTrust
SplunkTrust

First, on thought process.  Splunk allows you to create additional field in event stream.  If you mark each day as "day -1", "day -2", etc., you can group earliest and latest by day.

This is how to do that in Splunk

 

index=*XYZ" "Batchname1" earliest=-7d@d latest=-0d@d
| eval dayback = mvrange(0, 7)
| eval day = mvmap(dayback, if(_time < relative_time(now(), "-" . dayback . "d@day") AND relative_time(now(), "-" . tostring(dayback + 1) . "d@day") < _time, dayback, null()))
| stats min(_time) as Earliest max(_time) as Latest by day
| fieldformat Earliest = strftime(Earliest, "%F %T")
| fieldformat Latest = strftime(Latest, "%F %T")
| eval day = "day -" . tostring(day + 1)

 

The output looks like

dayEarliestLatest
day -12024-04-23 00:01:002024-04-23 23:53:00
day -22024-04-22 09:29:002024-04-22 23:31:00
day -32024-04-21 14:29:002024-04-21 14:29:00
day -42024-04-20 00:01:002024-04-20 19:14:00
day -52024-04-19 01:13:002024-04-19 23:47:00
day -62024-04-18 00:01:002024-04-18 23:28:00
day -72024-04-17 00:01:002024-04-17 23:14:00

Two pointers:

  1. It doesn't seem to make sense to search in current day.  So I shifted search period to -0day@day.  If your requirement includes current day, you need to change latest as well as tweak the definition of day a little.
  2. Do not use earliest(_time); min(_time) is cheaper.

The following is the emulation I use to test the above.

 

index = _audit earliest=-7d@d latest=-0d@d action=validate_token
| timechart span=1m count
| where count > 0
``` emulation of
index=*XYZ" "Batchname1" earliest=-7d@d latest=-0d@d
```

 

 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Aren't you overcomplicating it a bit?

Just render the date to a field

| eval day=strftime(_time,"%F")

and you're ready to go

| stats min(_time) as earliest max(_time) as latest by day
0 Karma

Memphis
Engager

Thank you @yuanliu ! This worked really well.  I added my eval commands to it as well and was able to produce the table that I was seeking, with your great query as a guide.   I've expanded the time range to 14 days bc I realized 7 days was a little pointless since most of my batches only run M-F.  My final query ended up being: 

index=*app_pcf cf_app_name="mddr-batch-integration-flow" "posbatch04" earliest=-14d@d latest=-0d@d
| eval dayback = mvrange(0, 14)
| eval day = mvmap(dayback, if(_time < relative_time(now(), "-" . dayback . "d@day") AND relative_time(now(), "-" . tostring(dayback + 1) . "d@day") < _time, dayback, null()))
| stats min(_time) as Earliest max(_time) as Latest by day
| fieldformat Earliest = strftime(Earliest, "%F %T")
| fieldformat Latest = strftime(Latest, "%F %T")
| eval day = "day -" . tostring(day + 1)
| eval Elapsed_Time=Latest-Earliest, Start_Time_Std=strftime(Earliest,"%H:%M:%S:%Y-%m-%d"), End_Time_Std=strftime(Latest,"%H:%M:%S:%Y-%m-%d")
| eval Elapsed_Time=Elapsed_Time/60


Lastly I will figure out how to organize this by Day in desc order; right now it is sorting the results by another column... 

Much appreciated for the help and the fast response, I would have never figured this out 🙂 

0 Karma

yuanliu
SplunkTrust
SplunkTrust

Lastly I will figure out how to organize this by Day in desc order; right now it is sorting the results by another column... 

Yes, Splunk has some weird obsession with alphabetic/ASCII ordering unless you tell it otherwise. (It kind of surprises me when the "natural" sorting order is already set in groupby (numeric on original value of "day") but Splunk changes it after adding character string "day -".)  All you need to do is to insert sort after stats and before that string conversion.

When you say in desc order, I imagine that you want the reverse numeric order.  Is this correct?

index=*app_pcf cf_app_name="mddr-batch-integration-flow" "posbatch04" earliest=-14d@d latest=-0d@d
| eval dayback = mvrange(0, 14)
| eval day = mvmap(dayback, if(_time < relative_time(now(), "-" . dayback . "d@day") AND relative_time(now(), "-" . tostring(dayback + 1) . "d@day") < _time, dayback, null()))
| stats min(_time) as Earliest max(_time) as Latest by day
| sort - day
| fieldformat Earliest = strftime(Earliest, "%F %T")
| fieldformat Latest = strftime(Latest, "%F %T")
| eval day = "day -" . tostring(day + 1)
| eval Elapsed_Time=Latest-Earliest, Start_Time_Std=strftime(Earliest,"%H:%M:%S:%Y-%m-%d"), End_Time_Std=strftime(Latest,"%H:%M:%S:%Y-%m-%d")
| eval Elapsed_Time=Elapsed_Time/60

If you want to last day first, just  | sort day.

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...

Combine Multiline Logs into a Single Event with SOCK: a Step-by-Step Guide for ...

Combine multiline logs into a single event with SOCK - a step-by-step guide for newbies Olga Malita The ...