Splunk Search

How to extract stats in a multivariate field?

Bizzaro_Shake
Explorer

I have the following query that sets 'Results' based on the JSON portion of my logs below:

index="internallogs"
sourcetype="internal"
host="*hostadress*"
source="FunctionApp"
(/api/assignmentsearch OR AssignmentSearchResults)
| eval _raw=replace(_raw,"(\\\\r\\\\n\s+)","")
| eval _raw=replace(_raw,"\\\\(\")","\"")
| rename additionalProperties.Key as TransactionID
| rename additionalProperties.JSON as JSON
| spath input=JSON path=AssignmentSearchResults{}.FullName output=RsUWName
| spath input=JSON path=Headers{}.profitCenterCode{} output=RqProfitCenter
| lookup uwmanagement-divisions.csv ProfitCenterCode as RqProfitCenter Outputnew ProfitCenterDescription as ProfitCenterOutput
| transaction TransactionID
| eval Results=case(
isnull(RsUWName), "Zero Returned",
mvcount(RsUWName) = 1, "Exactly One Returned",
mvcount(RsUWName) > 1, "More than One Returned"
)

There are three profit centers that are extracted from the logs; 1) Div1 2) Div2 and 3) Div3

How would I extract the percentages from when Results = 'Exactly One Returned' by profitCenterCode? I need this in a chart format. Please let me know what other questions you have.

Labels (1)
0 Karma

Bizzaro_Shake
Explorer

Thank you for the response yuanliu! Ultimately, what I am attempting to do is 'append' a statistics column when 1 UW is returned in a log. Example of my SPL query is below:

index="internallogs"
sourcetype="internal"
host="*hostadress*"
source="FunctionApp"
(/api/assignmentsearch OR AssignmentSearchResults)
| eval _raw=replace(_raw,"(\\\\r\\\\n\s+)","")
| eval _raw=replace(_raw,"\\\\(\")","\"")
| rename additionalProperties.Key as TransactionID
| rename additionalProperties.JSON as JSON
| spath input=JSON path=AssignmentSearchResults{}.FullName output=RsUWName
| spath input=JSON path=Headers{}.profitCenterCode{} output=RqProfitCenter
| lookup uwmanagement-divisions.csv ProfitCenterCode as RqProfitCenter Outputnew ProfitCenterDescription as ProfitCenterOutput
| transaction TransactionID
| eval Results=case(
isnull(RsUWName), "Zero Returned",
mvcount(RsUWName) = 1, "Exactly One Returned",
mvcount(RsUWName) > 1, "More than One Returned"
)
| chart count by Results, ProfitCenterOutput
| addcoltotals labelfield=Results label="Total Calls"


Currently, the output is as follows:

Results          Div1          Div 2          Div 3

1 UW                9               227             222

0 UWs              1               173            0

>1 UW              1                2                9

Total Calls       11            402          231

What my desired chart would look like:

Results          Div1          Div 2          Div 3

1 UW                9               227             222

0 UWs              1               173            0

>1 UW              1                2                9

Total Calls       11            402          231

% 1UW*             81%         56%        96%

SLA**                     90%         90%        90%

* Where % 1UW is the number of times a log has exactly one returned UW (ie Results="Exactly One Returned"). So along the lines of "Exactly One Returned" / ("Exactly One Returned" + "Zero Retuned" + "More than One Returned") * 100.

** Service Level Agreement (SLA) is a hard coded value that is specific to a PorfitCenterOutput (ie Div1 90%, Div2 75%, Div3 60%).

I know that using the append command can add an additional row to a chart, but I am having difficulty getting the stats for "% 1UW" from "Results".

0 Karma

yuanliu
SplunkTrust
SplunkTrust

Thanks for explaining the intention and desired outcome.  So far, I conclude  that the gap between the current output and desired output has nothing to do with multivaluedness of any field.  Is it correct? (I see that all array elements AssignmentSearchResults{}, Headers{}, and Headers{}.profitCenterCode{} are only used in groupby phrase, so technically you don't have to expand them into single values before chart.)

The commands you are looking for are eventstats.  If you replace chart with stats, the calculation is very straight-forward and efficient

 

| stats count by ProfitCenter Results
| eventstats sum(count) as Total by ProfitCenter
| eval Percent1UW = if(Results == "Exactly One Returned", count / Total * 100, null())

 

However, if you must use ProfitCenterOutput as column names as you illustrated, you need appendpipe, not append.  But the only way I worked out is terribly tortured AND extremely expensive.

| chart count by Results ProfitCenterOutput
| addcoltotals labelfield=Results label="Total Calls"
| appendpipe
    [foreach Div* ``` you need a common substring among ProfitCenters ```
        [eval <<FIELD>> = if(Results IN ("Exactly One Returned", "Total Calls"), <<FIELD>>, null())
        ]
    | stats list(*) as *
    | foreach Div*
        [eval <<FIELD>> = tonumber(mvindex(<<FIELD>>, 0)) / tonumber(mvindex(<<FIELD>>, 1)) * 100]
    | eval Results = "% 1UW"
    ]
| appendpipe
    [foreach *
        [where Results == "% 1UW"
        | eval <<FIELD>> = "90%"]
    | eval Results = "SLA"]

Personally I don't think this format is worth the complexity.

Tags (2)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try something like this

| appendpipe
    [| transpose 0 header_field=Results column_name=ProfitCenterOutput
    | eval "% 1UW" = floor(100*'1 UW'/'Total Calls')
    | eval SLA=case(ProfitCenterOutput="Div1",90,ProfitCenterOutput="Div2",75,ProfitCenterOutput="Div3",60)
    | fields ProfitCenterOutput "% 1UW" SLA
    | transpose 0 header_field=ProfitCenterOutput column_name=Results]
0 Karma

yuanliu
SplunkTrust
SplunkTrust

The question is not exactly clear.  Can you illustrate Results that you get from the above search?  I cannot see how that is multivalued.

Meanwhile, based on your search, your JSON contains several multivalued parts.  Generally speaking, you should make those subnodes single valued first.

```
| spath input=JSON path=AssignmentSearchResults{}.FullName output=RsUWName
| spath input=JSON path=Headers{}.profitCenterCode{} output=RqProfitCenter
```
| spath input=JSON path=AssignmentSearchResults{}
| mvexpand AssignmentSearchResults{}
| spath input=AssignmentSearchResults{}
| rename FullName AS RsUWName
| spath input=JSON path=Headers{}
| mvexpand Headers{}
| spath input=Headers{} path=Headers{}.profitCenterCode{} output=RqProfitCenter
| mvexpand RqProfitCenter

After this, RqProfitCenter is still a JSON object.  For ProfitCenterCode in the lookup to match RqProfitCenter as your sample code suggests, that field will need to be in JSON text format.  I'm also not sure if this will work out too well.  You are probably looking for something like (purely speculative)

index="internallogs"
sourcetype="internal"
host="*hostadress*"
source="FunctionApp"
(/api/assignmentsearch OR AssignmentSearchResults)
| eval _raw=replace(_raw,"(\\\\r\\\\n\s+)","")
| eval _raw=replace(_raw,"\\\\(\")","\"")
| rename additionalProperties.Key as TransactionID
| rename additionalProperties.JSON as JSON
```
| spath input=JSON path=AssignmentSearchResults{}.FullName output=RsUWName
| spath input=JSON path=Headers{}.profitCenterCode{} output=RqProfitCenter
```
| spath input=JSON path=AssignmentSearchResults{}
| mvexpand AssignmentSearchResults{}
| spath input=AssignmentSearchResults{}
| rename FullName AS RsUWName
| spath input=JSON path=Headers{}
| mvexpand Headers{}
| spath input=Headers{} path=Headers{}.profitCenterCode{} output=RqProfitCenter
| mvexpand RqProfitCenter
| spath input=RqProfitCenter ``` speculate output will contain a field named "Code" ```
| lookup uwmanagement-divisions.csv ProfitCenterCode as Code Outputnew ProfitCenterDescription as ProfitCenterOutput
| transaction TransactionID
| eval Results=case(
isnull(RsUWName), "Zero Returned",
mvcount(RsUWName) = 1, "Exactly One Returned",
mvcount(RsUWName) > 1, "More than One Returned"
)

At this point, Results should be single-valued.  You can then tell us what the expected stats should look like.

Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...