Splunk Search

How to extract cancelled transactions from logs?

syed
Observer

 I'm looking at events and I'm trying to determine which files are not "deleted" from the folder on a server after files have been 'uploaded'. If the file is deleted it means it has been successfully transferred. I'm able to use the 'transaction' command to determine the duration of a successful file transfer, however, I'm not able to figure out which files are stuck in the folder since the 'delete' event did not occur for some files. Help would be appreciated. 

This is what i have so far, but needs fixing to determine which files are "stuck"...I think a join might be needed?

index=main* ("Found new file" OR "Deleted file") 
| rex field=_raw "Found new file .*\\\\(?P<files>.*)\"}"
| rex field=_raw "Deleted file (?P<files>.*)\"}"
| transaction user files keepevicted=t mvlist=true startswith="Found new file" endswith="Deleted file"
| table user files duration _raw
| sort _time desc
| where duration=0
Labels (4)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

The current transaction command will find only completed transactions, that is, those with both "Found new file" and "Deleted file" events.  To find files that were not deleted, you want to locate "orphan" transactions - those with a matching startswith and no matching endswith.  The keeporphans option should do it.

index=main* ("Found new file" OR "Deleted file") 
| rex field=_raw "Found new file .*\\\\(?P<files>.*)\"}"
| rex field=_raw "Deleted file (?P<files>.*)\"}"
| transaction user files keepevicted=t keeporphans=t mvlist=true startswith="Found new file" endswith="Deleted file"
| where _txn_orphan=1
| table user files duration _raw
| sort _time desc
| where duration=0

 

---
If this reply helps you, Karma would be appreciated.
0 Karma

syed
Observer

Unfortunately, I get no results when searched over 30 days.  Would there be another way to tackle this search out side of the transaction command perhaps?

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Try removing the | where duration=0 line.

---
If this reply helps you, Karma would be appreciated.
0 Karma

syed
Observer

So this is this is showing the files that failed but it doesn't take into account for them successfully transfer after the failure. How can we take the data from this and then run a subsearch to see if they upload after the first failure? Thanks

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Using transaction command over a long period is perhaps not the best solution for this, as memory constraints can result in the results not being correct. You can probably use stats to perform the same thing, e.g. this search

index=main* ("Found new file" OR "Deleted file") 
| rex field=_raw "Found new file .*\\\\(?P<files>.*)\"}"
| rex field=_raw "Deleted file (?P<files>.*)\"}"
| eval deleted=if(match(_raw, "Deleted file"), 1, 0)
| stats earliest(_time) as earliest latest(_time) as latest max(deleted) as deleted by user files
| where deleted=0

 will provide a table of the earliest/latest times of 'user' and 'files'. It works by setting the 'deleted' field to 1 if the file is deleted and 0 if there is no deleted message. The where clause will remove any rows where the file has not been deleted.

Using stats will certainly perform faster than transaction and be reliable, as it won't have memory constraints.

You can perform calculates after the stats to get duration if needed 

 

richgalloway
SplunkTrust
SplunkTrust

How long might it be until the re-try is successful?  You can try extending the time span of the transaction (maxspan option), but that may not help if it takes a long time for the re-try.  Also, longer transactions use more memory and make the search run longer.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

More Ways To Control Your Costs With Archived Metrics | Register for Tech Talk

Tuesday, May 14, 2024  |  11AM PT / 2PM ET Register to Attend Join us for this Tech Talk and learn how to ...

.conf24 | Personalize your .conf experience with Learning Paths!

Personalize your .conf24 Experience Learning paths allow you to level up your skill sets and dive deeper ...

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...