Getting Data In

How do I configure data inputs for .csv files with dynamic field headers for a new event on each line

fox
Path Finder

Running 4.2.1, we are monitoring many csv files that differ on listed fields. We have splunk configured to dynamically read the header row for field names. (props.conf: CHECK_FOR_HEADER=TRUE) and this works brilliantly! However we are not seeing the events split correctly - splunk is indexing 256 rows to one event. This is a .csv file with a clear event new line separation...

Has anyone else done this successfully?

Any ideas?

Tags (1)
0 Karma

gkanapathy
Splunk Employee
Splunk Employee

Most likely, Splunk is not detecting a timestamp in your rows. The default rule for Splunk is to merge lines together (SHOULD_LINEMERGE = true), but to split them whenever it detects a date (BREAK_ONLY_BEFORE_DATE = true). The easiest and best way to break on newlines is to simply set SHOULD_LINEMERGE = false, but if there are dates in your data and Splunk isn't finding them, you should also set TIME_FORMAT and TIME_PREFIX and maybe MAX_TIMESTAMP_LOOKAHEAD.

0 Karma

Ayn
Legend

By default Splunk will merge lines in incoming logs and then break them up according to certain rules. This behavior is controlled by the SHOULD_LINEMERGE directive in props.conf (default is true). Setting SHOULD_LINEMERGE to false will tell Splunk not to combine several lines into a single event, which will give you the behavior you want.

Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...