Getting Data In

extracting data using script getting none in timestamp field

anilkapoor123
Explorer

Hi Team,

need your help, while i am ingesting data using python script i.e scripted input. for timestamp field i am getting none value . even in script data is populating fine but when it is ingesting in splunk it is getting extra field value none for timestamp

 

need your help

Labels (1)
0 Karma
1 Solution

isoutamo
SplunkTrust
SplunkTrust

You should always define your own sourcetype instead of use _json. So change your inputs.conf like

[script://./bin/networker_alerts.py]
disabled = 0
index = test
interval = 4-59/5 * * * *
source = script:networker_alerts.py
sourcetype = json:with:timestamp

 Then you should define your props.conf for that sourcetype on that HF. Please create own app for it

[json:with:timestamp]
SHOULD_LINEMERGE=true
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
CHARSET=UTF-8
INDEXED_EXTRACTIONS=json
KV_MODE=none
category=Structured
description=Your own JSON definition for networker_alerts.py script
disabled=false
pulldown_type=true
TIME_FORMAT=%Y-%m-%dT%H:%M:%S%:z
TIMESTAMP_FIELDS=timestamp

Then you must restart that HF for reading that props.conf.

View solution in original post

anilkapoor123
Explorer

any resolution will be appreciable here

you were telling one more solution if indexed_extractions is set to none.  you sugest to convert timestamp to _time 

can you tell me how can i do it and where in props.conf in HF or Search head.

Thanks

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Anything more. Basically when you have those earlier props.conf on HF and SH you haven't those it should works. If not then there is probably something else which didn't come in our minds now. We should see the situation by our own hands to find what is real issue in your system.

On HF side just install that networker_inputs app. It should be enough.

TIME_FORMAT=%Y-%m-%dT%H:%M:%S%:z
TIMESTAMP_FIELDS=timestamp
DATETIME_CONFIG =

Check that you have above on our TA/App on HF. That should take care of correct _time field.

On SH side, just create app/TA where you set KV_MODE=none for that sourcetype without INDEXED_EXTRACTIONS change.

 

0 Karma

anilkapoor123
Explorer

@isoutamo 

after change above configurations in HF for props.conf , i am getting none for timestamp field

TIME_FORMAT=%Y-%m-%dT%H:%M:%S%:z
TIMESTAMP_FIELDS=timestamp
DATETIME_CONFIG =

  help here

Tags (1)
0 Karma

anilkapoor123
Explorer

@isoutamo 

any update 

0 Karma

anilkapoor123
Explorer

there is only single event in splunk web it is showing but when i am checking timestamp field with 

index=xx | table timestamp. it is showing multiple values in single field.

both parameter are not json as per your props.conf values given by you these are the settings.

indexed_extractions= json

kv_mode = none 

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Is this separate field than _time? And is there any other fields duplicated or only this one?

0 Karma

anilkapoor123
Explorer

all fields duplicated which are coming in scripted input output. like below

category

message

priority

timestamp

script output

{"category": "disk space", "message": "'xxx' host '/nsr' disk path occupied with '92.42%' of disk space. Free up the space.", "priority": "warning", "timestamp": "2023-07-03T08:51:25+02:00"}

timestamp is different field then _time. coming in outputs as shown above

0 Karma

anilkapoor123
Explorer

@isoutamo , can you help me to understand what was the issue , 

one doubt, if i  put props.conf in /opt/splunk/etc/system/local, it will be used by all data inputs right.

here i have kept it inside /etc/apps/<app_name>/local folder .  only

 

0 Karma

isoutamo
SplunkTrust
SplunkTrust

It depending how apps configurations have exported. Usually on indexing phase you should export those globally. Then it's no matter where you have put those. Expect the precedence which are defined by app names.

If/when your job is to do data onboarding to splunk I suggest to you to take Splunk Data Administration course. It will explain that well.

0 Karma

anilkapoor123
Explorer

thankyou isoutamo, this solution solved my problem

0 Karma
Get Updates on the Splunk Community!

Modern way of developing distributed application using OTel

Recently, I had the opportunity to work on a complex microservice using Spring boot and Quarkus to develop a ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security ...

Archived Metrics Now Available for APAC and EMEA realms

We’re excited to announce the launch of Archived Metrics in Splunk Infrastructure Monitoring for our customers ...