I am using universal forwarder.
I wish to tag my logs with the application and some custom information like groupA or groupB etc. So, I wish to have multiple tags to my events namely applog1, groupA.
I understand we can do it from _meta in inputs.conf. But, that does not seem to work. In fact, once I add a tag, none of the events are shown in search.
Is there any other way to do that ? Any pointers will help.
Thanks and Regards,
Abhay Dandekar
You can do it at the app level at tags.conf
You can create one tag from the UI (so you see when it ends up) and then add more to this tags.conf
Is tags.conf a config that can be applied at the config of universal-forwarder or is it an indexer config only feature?
Where would it have to be place in the forwarder directory to make it effective?
I have tried:
/opt/splunkforwarder/etc/apps/SplunkUniversalForwarder/default/tags.conf
with the following config:
[host=myhost.mydomain.net]
nonproduction=enabled
testsystem=enabled
Tags are applied at searchtime, so the easist way to do this is to built event types for each of your scenarious and then attach a tag to an event type.
There is a great splunk video here: http://www.splunk.com/view/SP-CAAAGYJ
This video is excellent - one of my favorites ; -)
Thanks, the video was great.
But, is there any automated way to add institutional data to Splunk ? I was looking at forwarders for those, but somehow they are not working for adding tags to the existing fields.
Thanks and Regards,
Abhay Dandekar
what do you mean "institutional data"?
If you mean common sourcetypes like syslog, windows events, LDAP, network events, firewall logs, ids logs, antivirus logs, proxy logs ... etc, etc.. then, take a look at splunkbase where you will find apps for all sorts of data. These applications commonly include automatic extractions and tags to identify and classify those datatypes automatically and apply them to a common information model.
If you mean something which is a little more bespoke to your organisation, you may need to perform some of the extractions and apply tags yourself.
Thanks for a quick response.
By institutional data, I mean data that cannot be derived directly and needs to be provided by user.
I am planning to add such information as tags at forwarding level itself.
Does that provide enough information ?
Thanks and Regards,
Abhay Dandekar
Splunk principally (but is not limited) to collecting data from log files. It in these cases one would normally deploy a universal forwarder to read the logs from the target system and then provide the content back to the Splunk indexer. - This assumes your data is in log files of some type.
If your data is to be collected from an API, TCP socket or a scripted process you have built you will probably want to consider a heavy forwarder which has the capabilities of the UF as well the means to run shell scripts or hooks via python.
I must admit that from your description I am not clear on your use case, but "tagging" has a specific meaning in the context of Splunk. I hope I am not doing you a disservice by suggesting you are referring to something else 🙂
I don't know if you are able to provide some sample data or maybe describe your use case in detail?