I have my spark logs in Splunk .
I have got 2 Spark streaming jobs running .It will have different logs ( INFO, WARN, ERROR etc) .
I want to create a dashboard for the error Count by hour or any better way ( suggest please)
index=myindex AND (sourcetype=sparkjob1 OR sourcetype=sparkjob2 ) | stats count as total_logs count(eval(level="INFO")) as total_errors</query>
Please also advise if you have any better suggestion with useful dashboard.
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 ) | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 ) | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
thank you a lot .
1) As i have 2 applications Source types, do i need to make separate graphs? Can i do something by which i can identify those error/Info/warn belongs to which Job ?
2) Also If i need to do to add some texts like error, failed, exceptions to the ERROR bucket in the above example?