hI @Ledion Bitincka ,
I'm also getting similar error. I'm using trial version of Splunk Analytics for Hadoop.
[myhadoopprovider2] Error while running external process, return_code=255. See search.log for more info
[myhadoopprovider2] Exception - com.splunk.mr.JobStartException: Failed to start MapReduce job. Please consult search.log for more information. Message: [ Failed to start MapReduce job, name=SPLK_ABC.com_1481188820.161_0 ] and [ Permission denied: user=splunkd1, access=WRITE, inode="/tmp/hadoop-yarn/staging/splunkd1/.staging":hdpuser:supergroup:drwxr-xr-x
hdpuser --> hadoop cluster
splunkd1 --> splunk search head
I did this command and I'm still seeing above error:
hadoop fs -chown splunkd1 hdfs://nnhost:port/path/to/working/dir
hadoop fs -ls /data/input/splunk/linux/
16/12/08 04:07:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 3 items
-rwxrwxrwx 1 splunkd1 supergroup 15429 2016-12-08 02:25 /data/input/splunk/linux/LICENSE.txt
-rwxrwxrwx 1 splunkd1 supergroup 101 2016-12-08 02:25 /data/input/splunk/linux/NOTICE.txt
-rwxrwxrwx 1 splunkd1 supergroup 1366 2016-12-08 02:25 /data/input/splunk/linux/README.txt
In fact, I have provided 777 permissions also to the entire HDFS directory
Kindly help
The error message contains the directory and its permissions of where Splunk is failing to write. A simple fix here would be to have Splunk run as user hdpuser instead of splunkd1
Permission denied: user=splunkd1, access=WRITE, inode="/tmp/hadoop-yarn/staging/splunkd1/.staging":hdpuser:supergroup:drwxr-xr-x
Hi @Ledion Bitincka ,
In a previous post (link below) you had mentioned to execute the following command as hdpuser (ie change the ownership of that dir to splunkd1)
I did that :
hadoop fs -chown splunkd1 hdfs://data/input/splunk/linux/
and it didn't seem to fix the issue.
Kindly guide
[link text][2]
Can you share the Splunk configurations?
What are your values for the Name Node or the Yarn Resource Manager?
Hi @rdagan,
here are the configs:
[provider:myhadoopprovider2]
vix.command.arg.3 = $SPLUNK_HOME/bin/jars/SplunkMR-hy2.jar
vix.env.HADOOP_HOME = /home/splunkd1/hadoop-2.7.2
vix.env.JAVA_HOME = /opt/common/jdk1.7.0_21
vix.family = hadoop
vix.fs.default.name = hdfs://XXX:9000
vix.mapreduce.framework.name = yarn
vix.output.buckets.max.network.bandwidth = 0
vix.splunk.home.hdfs = /user/home/working/
vix.yarn.resourcemanager.address = XXX:8032
vix.yarn.resourcemanager.scheduler.address = XXX:8030
[vihadooptest2]
vix.input.1.path = /data/input/splunk/linux/...
vix.provider = myhadoopprovider2
Could you please help? I still face this issue..
I moved this question from a comment on a two-year-old thread.
Oh Ok..No problem as long as I get some help 🙂
You're more likely to get help from a new question.