Hadoop Mapreduce failing with CreateSymbolicLink

2016-11-18T15:46:39

Team,

Please be informed that i am running Hadoop mapreduce examples (Version 2.7.1). It is failing with below error.

Exit code: 1
Exception message: CreateSymbolicLink error (1314): A required privilege is not held by the client.
Stack trace: ExitCodeException exitCode=1: CreateSymbolicLink error (1314): A required privilege is not held by the client.
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
        at org.apache.hadoop.util.Shell.run(Shell.java:456)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Shell output:         1 file(s) moved.
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.

I have searched in google. It seems to be permission issue.

My assumption is Mapreduce is accessing some restricted path/location. Is there any way that i can change the accessing location?

Since i dont have admin permission..

Note : I am running in windows....

Your suggestions are highly appreciated...

Copyright License:
Author:「VinayS」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/40671822/hadoop-mapreduce-failing-with-createsymboliclink

About “Hadoop Mapreduce failing with CreateSymbolicLink” questions

Team, Please be informed that i am running Hadoop mapreduce examples (Version 2.7.1). It is failing with below error. Exit code: 1 Exception message: CreateSymbolicLink error (1314): A required
I have installed Hadoop 3.2.3 stable version. I set all environment variables like JAVA_HOME, HADOOP_HOME, PATH etc.. I configured yarn-site.xml, hdfs-site.xml, core-site.xml, mapred-site.xml. I up...
I am getting this error when trying to run the wordcount example from the hadoop-mapreduce-examples-3.3.1.jar that comes with Hadoop. Exception message: CreateSymbolicLink error (183): Cannot creat...
Tried to execute sample map reduce program from Apache Hadoop. Got exception below when map reduce job was running. Tried hdfs dfs -chmod 777 / but that didn't fix the issue. 15/03/10 13:13:10 WARN
I am relatively new to hadoop 2 (hadoop 2.2.0) and I don't understand why M/R job ~ application on Resource manager is marked as failed : application_1399458460502_0015 pig Max temperature MAPRED...
MapReduce job is failing with following error even though JAVA_HOME is set. /bin/bash: /bin/java: No such file or directory I am trying to setup hadoop (3.3.4) on my Mac M1. I have set JAVA_HOME i...
I am currently learning to use Hadoop mapred an have come across this error: packageJobJar: [/home/hduser/mapper.py, /home/hduser/reducer.py, /tmp/hadoop-unjar4635332780289131423/] [] /tmp/
I'm ready to install hadoop on Windows,4 daemons are running,when I run a demo yarn jar %HADOOP_PREFIX%\share\hadoop\mapreduce\hadoop-mapreduce-examples-2.5.0.jar wordcount /myfile.txt /out This ...
I am using Hadoop 2.7.0 in pseudo node mode, on a Fedora 22 Virtual Machine. A few days back the MapReduce jobs ran fine, but after installed Oozie and made modifications to the yarn-site.xml . I am
I am following this hadoop mapreduce tutorial given by Apache. The Java code given there uses these Apache-hadoop classes: import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.f...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.