Hadoop HADOOP_CLASSPATH issues

2012-10-18T01:46:54

This question doesn't refer to distributing jars in the whole cluster for the workers to use them.

It refers to specifying a number of additional libraries on the client machine. To be more specific: I'm trying to run the following command in order to retrieve the contents of a SequenceFile:

   /path/to/hadoop/script fs -text /path/in/HDFS/to/my/file

It throws me this error: text: java.io.IOException: WritableName can't load class: util.io.DoubleArrayWritable

I have a writable class called DoubleArrayWritable. In fact , on another computer everything works well.

I tried to set the HADOOP_CLASSPATH to include the jar containing that class but with no results. Actually, when running:

   /path/to/hadoop/script classpath 

The result doesn't contain the jar which I added to HADOOP_CLASSPATH.

The question is: how do you specify extra libraries when running hadoop (by extra meaning other libraries than the ones which the hadoop script includes automatically in the classpath)

Some more info which might help:

  • I can't modify the hadoop.sh script (nor any associated scripts)
  • I can't copy my library to the /lib directory under the hadoop installation directory
  • In the hadoop-env.sh which is run from the hadoop.sh there is this line: export HADOOP_CLASSPATH=$HADOOP_HOME/lib which probably explains why my HADOOP_CLASSPATH env var is ignored.

Copyright License:
Author:「Razvan」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/12940239/hadoop-hadoop-classpath-issues

About “Hadoop HADOOP_CLASSPATH issues” questions

This question doesn't refer to distributing jars in the whole cluster for the workers to use them. It refers to specifying a number of additional libraries on the client machine. To be more speci...
I can run my MapReduce job manually to load data into Hbase without any issue, HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_MAPRED_HOME/../hbase/* HADOOP_CLASSPATH=$HADOOP_CLASSPATH:./* export
I have written a maven program and then I build it and put 3rd party jars on target/lib folder. with which command and where I can put them in my HADOOP_CLASSPATH? The location of setting
I need to add a value to the HADOOP_CLASSPATH environment variable, according to this troubleshoot article: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/cont...
48. HBase, MapReduce, and the CLASSPATH By default, MapReduce jobs deployed to a MapReduce cluster do not have access to either the HBase configuration under $HBASE_CONF_DIR or the HBase classes. ...
I have installed hadoop, all services working well. similarly hue installed as per instruction and configured properly (similarly hive). Everything i have checked many times its good. But when i tr...
I am using hadoop jar-tasklet: <hdp:jar-tasklet id="testjob" jar="bhs_abhishek.jar"> </hdp:jar-tasklet> This jar currently needs some config files on classpath which I was earlier set..
I'm using CDH5. I'm hit by a HBase bug while running a MapReduce job through Oozie in a fully distributed environment. This job connects to HBase and adds records programmatically. Requesting to r...
How to set the HADOOP_CLASSPATH for using the local filesystem with a local job runner? How to set the input and output path from local directories? ClassNotFoundException arises for mapper and re...
I am following the documentation found at this link https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html#Usage When i try to compile ...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.