HDP: How to change HADOOP_CLASSPATH value

2016-07-07T15:15:51

I need to add a value to the HADOOP_CLASSPATH environment variable, according to this troubleshoot article: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/content/troubleshooting-phoenix.html

when I type echo $HADOOP_CLASSPATH in console I get an empty result back. I think I need to set these values in an config.xml file...

Where or how can I set this new value to the environment variable?

Can I set it in spark-submit?

Copyright License:
Author:「D. Müller」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/38239577/hdp-how-to-change-hadoop-classpath-value

About “HDP: How to change HADOOP_CLASSPATH value” questions

I need to add a value to the HADOOP_CLASSPATH environment variable, according to this troubleshoot article: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_book/cont...
I am using hadoop jar-tasklet: <hdp:jar-tasklet id="testjob" jar="bhs_abhishek.jar"> </hdp:jar-tasklet> This jar currently needs some config files on classpath which I was earlier set..
I'm using CDH5. I'm hit by a HBase bug while running a MapReduce job through Oozie in a fully distributed environment. This job connects to HBase and adds records programmatically. Requesting to r...
I can run my MapReduce job manually to load data into Hbase without any issue, HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_MAPRED_HOME/../hbase/* HADOOP_CLASSPATH=$HADOOP_CLASSPATH:./* export
I have written a maven program and then I build it and put 3rd party jars on target/lib folder. with which command and where I can put them in my HADOOP_CLASSPATH? The location of setting
This question doesn't refer to distributing jars in the whole cluster for the workers to use them. It refers to specifying a number of additional libraries on the client machine. To be more speci...
How to rename the cluster in hdp 2.1? There is a ui option to change in hdp 2.2 but I am unable to find any option and any configuration to rename the cluster in hdp 2.1.I need to rename it because I
I want to use apache flink on a secure kerberized HDP 3.1 cluster, but am still stuck with the first steps. The latest release was downloaded and unzipped (https://flink.apache.org/downloads.html#...
I am using HDP 2.5 and NiFI-1.1.0.2.1.2.0. Can anyone tell me how to export and import nifi flow from one HDP to another HDP
I am trying to install HortonWorks HDP manually on a CentOS box. I am following the instructions given on this page http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_installing_manually_b...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.