Hue supervisor startup error - $HADOOP_CLASSPATH

2014-03-05T15:38:17

I have installed hadoop, all services working well. similarly hue installed as per instruction and configured properly (similarly hive). Everything i have checked many times its good. But when i try to start hue(supervisor) i got this error. Please suggest what should i do to solve this.

root@slave3:/usr/local/master/hue/build/env# bin/supervisor
$HADOOP_HOME=/usr/local/master/hadoop
$HIVE_CONF_DIR=/etc/hive/conf
$HIVE_HOME=/usr/lib/hive
find: `/usr/lib/hive/lib': No such file or directory
$HADOOP_CLASSPATH=
$HADOOP_OPTS=-Dlog4j.configuration=log4j.properties
$HADOOP_CONF_DIR=/etc/hive/conf:/usr/local/master/hue/apps/beeswax/src/beeswax/../..   /../../desktop/conf:/usr/local/master/hadoop/conf
CWD=/usr/local/master/hue/build/env
Executing /usr/local/master/hadoop/bin/hadoop jar /usr/local/master/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar --beeswax 8002 --desktop-host 127.0.0.1 --desktop-port 8802 --query-lifetime 604800000 --metastore 8003
(30362) *** Controller starting at Tue Mar  4 23:46:57 2014
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:266)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:190)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
 at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
 ... 3 more
Should start 1 new children
Controller.spawn_children(number=1)
$HADOOP_HOME=/usr/local/master/hadoop
$HIVE_CONF_DIR=/etc/hive/conf
$HIVE_HOME=/usr/lib/hive
find: `/usr/lib/hive/lib': No such file or directory
$HADOOP_CLASSPATH=
$HADOOP_OPTS=-Dlog4j.configuration=log4j.properties
$HADOOP_CONF_DIR=/etc/hive/conf:/usr/local/master/hue/apps/beeswax/src/beeswax/../../../../desktop/conf:/usr/local/master/hadoop/conf
CWD=/usr/local/master/hue/build/env
Executing /usr/local/master/hadoop/bin/hadoop jar /usr/local/master/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar --beeswax 8002 --desktop-host 127.0.0.1 --desktop-port 8802 --query-lifetime 604800000 --metastore 8003
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:266)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:190)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
 at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
 ... 3 more
$HADOOP_HOME=/usr/local/master/hadoop
$HIVE_CONF_DIR=/etc/hive/conf
$HIVE_HOME=/usr/lib/hive
find: `/usr/lib/hive/lib': No such file or directory
$HADOOP_CLASSPATH=
$HADOOP_OPTS=-Dlog4j.configuration=log4j.properties
$HADOOP_CONF_DIR=/etc/hive/conf:/usr/local/master/hue/apps/beeswax/src/beeswax/../../../../desktop/conf:/usr/local/master/hadoop/conf
CWD=/usr/local/master/hue/build/env
Executing /usr/local/master/hadoop/bin/hadoop jar /usr/local/master/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar --beeswax 8002 --desktop-host 127.0.0.1 --desktop-port 8802 --query-lifetime 604800000 --metastore 8003
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:266)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:190)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
 at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
  ... 3 more
$HADOOP_HOME=/usr/local/master/hadoop
$HIVE_CONF_DIR=/etc/hive/conf
$HIVE_HOME=/usr/lib/hive
find: `/usr/lib/hive/lib': No such file or directory
$HADOOP_CLASSPATH=
$HADOOP_OPTS=-Dlog4j.configuration=log4j.properties
$HADOOP_CONF_DIR=/etc/hive/conf:/usr/local/master/hue/apps/beeswax/src/beeswax/../../../../desktop/conf:/usr/local/master/hadoop/conf
CWD=/usr/local/master/hue/build/env
Executing /usr/local/master/hadoop/bin/hadoop jar /usr/local/master/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar --beeswax 8002 --desktop-host 127.0.0.1     --desktop-port 8802 --query-lifetime 604800000 --metastore 8003
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:266)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:190)
 Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
 at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
 ... 3 more
(30417) *** Child exiting
(30362) *** Controller exiting

I have installed python, mysql also. In mysql i have created a database hadoop. Everything i did as per my previous hue installation exp. Thats working fine. But this installation giving problems.

Copyright License:
Author:「prabu」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/22191475/hue-supervisor-startup-error-hadoop-classpath

About “Hue supervisor startup error - $HADOOP_CLASSPATH” questions

I have installed hadoop, all services working well. similarly hue installed as per instruction and configured properly (similarly hive). Everything i have checked many times its good. But when i tr...
I am using CLoudera Quickstart VM in VMWare, the trial single node version. So far Hue has been working great but now on Firefox if i go on Hue it says Unable to connect EDIT: supervisor keeps stop...
I have installed hadoop, all services working well. similarly hue installed as per instruction and configured properly (similarly hive). Everything i have checked many times its good. But when i tr...
I am trying to install hue 4.9.0 from tarball in my machine(centos 7) but I'm not sure how to start hue and i am getting below issue . /home/ali/hue> build/env/bin/supervisor [I ran below command
I am trying to install hue with the tar ball. The make process went smoothly. While running ./supervisor to start hue, am getting the following error. Traceback (most recent call last): File "/h...
I can't find the file 'supervisor' in Hue folder. According to official documentation it should be in the folder $HUE_HOME/build/env/bin. I am doing my operation in Ubuntu server 22.04. My objectiv...
I've followed this tutorial twice, but on the second machine that I've run it on I get a supervisor-run gunicorn error. When I tell supervisor to startup gunicorn using: $ sudo supervisorctl start
I’ve installed the latest HUE version 3.11. However whenever I try and access Hue via http://localhost:8888 it constantly gives me Server Error 500. I’ve looked at the logs and both runcpserver.lo...
I created a dockerfile which includes: #Start Hue RUN /etc/init.d/hue start For building the dockerfile, I am doing docker build --no-cache --rm=true -t hue-centos . During build, I get the
I'm getting the following while trying to build Hue: (6211) *** Controller starting at Thu Aug 8 11:29:50 2013 Should start 1 new children Controller.spawn_children(number=1) $HADOOP_HOME= $HADO...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.