`hadoop dfs` command java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FsShell

2016-04-10T03:36:52

I am trying to run hadoop dfs command on cygwin with Hadoop - 2.6.3

I am running following command

/cygdrive/c/hadoop-2.6.4/bin/hadoop dfs -put word1 words/ which eventually throws an error

java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FsShell
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FsShell
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell.  Program will exit.
Exception in thread "main"

I have setup paths correctly

  $ echo $JAVA_HOME
C:\Program Files\Java\jdk1.6.0_31

and

$ echo $HADOOP_HOME
/cygdrive/c/hadoop-2.6.4/

Can anyone help me here?

I also tried running hadoop-env.sh file from $HADOOP_HOME/etc/hadoop but in vain

Copyright License:
Author:「veer7」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/36521733/hadoop-dfs-command-java-lang-noclassdeffounderror-org-apache-hadoop-fs-fsshel

About “`hadoop dfs` command java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FsShell” questions

I am trying to run hadoop dfs command on cygwin with Hadoop - 2.6.3 I am running following command /cygdrive/c/hadoop-2.6.4/bin/hadoop dfs -put word1 words/ which eventually throws an error java...
I'm building a wordcounter program and I want to create a working directory in the HDFS, but when I execute hdfs dfs -mkdir wordcount or other commands from hdfs dfs command list it returns me Error:
I want to specify the AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID at run-time. I already tried using hadoop -Dfs.s3a.access.key=${AWS_ACESS_KEY_ID} -Dfs.s3a.secret.key=${AWS_SECRET_ACCESS_KEY} fs...
As a start, I've installed Hadoop (0.15.2) and setup a cluster of 3 nodes: one each for NameNode, DataNode and the JobTracker. All the daemons are up and running. But when I issue any command I get...
I understand this question might have been answered already, well, my issue is still here: I have a vm created for hadoop on vmware using CentOS7, I can start namenode and datanode, however, when I
We have a linux hadoop cluster but for a variety of reasons have some windows clients connecting and pushing data to the linux cluster. In hadoop1 we had been able to run hadoop via cygwin However in
I am trying to install and set-up Hive 2.3.9 using Hadoop 2.8.3 and configure Hive warehouse using this tutorial on my Windows 10. Hadoop runs fine - I am able to establish connections to ports 880...
I know there have been many posts regarding this exception, but I am not able to fix this issue. Classpath has to be edited I think to resolve it. I am trying to run a program called DistMap in ha...
I am trying to implement a parallelized algorithm using Apache hadoop, however I am facing some issues when trying to transfer a file from the local file system to hdfs. A checksum exception is being
I am trying to change the Access Control List for a file located in an Azure Data Lake Store Gen 1 from an HDInsight 3.6 Cluster with the following command: hdfs dfs -setfacl -m user:d7de0903-abcabc-

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.