Single node hadoop -how to copy local filesystem file to hadoop file system

2013-10-27T13:55:32

I have installed hadoop. Now i am trying to copy one local file-system to hadoop file-system using below command.

hadoop fs -copyFromLocal /mnt/PRO/wvdial.conf hdfs://virus/mydata/hdfs/datanode/wvdial.conf

but I am getting below error.

copyFromLocal: Call From Virus/127.0.0.1 to Virus:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

I gone through "http://wiki.apache.org/hadoop/ConnectionRefused" documentation and I found that "Check that there isn't an entry for your hostname mapped to 127.0.0.1 or 127.0.1.1 in /etc/hosts (Ubuntu is notorious for this) ".

Yes I have my host "Virus" mapped to 127.0.0.1 at /etc/hosts. I have installed hadoop on single node.So I have to map my host to 127.0.0.1. Now what change I should make my configuration so that I could copy my local filesyatem file to hdfs?

Please find hadoop configuration from my hadoop installation documentation "http://omsopensource.blogspot.in/search/label/Hadoop". I am using Fedora 19.

Copyright License:
Author:「Gaurav Pant」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/19615115/single-node-hadoop-how-to-copy-local-filesystem-file-to-hadoop-file-system

About “Single node hadoop -how to copy local filesystem file to hadoop file system” questions

I have installed hadoop. Now i am trying to copy one local file-system to hadoop file-system using below command. hadoop fs -copyFromLocal /mnt/PRO/wvdial.conf hdfs://virus/mydata/hdfs/data...
I have installed local single node Hadoop on Windows 10 and it appatently works. Unfortunately, when I am trying to copy files to Hadoop from local filesystem, it swears: λ hadoop fs -copyFromLoc...
Setup: I have a map-reduce job. In the mapper class (which is obviously running on the cluster), I have a code something like this: try { . . . } catch (<some exception>) { // Do some s...
I have a requirement where in the Map Reduce code should read the local file system in each node. The program will be running on HDFS and I cannot change the FileSystem property for hadoop in xml f...
I am trying my first code to copy files from HDFS to local file system. But, I am unable to copy, even though the files are preset and I am getting the exception: FileNotFoundExceptoin. Here is m...
I have a scenario where I process 1000's of small files using Hadoop. The output of the Hadoop job is then to be used as input for a non-Hadoop algorithm. In the current workflow, data is read, con...
I have copied a file from a local to the hdfs file system and the file got copied -- /user/hduser/in hduser@vagrant:/usr/local/hadoop/hadoop-1.2.1$ bin/hadoop fs -copyFromLocal /home/hduser/afile...
I have this piece of code which can fetch a file from a Hadoop filesystem. I setup hadoop on a single node and from my local machine ran this code to see if it would be able to fetch file from HDFS...
I want to re write the function of copy file from hdfs to local String src = args[0]; // hdfs String dst = args[1]; // local path /* * Prepare the input and output filesystems ...
I have a valid jar which is running perfectly on another system running the same version of hadoop i.e hadoop-1.2.1 with the same settings. I am able to put the jar file in the hdfs filesystem and

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.