Is that possible to run HADOOP and copy a file from local fs to HDFS in JAVA BUT without installing Hadoop on file system?

2015-04-24T14:47:12

I have NOT installed hadoop on my Linux file System. I would like to run hadoop and copy the file from local file system to HDFS WITHOUT installing hadoop on my Linux file System. I have created a sample code but it says "wrong FS, expected file:///". Any help for this?

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;

import java.io.BufferedInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.net.URI;

/**
 * Created by Ashish on 23/4/15.
*/
public class SampleHadoop {

    public static void main(String[] args) throws Exception {
        try {

            Configuration configuration = new Configuration();
            FileSystem fs = FileSystem.get(new URI("hdfs://192.168.1.170:54310/"),configuration);
            fs.copyFromLocalFile(new Path("./part-m-00000"), new Path("hdfs://192.168.1.170:54310/user/hduser/samplefile"));
            fs.close();
        } catch (Exception ex) {
          System.out.println("Exception "+ex.toString());
        }
    }
}

POM.XML

<dependencies>
    <dependency>
        <groupId>org.postgresql</groupId>
        <artifactId>postgresql</artifactId>
        <version>9.3-1102-jdbc41</version>
    </dependency>
    <dependency>
        <groupId>org.apache.httpcomponents</groupId>
        <artifactId>httpclient</artifactId>
        <version>4.3.4</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>1.0.4</version>
    </dependency>
    <dependency>
        <groupId>org.apache.sqoop</groupId>
        <artifactId>sqoop-client</artifactId>
        <version>1.99.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.sqoop</groupId>
        <artifactId>sqoop</artifactId>
        <version>1.4.0-incubating</version>
    </dependency>
    <dependency>
        <groupId>mysql</groupId>
        <artifactId>mysql-connector-java</artifactId>
        <version>5.1.34</version>
    </dependency>
    <dependency>
        <groupId>org.apache.sqoop</groupId>
        <artifactId>sqoop-tools</artifactId>
        <version>1.99.4</version>
    </dependency>
    <dependency>
        <groupId>commons-httpclient</groupId>
        <artifactId>commons-httpclient</artifactId>
        <version>3.1</version>
    </dependency>
</dependencies>

I looked for all possible solution and found following:

...
Configuration conf = new Configuration();
conf.addResource(new Path("/home/user/hadoop/conf/core-site.xml"));
conf.addResource(new Path("/home/user/hadoop/conf/hdfs-site.xml"));

BUT in my case I do not want to install hadoop on my liunx file system so I could not specify such path like "home/user/hadoop". I prefer if I could make it run only using jar files.

Copyright License:
Author:「Ashish Pancholi」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/29840527/is-that-possible-to-run-hadoop-and-copy-a-file-from-local-fs-to-hdfs-in-java-but

About “Is that possible to run HADOOP and copy a file from local fs to HDFS in JAVA BUT without installing Hadoop on file system?” questions

I have NOT installed hadoop on my Linux file System. I would like to run hadoop and copy the file from local file system to HDFS WITHOUT installing hadoop on my Linux file System. I have created a ...
I need to copy a folder from local file system to HDFS. I could not find any example of moving a folder(including its all subfolders) to HDFS $ hadoop fs -copyFromLocal /home/ubuntu/Source-Folder-To-
Is it possible to save files in Hadoop without saving them in local file system? I would like to do something like shown below however I would like to save file directly in HDFS. At the moment I save
i want to copy a local file in Hadoop FS. i run this command: sara@ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hadoop fs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /usr/lib/hadoop/hadoop-2.3.0/${
So I installed Hadoop via Cloudera Manager cdh3u5 on CentOS 5. When I run cmd hadoop fs -ls / I expected to see the contents of hdfs://localhost.localdomain:8020/ However, it had returned the
I have installed hadoop. Now i am trying to copy one local file-system to hadoop file-system using below command. hadoop fs -copyFromLocal /mnt/PRO/wvdial.conf hdfs://virus/mydata/hdfs/data...
I'm having a problem with trying to "download" file from HDFS file system to my local system. (even though opposite operation works without a problem). *Note: File exists on the HDFS file system on
I'm trying to copy a local file called 'afile' to the HDFS. So I ran the following command: 'hadoop fs -copyFromLocal /home/neo/afile in' or 'hadoop fs -put /home/neo/afile in' However, it says:
I want to re write the function of copy file from hdfs to local String src = args[0]; // hdfs String dst = args[1]; // local path /* * Prepare the input and output filesystems ...
After installing cloudera HDC on fedora25 , I can create folders, but not files nor can I copy data from my local file system to HDFS. This is the command I use: sudo -u hdfs hadoop fs -copyFrom...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.