hadoop copy file to current directory from hdfs

2018-10-14T17:51:03

I want to re write the function of copy file from hdfs to local

    String src = args[0]; // hdfs 
    String dst = args[1]; // local path

    /*
     * Prepare the input and output filesystems
     */
    Configuration conf = new Configuration();
    FileSystem inFS = FileSystem.get(URI.create(src), conf);
    FileSystem outFS = FileSystem.get(URI.create(dst), conf);

    /*
     * Prepare the input and output streams
     */

    FSDataInputStream in = null;
    FSDataOutputStream out = null;
    // TODO: Your implementation goes here...
    in = inFS.open(new Path(src));

    out = outFS.create(new Path(dst),
            new Progressable() {
        /*
         * Print a dot whenever 64 KB of data has been written to
         * the datanode pipeline.
         */
        public void progress() {
            System.out.print(".");
        }
    });

(1) when I set the local path to my current director

--> JAVA I/O error. mkdirs failed to create file

(2) when I set the path to a non exist folder

--> a new folder is created and my copied file is inside.

What should I do?

I beleive I should not use filesystem.create()?, am I?

EDIT

a related filesystem library link: https://hadoop.apache.org/docs/current/api/org/apache/hadoop/fs/FileSystem.html#create(org.apache.hadoop.fs.Path)

Copyright License:
Author:「Joe Black」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/52801455/hadoop-copy-file-to-current-directory-from-hdfs

About “hadoop copy file to current directory from hdfs” questions

I want to re write the function of copy file from hdfs to local String src = args[0]; // hdfs String dst = args[1]; // local path /* * Prepare the input and output filesystems ...
I need to copy the directory (/tmp/xxx_files/xxx/Output) head containing sub folders and files from HDFS (Hadoop distributed file system). I'm using HDFS connector but it seems it does not support ...
I'm new to Apache Hadoop and I'm trying to copy a simple text file from my local directory to HDFS on Hadoop, which is up and running. However, Hadoop is installed in D: while my file is in C:. If...
I'm looking for a best way to copy whole directory from HDFS with all contents inside. Something like: Path srcPath = new Path("hdfs://localhost:9000/user/britva/data"); Path dstPath = new Path("/...
I started with the need to backup the whole hadoop datanode data directory using: hdfs dfs -copyToLocal /var/hadoop/dfs/name/data /home/ubuntu/hadoopfiles And I got an error: "No such file opr
Is there a simple way to copy a HDFS directory to another directory in Java? For example, how would I move the contents of /user/abc/pudding to /user/def/pudding? I'm looking for some HDFS equivale...
I'm setting up a single Hadoop node, but when running $HADOOP_HOME/sbin/start-dfs.sh it prints that it cannot find $HADOOP_HOME/bin/hdfs. The file at that location exists though, and I can read it
When i try to copy directory of 3 files in hdfs i get following errors hduser@saket-K53SM:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/gutenberg /user/hduser/gutenberg 12/08/01 23:48...
I deployed Kubernetes on a single node using minikube and then installed hadoop and hdfs with helm. It's working well. The problem is when i try to copy a file from local to hdfs $ hadoop fs -
I am trying to copy a file from my local file system to HDFS, but it is giving an error message. The command is: hadoop fs -copyFromLocal /Users/admin/a.txt username98642 And the error is:

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.