Making Pig 0.12 work with hadoop 2.2.0

2014-06-04T08:04:42

I have hadoop 2.2.0 running on remote cluster and Pig 0.12 on a separate machine. I need to make Pig communicate with hadoop and the first steps seems to be to build Pig 0.12 with hadoop 2.2.0. Here is what I did:

  1. In ivy/libraries.properties changed the hadoop-core.version, hadoop-common.version, hadoop-hdfs.version, hadoop-mapreduce.version to 2.2.0.
  2. In ivy.xml replaced the hadoop-core dependency to hadoop-client dependency.
  3. Build pig using "ant clean jar-all -Dhadoopversion=23"

When I run "pig" from command line I get the following error:

ERROR 2998: Unhandled internal error. org/apache/hadoop/hdfs/DistributedFileSystem

java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/DistributedFileSystem
        at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:173)
        at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:122)
        at org.apache.pig.impl.PigContext.connect(PigContext.java:301)
        at org.apache.pig.PigServer.<init>(PigServer.java:222)
        at org.apache.pig.PigServer.<init>(PigServer.java:207)
        at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:47)
        at org.apache.pig.Main.run(Main.java:538)
        at org.apache.pig.Main.main(Main.java:156)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.DistributedFileSystem
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

Copyright License:
Author:「user2242284」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/24026952/making-pig-0-12-work-with-hadoop-2-2-0

About “Making Pig 0.12 work with hadoop 2.2.0” questions

I have hadoop 2.2.0 running on remote cluster and Pig 0.12 on a separate machine. I need to make Pig communicate with hadoop and the first steps seems to be to build Pig 0.12 with hadoop 2.2.0. Her...
I am using hadoop2.2.0,cassandra2.0.6,pig0.12 and spark1.0.1. I am reading data from cassandra using pig using CassandraStorage handler and did analytic operations. I know spark accept hadoop input
I am trying to install all apache hadoop components in my system. I installed hadoop-2.2.0, hive-0.11.0, pig-0.12.0, hbase-0.96.0, now its time to install sqoop. So please suggest me installation s...
I have configured a Hadoop cluster with three Nodes. All nodes are working fine and connected. I have uploaded 28 GB file in HDFS and executing Pig script for process that file. While I am executing
i am getting the following error,when i going to insert the data into cassandra using pig, ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. Found interface org.apache.
I need to run Python streaming UDFs from PIG on Amazon EMR using Hadoop 2.x Based on the documentation PIG works with Hadoop 2.x since version 0.14 http://pig.apache.org/docs/r0.12.0/udf.html#python-
I am trying to execute the pig oozie work flow. But work flow hangs in running state ,i checked the log file i found this Log file from Node manager: 2015-02-25 17:50:06,322 [JobControl] INFO ...
has anyone had successful experience loading data to hbase-0.98.0 from pig-0.12.0 on hadoop-2.2.0 in an environment of hadoop-2.20+hbase-0.98.0+pig-0.12.0 combination without encountering this erro...
I was processing a concatenated bz2 file using a pig script on top of Pig 0.12 and YARN 2.2 and got the following error message: ERROR: java.io.IOException: Encountered additional bytes in the fil...
I'm trying to perform mapreduce in hadoop-2.2.0 for the data stored in cassandra using pig. I was able run the script in pig local mode but I couldn't run in mapreduce mode.Kindly help me to resolv...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.