java.lang.reflect.InvocationTargetException in Reduce DataJoin ( Hadoop In Action )
NickName:Cindy Lee Ask DateTime:2012-12-30T15:10:33

java.lang.reflect.InvocationTargetException in Reduce DataJoin ( Hadoop In Action )

I am having a problem running the DataJoin example in the Hadoop In Action. It seems like while running the job, java.lang.reflect.InvocationTargetException was thrown. I tried it for a day and it doesn't work. what did i do wrong?

Below is the exception

12/12/30 01:54:06 INFO mapred.JobClient: Task Id : attempt_201212280853_0032_m_000000_2, Status : FAILED
java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:389)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja

Below is the code taken from the Hadoop In Action. I have customers.txt and orders.txt in my input directory. I tried renaming these 2 files in the input directory to part-0000.txt and part-0001.txt and still it doesn't work.

import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import java.util.Iterator;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.KeyValueTextInputFormat;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.Mapper;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.mapred.Reporter;
import org.apache.hadoop.mapred.TextInputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
import org.apache.hadoop.util.ReflectionUtils;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;

import org.apache.hadoop.contrib.utils.join.DataJoinMapperBase;
import org.apache.hadoop.contrib.utils.join.DataJoinReducerBase;
import org.apache.hadoop.contrib.utils.join.TaggedMapOutput;

/**
 * DataJoinMapperBase has a method method - http://hadoop.apache.org/docs/mapreduce/r0.21.0/api/org/apache/hadoop/contrib/utils/join/DataJoinMapperBase.html
 * 
 * map(Object key, Object value, OutputCollector output, Reporter reporter) 
 *  
 */

public class DataJoin extends Configured implements Tool {

    public static class MapClass extends DataJoinMapperBase {


        protected Text generateInputTag(String inputFile) {
            String datasource = inputFile.split("-")[0];
            return new Text(datasource);

        }


        protected Text generateGroupKey(TaggedMapOutput aRecord) {
            String line = ((Text) aRecord.getData()).toString();
            String[] tokens = line.split(",");
            String groupKey = tokens[0];
            return new Text(groupKey);
        }

        protected TaggedMapOutput generateTaggedMapOutput(Object value) {
            TaggedWritable retv = new TaggedWritable();
            retv.setData((Text) value);
            retv.setTag(this.inputTag);
            return retv;
        }
    }

    public static class Reduce extends DataJoinReducerBase {

        protected TaggedMapOutput combine(Object[] tags, Object[] values) {
            if (tags.length < 2) return null;  
            String joinedStr = ""; 
            for (int i=0; i<values.length; i++) {
                if (i > 0) joinedStr += ",";
                TaggedWritable tw = (TaggedWritable) values[i];
                String line = ((Text) tw.getData()).toString();
                String[] tokens = line.split(",", 2);
                joinedStr += tokens[1];
            }
            TaggedWritable retv = new TaggedWritable();
            retv.setData(new Text(joinedStr));
            retv.setTag((Text) tags[0]); 
            return retv;
        }
    }

    public static class TaggedWritable extends TaggedMapOutput {

        private Writable data;

//        public TaggedWritable(Writable data) {
//            this.tag = new Text("");
//            this.data = data;
//        }
        public TaggedWritable() {
            this.tag = new Text();
        }        

        public Writable getData() {
            return data;
        }

        public void setData(Writable data) {
            this.data = data;
        }        

        public void write(DataOutput out) throws IOException {
            this.tag.write(out);
            this.data.write(out);
        }

        public void readFields(DataInput in) throws IOException {
            this.tag.readFields(in);
            String dataClz = in.readUTF();
            if (this.data == null
                    || !this.data.getClass().getName().equals(dataClz)) {
                try {
                    this.data = (Writable) ReflectionUtils.newInstance(
                            Class.forName(dataClz), null);
                } catch (ClassNotFoundException e) {
                    // TODO Auto-generated catch block
                    e.printStackTrace();
                }
            }
            this.data.readFields(in);
        }        

//        public void readFields(DataInput in) throws IOException {
//            this.tag.readFields(in);
//            this.data.readFields(in);
//        }
    }

    public int run(String[] args) throws Exception {
        Configuration conf = getConf();

        JobConf job = new JobConf(conf, DataJoin.class);

        Path in = new Path(args[0]);
        Path out = new Path(args[1]);
        FileInputFormat.setInputPaths(job, in);
        FileOutputFormat.setOutputPath(job, out);

        job.setJobName("DataJoin");
        job.setMapperClass(MapClass.class);
        job.setReducerClass(Reduce.class);

        job.setInputFormat(TextInputFormat.class);
        job.setOutputFormat(TextOutputFormat.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(TaggedWritable.class);
        job.set("mapred.textoutputformat.separator", ",");

        JobClient.runJob(job); 
        return 0;
    }

    public static void main(String[] args) throws Exception { 
        int res = ToolRunner.run(new Configuration(),
                                 new DataJoin(),
                                 args);

        System.exit(res);
    }
}


I hit the similar problem and java.lang.reflect.InvocationTargetException was the root cause...what did i do wrong?

12/12/30 01:54:06 INFO mapred.JobClient: Task Id : attempt_201212280853_0032_m_000000_2, Status : FAILED
java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:389)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja
Below is the code taken from the Hadoop In Action

import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import java.util.Iterator;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.KeyValueTextInputFormat;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.Mapper;
import org.apache.hadoop.mapred.OutputCollector;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.mapred.Reporter;
import org.apache.hadoop.mapred.TextInputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
import org.apache.hadoop.util.ReflectionUtils;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;

import org.apache.hadoop.contrib.utils.join.DataJoinMapperBase;
import org.apache.hadoop.contrib.utils.join.DataJoinReducerBase;
import org.apache.hadoop.contrib.utils.join.TaggedMapOutput;

/**
 * DataJoinMapperBase has a method method - http://hadoop.apache.org/docs/mapreduce/r0.21.0/api/org/apache/hadoop/contrib/utils/join/DataJoinMapperBase.html
 * 
 * map(Object key, Object value, OutputCollector output, Reporter reporter) 
 *  
 */

public class DataJoin extends Configured implements Tool {

    public static class MapClass extends DataJoinMapperBase {


        protected Text generateInputTag(String inputFile) {
            String datasource = inputFile.split("-")[0];
            return new Text(datasource);

        }


        protected Text generateGroupKey(TaggedMapOutput aRecord) {
            String line = ((Text) aRecord.getData()).toString();
            String[] tokens = line.split(",");
            String groupKey = tokens[0];
            return new Text(groupKey);
        }

        protected TaggedMapOutput generateTaggedMapOutput(Object value) {
            TaggedWritable retv = new TaggedWritable();
            retv.setData((Text) value);
            retv.setTag(this.inputTag);
            return retv;
        }
    }

    public static class Reduce extends DataJoinReducerBase {

        protected TaggedMapOutput combine(Object[] tags, Object[] values) {
            if (tags.length < 2) return null;  
            String joinedStr = ""; 
            for (int i=0; i<values.length; i++) {
                if (i > 0) joinedStr += ",";
                TaggedWritable tw = (TaggedWritable) values[i];
                String line = ((Text) tw.getData()).toString();
                String[] tokens = line.split(",", 2);
                joinedStr += tokens[1];
            }
            TaggedWritable retv = new TaggedWritable();
            retv.setData(new Text(joinedStr));
            retv.setTag((Text) tags[0]); 
            return retv;
        }
    }

    public static class TaggedWritable extends TaggedMapOutput {

        private Writable data;

//        public TaggedWritable(Writable data) {
//            this.tag = new Text("");
//            this.data = data;
//        }
        public TaggedWritable() {
            this.tag = new Text();
        }        

        public Writable getData() {
            return data;
        }

        public void setData(Writable data) {
            this.data = data;
        }        

        public void write(DataOutput out) throws IOException {
            this.tag.write(out);
            this.data.write(out);
        }

        /***

          //  public void readFields(DataInput in) throws IOException {
          //  this.tag.readFields(in);
          //  String dataClz = in.readUTF();
          //  if (this.data == null
          //          || !this.data.getClass().getName().equals(dataClz)) {
          //      try {
          //          this.data = (Writable) ReflectionUtils.newInstance(
          //                  Class.forName(dataClz), null);
          //      } catch (ClassNotFoundException e) {
          //          // TODO Auto-generated catch block
          //          e.printStackTrace();
          //      }
          //  }
          //this.data.readFields(in);
        //}

        *****/        

        public void readFields(DataInput in) throws IOException {
            this.tag.readFields(in);
            this.data.readFields(in);
        }
    }

    public int run(String[] args) throws Exception {
        Configuration conf = getConf();

        JobConf job = new JobConf(conf, DataJoin.class);

        Path in = new Path(args[0]);
        Path out = new Path(args[1]);
        FileInputFormat.setInputPaths(job, in);
        FileOutputFormat.setOutputPath(job, out);

        job.setJobName("DataJoin");
        job.setMapperClass(MapClass.class);
        job.setReducerClass(Reduce.class);

        job.setInputFormat(TextInputFormat.class);
        job.setOutputFormat(TextOutputFormat.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(TaggedWritable.class);
        job.set("mapred.textoutputformat.separator", ",");

        JobClient.runJob(job); 
        return 0;
    }

    public static void main(String[] args) throws Exception { 
        int res = ToolRunner.run(new Configuration(),
                                 new DataJoin(),
                                 args);

        System.exit(res);
    }
}

Copyright Notice:Content Author:「Cindy Lee」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/14088977/java-lang-reflect-invocationtargetexception-in-reduce-datajoin-hadoop-in-actio

More about “java.lang.reflect.InvocationTargetException in Reduce DataJoin ( Hadoop In Action )” related questions

java.lang.reflect.InvocationTargetException in Reduce DataJoin ( Hadoop In Action )

I am having a problem running the DataJoin example in the Hadoop In Action. It seems like while running the job, java.lang.reflect.InvocationTargetException was thrown. I tried it for a day and it...

Show Detail

Hadoop reduce side join using Datajoin

I am using the folllowing code to do the reduce side join /* * HadoopMapper.java * * Created on Apr 8, 2012, 5:39:51 PM */ import java.io.DataInput; import java.io.DataOutput; import java.io.

Show Detail

Hadoop Data Set Map Reduce DataJoin

Code I tried to run DataJoin example of Hadoop in Action Book. import java.io.DataInput; import java.io.DataOutput; import java.io.IOException; // import org.apache.commons.logging.Log; // impor...

Show Detail

Hadoop reduce-side join doesn't work

When I try the same example shown here, I see following output but it doesn't proceed any further. Please advice. My DataJoin.jar includes following libs: hadoop-common.jar hadoop-datajoin-2.0.0...

Show Detail

DataJoins in Hadoop MapReduce

I am trying to implement one use case as given in Book Hadoop In Action, but I am not being to compile the code. I am new to Java so, not being able to understand the exact reasons behind the error...

Show Detail

Join using Hadoop Map Reduce to join data from NoSQL databases

I am currently using Solr as a NoSQL database. I have indexed various types of documents that sometimes have relationships between them. For new use cases I have to perform the equivalent of a join

Show Detail

Hadoop data join package

I am new to hadoop while exploring the hadoop data join package I am given the below mentioned command: hadoop jar /home/biadmin/DataJoin.jar com.datajoin.DataJoin /user/biadmin/Datajoin/

Show Detail

Hadoop jar command error for multiple mapper inputs and 1 reducer output (Join 2 values from 2 files)

Here is my sample program joining 2 datasets. The program has 2 mappers and 1 reducer joining the values obtained from 2 different mappers having 2 different files as input. I am getting an error ...

Show Detail

Hadoop: Reduce-side join get stuck at map 100% reduce 100% and never finish

I'm beginner with Hadoop, these days I'm trying to run reduce-side join example but it got stuck: Map 100% and Reduce 100% but never finishing. Progress,logs, code, sample data and configuration fi...

Show Detail

Hadoop : NoSuchMethodException

This is the job, which joins two relations, import org.apache.hadoop.mapred.*; import org.apache.hadoop.conf.*; import org.apache.hadoop.util.*; import org.apache.hadoop.fs.Path; import org.apache.

Show Detail