Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask. 解决方案
java.lang.NoSuchMethodError: org.apache.hadoop.mapred.TaskID.(Lorg/apache/hadoop/mapreduce/JobID;Lorg/apache/hadoop/mapreduce/TaskType;I)V
at org.apache.spark.rdd.HadoopRDD.addLocalConfiguration(HadoopRDD.scala:384)atorg.apache.spark.rdd.HadoopRDD .addLocalConfiguration(HadoopRDD.scala:384) at org.apache.spark.rdd.HadoopRDD.addLocalConfiguration(HadoopRDD.scala:384)atorg.apache.spark.rdd.HadoopRDD$anon1.<init>(HadoopRDD.scala:246)atorg.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:211)atorg.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:102)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:287)atorg.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:287)atorg.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD 1.<init>(HadoopRDD.scala:246) at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:211) at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:102) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD1.<init>(HadoopRDD.scala:246)atorg.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:211)atorg.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:102)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:287)atorg.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:287)atorg.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD$anonfun8.apply(RDD.scala:336)atorg.apache.spark.rdd.RDD 8.apply(RDD.scala:336) at org.apache.spark.rdd.RDD8.apply(RDD.scala:336)atorg.apache.spark.rdd.RDD$anonfun8.apply(RDD.scala:334)atorg.apache.spark.storage.BlockManager 8.apply(RDD.scala:334) at org.apache.spark.storage.BlockManager8.apply(RDD.scala:334)atorg.apache.spark.storage.BlockManageranonfun anonfunanonfundoPutIterator1.apply(BlockManager.scala:957)atorg.apache.spark.storage.BlockManager 1.apply(BlockManager.scala:957) at org.apache.spark.storage.BlockManager1.apply(BlockManager.scala:957)atorg.apache.spark.storage.BlockManageranonfun anonfunanonfundoPutIterator1.apply(BlockManager.scala:948)atorg.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)atorg.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)atorg.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)atorg.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:285)atorg.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)atorg.apache.spark.rdd.RDD 1.apply(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD1.apply(BlockManager.scala:948)atorg.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)atorg.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)atorg.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)atorg.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:
错误原因:
这个错误是因为hbase的版本号太低
解决方案:把原有的版本改高一点