1

我正在尝试解决以下错误:

13/05/05 19:49:04 INFO handler.OpenRegionHandler: Opening of region {NAME => '-ROOT-,,0', STARTKEY => '', ENDKEY => '', ENCODED => 70236052,} failed, marking as FAILED_OPEN in ZK
13/05/05 19:49:04 INFO regionserver.HRegionServer: Received request to open region: -ROOT-,,0.70236052
13/05/05 19:49:04 INFO regionserver.HRegion: Setting up tabledescriptor config now ...
13/05/05 19:49:04 ERROR handler.OpenRegionHandler: Failed open of region=-ROOT-,,0.70236052, starting to roll back the global memstore size.
java.lang.IllegalStateException: Could not instantiate a region instance.
    at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:3747)
    at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:3927)
    at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:332)
    at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:108)
    at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:175)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
    at java.lang.Thread.run(Thread.java:680)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.GeneratedConstructorAccessor17.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:3744)
    ... 7 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost
    at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:421)
    ... 11 more

我有以下 Maven 依赖项:

<properties>
<hadoopCDHMRVersion>2.0.0-mr1-cdh4.2.0</hadoopCDHMRVersion&gt;
<hadoopCDHVersion>2.0.0-cdh4.2.0</hadoopCDHVersion&gt;
<hbaseCDHVersion><b>0.94.2-cdh4.2.0</b></hbaseCDHVersion>
</properties>

<dependencyManagement>
<dependencies>
<!-- Apache -->
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>${hadoopCDHMRVersion}</version>
        <exclusions>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-compiler</artifactId>
          </exclusion>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-runtime</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoopCDHVersion}</version>
        <exclusions>
          <exclusion>
            <groupId>org.mockito</groupId>
            <artifactId>mockito-all</artifactId>
          </exclusion>
          <exclusion>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
          </exclusion>
          <exclusion>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
          </exclusion>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-compiler</artifactId>
          </exclusion>
          <exclusion>
            <groupId>tomcat</groupId>
            <artifactId>jasper-runtime</artifactId>
          </exclusion>
          <exclusion>
            <groupId>org.mortbay.jetty</groupId>
            <artifactId>jetty</artifactId>
          </exclusion>
          <exclusion>
            <groupId>org.mortbay.jetty</groupId>
            <artifactId>jetty-util</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoopCDHVersion}</version>
      </dependency>
      <!-- Test -->
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase</artifactId>
        <scope>test</scope>
        <classifier>tests</classifier>
        <version>${hbaseCDHVersion}</version>
      </dependency>
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase</artifactId>
        <scope>provided</scope>
        <version>${hbaseCDHVersion}</version>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-test</artifactId>
        <version>${hadoopCDHMRVersion}</version>
        <scope>test</scope>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-minicluster</artifactId>
        <version>${hadoopCDHMRVersion}</version>
        <scope>test</scope>
      </dependency>
<dependencies>
</dependencyManagement>

我将依赖从父 pom 带到子 pom。我测试的代码:

//Started a mini cluster to perform unit test
final Configuration startingConf = HBaseConfiguration.create();
startingConf.setLong("hbase.client.keyvalue.maxsize", 65536);
startingConf.setStrings(HConstants.ZOOKEEPER_QUORUM, "localhost");
startingConf.setStrings("mapreduce.jobtracker.address", "local");
startingConf.setLong(HConstants.HBASE_CLIENT_PAUSE, 50);
startingConf.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER, 200);
testUtil = new HBaseTestingUtility(startingConf);
//point of failure 
testUtil.startMiniCluster();


我在 startMiniCluster() 之后收到错误它完成了实例化环境的大部分工作,但由于上述错误而介于两者之间。我尝试过的事情:

  1. 如果我从 hbaseCDHVersion 0.94.2-cdh4.2.0 回滚到 0.92.1-cdh4.XX 的任何版本,它就可以工作。
  2. 完全删除 .m2 缓存并确保仅创建 0.94.2-cdh4.2.0。
  3. 尝试了几乎所有版本的 0.94.2-cdh4.XX
  4. 我通过命令行运行 mvn clean 并安装,而不是依赖 eclipse 来做魔术,我还尝试了 eclipse:eclipse。
  5. 通过 eclipse 检查缺少的类的类型/资源,它指向本地 repo 的正确版本,所以我可以通过 eclipse 找到它。
  6. 观察依赖树是否有任何冲突。
  7. 我也自己打开了repo jar,看到了这个类。
  8. 尝试创建一个新项目并从头开始创建 pom 文件。

非常感谢任何指针。

4

1 回答 1

1

问题出在公共配置 jar 上。父 pom 引入了 1.9 版本,导致与引入 1.6 版本的 hadoop common jar 冲突。找出问题的唯一方法是在父 pom 中保持最小依赖关系并一一取消注释依赖关系以缩小问题范围。一旦发现问题,只需在 hadoop commons 依赖项中排除这些依赖项。希望这可以帮助某人。hadoop jar 应该升级到现在已有五年历史的 commons-configuration。我们还可以将最新的 jar 从 1.9 回滚到 1.6

于 2013-05-07T03:16:51.640 回答