线程 “main” java.lang.NoClassDefFoundError中的异常:org / apache / hadoop / tracing / SpanReceiverHost

2020年11月29日 55点热度 0条评论

我正在运行Hadoop 2.8.1和Hive 2.3.0
我正试图从在Hive中创建的表中读取值
目前的异常(exception)是

java.lang.ClassNotFoundException: org.apache.hadoop.tracing.SpanReceiverHost
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

这是我用来读取表格的代码

 public static final String HIVEURL = "jdbc:hive2://localhost:10000";
    public static final String DB_NAME = "default";
    public static final String TABLE_NAME = "order_line";

    public static void main(String[] args) throws Exception {
        HiveConf hiveConf = new HiveConf();
        //hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, HIVEURL);
        HiveMetaStoreClient hiveClient = new HiveMetaStoreClient(hiveConf);

        Job job =Job.getInstance();
        TaskAttemptContext ctx = new TaskAttemptContextImpl(job.getConfiguration(), new TaskAttemptID());
        HCatInputFormat hcif = HCatInputFormat.setInput(job, DB_NAME, TABLE_NAME);


        HCatSchema allCols = hcif.getTableSchema(job.getConfiguration());
        List<HCatFieldSchema> usedList = new ArrayList<>();
        usedList.add(allCols.get(2)); // por ex...
        HCatSchema someCols = new HCatSchema(usedList);
        hcif.setOutputSchema(job, someCols);

        for(InputSplit split: hcif.getSplits(job)) {
            RecordReader<WritableComparable, HCatRecord> rr = hcif.createRecordReader(split,ctx);
            rr.initialize(split, ctx);

            while(rr.nextKeyValue()) {
                HCatRecord record = rr.getCurrentValue();
                // usar record.get(...) para obter a coluna...
                //Object o = record.get(1);
                //System.out.println(o.toString());

            }

            rr.close();
        }

        hiveClient.close();
    }

这是我使用的Pom文件

org.apache.hive.h目录

hive hatalog核心

2.3.0

org.apache.hive.h目录

hive 目录

0.13.1-cdh5.3.5

org.apache.hive

hive 常见

2.3.0

    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive</artifactId>
        <version>0.13.1-cdh5.3.5</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hive</groupId>
        <artifactId>hive-metastore</artifactId>
        <version>2.3.0</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.8.1</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>2.6.0-mr1-cdh5.12.1</version>
        <type>pom</type>
    </dependency>

    <dependency>
        <groupId>org.apache.thrift</groupId>
        <artifactId>libthrift</artifactId>
        <version>0.9.3</version>
    </dependency>
</dependencies>

解决方案如下:

我无法从堆栈跟踪代码段中真正了解导致loadClass的原因,但似乎该类在您所使用的hadoop-common的2.8.1版本中实际上并不存在。它似乎在2.7.2之后消失了

它或同名的东西在hbase source

您是否有各种版本的混搭?