JAVA 连接HDFS错误 Class org.apache.hadoop.hdfs.DistributedFileSystem not found


经过查看配置类Configration源代码发现,类加载器会使用当前程序默认的Classloader

public Configuration(boolean loadDefaults) {
        this.quietmode = true;
        this.allowNullValueProperties = false;
        this.resources = new ArrayList();
        this.finalParameters = Collections.newSetFromMap(new ConcurrentHashMap());
        this.loadDefaults = true;
        this.classLoader = Thread.currentThread().getContextClassLoader();
        if (this.classLoader == null) {
            this.classLoader = Configuration.class.getClassLoader();
        }

        this.loadDefaults = loadDefaults;
        this.updatingResource = new ConcurrentHashMap();
        Class var2 = Configuration.class;
        synchronized(Configuration.class) {
            REGISTRY.put(this, (Object)null);
        }
    }

而我在kettle开发时使用的是插件形式,加载类时读取了kettle默认的根目录,导致无法加载插件的hdfs相关类。

解决方式:初始化定义自己的classloader:

Configuration hdfsConf = new Configuration();
    hdfsConf.setClassLoader(this.getClass().getClassLoader());


原文链接:https://www.cnblogs.com/jinhai-wow/p/13474634.html