Search in sources :

Example 1 with HdfsHelper

use of com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper in project plugins by qlangtech.

the class BasicHdfsWriterJob method createHdfsHelper.

public static HdfsHelper createHdfsHelper(Configuration pluginJobConf, BasicFSWriter hiveWriter) {
    Objects.requireNonNull(pluginJobConf, "pluginJobConf can not be null");
    Objects.requireNonNull(hiveWriter, "hiveWriter can not be null");
    try {
        FileSystemFactory fs = hiveWriter.getFs();
        HdfsHelper hdfsHelper = new HdfsHelper(fs.getFileSystem().unwrap());
        org.apache.hadoop.conf.Configuration cfg = new org.apache.hadoop.conf.Configuration();
        cfg.setClassLoader(TIS.get().getPluginManager().uberClassLoader);
        org.apache.hadoop.mapred.JobConf conf = new JobConf(cfg);
        conf.set(FileSystem.FS_DEFAULT_NAME_KEY, pluginJobConf.getString("defaultFS"));
        conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
        conf.set(JobContext.WORKING_DIR, (fs.getFileSystem().getRootDir().unwrap(Path.class).toString()));
        hdfsHelper.conf = conf;
        return hdfsHelper;
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) Configuration(com.alibaba.datax.common.util.Configuration) FileSystemFactory(com.qlangtech.tis.offline.FileSystemFactory) IOException(java.io.IOException) JobConf(org.apache.hadoop.mapred.JobConf) HdfsHelper(com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper) JobConf(org.apache.hadoop.mapred.JobConf)

Aggregations

Configuration (com.alibaba.datax.common.util.Configuration)1 HdfsHelper (com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper)1 FileSystemFactory (com.qlangtech.tis.offline.FileSystemFactory)1 IOException (java.io.IOException)1 Path (org.apache.hadoop.fs.Path)1 JobConf (org.apache.hadoop.mapred.JobConf)1