Search in sources :

Example 16 with LoadException

use of com.baidu.hugegraph.loader.exception.LoadException in project incubator-hugegraph-toolchain by apache.

the class HugeGraphLoader method loadStructs.

private void loadStructs(List<InputStruct> structs) {
    // Load input structs one by one
    for (InputStruct struct : structs) {
        if (this.context.stopped()) {
            break;
        }
        if (struct.skip()) {
            continue;
        }
        // Create and init InputReader, fetch next batch lines
        try (InputReader reader = InputReader.create(struct.input())) {
            // Init reader
            reader.init(this.context, struct);
            // Load data from current input mapping
            this.loadStruct(struct, reader);
        } catch (InitException e) {
            throw new LoadException("Failed to init input reader", e);
        }
    }
}
Also used : InputReader(com.baidu.hugegraph.loader.reader.InputReader) InputStruct(com.baidu.hugegraph.loader.mapping.InputStruct) InitException(com.baidu.hugegraph.loader.exception.InitException) LoadException(com.baidu.hugegraph.loader.exception.LoadException)

Example 17 with LoadException

use of com.baidu.hugegraph.loader.exception.LoadException in project incubator-hugegraph-toolchain by apache.

the class OrcFileLineFetcher method openReader.

@Override
public void openReader(Readable readable) {
    Path path = readable.path();
    try {
        OrcFile.ReaderOptions options = OrcFile.readerOptions(this.conf);
        this.reader = OrcFile.createReader(path, options);
        this.recordReader = this.reader.rows();
        this.inspector = (StructObjectInspector) this.reader.getObjectInspector();
        this.row = null;
    } catch (IOException e) {
        throw new LoadException("Failed to open orc reader for '%s'", e, readable);
    }
    this.resetOffset();
}
Also used : Path(org.apache.hadoop.fs.Path) OrcFile(org.apache.hadoop.hive.ql.io.orc.OrcFile) IOException(java.io.IOException) LoadException(com.baidu.hugegraph.loader.exception.LoadException)

Example 18 with LoadException

use of com.baidu.hugegraph.loader.exception.LoadException in project incubator-hugegraph-toolchain by apache.

the class HDFSFileReader method scanReadables.

@Override
protected List<Readable> scanReadables() throws IOException {
    Path path = new Path(this.source().path());
    FileFilter filter = this.source().filter();
    List<Readable> paths = new ArrayList<>();
    if (this.hdfs.isFile(path)) {
        if (!filter.reserved(path.getName())) {
            throw new LoadException("Please check path name and extensions, ensure " + "that at least one path is available for reading");
        }
        paths.add(new HDFSFile(this.hdfs, path));
    } else {
        assert this.hdfs.isDirectory(path);
        FileStatus[] statuses = this.hdfs.listStatus(path);
        Path[] subPaths = FileUtil.stat2Paths(statuses);
        for (Path subPath : subPaths) {
            if (filter.reserved(subPath.getName())) {
                paths.add(new HDFSFile(this.hdfs, subPath));
            }
        }
    }
    return paths;
}
Also used : Path(org.apache.hadoop.fs.Path) FileStatus(org.apache.hadoop.fs.FileStatus) ArrayList(java.util.ArrayList) Readable(com.baidu.hugegraph.loader.reader.Readable) FileFilter(com.baidu.hugegraph.loader.source.file.FileFilter) LoadException(com.baidu.hugegraph.loader.exception.LoadException)

Example 19 with LoadException

use of com.baidu.hugegraph.loader.exception.LoadException in project incubator-hugegraph-toolchain by apache.

the class FailWriter method write.

public void write(InsertException e) {
    try {
        this.writeLine("#### INSERT ERROR: " + e.getMessage());
        this.writeLine(e.line());
    } catch (IOException ex) {
        throw new LoadException("Failed to write insert error '%s'", ex, e.line());
    }
}
Also used : IOException(java.io.IOException) LoadException(com.baidu.hugegraph.loader.exception.LoadException)

Example 20 with LoadException

use of com.baidu.hugegraph.loader.exception.LoadException in project incubator-hugegraph-toolchain by apache.

the class FailWriter method write.

public void write(ParseException e) {
    try {
        this.writeLine("#### PARSE ERROR: " + e.getMessage());
        this.writeLine(e.line());
    } catch (IOException ex) {
        throw new LoadException("Failed to write parse error '%s'", ex, e.line());
    }
}
Also used : IOException(java.io.IOException) LoadException(com.baidu.hugegraph.loader.exception.LoadException)

Aggregations

LoadException (com.baidu.hugegraph.loader.exception.LoadException)32 IOException (java.io.IOException)18 File (java.io.File)10 SQLException (java.sql.SQLException)4 ArrayList (java.util.ArrayList)4 FileFilter (com.baidu.hugegraph.loader.source.file.FileFilter)3 Path (org.apache.hadoop.fs.Path)3 ServerException (com.baidu.hugegraph.exception.ServerException)2 InitException (com.baidu.hugegraph.loader.exception.InitException)2 LoadOptions (com.baidu.hugegraph.loader.executor.LoadOptions)2 LoadSummary (com.baidu.hugegraph.loader.metrics.LoadSummary)2 Readable (com.baidu.hugegraph.loader.reader.Readable)2 InputStream (java.io.InputStream)2 InputStreamReader (java.io.InputStreamReader)2 CompressorInputStream (org.apache.commons.compress.compressors.CompressorInputStream)2 CompressionInputStream (org.apache.hadoop.io.compress.CompressionInputStream)2 HugeClient (com.baidu.hugegraph.driver.HugeClient)1 HugeClientBuilder (com.baidu.hugegraph.driver.HugeClientBuilder)1 GroovyExecutor (com.baidu.hugegraph.loader.executor.GroovyExecutor)1 InputStruct (com.baidu.hugegraph.loader.mapping.InputStruct)1