Search in sources :

Example 1 with CountingJsonReader

use of org.apache.drill.exec.store.easy.json.reader.CountingJsonReader in project drill by axbaretto.

the class JSONRecordReader method setup.

@Override
public void setup(final OperatorContext context, final OutputMutator output) throws ExecutionSetupException {
    try {
        if (hadoopPath != null) {
            this.stream = fileSystem.openPossiblyCompressedStream(hadoopPath);
        }
        this.writer = new VectorContainerWriter(output, unionEnabled);
        if (isSkipQuery()) {
            this.jsonReader = new CountingJsonReader(fragmentContext.getManagedBuffer(), enableNanInf);
        } else {
            this.jsonReader = new JsonReader.Builder(fragmentContext.getManagedBuffer()).schemaPathColumns(ImmutableList.copyOf(getColumns())).allTextMode(enableAllTextMode).skipOuterList(true).readNumbersAsDouble(readNumbersAsDouble).enableNanInf(enableNanInf).build();
        }
        setupParser();
    } catch (final Exception e) {
        handleAndRaise("Failure reading JSON file", e);
    }
}
Also used : VectorContainerWriter(org.apache.drill.exec.vector.complex.impl.VectorContainerWriter) CountingJsonReader(org.apache.drill.exec.store.easy.json.reader.CountingJsonReader) JsonReader(org.apache.drill.exec.vector.complex.fn.JsonReader) CountingJsonReader(org.apache.drill.exec.store.easy.json.reader.CountingJsonReader) UserException(org.apache.drill.common.exceptions.UserException) IOException(java.io.IOException) OutOfMemoryException(org.apache.drill.exec.exception.OutOfMemoryException) ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) JsonParseException(com.fasterxml.jackson.core.JsonParseException)

Example 2 with CountingJsonReader

use of org.apache.drill.exec.store.easy.json.reader.CountingJsonReader in project drill by apache.

the class JSONRecordReader method setup.

@Override
public void setup(OperatorContext context, OutputMutator output) throws ExecutionSetupException {
    try {
        if (hadoopPath != null) {
            stream = fileSystem.openPossiblyCompressedStream(hadoopPath);
        }
        writer = new VectorContainerWriter(output, unionEnabled);
        if (isSkipQuery()) {
            jsonReader = new CountingJsonReader(fragmentContext.getManagedBuffer(), enableNanInf, enableEscapeAnyChar);
        } else {
            this.jsonReader = new JsonReader.Builder(fragmentContext.getManagedBuffer()).schemaPathColumns(ImmutableList.copyOf(getColumns())).allTextMode(enableAllTextMode).skipOuterList(true).readNumbersAsDouble(readNumbersAsDouble).enableNanInf(enableNanInf).enableEscapeAnyChar(enableEscapeAnyChar).build();
        }
        setupParser();
    } catch (Exception e) {
        handleAndRaise("Failure reading JSON file", e);
    }
}
Also used : VectorContainerWriter(org.apache.drill.exec.vector.complex.impl.VectorContainerWriter) CountingJsonReader(org.apache.drill.exec.store.easy.json.reader.CountingJsonReader) UserException(org.apache.drill.common.exceptions.UserException) OutOfMemoryException(org.apache.drill.exec.exception.OutOfMemoryException) ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) JsonParseException(com.fasterxml.jackson.core.JsonParseException) IOException(java.io.IOException)

Aggregations

JsonParseException (com.fasterxml.jackson.core.JsonParseException)2 IOException (java.io.IOException)2 ExecutionSetupException (org.apache.drill.common.exceptions.ExecutionSetupException)2 UserException (org.apache.drill.common.exceptions.UserException)2 OutOfMemoryException (org.apache.drill.exec.exception.OutOfMemoryException)2 CountingJsonReader (org.apache.drill.exec.store.easy.json.reader.CountingJsonReader)2 VectorContainerWriter (org.apache.drill.exec.vector.complex.impl.VectorContainerWriter)2 JsonReader (org.apache.drill.exec.vector.complex.fn.JsonReader)1