Search in sources :

Example 41 with ExecutionSetupException

use of org.apache.drill.common.exceptions.ExecutionSetupException in project drill by apache.

the class TestScanBasics method testNoReader.

/**
 * Pathological case that a scan operator is provided no readers.
 * It will throw a user exception because the downstream operators
 * can't handle this case so we choose to stop the show early to
 * avoid getting into a strange state.
 */
@Test
public void testNoReader() {
    // Create the scan operator
    ScanFixture scanFixture = simpleFixture();
    ScanOperatorExec scan = scanFixture.scanOp;
    try {
        scan.buildSchema();
    } catch (UserException e) {
        // Expected
        assertTrue(e.getCause() instanceof ExecutionSetupException);
    }
    // Must close the DAG (context and scan operator) even on failures
    scanFixture.close();
}
Also used : ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) ScanOperatorExec(org.apache.drill.exec.physical.impl.scan.ScanOperatorExec) UserException(org.apache.drill.common.exceptions.UserException) Test(org.junit.Test) EvfTest(org.apache.drill.categories.EvfTest)

Example 42 with ExecutionSetupException

use of org.apache.drill.common.exceptions.ExecutionSetupException in project drill by apache.

the class DruidScanBatchCreator method getBatch.

@Override
public CloseableRecordBatch getBatch(ExecutorFragmentContext context, DruidSubScan subScan, List<RecordBatch> children) throws ExecutionSetupException {
    Preconditions.checkArgument(children.isEmpty());
    List<RecordReader> readers = Lists.newArrayList();
    List<SchemaPath> columns;
    for (DruidSubScan.DruidSubScanSpec scanSpec : subScan.getScanSpec()) {
        try {
            columns = subScan.getColumns();
            readers.add(new DruidRecordReader(scanSpec, columns, subScan.getMaxRecordsToRead(), context, subScan.getStorageEngine()));
        } catch (Exception ex) {
            throw new ExecutionSetupException(ex);
        }
    }
    logger.debug("Number of record readers initialized - {}", readers.size());
    return new ScanBatch(subScan, context, readers);
}
Also used : ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) SchemaPath(org.apache.drill.common.expression.SchemaPath) RecordReader(org.apache.drill.exec.store.RecordReader) ScanBatch(org.apache.drill.exec.physical.impl.ScanBatch) ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException)

Example 43 with ExecutionSetupException

use of org.apache.drill.common.exceptions.ExecutionSetupException in project drill by apache.

the class OpenTSDBBatchCreator method getBatch.

@Override
public CloseableRecordBatch getBatch(ExecutorFragmentContext context, OpenTSDBSubScan subScan, List<RecordBatch> children) throws ExecutionSetupException {
    List<RecordReader> readers = new LinkedList<>();
    List<SchemaPath> columns;
    for (OpenTSDBSubScan.OpenTSDBSubScanSpec scanSpec : subScan.getTabletScanSpecList()) {
        try {
            if ((columns = subScan.getColumns()) == null) {
                columns = GroupScan.ALL_COLUMNS;
            }
            readers.add(new OpenTSDBRecordReader(subScan.getStorageEngine().getClient(), scanSpec, columns));
        } catch (Exception e) {
            throw new ExecutionSetupException(e);
        }
    }
    return new ScanBatch(subScan, context, readers);
}
Also used : ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) SchemaPath(org.apache.drill.common.expression.SchemaPath) RecordReader(org.apache.drill.exec.store.RecordReader) ScanBatch(org.apache.drill.exec.physical.impl.ScanBatch) LinkedList(java.util.LinkedList) ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException)

Example 44 with ExecutionSetupException

use of org.apache.drill.common.exceptions.ExecutionSetupException in project drill by apache.

the class ClassicConnectorLocator method create.

/**
 * Creates plugin instance with the given {@code name} and configuration {@code pluginConfig}.
 * The plugin need to be present in a list of available plugins and be enabled in the configuration
 *
 * @param name name of the plugin
 * @param pluginConfig plugin configuration
 * @return plugin client or {@code null} if plugin is disabled
 */
@Override
public StoragePlugin create(String name, StoragePluginConfig pluginConfig) throws ExecutionSetupException {
    StoragePlugin plugin;
    Constructor<? extends StoragePlugin> constructor = availablePlugins.get(pluginConfig.getClass());
    if (constructor == null) {
        throw new ExecutionSetupException(String.format("Failure finding StoragePlugin constructor for config %s", pluginConfig.getClass().getName()));
    }
    try {
        plugin = constructor.newInstance(pluginConfig, context.drillbitContext(), name);
        plugin.start();
        return plugin;
    } catch (ReflectiveOperationException | IOException e) {
        Throwable t = e instanceof InvocationTargetException ? ((InvocationTargetException) e).getTargetException() : e;
        if (t instanceof ExecutionSetupException) {
            throw ((ExecutionSetupException) t);
        }
        throw new ExecutionSetupException(String.format("Failure setting up new storage plugin configuration for config %s", pluginConfig.getClass().getSimpleName()), t);
    }
}
Also used : ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) IOException(java.io.IOException) InvocationTargetException(java.lang.reflect.InvocationTargetException)

Example 45 with ExecutionSetupException

use of org.apache.drill.common.exceptions.ExecutionSetupException in project drill by apache.

the class KuduRecordReader method setup.

@Override
public void setup(OperatorContext context, OutputMutator output) throws ExecutionSetupException {
    this.output = output;
    this.context = context;
    try {
        KuduTable table = client.openTable(scanSpec.getTableName());
        KuduScannerBuilder builder = client.newScannerBuilder(table);
        if (!isStarQuery()) {
            List<String> colNames = Lists.newArrayList();
            for (SchemaPath p : this.getColumns()) {
                colNames.add(p.getRootSegmentPath());
            }
            builder.setProjectedColumnNames(colNames);
        }
        context.getStats().startWait();
        try {
            scanner = builder.lowerBoundRaw(scanSpec.getStartKey()).exclusiveUpperBoundRaw(scanSpec.getEndKey()).build();
        } finally {
            context.getStats().stopWait();
        }
    } catch (Exception e) {
        throw new ExecutionSetupException(e);
    }
}
Also used : ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) KuduScannerBuilder(org.apache.kudu.client.KuduScanner.KuduScannerBuilder) SchemaPath(org.apache.drill.common.expression.SchemaPath) KuduTable(org.apache.kudu.client.KuduTable) UserException(org.apache.drill.common.exceptions.UserException) ExecutionSetupException(org.apache.drill.common.exceptions.ExecutionSetupException) SchemaChangeException(org.apache.drill.exec.exception.SchemaChangeException)

Aggregations

ExecutionSetupException (org.apache.drill.common.exceptions.ExecutionSetupException)94 IOException (java.io.IOException)43 ScanBatch (org.apache.drill.exec.physical.impl.ScanBatch)26 SchemaPath (org.apache.drill.common.expression.SchemaPath)25 RecordReader (org.apache.drill.exec.store.RecordReader)24 SchemaChangeException (org.apache.drill.exec.exception.SchemaChangeException)22 LinkedList (java.util.LinkedList)16 Map (java.util.Map)14 MaterializedField (org.apache.drill.exec.record.MaterializedField)13 ExecutionException (java.util.concurrent.ExecutionException)10 DrillRuntimeException (org.apache.drill.common.exceptions.DrillRuntimeException)10 OperatorContext (org.apache.drill.exec.ops.OperatorContext)8 UserException (org.apache.drill.common.exceptions.UserException)7 MajorType (org.apache.drill.common.types.TypeProtos.MajorType)7 JobConf (org.apache.hadoop.mapred.JobConf)7 HashMap (java.util.HashMap)6 List (java.util.List)6 OutOfMemoryException (org.apache.drill.exec.exception.OutOfMemoryException)6 VectorContainerWriter (org.apache.drill.exec.vector.complex.impl.VectorContainerWriter)6 Path (org.apache.hadoop.fs.Path)6