Search in sources :

Example 1 with FilterTransformer

use of org.apache.drill.metastore.iceberg.transform.FilterTransformer in project drill by apache.

the class TestTablesOperationTransformer method testToOverwriteOperation.

@Test
public void testToOverwriteOperation() {
    TableMetadataUnit unit = TableMetadataUnit.builder().storagePlugin("dfs").workspace("tmp").tableName("nation").metadataKey("dir0").build();
    TableKey tableKey = new TableKey(unit.storagePlugin(), unit.workspace(), unit.tableName());
    Map<MetastoreColumn, Object> filterConditions = new HashMap<>(tableKey.toFilterConditions());
    filterConditions.put(MetastoreColumn.METADATA_KEY, unit.metadataKey());
    String location = tableKey.toLocation(TestTablesOperationTransformer.location);
    Expression expression = new FilterTransformer().transform(filterConditions);
    Overwrite operation = transformer.toOverwrite(location, expression, Collections.singletonList(unit));
    assertEquals(expression.toString(), operation.filter().toString());
    Path path = new Path(String.valueOf(operation.dataFile().path()));
    File file = new File(path.toUri().getPath());
    assertTrue(file.exists());
    assertEquals(location, path.getParent().toUri().getPath());
}
Also used : Path(org.apache.hadoop.fs.Path) Overwrite(org.apache.drill.metastore.iceberg.operate.Overwrite) TableMetadataUnit(org.apache.drill.metastore.components.tables.TableMetadataUnit) HashMap(java.util.HashMap) FilterExpression(org.apache.drill.metastore.expressions.FilterExpression) Expression(org.apache.iceberg.expressions.Expression) File(java.io.File) MetastoreColumn(org.apache.drill.metastore.MetastoreColumn) FilterTransformer(org.apache.drill.metastore.iceberg.transform.FilterTransformer) Test(org.junit.Test) IcebergBaseTest(org.apache.drill.metastore.iceberg.IcebergBaseTest)

Example 2 with FilterTransformer

use of org.apache.drill.metastore.iceberg.transform.FilterTransformer in project drill by apache.

the class IcebergRead method internalExecute.

@Override
protected List<T> internalExecute() {
    String[] selectedColumns = columns.isEmpty() ? defaultColumns : columns.stream().map(MetastoreColumn::columnName).toArray(String[]::new);
    FilterTransformer filterTransformer = context.transformer().filter();
    Expression rowFilter = filterTransformer.combine(filterTransformer.transform(metadataTypes), filterTransformer.transform(filter));
    Iterable<Record> records = IcebergGenerics.read(context.table()).select(selectedColumns).where(rowFilter).build();
    return context.transformer().outputData().columns(selectedColumns).records(Lists.newArrayList(records)).execute();
}
Also used : Expression(org.apache.iceberg.expressions.Expression) Record(org.apache.iceberg.data.Record) MetastoreColumn(org.apache.drill.metastore.MetastoreColumn) FilterTransformer(org.apache.drill.metastore.iceberg.transform.FilterTransformer)

Aggregations

MetastoreColumn (org.apache.drill.metastore.MetastoreColumn)2 FilterTransformer (org.apache.drill.metastore.iceberg.transform.FilterTransformer)2 Expression (org.apache.iceberg.expressions.Expression)2 File (java.io.File)1 HashMap (java.util.HashMap)1 TableMetadataUnit (org.apache.drill.metastore.components.tables.TableMetadataUnit)1 FilterExpression (org.apache.drill.metastore.expressions.FilterExpression)1 IcebergBaseTest (org.apache.drill.metastore.iceberg.IcebergBaseTest)1 Overwrite (org.apache.drill.metastore.iceberg.operate.Overwrite)1 Path (org.apache.hadoop.fs.Path)1 Record (org.apache.iceberg.data.Record)1 Test (org.junit.Test)1