Search in sources :

Example 6 with CompiledPlan

use of org.apache.flink.table.api.CompiledPlan in project flink by apache.

the class TableEnvironmentImpl method compilePlanAndWrite.

private CompiledPlan compilePlanAndWrite(String filePath, boolean ifNotExists, Operation operation) {
    File file = Paths.get(filePath).toFile();
    if (file.exists()) {
        if (ifNotExists) {
            return loadPlan(PlanReference.fromFile(filePath));
        }
        if (!tableConfig.getConfiguration().get(TableConfigOptions.PLAN_FORCE_RECOMPILE)) {
            throw new TableException(String.format("Cannot overwrite the plan file '%s'. " + "Either manually remove the file or, " + "if you're debugging your job, " + "set the option '%s' to true.", filePath, TableConfigOptions.PLAN_FORCE_RECOMPILE.key()));
        }
    }
    CompiledPlan compiledPlan;
    if (operation instanceof StatementSetOperation) {
        compiledPlan = compilePlan(((StatementSetOperation) operation).getOperations());
    } else if (operation instanceof ModifyOperation) {
        compiledPlan = compilePlan(Collections.singletonList((ModifyOperation) operation));
    } else {
        throw new TableException("Unsupported operation to compile: " + operation.getClass() + ". This is a bug, please file an issue.");
    }
    compiledPlan.writeToFile(file, false);
    return compiledPlan;
}
Also used : CompiledPlan(org.apache.flink.table.api.CompiledPlan) TableException(org.apache.flink.table.api.TableException) ModifyOperation(org.apache.flink.table.operations.ModifyOperation) SinkModifyOperation(org.apache.flink.table.operations.SinkModifyOperation) CollectModifyOperation(org.apache.flink.table.operations.CollectModifyOperation) File(java.io.File) StatementSetOperation(org.apache.flink.table.operations.StatementSetOperation)

Example 7 with CompiledPlan

use of org.apache.flink.table.api.CompiledPlan in project flink by apache.

the class TableCsvFormatITCase method testReadingMetadata.

@Test
public void testReadingMetadata() throws Exception {
    createTestValuesSourceTable("MyTable", JavaScalaConversionUtil.toJava(TestData.smallData3()), new String[] { "a int", "b bigint", "m varchar metadata" }, new HashMap<String, String>() {

        {
            put("readable-metadata", "m:STRING");
        }
    });
    File sinkPath = createSinkTable("MySink", "a bigint", "m varchar");
    CompiledPlan compiledPlan = tableEnv.compilePlanSql("insert into MySink select a, m from MyTable");
    tableEnv.executePlan(compiledPlan).await();
    assertResult(Arrays.asList("1,Hi", "2,Hello", "3,Hello world"), sinkPath);
}
Also used : CompiledPlan(org.apache.flink.table.api.CompiledPlan) File(java.io.File) Test(org.junit.Test)

Example 8 with CompiledPlan

use of org.apache.flink.table.api.CompiledPlan in project flink by apache.

the class TableCsvFormatITCase method testPushDowns.

@Test
public void testPushDowns() throws Exception {
    createTestValuesSourceTable("MyTable", JavaScalaConversionUtil.toJava(TestData.data3WithTimestamp()), new String[] { "a int", "b bigint", "c varchar", "ts timestamp(3)", "watermark for ts as ts - interval '5' second" }, "b", new HashMap<String, String>() {

        {
            put("readable-metadata", "a:INT");
            put("filterable-fields", "a");
            put("enable-watermark-push-down", "true");
            put("partition-list", "b:1;b:2;b:3;b:4;b:5;b:6");
        }
    });
    File sinkPath = createSinkTable("MySink", "a int", "ts timestamp(3)");
    CompiledPlan compiledPlan = tableEnv.compilePlanSql("insert into MySink select a, ts from MyTable where b = 3 and a > 4");
    tableEnv.executePlan(compiledPlan).await();
    assertResult(Arrays.asList("5," + formatSqlTimestamp(5000L), "6," + formatSqlTimestamp(6000L)), sinkPath);
}
Also used : CompiledPlan(org.apache.flink.table.api.CompiledPlan) File(java.io.File) Test(org.junit.Test)

Example 9 with CompiledPlan

use of org.apache.flink.table.api.CompiledPlan in project flink by apache.

the class TableCsvFormatITCase method testPartitionPushDown.

@Test
public void testPartitionPushDown() throws Exception {
    createTestValuesSourceTable("MyTable", JavaScalaConversionUtil.toJava(TestData.smallData3()), new String[] { "a int", "p bigint", "c varchar" }, "p", new HashMap<String, String>() {

        {
            put("partition-list", "p:1;p:2");
        }
    });
    File sinkPath = createSinkTable("MySink", "a int", "p bigint", "c varchar");
    CompiledPlan compiledPlan = tableEnv.compilePlanSql("insert into MySink select * from MyTable where p = 2");
    tableEnv.executePlan(compiledPlan).await();
    assertResult(Arrays.asList("2,2,Hello", "3,2,Hello world"), sinkPath);
}
Also used : CompiledPlan(org.apache.flink.table.api.CompiledPlan) File(java.io.File) Test(org.junit.Test)

Example 10 with CompiledPlan

use of org.apache.flink.table.api.CompiledPlan in project flink by apache.

the class JsonPlanTestBase method compileSqlAndExecutePlan.

protected TableResult compileSqlAndExecutePlan(String sql) {
    CompiledPlan compiledPlan = tableEnv.compilePlanSql(sql);
    checkTransformationUids(compiledPlan);
    return tableEnv.executePlan(compiledPlan);
}
Also used : CompiledPlan(org.apache.flink.table.api.CompiledPlan) ExecNodeGraphCompiledPlan(org.apache.flink.table.planner.plan.ExecNodeGraphCompiledPlan)

Aggregations

CompiledPlan (org.apache.flink.table.api.CompiledPlan)10 File (java.io.File)7 Test (org.junit.Test)7 TableException (org.apache.flink.table.api.TableException)2 CollectModifyOperation (org.apache.flink.table.operations.CollectModifyOperation)2 ModifyOperation (org.apache.flink.table.operations.ModifyOperation)2 SinkModifyOperation (org.apache.flink.table.operations.SinkModifyOperation)2 StatementSetOperation (org.apache.flink.table.operations.StatementSetOperation)2 IOException (java.io.IOException)1 ArrayList (java.util.ArrayList)1 HashMap (java.util.HashMap)1 List (java.util.List)1 Map (java.util.Map)1 Optional (java.util.Optional)1 ExecutionException (java.util.concurrent.ExecutionException)1 ExplainDetail (org.apache.flink.table.api.ExplainDetail)1 SqlParserException (org.apache.flink.table.api.SqlParserException)1 TableSchema (org.apache.flink.table.api.TableSchema)1 ValidationException (org.apache.flink.table.api.ValidationException)1 Catalog (org.apache.flink.table.catalog.Catalog)1