Search in sources :

Example 1 with SqlCreateTable

use of org.apache.storm.sql.parser.SqlCreateTable in project storm by apache.

the class StormSqlImpl method submit.

@Override
public void submit(String name, Iterable<String> statements, Map<String, ?> stormConf, SubmitOptions opts, StormSubmitter.ProgressListener progressListener, String asUser) throws Exception {
    Map<String, ISqlTridentDataSource> dataSources = new HashMap<>();
    for (String sql : statements) {
        StormParser parser = new StormParser(sql);
        SqlNode node = parser.impl().parseSqlStmtEof();
        if (node instanceof SqlCreateTable) {
            handleCreateTableForTrident((SqlCreateTable) node, dataSources);
        } else if (node instanceof SqlCreateFunction) {
            handleCreateFunction((SqlCreateFunction) node);
        } else {
            QueryPlanner planner = new QueryPlanner(schema);
            AbstractTridentProcessor processor = planner.compile(dataSources, sql);
            TridentTopology topo = processor.build();
            Path jarPath = null;
            try {
                // QueryPlanner on Trident mode configures the topology with compiled classes,
                // so we need to add new classes into topology jar
                // Topology will be serialized and sent to Nimbus, and deserialized and executed in workers.
                jarPath = Files.createTempFile("storm-sql", ".jar");
                System.setProperty("storm.jar", jarPath.toString());
                packageTopology(jarPath, processor);
                StormSubmitter.submitTopologyAs(name, stormConf, topo.build(), opts, progressListener, asUser);
            } finally {
                if (jarPath != null) {
                    Files.delete(jarPath);
                }
            }
        }
    }
}
Also used : Path(java.nio.file.Path) HashMap(java.util.HashMap) TridentTopology(org.apache.storm.trident.TridentTopology) SqlCreateFunction(org.apache.storm.sql.parser.SqlCreateFunction) ISqlTridentDataSource(org.apache.storm.sql.runtime.ISqlTridentDataSource) StormParser(org.apache.storm.sql.parser.StormParser) SqlCreateTable(org.apache.storm.sql.parser.SqlCreateTable) QueryPlanner(org.apache.storm.sql.planner.trident.QueryPlanner) SqlNode(org.apache.calcite.sql.SqlNode)

Example 2 with SqlCreateTable

use of org.apache.storm.sql.parser.SqlCreateTable in project storm by apache.

the class StormSqlImpl method updateSchema.

private List<FieldInfo> updateSchema(SqlCreateTable n) {
    TableBuilderInfo builder = new TableBuilderInfo(typeFactory);
    List<FieldInfo> fields = new ArrayList<>();
    for (ColumnDefinition col : n.fieldList()) {
        builder.field(col.name(), col.type(), col.constraint());
        RelDataType dataType = col.type().deriveType(typeFactory);
        Class<?> javaType = (Class<?>) typeFactory.getJavaClass(dataType);
        ColumnConstraint constraint = col.constraint();
        boolean isPrimary = constraint != null && constraint instanceof ColumnConstraint.PrimaryKey;
        fields.add(new FieldInfo(col.name(), javaType, isPrimary));
    }
    if (n.parallelism() != null) {
        builder.parallelismHint(n.parallelism());
    }
    Table table = builder.build();
    schema.add(n.tableName(), table);
    return fields;
}
Also used : Table(org.apache.calcite.schema.Table) ChainedSqlOperatorTable(org.apache.calcite.sql.util.ChainedSqlOperatorTable) SqlStdOperatorTable(org.apache.calcite.sql.fun.SqlStdOperatorTable) SqlCreateTable(org.apache.storm.sql.parser.SqlCreateTable) SqlOperatorTable(org.apache.calcite.sql.SqlOperatorTable) TableBuilderInfo(org.apache.storm.sql.compiler.CompilerUtil.TableBuilderInfo) ColumnConstraint(org.apache.storm.sql.parser.ColumnConstraint) ArrayList(java.util.ArrayList) RelDataType(org.apache.calcite.rel.type.RelDataType) FieldInfo(org.apache.storm.sql.runtime.FieldInfo) ColumnDefinition(org.apache.storm.sql.parser.ColumnDefinition)

Example 3 with SqlCreateTable

use of org.apache.storm.sql.parser.SqlCreateTable in project storm by apache.

the class StormSqlImpl method execute.

@Override
public void execute(Iterable<String> statements, ChannelHandler result) throws Exception {
    Map<String, DataSource> dataSources = new HashMap<>();
    for (String sql : statements) {
        StormParser parser = new StormParser(sql);
        SqlNode node = parser.impl().parseSqlStmtEof();
        if (node instanceof SqlCreateTable) {
            handleCreateTable((SqlCreateTable) node, dataSources);
        } else if (node instanceof SqlCreateFunction) {
            handleCreateFunction((SqlCreateFunction) node);
        } else {
            FrameworkConfig config = buildFrameWorkConfig();
            Planner planner = Frameworks.getPlanner(config);
            SqlNode parse = planner.parse(sql);
            SqlNode validate = planner.validate(parse);
            RelNode tree = planner.convert(validate);
            PlanCompiler compiler = new PlanCompiler(typeFactory);
            AbstractValuesProcessor proc = compiler.compile(tree);
            proc.initialize(dataSources, result);
        }
    }
}
Also used : HashMap(java.util.HashMap) AbstractValuesProcessor(org.apache.storm.sql.runtime.AbstractValuesProcessor) SqlCreateTable(org.apache.storm.sql.parser.SqlCreateTable) ISqlTridentDataSource(org.apache.storm.sql.runtime.ISqlTridentDataSource) DataSource(org.apache.storm.sql.runtime.DataSource) PlanCompiler(org.apache.storm.sql.compiler.backends.standalone.PlanCompiler) RelNode(org.apache.calcite.rel.RelNode) SqlCreateFunction(org.apache.storm.sql.parser.SqlCreateFunction) Planner(org.apache.calcite.tools.Planner) QueryPlanner(org.apache.storm.sql.planner.trident.QueryPlanner) FrameworkConfig(org.apache.calcite.tools.FrameworkConfig) StormParser(org.apache.storm.sql.parser.StormParser) SqlNode(org.apache.calcite.sql.SqlNode)

Example 4 with SqlCreateTable

use of org.apache.storm.sql.parser.SqlCreateTable in project storm by apache.

the class StormSqlImpl method explain.

@Override
public void explain(Iterable<String> statements) throws Exception {
    for (String sql : statements) {
        System.out.println("===========================================================");
        System.out.println("query>");
        System.out.println(sql);
        System.out.println("-----------------------------------------------------------");
        StormParser parser = new StormParser(sql);
        SqlNode node = parser.impl().parseSqlStmtEof();
        if (node instanceof SqlCreateTable) {
            sqlContext.interpretCreateTable((SqlCreateTable) node);
            System.out.println("No plan presented on DDL");
        } else if (node instanceof SqlCreateFunction) {
            sqlContext.interpretCreateFunction((SqlCreateFunction) node);
            System.out.println("No plan presented on DDL");
        } else {
            String plan = sqlContext.explain(sql);
            System.out.println("plan>");
            System.out.println(plan);
        }
        System.out.println("===========================================================");
    }
}
Also used : SqlCreateFunction(org.apache.storm.sql.parser.SqlCreateFunction) StormParser(org.apache.storm.sql.parser.StormParser) SqlCreateTable(org.apache.storm.sql.parser.SqlCreateTable) SqlNode(org.apache.calcite.sql.SqlNode)

Example 5 with SqlCreateTable

use of org.apache.storm.sql.parser.SqlCreateTable in project storm by apache.

the class StormSqlLocalClusterImpl method runLocal.

public void runLocal(LocalCluster localCluster, Iterable<String> statements, Predicate<Void> waitCondition, long waitTimeoutMs) throws Exception {
    final Config conf = new Config();
    conf.setMaxSpoutPending(20);
    for (String sql : statements) {
        StormParser parser = new StormParser(sql);
        SqlNode node = parser.impl().parseSqlStmtEof();
        if (node instanceof SqlCreateTable) {
            sqlContext.interpretCreateTable((SqlCreateTable) node);
        } else if (node instanceof SqlCreateFunction) {
            sqlContext.interpretCreateFunction((SqlCreateFunction) node);
        } else {
            AbstractStreamsProcessor processor = sqlContext.compileSql(sql);
            StormTopology topo = processor.build();
            if (processor.getClassLoaders() != null && processor.getClassLoaders().size() > 0) {
                CompilingClassLoader lastClassloader = processor.getClassLoaders().get(processor.getClassLoaders().size() - 1);
                Utils.setClassLoaderForJavaDeSerialize(lastClassloader);
            }
            try (LocalCluster.LocalTopology stormTopo = localCluster.submitTopology("storm-sql", conf, topo)) {
                waitForCompletion(waitTimeoutMs, waitCondition);
            } finally {
                while (localCluster.getTopologySummaries().size() > 0) {
                    Thread.sleep(10);
                }
                Utils.resetClassLoaderForJavaDeSerialize();
            }
        }
    }
}
Also used : CompilingClassLoader(org.apache.storm.sql.javac.CompilingClassLoader) Config(org.apache.storm.Config) StormTopology(org.apache.storm.generated.StormTopology) SqlCreateFunction(org.apache.storm.sql.parser.SqlCreateFunction) StormParser(org.apache.storm.sql.parser.StormParser) SqlCreateTable(org.apache.storm.sql.parser.SqlCreateTable) SqlNode(org.apache.calcite.sql.SqlNode)

Aggregations

SqlCreateTable (org.apache.storm.sql.parser.SqlCreateTable)7 SqlNode (org.apache.calcite.sql.SqlNode)5 SqlCreateFunction (org.apache.storm.sql.parser.SqlCreateFunction)5 StormParser (org.apache.storm.sql.parser.StormParser)5 Path (java.nio.file.Path)2 ArrayList (java.util.ArrayList)2 HashMap (java.util.HashMap)2 RelDataType (org.apache.calcite.rel.type.RelDataType)2 Table (org.apache.calcite.schema.Table)2 SqlOperatorTable (org.apache.calcite.sql.SqlOperatorTable)2 SqlStdOperatorTable (org.apache.calcite.sql.fun.SqlStdOperatorTable)2 ChainedSqlOperatorTable (org.apache.calcite.sql.util.ChainedSqlOperatorTable)2 StormTopology (org.apache.storm.generated.StormTopology)2 ColumnConstraint (org.apache.storm.sql.parser.ColumnConstraint)2 ColumnDefinition (org.apache.storm.sql.parser.ColumnDefinition)2 QueryPlanner (org.apache.storm.sql.planner.trident.QueryPlanner)2 FieldInfo (org.apache.storm.sql.runtime.FieldInfo)2 ISqlTridentDataSource (org.apache.storm.sql.runtime.ISqlTridentDataSource)2 RelNode (org.apache.calcite.rel.RelNode)1 FrameworkConfig (org.apache.calcite.tools.FrameworkConfig)1