Search in sources :

Example 11 with SchemaKStream

use of io.confluent.ksql.structured.SchemaKStream in project ksql by confluentinc.

the class PhysicalPlanBuilder method buildPlanForBareQuery.

private QueryMetadata buildPlanForBareQuery(final QueuedSchemaKStream schemaKStream, final KsqlBareOutputNode bareOutputNode, final String serviceId, final String transientQueryPrefix, final String statement) {
    final String applicationId = addTimeSuffix(getBareQueryApplicationId(serviceId, transientQueryPrefix));
    KafkaStreams streams = buildStreams(builder, applicationId, ksqlConfig, overriddenStreamsProperties);
    SchemaKStream sourceSchemaKstream = schemaKStream.getSourceSchemaKStreams().get(0);
    return new QueuedQueryMetadata(statement, streams, bareOutputNode, schemaKStream.getExecutionPlan(""), schemaKStream.getQueue(), (sourceSchemaKstream instanceof SchemaKTable) ? DataSource.DataSourceType.KTABLE : DataSource.DataSourceType.KSTREAM, applicationId, kafkaTopicClient, builder.build());
}
Also used : SchemaKTable(io.confluent.ksql.structured.SchemaKTable) KafkaStreams(org.apache.kafka.streams.KafkaStreams) QueuedQueryMetadata(io.confluent.ksql.util.QueuedQueryMetadata) SchemaKStream(io.confluent.ksql.structured.SchemaKStream) QueuedSchemaKStream(io.confluent.ksql.structured.QueuedSchemaKStream)

Example 12 with SchemaKStream

use of io.confluent.ksql.structured.SchemaKStream in project ksql by confluentinc.

the class KsqlStructuredDataOutputNodeTest method shouldPartitionByFieldNameInPartitionByProperty.

@Test
public void shouldPartitionByFieldNameInPartitionByProperty() {
    createOutputNode(Collections.singletonMap(DdlConfig.PARTITION_BY_PROPERTY, "field2"));
    final SchemaKStream schemaKStream = buildStream();
    final Field keyField = schemaKStream.getKeyField();
    assertThat(keyField, equalTo(new Field("field2", 1, Schema.STRING_SCHEMA)));
    assertThat(schemaKStream.getSchema().fields(), equalTo(schema.fields()));
}
Also used : Field(org.apache.kafka.connect.data.Field) SchemaKStream(io.confluent.ksql.structured.SchemaKStream) Test(org.junit.Test)

Example 13 with SchemaKStream

use of io.confluent.ksql.structured.SchemaKStream in project ksql by confluentinc.

the class KsqlStructuredDataOutputNodeTest method shouldCreateSinkWithCorrectCleanupPolicyNonWindowedTable.

@Test
public void shouldCreateSinkWithCorrectCleanupPolicyNonWindowedTable() {
    KafkaTopicClient topicClientForNonWindowTable = EasyMock.mock(KafkaTopicClient.class);
    KsqlStructuredDataOutputNode outputNode = getKsqlStructuredDataOutputNode(false);
    StreamsBuilder streamsBuilder = new StreamsBuilder();
    Map<String, String> topicConfig = ImmutableMap.of(TopicConfig.CLEANUP_POLICY_CONFIG, TopicConfig.CLEANUP_POLICY_COMPACT);
    topicClientForNonWindowTable.createTopic("output", 4, (short) 3, topicConfig);
    EasyMock.replay(topicClientForNonWindowTable);
    SchemaKStream schemaKStream = outputNode.buildStream(streamsBuilder, ksqlConfig, topicClientForNonWindowTable, new FunctionRegistry(), new HashMap<>(), new MockSchemaRegistryClient());
    assertThat(schemaKStream, instanceOf(SchemaKTable.class));
    EasyMock.verify();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) SchemaKTable(io.confluent.ksql.structured.SchemaKTable) FunctionRegistry(io.confluent.ksql.function.FunctionRegistry) KafkaTopicClient(io.confluent.ksql.util.KafkaTopicClient) MockSchemaRegistryClient(io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient) SchemaKStream(io.confluent.ksql.structured.SchemaKStream) Test(org.junit.Test)

Example 14 with SchemaKStream

use of io.confluent.ksql.structured.SchemaKStream in project ksql by confluentinc.

the class KsqlStructuredDataOutputNodeTest method shouldCreateSinkWithCorrectCleanupPolicyWindowedTable.

@Test
public void shouldCreateSinkWithCorrectCleanupPolicyWindowedTable() {
    KafkaTopicClient topicClientForWindowTable = EasyMock.mock(KafkaTopicClient.class);
    KsqlStructuredDataOutputNode outputNode = getKsqlStructuredDataOutputNode(true);
    StreamsBuilder streamsBuilder = new StreamsBuilder();
    topicClientForWindowTable.createTopic("output", 4, (short) 3, Collections.emptyMap());
    EasyMock.replay(topicClientForWindowTable);
    SchemaKStream schemaKStream = outputNode.buildStream(streamsBuilder, ksqlConfig, topicClientForWindowTable, new FunctionRegistry(), new HashMap<>(), new MockSchemaRegistryClient());
    assertThat(schemaKStream, instanceOf(SchemaKTable.class));
    EasyMock.verify();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) SchemaKTable(io.confluent.ksql.structured.SchemaKTable) FunctionRegistry(io.confluent.ksql.function.FunctionRegistry) KafkaTopicClient(io.confluent.ksql.util.KafkaTopicClient) MockSchemaRegistryClient(io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient) SchemaKStream(io.confluent.ksql.structured.SchemaKStream) Test(org.junit.Test)

Example 15 with SchemaKStream

use of io.confluent.ksql.structured.SchemaKStream in project ksql by confluentinc.

the class StructuredDataSourceNodeTest method shouldBuildSchemaKTableWhenKTableSource.

@Test
public void shouldBuildSchemaKTableWhenKTableSource() {
    StructuredDataSourceNode node = new StructuredDataSourceNode(new PlanNodeId("0"), new KsqlTable("sqlExpression", "datasource", schema, schema.field("field"), schema.field("timestamp"), new KsqlTopic("topic2", "topic2", new KsqlJsonTopicSerDe()), "statestore", false), schema);
    final SchemaKStream result = build(node);
    assertThat(result.getClass(), equalTo(SchemaKTable.class));
}
Also used : SchemaKTable(io.confluent.ksql.structured.SchemaKTable) KsqlJsonTopicSerDe(io.confluent.ksql.serde.json.KsqlJsonTopicSerDe) KsqlTable(io.confluent.ksql.metastore.KsqlTable) SchemaKStream(io.confluent.ksql.structured.SchemaKStream) KsqlTopic(io.confluent.ksql.metastore.KsqlTopic) Test(org.junit.Test)

Aggregations

SchemaKStream (io.confluent.ksql.structured.SchemaKStream)17 SchemaKTable (io.confluent.ksql.structured.SchemaKTable)10 Test (org.junit.Test)8 KsqlException (io.confluent.ksql.util.KsqlException)5 StreamsBuilder (org.apache.kafka.streams.StreamsBuilder)4 MockSchemaRegistryClient (io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient)3 FunctionRegistry (io.confluent.ksql.function.FunctionRegistry)3 KsqlTopicSerDe (io.confluent.ksql.serde.KsqlTopicSerDe)3 KafkaTopicClient (io.confluent.ksql.util.KafkaTopicClient)3 Field (org.apache.kafka.connect.data.Field)3 GenericRow (io.confluent.ksql.GenericRow)2 KsqlTable (io.confluent.ksql.metastore.KsqlTable)2 QueuedSchemaKStream (io.confluent.ksql.structured.QueuedSchemaKStream)2 HashMap (java.util.HashMap)2 Schema (org.apache.kafka.connect.data.Schema)2 KudafAggregator (io.confluent.ksql.function.udaf.KudafAggregator)1 KudafInitializer (io.confluent.ksql.function.udaf.KudafInitializer)1 KsqlTopic (io.confluent.ksql.metastore.KsqlTopic)1 AddTimestampColumn (io.confluent.ksql.physical.AddTimestampColumn)1 KsqlBareOutputNode (io.confluent.ksql.planner.plan.KsqlBareOutputNode)1