Search in sources :

Example 1 with Materialization

use of io.confluent.ksql.execution.streams.materialization.Materialization in project ksql by confluentinc.

the class QueryBuilderTest method shouldStartPersistentQueryWithCorrectMaterializationProvider.

@Test
public void shouldStartPersistentQueryWithCorrectMaterializationProvider() {
    // Given:
    final PersistentQueryMetadata queryMetadata = buildPersistentQuery(SOURCES, KsqlConstants.PersistentQueryType.CREATE_AS, QUERY_ID);
    queryMetadata.initialize();
    queryMetadata.start();
    // When:
    final Optional<Materialization> result = queryMetadata.getMaterialization(QUERY_ID, stacker);
    // Then:
    assertThat(result.get(), is(materialization));
}
Also used : KsMaterialization(io.confluent.ksql.execution.streams.materialization.ks.KsMaterialization) Materialization(io.confluent.ksql.execution.streams.materialization.Materialization) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) Test(org.junit.Test)

Example 2 with Materialization

use of io.confluent.ksql.execution.streams.materialization.Materialization in project ksql by confluentinc.

the class QueryBuilderTest method shouldStartCreateSourceQueryWithMaterializationProvider.

@Test
public void shouldStartCreateSourceQueryWithMaterializationProvider() {
    when(ksqlConfig.getBoolean(KsqlConfig.KSQL_SHARED_RUNTIME_ENABLED)).thenReturn(true);
    // Given:
    final DataSource source = givenSource("foo");
    when(source.getSchema()).thenReturn(SINK_SCHEMA);
    when(source.getKsqlTopic()).thenReturn(ksqlTopic);
    final PersistentQueryMetadata queryMetadata = buildPersistentQuery(ImmutableSet.of(source), KsqlConstants.PersistentQueryType.CREATE_SOURCE, QUERY_ID, Optional.empty());
    queryMetadata.initialize();
    queryMetadata.register();
    queryMetadata.start();
    // When:
    final Optional<Materialization> result = queryMetadata.getMaterialization(QUERY_ID, stacker);
    // Then:
    assertThat(result.get(), is(materialization));
}
Also used : KsMaterialization(io.confluent.ksql.execution.streams.materialization.ks.KsMaterialization) Materialization(io.confluent.ksql.execution.streams.materialization.Materialization) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) DataSource(io.confluent.ksql.metastore.model.DataSource) Test(org.junit.Test)

Example 3 with Materialization

use of io.confluent.ksql.execution.streams.materialization.Materialization in project ksql by confluentinc.

the class KsMaterializationFunctionalTest method shouldQueryMaterializedTableWithKeyFieldsInProjection.

@Test
public void shouldQueryMaterializedTableWithKeyFieldsInProjection() {
    // Given:
    final PersistentQueryMetadata query = executeQuery("CREATE TABLE " + output + " AS" + " SELECT USERID, COUNT(*), AS_VALUE(USERID) AS USERID_2 FROM " + USER_TABLE + " GROUP BY USERID;");
    final LogicalSchema schema = schema("KSQL_COL_0", SqlTypes.BIGINT, "USERID_2", SqlTypes.STRING);
    final Map<String, GenericRow> rows = waitForUniqueUserRows(STRING_DESERIALIZER, schema);
    // When:
    final Materialization materialization = query.getMaterialization(queryId, contextStacker).get();
    // Then:
    assertThat(materialization.windowType(), is(Optional.empty()));
    final MaterializedTable table = materialization.nonWindowed();
    rows.forEach((rowKey, value) -> {
        final GenericKey key = genericKey(rowKey);
        final List<Row> rowList = withRetry(() -> Lists.newArrayList(table.get(key, PARTITION)));
        assertThat(rowList.size(), is(1));
        assertThat(rowList.get(0).schema(), is(schema));
        assertThat(rowList.get(0).key(), is(key));
        assertThat(rowList.get(0).value(), is(value));
    });
}
Also used : GenericRow(io.confluent.ksql.GenericRow) Materialization(io.confluent.ksql.execution.streams.materialization.Materialization) LogicalSchema(io.confluent.ksql.schema.ksql.LogicalSchema) GenericKey(io.confluent.ksql.GenericKey) Row(io.confluent.ksql.execution.streams.materialization.Row) WindowedRow(io.confluent.ksql.execution.streams.materialization.WindowedRow) GenericRow(io.confluent.ksql.GenericRow) MaterializedTable(io.confluent.ksql.execution.streams.materialization.MaterializedTable) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) IntegrationTest(org.apache.kafka.test.IntegrationTest) Test(org.junit.Test)

Example 4 with Materialization

use of io.confluent.ksql.execution.streams.materialization.Materialization in project ksql by confluentinc.

the class KsMaterializationFunctionalTest method shouldQueryMaterializedTableForAggregatedTable.

@Test
public void shouldQueryMaterializedTableForAggregatedTable() {
    // Given:
    final PersistentQueryMetadata query = executeQuery("CREATE TABLE " + output + " AS" + " SELECT USERID, COUNT(*) FROM " + USER_TABLE + " GROUP BY USERID;");
    final LogicalSchema schema = schema("KSQL_COL_0", SqlTypes.BIGINT);
    final Map<String, GenericRow> rows = waitForUniqueUserRows(STRING_DESERIALIZER, schema);
    // When:
    final Materialization materialization = query.getMaterialization(queryId, contextStacker).get();
    // Then:
    assertThat(materialization.windowType(), is(Optional.empty()));
    final MaterializedTable table = materialization.nonWindowed();
    rows.forEach((rowKey, value) -> {
        final GenericKey key = genericKey(rowKey);
        final Iterator<Row> rowIterator = withRetry(() -> table.get(key, PARTITION));
        assertThat(rowIterator.hasNext(), is(true));
        final Row row = rowIterator.next();
        assertThat(row.schema(), is(schema));
        assertThat(row.key(), is(key));
        assertThat(row.value(), is(value));
    });
    final GenericKey key = genericKey("Won't find me");
    assertThat("unknown key", withRetry(() -> table.get(key, PARTITION).hasNext()), is(false));
}
Also used : GenericRow(io.confluent.ksql.GenericRow) Materialization(io.confluent.ksql.execution.streams.materialization.Materialization) LogicalSchema(io.confluent.ksql.schema.ksql.LogicalSchema) GenericKey(io.confluent.ksql.GenericKey) Row(io.confluent.ksql.execution.streams.materialization.Row) WindowedRow(io.confluent.ksql.execution.streams.materialization.WindowedRow) GenericRow(io.confluent.ksql.GenericRow) MaterializedTable(io.confluent.ksql.execution.streams.materialization.MaterializedTable) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) IntegrationTest(org.apache.kafka.test.IntegrationTest) Test(org.junit.Test)

Example 5 with Materialization

use of io.confluent.ksql.execution.streams.materialization.Materialization in project ksql by confluentinc.

the class KsMaterializationFunctionalTest method shouldHandleHavingClause.

@Test
public void shouldHandleHavingClause() {
    // Note: HAVING clause are handled centrally by KsqlMaterialization. This logic will have been
    // installed as part of building the below statement:
    // Given:
    final PersistentQueryMetadata query = executeQuery("CREATE TABLE " + output + " AS" + " SELECT USERID, COUNT(*) AS COUNT FROM " + USER_TABLE + " GROUP BY USERID" + " HAVING SUM(REGISTERTIME) > 2;");
    final LogicalSchema schema = schema("COUNT", SqlTypes.BIGINT);
    final int matches = (int) USER_DATA_PROVIDER.data().values().stream().filter(row -> ((Long) row.get(0)) > 2).count();
    final Map<String, GenericRow> rows = waitForUniqueUserRows(matches, STRING_DESERIALIZER, schema);
    // When:
    final Materialization materialization = query.getMaterialization(queryId, contextStacker).get();
    // Then:
    final MaterializedTable table = materialization.nonWindowed();
    rows.forEach((rowKey, value) -> {
        // Rows passing the HAVING clause:
        final GenericKey key = genericKey(rowKey);
        final List<Row> rowList = withRetry(() -> Lists.newArrayList(table.get(key, PARTITION)));
        assertThat(rowList.size(), is(1));
        assertThat(rowList.get(0).schema(), is(schema));
        assertThat(rowList.get(0).key(), is(key));
        assertThat(rowList.get(0).value(), is(value));
    });
    USER_DATA_PROVIDER.data().entries().stream().filter(e -> !rows.containsKey(e.getKey().get(0))).forEach(e -> {
        // Rows filtered by the HAVING clause:
        final List<Row> rowList = withRetry(() -> Lists.newArrayList(table.get(e.getKey(), PARTITION)));
        assertThat(rowList.isEmpty(), is(true));
    });
}
Also used : GenericRow(io.confluent.ksql.GenericRow) PhysicalSchema(io.confluent.ksql.schema.ksql.PhysicalSchema) ColumnName(io.confluent.ksql.name.ColumnName) Row(io.confluent.ksql.execution.streams.materialization.Row) AssertEventually.assertThatEventually(io.confluent.ksql.test.util.AssertEventually.assertThatEventually) WindowedSerdes(org.apache.kafka.streams.kstream.WindowedSerdes) StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) Duration(java.time.Duration) Map(java.util.Map) WindowType(io.confluent.ksql.model.WindowType) QueryId(io.confluent.ksql.query.QueryId) RetryOnException(io.confluent.ksql.test.util.AssertEventually.RetryOnException) ClassRule(org.junit.ClassRule) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) ZooKeeperClientException(kafka.zookeeper.ZooKeeperClientException) QueryMetadata(io.confluent.ksql.util.QueryMetadata) Matchers.notNullValue(org.hamcrest.Matchers.notNullValue) Range(com.google.common.collect.Range) Set(java.util.Set) Window(io.confluent.ksql.Window) Instant(java.time.Instant) Category(org.junit.experimental.categories.Category) LogicalSchema(io.confluent.ksql.schema.ksql.LogicalSchema) Collectors(java.util.stream.Collectors) Matchers.instanceOf(org.hamcrest.Matchers.instanceOf) List(java.util.List) GenericKey.genericKey(io.confluent.ksql.GenericKey.genericKey) Stream(java.util.stream.Stream) ConsumerRecord(org.apache.kafka.clients.consumer.ConsumerRecord) Matchers.equalTo(org.hamcrest.Matchers.equalTo) Optional(java.util.Optional) Matchers.is(org.hamcrest.Matchers.is) UserDataProvider(io.confluent.ksql.util.UserDataProvider) Retry(io.confluent.ksql.integration.Retry) StreamsConfig(org.apache.kafka.streams.StreamsConfig) PageViewDataProvider(io.confluent.ksql.util.PageViewDataProvider) BeforeClass(org.junit.BeforeClass) MaterializedTable(io.confluent.ksql.execution.streams.materialization.MaterializedTable) JSON(io.confluent.ksql.serde.FormatFactory.JSON) QueryContext(io.confluent.ksql.execution.context.QueryContext) IntegrationTest(org.apache.kafka.test.IntegrationTest) KsqlIdentifierTestUtil(io.confluent.ksql.test.util.KsqlIdentifierTestUtil) Supplier(java.util.function.Supplier) ArrayList(java.util.ArrayList) WindowedRow(io.confluent.ksql.execution.streams.materialization.WindowedRow) Windowed(org.apache.kafka.streams.kstream.Windowed) Timeout(org.junit.rules.Timeout) Matchers.hasSize(org.hamcrest.Matchers.hasSize) MatcherAssert.assertThat(org.hamcrest.MatcherAssert.assertThat) SqlType(io.confluent.ksql.schema.ksql.types.SqlType) SerdeFeatures(io.confluent.ksql.serde.SerdeFeatures) Deserializer(org.apache.kafka.common.serialization.Deserializer) Before(org.junit.Before) TestKsqlContext(io.confluent.ksql.integration.TestKsqlContext) Matchers.empty(org.hamcrest.Matchers.empty) LongStream(java.util.stream.LongStream) Iterator(java.util.Iterator) Lists(org.apache.commons.compress.utils.Lists) KAFKA(io.confluent.ksql.serde.FormatFactory.KAFKA) MaterializedWindowedTable(io.confluent.ksql.execution.streams.materialization.MaterializedWindowedTable) Test(org.junit.Test) IntegrationTestHarness(io.confluent.ksql.integration.IntegrationTestHarness) TimeUnit(java.util.concurrent.TimeUnit) RuleChain(org.junit.rules.RuleChain) Rule(org.junit.Rule) GenericRow(io.confluent.ksql.GenericRow) Format(io.confluent.ksql.serde.Format) GenericKey(io.confluent.ksql.GenericKey) Materialization(io.confluent.ksql.execution.streams.materialization.Materialization) SqlTypes(io.confluent.ksql.schema.ksql.types.SqlTypes) Materialization(io.confluent.ksql.execution.streams.materialization.Materialization) LogicalSchema(io.confluent.ksql.schema.ksql.LogicalSchema) GenericKey(io.confluent.ksql.GenericKey) Row(io.confluent.ksql.execution.streams.materialization.Row) WindowedRow(io.confluent.ksql.execution.streams.materialization.WindowedRow) GenericRow(io.confluent.ksql.GenericRow) MaterializedTable(io.confluent.ksql.execution.streams.materialization.MaterializedTable) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) IntegrationTest(org.apache.kafka.test.IntegrationTest) Test(org.junit.Test)

Aggregations

Materialization (io.confluent.ksql.execution.streams.materialization.Materialization)16 PersistentQueryMetadata (io.confluent.ksql.util.PersistentQueryMetadata)16 Test (org.junit.Test)16 IntegrationTest (org.apache.kafka.test.IntegrationTest)13 GenericKey (io.confluent.ksql.GenericKey)8 GenericRow (io.confluent.ksql.GenericRow)8 WindowedRow (io.confluent.ksql.execution.streams.materialization.WindowedRow)8 LogicalSchema (io.confluent.ksql.schema.ksql.LogicalSchema)8 MaterializedWindowedTable (io.confluent.ksql.execution.streams.materialization.MaterializedWindowedTable)7 MaterializedTable (io.confluent.ksql.execution.streams.materialization.MaterializedTable)5 Row (io.confluent.ksql.execution.streams.materialization.Row)5 Window (io.confluent.ksql.Window)4 Windowed (org.apache.kafka.streams.kstream.Windowed)4 KsMaterialization (io.confluent.ksql.execution.streams.materialization.ks.KsMaterialization)3 Optional (java.util.Optional)3 ConsumerRecord (org.apache.kafka.clients.consumer.ConsumerRecord)3 TestKsqlContext (io.confluent.ksql.integration.TestKsqlContext)2 Range (com.google.common.collect.Range)1 GenericKey.genericKey (io.confluent.ksql.GenericKey.genericKey)1 QueryContext (io.confluent.ksql.execution.context.QueryContext)1