Search in sources :

Example 11 with KsqlTopic

use of io.confluent.ksql.execution.ddl.commands.KsqlTopic in project ksql by confluentinc.

the class QueryBuilder method buildPersistentQueryInSharedRuntime.

@SuppressWarnings("ParameterNumber")
PersistentQueryMetadata buildPersistentQueryInSharedRuntime(final KsqlConfig ksqlConfig, final KsqlConstants.PersistentQueryType persistentQueryType, final String statementText, final QueryId queryId, final Optional<DataSource> sinkDataSource, final Set<DataSource> sources, final ExecutionStep<?> physicalPlan, final String planSummary, final QueryMetadata.Listener listener, final Supplier<List<PersistentQueryMetadata>> allPersistentQueries, final String applicationId, final MetricCollectors metricCollectors) {
    final SharedKafkaStreamsRuntime sharedKafkaStreamsRuntime = getKafkaStreamsInstance(applicationId, sources.stream().map(DataSource::getName).collect(Collectors.toSet()), queryId, metricCollectors);
    final Map<String, Object> queryOverrides = sharedKafkaStreamsRuntime.getStreamProperties();
    final LogicalSchema logicalSchema;
    final KeyFormat keyFormat;
    final ValueFormat valueFormat;
    final KsqlTopic ksqlTopic;
    switch(persistentQueryType) {
        // CREATE_SOURCE does not have a sink, so the schema is obtained from the query source
        case CREATE_SOURCE:
            final DataSource dataSource = Iterables.getOnlyElement(sources);
            logicalSchema = dataSource.getSchema();
            keyFormat = dataSource.getKsqlTopic().getKeyFormat();
            valueFormat = dataSource.getKsqlTopic().getValueFormat();
            ksqlTopic = dataSource.getKsqlTopic();
            break;
        default:
            logicalSchema = sinkDataSource.get().getSchema();
            keyFormat = sinkDataSource.get().getKsqlTopic().getKeyFormat();
            valueFormat = sinkDataSource.get().getKsqlTopic().getValueFormat();
            ksqlTopic = sinkDataSource.get().getKsqlTopic();
            break;
    }
    final PhysicalSchema querySchema = PhysicalSchema.from(logicalSchema, keyFormat.getFeatures(), valueFormat.getFeatures());
    final NamedTopologyBuilder namedTopologyBuilder = sharedKafkaStreamsRuntime.getKafkaStreams().newNamedTopologyBuilder(queryId.toString(), PropertiesUtil.asProperties(queryOverrides));
    final RuntimeBuildContext runtimeBuildContext = buildContext(applicationId, queryId, namedTopologyBuilder);
    final Object result = buildQueryImplementation(physicalPlan, runtimeBuildContext);
    final NamedTopology topology = namedTopologyBuilder.build();
    final Optional<MaterializationProviderBuilderFactory.MaterializationProviderBuilder> materializationProviderBuilder = getMaterializationInfo(result).map(info -> materializationProviderBuilderFactory.materializationProviderBuilder(info, querySchema, keyFormat, queryOverrides, applicationId, queryId.toString()));
    final Optional<ScalablePushRegistry> scalablePushRegistry = applyScalablePushProcessor(querySchema.logicalSchema(), result, allPersistentQueries, queryOverrides, applicationId, ksqlConfig, ksqlTopic, serviceContext);
    final BinPackedPersistentQueryMetadataImpl binPackedPersistentQueryMetadata = new BinPackedPersistentQueryMetadataImpl(persistentQueryType, statementText, querySchema, sources.stream().map(DataSource::getName).collect(Collectors.toSet()), planSummary, applicationId, topology, sharedKafkaStreamsRuntime, runtimeBuildContext.getSchemas(), config.getOverrides(), queryId, materializationProviderBuilder, physicalPlan, getUncaughtExceptionProcessingLogger(queryId), sinkDataSource, listener, queryOverrides, scalablePushRegistry, (streamsRuntime) -> getNamedTopology(streamsRuntime, queryId, applicationId, queryOverrides, physicalPlan));
    if (real) {
        return binPackedPersistentQueryMetadata;
    } else {
        return SandboxedBinPackedPersistentQueryMetadataImpl.of(binPackedPersistentQueryMetadata, listener);
    }
}
Also used : ValueFormat(io.confluent.ksql.serde.ValueFormat) LogicalSchema(io.confluent.ksql.schema.ksql.LogicalSchema) KeyFormat(io.confluent.ksql.serde.KeyFormat) DataSource(io.confluent.ksql.metastore.model.DataSource) ScalablePushRegistry(io.confluent.ksql.physical.scalablepush.ScalablePushRegistry) SharedKafkaStreamsRuntime(io.confluent.ksql.util.SharedKafkaStreamsRuntime) RuntimeBuildContext(io.confluent.ksql.execution.runtime.RuntimeBuildContext) PhysicalSchema(io.confluent.ksql.schema.ksql.PhysicalSchema) NamedTopologyBuilder(org.apache.kafka.streams.processor.internals.namedtopology.NamedTopologyBuilder) NamedTopology(org.apache.kafka.streams.processor.internals.namedtopology.NamedTopology) KsqlTopic(io.confluent.ksql.execution.ddl.commands.KsqlTopic) BinPackedPersistentQueryMetadataImpl(io.confluent.ksql.util.BinPackedPersistentQueryMetadataImpl) SandboxedBinPackedPersistentQueryMetadataImpl(io.confluent.ksql.util.SandboxedBinPackedPersistentQueryMetadataImpl)

Example 12 with KsqlTopic

use of io.confluent.ksql.execution.ddl.commands.KsqlTopic in project ksql by confluentinc.

the class KsqlAuthorizationValidatorImpl method validateQuery.

private void validateQuery(final KsqlSecurityContext securityContext, final MetaStore metaStore, final Query query) {
    for (KsqlTopic ksqlTopic : extractQueryTopics(query, metaStore)) {
        checkTopicAccess(securityContext, ksqlTopic.getKafkaTopicName(), AclOperation.READ);
        checkSchemaAccess(securityContext, ksqlTopic, AclOperation.READ);
    }
}
Also used : KsqlTopic(io.confluent.ksql.execution.ddl.commands.KsqlTopic)

Example 13 with KsqlTopic

use of io.confluent.ksql.execution.ddl.commands.KsqlTopic in project ksql by confluentinc.

the class KsqlAuthorizationValidatorImpl method validateCreateAsSelect.

private void validateCreateAsSelect(final KsqlSecurityContext securityContext, final MetaStore metaStore, final CreateAsSelect createAsSelect) {
    /*
     * Check topic access for CREATE STREAM/TABLE AS SELECT statements.
     *
     * Validates Write on the target topic if exists, and Read on the query sources topics.
     *
     * The Create access is validated by the TopicCreateInjector which will attempt to create
     * the target topic using the same ServiceContext used for validation.
     */
    validateQuery(securityContext, metaStore, createAsSelect.getQuery());
    // At this point, the topic should have been created by the TopicCreateInjector
    final KsqlTopic sinkTopic = getCreateAsSelectSinkTopic(metaStore, createAsSelect);
    checkTopicAccess(securityContext, sinkTopic.getKafkaTopicName(), AclOperation.WRITE);
    checkSchemaAccess(securityContext, sinkTopic, AclOperation.WRITE);
}
Also used : KsqlTopic(io.confluent.ksql.execution.ddl.commands.KsqlTopic)

Example 14 with KsqlTopic

use of io.confluent.ksql.execution.ddl.commands.KsqlTopic in project ksql by confluentinc.

the class KsqlAuthorizationValidatorImpl method getCreateAsSelectSinkTopic.

private KsqlTopic getCreateAsSelectSinkTopic(final MetaStore metaStore, final CreateAsSelect createAsSelect) {
    final CreateSourceAsProperties properties = createAsSelect.getProperties();
    final String sinkTopicName;
    final KeyFormat sinkKeyFormat;
    final ValueFormat sinkValueFormat;
    if (!properties.getKafkaTopic().isPresent()) {
        final DataSource dataSource = metaStore.getSource(createAsSelect.getName());
        if (dataSource != null) {
            sinkTopicName = dataSource.getKafkaTopicName();
            sinkKeyFormat = dataSource.getKsqlTopic().getKeyFormat();
            sinkValueFormat = dataSource.getKsqlTopic().getValueFormat();
        } else {
            throw new KsqlException("Cannot validate for topic access from an unknown stream/table: " + createAsSelect.getName());
        }
    } else {
        sinkTopicName = properties.getKafkaTopic().get();
        // If no format is specified for the sink topic, then use the format from the primary
        // source topic.
        final SourceTopicsExtractor extractor = new SourceTopicsExtractor(metaStore);
        extractor.process(createAsSelect.getQuery(), null);
        final KsqlTopic primaryKsqlTopic = extractor.getPrimarySourceTopic();
        final Optional<Format> keyFormat = properties.getKeyFormat().map(formatName -> FormatFactory.fromName(formatName));
        final Optional<Format> valueFormat = properties.getValueFormat().map(formatName -> FormatFactory.fromName(formatName));
        sinkKeyFormat = keyFormat.map(format -> KeyFormat.of(FormatInfo.of(format.name()), format.supportsFeature(SerdeFeature.SCHEMA_INFERENCE) ? SerdeFeatures.of(SerdeFeature.SCHEMA_INFERENCE) : SerdeFeatures.of(), Optional.empty())).orElse(primaryKsqlTopic.getKeyFormat());
        sinkValueFormat = valueFormat.map(format -> ValueFormat.of(FormatInfo.of(format.name()), format.supportsFeature(SerdeFeature.SCHEMA_INFERENCE) ? SerdeFeatures.of(SerdeFeature.SCHEMA_INFERENCE) : SerdeFeatures.of())).orElse(primaryKsqlTopic.getValueFormat());
    }
    return new KsqlTopic(sinkTopicName, sinkKeyFormat, sinkValueFormat);
}
Also used : ValueFormat(io.confluent.ksql.serde.ValueFormat) KeyFormat(io.confluent.ksql.serde.KeyFormat) ValueFormat(io.confluent.ksql.serde.ValueFormat) Format(io.confluent.ksql.serde.Format) CreateSourceAsProperties(io.confluent.ksql.parser.properties.with.CreateSourceAsProperties) SourceTopicsExtractor(io.confluent.ksql.topic.SourceTopicsExtractor) KeyFormat(io.confluent.ksql.serde.KeyFormat) KsqlException(io.confluent.ksql.util.KsqlException) DataSource(io.confluent.ksql.metastore.model.DataSource) KsqlTopic(io.confluent.ksql.execution.ddl.commands.KsqlTopic)

Example 15 with KsqlTopic

use of io.confluent.ksql.execution.ddl.commands.KsqlTopic in project ksql by confluentinc.

the class SourceNodeTest method shouldBuildFromDataSource.

@Test
public void shouldBuildFromDataSource() {
    // Given:
    final LogicalSchema schema = LogicalSchema.builder().valueColumn(ColumnName.of("bob"), SqlTypes.BIGINT).build();
    final KsqlTopic topic = mock(KsqlTopic.class);
    when(topic.getKeyFormat()).thenReturn(KeyFormat.windowed(FormatInfo.of("AVRO", ImmutableMap.of("some", "prop")), SerdeFeatures.of(SerdeFeature.UNWRAP_SINGLES), WindowInfo.of(WindowType.HOPPING, Optional.of(Duration.ofMillis(10)))));
    when(topic.getValueFormat()).thenReturn(ValueFormat.of(FormatInfo.of("DELIMITED", ImmutableMap.of("some1", "prop1")), SerdeFeatures.of(SerdeFeature.WRAP_SINGLES)));
    final DataSource source = mock(DataSource.class);
    when(source.getName()).thenReturn(SourceName.of("the Name"));
    when(source.getDataSourceType()).thenReturn(DataSourceType.KTABLE);
    when(source.getSchema()).thenReturn(schema);
    when(source.getKsqlTopic()).thenReturn(topic);
    // When:
    final SourceNode sourceNode = SourceNode.fromDataSource(source);
    // Then:
    assertThat(sourceNode, is(new SourceNode("the Name", "TABLE", Optional.of(schema.toString()), Optional.of(new KeyFormatNode(Optional.of("AVRO"), Optional.of(WindowType.HOPPING), Optional.of(10L))), Optional.of("DELIMITED"), Optional.of(ImmutableSet.of(SerdeFeature.UNWRAP_SINGLES)), Optional.of(ImmutableSet.of(SerdeFeature.WRAP_SINGLES)), Optional.of(false))));
}
Also used : LogicalSchema(io.confluent.ksql.schema.ksql.LogicalSchema) KsqlTopic(io.confluent.ksql.execution.ddl.commands.KsqlTopic) DataSource(io.confluent.ksql.metastore.model.DataSource) Test(org.junit.Test)

Aggregations

KsqlTopic (io.confluent.ksql.execution.ddl.commands.KsqlTopic)33 DataSource (io.confluent.ksql.metastore.model.DataSource)10 LogicalSchema (io.confluent.ksql.schema.ksql.LogicalSchema)10 KsqlStream (io.confluent.ksql.metastore.model.KsqlStream)7 KeyFormat (io.confluent.ksql.serde.KeyFormat)6 Test (org.junit.Test)6 MetaStoreImpl (io.confluent.ksql.metastore.MetaStoreImpl)5 KsqlConfig (io.confluent.ksql.util.KsqlConfig)5 Before (org.junit.Before)5 KsqlTable (io.confluent.ksql.metastore.model.KsqlTable)4 KsqlStructuredDataOutputNode (io.confluent.ksql.planner.plan.KsqlStructuredDataOutputNode)4 Matchers.containsString (org.hamcrest.Matchers.containsString)4 InternalFunctionRegistry (io.confluent.ksql.function.InternalFunctionRegistry)3 ValueFormat (io.confluent.ksql.serde.ValueFormat)3 PersistentQueryMetadata (io.confluent.ksql.util.PersistentQueryMetadata)3 ImmutableMap (com.google.common.collect.ImmutableMap)2 SuppressFBWarnings (edu.umd.cs.findbugs.annotations.SuppressFBWarnings)2 CreateTableCommand (io.confluent.ksql.execution.ddl.commands.CreateTableCommand)2 RuntimeBuildContext (io.confluent.ksql.execution.runtime.RuntimeBuildContext)2 MutableMetaStore (io.confluent.ksql.metastore.MutableMetaStore)2