Search in sources :

Example 11 with KsqlServerException

use of io.confluent.ksql.util.KsqlServerException in project ksql by confluentinc.

the class DefaultConnectClient method connectors.

@SuppressWarnings("unchecked")
@Override
public ConnectResponse<List<String>> connectors() {
    try {
        LOG.debug("Issuing request to Kafka Connect at URI {} to list connectors", connectUri);
        final ConnectResponse<List<String>> connectResponse = withRetries(() -> Request.get(resolveUri(CONNECTORS)).setHeaders(requestHeaders).responseTimeout(Timeout.ofMilliseconds(requestTimeoutMs)).connectTimeout(Timeout.ofMilliseconds(requestTimeoutMs)).execute(httpClient).handleResponse(createHandler(HttpStatus.SC_OK, new TypeReference<List<String>>() {
        }, Function.identity())));
        connectResponse.error().ifPresent(error -> LOG.warn("Could not list connectors: {}.", error));
        return connectResponse;
    } catch (final Exception e) {
        throw new KsqlServerException(e);
    }
}
Also used : ArrayList(java.util.ArrayList) ImmutableList(com.google.common.collect.ImmutableList) List(java.util.List) RetryException(com.github.rholder.retry.RetryException) URISyntaxException(java.net.URISyntaxException) ExecutionException(java.util.concurrent.ExecutionException) KsqlServerException(io.confluent.ksql.util.KsqlServerException) KsqlException(io.confluent.ksql.util.KsqlException) KsqlServerException(io.confluent.ksql.util.KsqlServerException)

Example 12 with KsqlServerException

use of io.confluent.ksql.util.KsqlServerException in project ksql by confluentinc.

the class DefaultConnectClient method topics.

@Override
public ConnectResponse<Map<String, Map<String, List<String>>>> topics(final String connector) {
    try {
        LOG.debug("Issuing request to Kafka Connect at URI {} to get active topics for {}", connectUri, connector);
        final ConnectResponse<Map<String, Map<String, List<String>>>> connectResponse = withRetries(() -> Request.get(resolveUri(CONNECTORS + "/" + connector + TOPICS)).setHeaders(requestHeaders).responseTimeout(Timeout.ofMilliseconds(requestTimeoutMs)).connectTimeout(Timeout.ofMilliseconds(requestTimeoutMs)).execute(httpClient).handleResponse(createHandler(HttpStatus.SC_OK, new TypeReference<Map<String, Map<String, List<String>>>>() {
        }, Function.identity())));
        connectResponse.error().ifPresent(error -> LOG.warn("Could not query topics of connector {}: {}", connector, error));
        return connectResponse;
    } catch (final Exception e) {
        throw new KsqlServerException(e);
    }
}
Also used : ArrayList(java.util.ArrayList) ImmutableList(com.google.common.collect.ImmutableList) List(java.util.List) Map(java.util.Map) ImmutableMap(com.google.common.collect.ImmutableMap) RetryException(com.github.rholder.retry.RetryException) URISyntaxException(java.net.URISyntaxException) ExecutionException(java.util.concurrent.ExecutionException) KsqlServerException(io.confluent.ksql.util.KsqlServerException) KsqlException(io.confluent.ksql.util.KsqlException) KsqlServerException(io.confluent.ksql.util.KsqlServerException)

Example 13 with KsqlServerException

use of io.confluent.ksql.util.KsqlServerException in project ksql by confluentinc.

the class DefaultConnectClient method create.

@Override
public ConnectResponse<ConnectorInfo> create(final String connector, final Map<String, String> config) {
    try {
        LOG.debug("Issuing create request to Kafka Connect at URI {} with name {} and config {}", connectUri, connector, config);
        final ConnectResponse<ConnectorInfo> connectResponse = withRetries(() -> Request.post(resolveUri(CONNECTORS)).setHeaders(requestHeaders).responseTimeout(Timeout.ofMilliseconds(requestTimeoutMs)).connectTimeout(Timeout.ofMilliseconds(requestTimeoutMs)).bodyString(MAPPER.writeValueAsString(ImmutableMap.of("name", connector, "config", config)), ContentType.APPLICATION_JSON).execute(httpClient).handleResponse(createHandler(HttpStatus.SC_CREATED, new TypeReference<ConnectorInfo>() {
        }, Function.identity())));
        connectResponse.error().ifPresent(error -> LOG.warn("Did not CREATE connector {}: {}", connector, error));
        return connectResponse;
    } catch (final Exception e) {
        throw new KsqlServerException(e);
    }
}
Also used : ConnectorInfo(org.apache.kafka.connect.runtime.rest.entities.ConnectorInfo) RetryException(com.github.rholder.retry.RetryException) URISyntaxException(java.net.URISyntaxException) ExecutionException(java.util.concurrent.ExecutionException) KsqlServerException(io.confluent.ksql.util.KsqlServerException) KsqlException(io.confluent.ksql.util.KsqlException) KsqlServerException(io.confluent.ksql.util.KsqlServerException)

Example 14 with KsqlServerException

use of io.confluent.ksql.util.KsqlServerException in project ksql by confluentinc.

the class RestTestExecutor method compareKeyValueTimestamp.

private static void compareKeyValueTimestamp(final ConsumerRecord<?, ?> actual, final Record expected) {
    final long actualTimestamp = actual.timestamp();
    final Object actualKey = actual.key();
    final Object actualValue = actual.value();
    final Object expectedKey = coerceExpectedKey(expected.key(), actualKey);
    final JsonNode expectedValue = expected.getJsonValue().orElseThrow(() -> new KsqlServerException("could not get expected value from test record: " + expected));
    final long expectedTimestamp = expected.timestamp().orElse(actualTimestamp);
    final AssertionError error = new AssertionError("Expected <" + expectedKey + ", " + expectedValue + "> " + "with timestamp=" + expectedTimestamp + " but was <" + actualKey + ", " + actualValue + "> " + "with timestamp=" + actualTimestamp);
    if (!Objects.equals(actualKey, expectedKey)) {
        throw error;
    }
    if (!ExpectedRecordComparator.matches(actualValue, expectedValue)) {
        throw error;
    }
    if (actualTimestamp != expectedTimestamp) {
        throw error;
    }
}
Also used : JsonNode(com.fasterxml.jackson.databind.JsonNode) KsqlServerException(io.confluent.ksql.util.KsqlServerException)

Example 15 with KsqlServerException

use of io.confluent.ksql.util.KsqlServerException in project ksql by confluentinc.

the class KsqlEngine method getStartOffsetsForStreamPullQuery.

private ImmutableMap<TopicPartition, Long> getStartOffsetsForStreamPullQuery(final Admin admin, final TopicDescription topicDescription) {
    final Map<TopicPartition, OffsetSpec> topicPartitions = topicDescription.partitions().stream().map(td -> new TopicPartition(topicDescription.name(), td.partition())).collect(toMap(identity(), tp -> OffsetSpec.earliest()));
    final ListOffsetsResult listOffsetsResult = admin.listOffsets(topicPartitions, // so we should do the same when checking end offsets.
    new ListOffsetsOptions(IsolationLevel.READ_UNCOMMITTED));
    try {
        final Map<TopicPartition, ListOffsetsResult.ListOffsetsResultInfo> partitionResultMap = listOffsetsResult.all().get(10, TimeUnit.SECONDS);
        final Map<TopicPartition, Long> result = partitionResultMap.entrySet().stream().collect(toMap(Entry::getKey, e -> e.getValue().offset()));
        return ImmutableMap.copyOf(result);
    } catch (final InterruptedException e) {
        log.error("Admin#listOffsets(" + topicDescription.name() + ") interrupted", e);
        throw new KsqlServerException("Interrupted");
    } catch (final ExecutionException e) {
        log.error("Error executing Admin#listOffsets(" + topicDescription.name() + ")", e);
        throw new KsqlServerException("Internal Server Error");
    } catch (final TimeoutException e) {
        log.error("Admin#listOffsets(" + topicDescription.name() + ") timed out", e);
        throw new KsqlServerException("Backend timed out");
    }
}
Also used : Query(io.confluent.ksql.parser.tree.Query) UnaryOperator.identity(java.util.function.UnaryOperator.identity) SourceName(io.confluent.ksql.name.SourceName) ServiceContext(io.confluent.ksql.services.ServiceContext) LoggerFactory(org.slf4j.LoggerFactory) RoutingOptions(io.confluent.ksql.execution.streams.RoutingOptions) TimeoutException(java.util.concurrent.TimeoutException) ProcessingLogContext(io.confluent.ksql.logging.processing.ProcessingLogContext) MutableMetaStore(io.confluent.ksql.metastore.MutableMetaStore) Context(io.vertx.core.Context) TransientQueryMetadata(io.confluent.ksql.util.TransientQueryMetadata) ListOffsetsResult(org.apache.kafka.clients.admin.ListOffsetsResult) TransientQueryCleanupListener(io.confluent.ksql.internal.TransientQueryCleanupListener) RewrittenAnalysis(io.confluent.ksql.analyzer.RewrittenAnalysis) Collectors.toMap(java.util.stream.Collectors.toMap) Map(java.util.Map) QueryLogger(io.confluent.ksql.logging.query.QueryLogger) QueryId(io.confluent.ksql.query.QueryId) PersistentQueryMetadata(io.confluent.ksql.util.PersistentQueryMetadata) QueryMetadata(io.confluent.ksql.util.QueryMetadata) TopicPartition(org.apache.kafka.common.TopicPartition) ImmutableAnalysis(io.confluent.ksql.analyzer.ImmutableAnalysis) ImmutableMap(com.google.common.collect.ImmutableMap) FunctionRegistry(io.confluent.ksql.function.FunctionRegistry) ScalablePushQueryMetadata(io.confluent.ksql.util.ScalablePushQueryMetadata) Set(java.util.Set) ConsumerConfig(org.apache.kafka.clients.consumer.ConsumerConfig) ScalablePushQueryMetrics(io.confluent.ksql.internal.ScalablePushQueryMetrics) ConfiguredStatement(io.confluent.ksql.statement.ConfiguredStatement) KsqlConfig(io.confluent.ksql.util.KsqlConfig) KafkaFuture(org.apache.kafka.common.KafkaFuture) ExecutableDdlStatement(io.confluent.ksql.parser.tree.ExecutableDdlStatement) Executors(java.util.concurrent.Executors) Objects(java.util.Objects) MetaStoreImpl(io.confluent.ksql.metastore.MetaStoreImpl) List(java.util.List) PullQueryExecutorMetrics(io.confluent.ksql.internal.PullQueryExecutorMetrics) QueryPlannerOptions(io.confluent.ksql.planner.QueryPlannerOptions) KsqlExecutionContext(io.confluent.ksql.KsqlExecutionContext) ConsistencyOffsetVector(io.confluent.ksql.util.ConsistencyOffsetVector) StreamPullQueryMetadata(io.confluent.ksql.util.StreamPullQueryMetadata) Entry(java.util.Map.Entry) KsqlEngineMetrics(io.confluent.ksql.internal.KsqlEngineMetrics) KsqlException(io.confluent.ksql.util.KsqlException) Optional(java.util.Optional) Statement(io.confluent.ksql.parser.tree.Statement) Builder(com.google.common.collect.ImmutableList.Builder) SuppressFBWarnings(edu.umd.cs.findbugs.annotations.SuppressFBWarnings) PullQueryResult(io.confluent.ksql.physical.pull.PullQueryResult) HARouting(io.confluent.ksql.physical.pull.HARouting) StreamsConfig(org.apache.kafka.streams.StreamsConfig) PushRouting(io.confluent.ksql.physical.scalablepush.PushRouting) HashMap(java.util.HashMap) MetricCollectors(io.confluent.ksql.metrics.MetricCollectors) Function(java.util.function.Function) ImmutableList(com.google.common.collect.ImmutableList) Analysis(io.confluent.ksql.analyzer.Analysis) ConfiguredKsqlPlan(io.confluent.ksql.planner.plan.ConfiguredKsqlPlan) MetaStore(io.confluent.ksql.metastore.MetaStore) ParsedStatement(io.confluent.ksql.parser.KsqlParser.ParsedStatement) ScheduledExecutorService(java.util.concurrent.ScheduledExecutorService) Admin(org.apache.kafka.clients.admin.Admin) PushRoutingOptions(io.confluent.ksql.physical.scalablepush.PushRoutingOptions) TopicDescription(org.apache.kafka.clients.admin.TopicDescription) QueryIdGenerator(io.confluent.ksql.query.id.QueryIdGenerator) QueryContainer(io.confluent.ksql.parser.tree.QueryContainer) Logger(org.slf4j.Logger) QueryAnalyzer(io.confluent.ksql.analyzer.QueryAnalyzer) ServiceInfo(io.confluent.ksql.ServiceInfo) KsqlStatementException(io.confluent.ksql.util.KsqlStatementException) ExecutionException(java.util.concurrent.ExecutionException) TimeUnit(java.util.concurrent.TimeUnit) OffsetSpec(org.apache.kafka.clients.admin.OffsetSpec) KsqlServerException(io.confluent.ksql.util.KsqlServerException) IsolationLevel(org.apache.kafka.common.IsolationLevel) StreamsErrorCollector(io.confluent.ksql.metrics.StreamsErrorCollector) Closeable(java.io.Closeable) VisibleForTesting(com.google.common.annotations.VisibleForTesting) ListOffsetsOptions(org.apache.kafka.clients.admin.ListOffsetsOptions) Collections(java.util.Collections) PreparedStatement(io.confluent.ksql.parser.KsqlParser.PreparedStatement) ListOffsetsOptions(org.apache.kafka.clients.admin.ListOffsetsOptions) KsqlServerException(io.confluent.ksql.util.KsqlServerException) ListOffsetsResult(org.apache.kafka.clients.admin.ListOffsetsResult) TopicPartition(org.apache.kafka.common.TopicPartition) OffsetSpec(org.apache.kafka.clients.admin.OffsetSpec) ExecutionException(java.util.concurrent.ExecutionException) TimeoutException(java.util.concurrent.TimeoutException)

Aggregations

KsqlServerException (io.confluent.ksql.util.KsqlServerException)21 KsqlException (io.confluent.ksql.util.KsqlException)13 ExecutionException (java.util.concurrent.ExecutionException)10 RetryException (com.github.rholder.retry.RetryException)8 URISyntaxException (java.net.URISyntaxException)8 List (java.util.List)6 ImmutableList (com.google.common.collect.ImmutableList)5 ImmutableMap (com.google.common.collect.ImmutableMap)3 SuppressFBWarnings (edu.umd.cs.findbugs.annotations.SuppressFBWarnings)3 MetaStore (io.confluent.ksql.metastore.MetaStore)3 Statement (io.confluent.ksql.parser.tree.Statement)3 JsonNode (com.fasterxml.jackson.databind.JsonNode)2 VisibleForTesting (com.google.common.annotations.VisibleForTesting)2 Builder (com.google.common.collect.ImmutableList.Builder)2 KsqlExecutionContext (io.confluent.ksql.KsqlExecutionContext)2 ServiceInfo (io.confluent.ksql.ServiceInfo)2 Analysis (io.confluent.ksql.analyzer.Analysis)2 ImmutableAnalysis (io.confluent.ksql.analyzer.ImmutableAnalysis)2 QueryAnalyzer (io.confluent.ksql.analyzer.QueryAnalyzer)2 RewrittenAnalysis (io.confluent.ksql.analyzer.RewrittenAnalysis)2