Search in sources :

Example 56 with KafkaFuture

use of org.apache.kafka.common.KafkaFuture in project core-ng-project by neowu.

the class KafkaController method topics.

public Response topics(Request request) throws ExecutionException, InterruptedException {
    ControllerHelper.assertFromLocalNetwork(request.clientIP());
    List<KafkaTopic> views = Lists.newArrayList();
    try (AdminClient admin = kafka.admin()) {
        Set<String> topics = admin.listTopics().names().get();
        DescribeTopicsResult descriptions = admin.describeTopics(topics);
        for (Map.Entry<String, KafkaFuture<TopicDescription>> entry : descriptions.values().entrySet()) {
            String name = entry.getKey();
            TopicDescription description = entry.getValue().get();
            KafkaTopic view = view(name, description);
            views.add(view);
        }
    }
    return Response.bean(views);
}
Also used : KafkaFuture(org.apache.kafka.common.KafkaFuture) DescribeTopicsResult(org.apache.kafka.clients.admin.DescribeTopicsResult) TopicDescription(org.apache.kafka.clients.admin.TopicDescription) Map(java.util.Map) AdminClient(org.apache.kafka.clients.admin.AdminClient)

Example 57 with KafkaFuture

use of org.apache.kafka.common.KafkaFuture in project kafka by apache.

the class DescribeUserScramCredentialsResult method description.

/**
 * @param userName the name of the user description being requested
 * @return a future indicating the description results for the given user. The future will complete exceptionally if
 * the future returned by {@link #users()} completes exceptionally.  Note that if the given user does not exist in
 * the list of described users then the returned future will complete exceptionally with
 * {@link org.apache.kafka.common.errors.ResourceNotFoundException}.
 */
public KafkaFuture<UserScramCredentialsDescription> description(String userName) {
    final KafkaFutureImpl<UserScramCredentialsDescription> retval = new KafkaFutureImpl<>();
    dataFuture.whenComplete((data, throwable) -> {
        if (throwable != null) {
            retval.completeExceptionally(throwable);
        } else {
            // it is possible that there is no future for this user (for example, the original describe request was
            // for users 1, 2, and 3 but this is looking for user 4), so explicitly take care of that case
            Optional<DescribeUserScramCredentialsResponseData.DescribeUserScramCredentialsResult> optionalUserResult = data.results().stream().filter(result -> result.user().equals(userName)).findFirst();
            if (!optionalUserResult.isPresent()) {
                retval.completeExceptionally(new ResourceNotFoundException("No such user: " + userName));
            } else {
                DescribeUserScramCredentialsResponseData.DescribeUserScramCredentialsResult userResult = optionalUserResult.get();
                if (userResult.errorCode() != Errors.NONE.code()) {
                    // RESOURCE_NOT_FOUND is included here
                    retval.completeExceptionally(Errors.forCode(userResult.errorCode()).exception(userResult.errorMessage()));
                } else {
                    retval.complete(new UserScramCredentialsDescription(userResult.user(), getScramCredentialInfosFor(userResult)));
                }
            }
        }
    });
    return retval;
}
Also used : Objects(java.util.Objects) List(java.util.List) ResourceNotFoundException(org.apache.kafka.common.errors.ResourceNotFoundException) InterfaceStability(org.apache.kafka.common.annotation.InterfaceStability) Map(java.util.Map) Errors(org.apache.kafka.common.protocol.Errors) Optional(java.util.Optional) HashMap(java.util.HashMap) KafkaFuture(org.apache.kafka.common.KafkaFuture) KafkaFutureImpl(org.apache.kafka.common.internals.KafkaFutureImpl) DescribeUserScramCredentialsResponseData(org.apache.kafka.common.message.DescribeUserScramCredentialsResponseData) Collectors(java.util.stream.Collectors) DescribeUserScramCredentialsResponseData(org.apache.kafka.common.message.DescribeUserScramCredentialsResponseData) KafkaFutureImpl(org.apache.kafka.common.internals.KafkaFutureImpl) ResourceNotFoundException(org.apache.kafka.common.errors.ResourceNotFoundException)

Example 58 with KafkaFuture

use of org.apache.kafka.common.KafkaFuture in project kafka by apache.

the class MirrorCheckpointTask method refreshIdleConsumerGroupOffset.

private void refreshIdleConsumerGroupOffset() {
    Map<String, KafkaFuture<ConsumerGroupDescription>> consumerGroupsDesc = targetAdminClient.describeConsumerGroups(consumerGroups).describedGroups();
    for (String group : consumerGroups) {
        try {
            ConsumerGroupDescription consumerGroupDesc = consumerGroupsDesc.get(group).get();
            ConsumerGroupState consumerGroupState = consumerGroupDesc.state();
            // (2) dead: the new consumer that is recently created at source and never exist at target
            if (consumerGroupState.equals(ConsumerGroupState.EMPTY)) {
                idleConsumerGroupsOffset.put(group, targetAdminClient.listConsumerGroupOffsets(group).partitionsToOffsetAndMetadata().get().entrySet().stream().collect(Collectors.toMap(Entry::getKey, Entry::getValue)));
            }
        // new consumer upstream has state "DEAD" and will be identified during the offset sync-up
        } catch (InterruptedException | ExecutionException e) {
            log.error("Error querying for consumer group {} on cluster {}.", group, targetClusterAlias, e);
        }
    }
}
Also used : Entry(java.util.Map.Entry) KafkaFuture(org.apache.kafka.common.KafkaFuture) ConsumerGroupDescription(org.apache.kafka.clients.admin.ConsumerGroupDescription) ConsumerGroupState(org.apache.kafka.common.ConsumerGroupState) ExecutionException(java.util.concurrent.ExecutionException)

Example 59 with KafkaFuture

use of org.apache.kafka.common.KafkaFuture in project kafka by apache.

the class KafkaAdminClientTest method testAlterUserScramCredentialsUnknownMechanism.

@Test
public void testAlterUserScramCredentialsUnknownMechanism() throws Exception {
    try (AdminClientUnitTestEnv env = mockClientEnv()) {
        env.kafkaClient().setNodeApiVersions(NodeApiVersions.create());
        final String user0Name = "user0";
        ScramMechanism user0ScramMechanism0 = ScramMechanism.UNKNOWN;
        final String user1Name = "user1";
        ScramMechanism user1ScramMechanism0 = ScramMechanism.UNKNOWN;
        final String user2Name = "user2";
        ScramMechanism user2ScramMechanism0 = ScramMechanism.SCRAM_SHA_256;
        AlterUserScramCredentialsResponseData responseData = new AlterUserScramCredentialsResponseData();
        responseData.setResults(Arrays.asList(new AlterUserScramCredentialsResponseData.AlterUserScramCredentialsResult().setUser(user2Name)));
        env.kafkaClient().prepareResponse(new AlterUserScramCredentialsResponse(responseData));
        AlterUserScramCredentialsResult result = env.adminClient().alterUserScramCredentials(Arrays.asList(new UserScramCredentialDeletion(user0Name, user0ScramMechanism0), new UserScramCredentialUpsertion(user1Name, new ScramCredentialInfo(user1ScramMechanism0, 8192), "password"), new UserScramCredentialUpsertion(user2Name, new ScramCredentialInfo(user2ScramMechanism0, 4096), "password")));
        Map<String, KafkaFuture<Void>> resultData = result.values();
        assertEquals(3, resultData.size());
        Arrays.asList(user0Name, user1Name).stream().forEach(u -> {
            assertTrue(resultData.containsKey(u));
            try {
                resultData.get(u).get();
                fail("Expected request for user " + u + " to complete exceptionally, but it did not");
            } catch (Exception expected) {
            // ignore
            }
        });
        assertTrue(resultData.containsKey(user2Name));
        try {
            resultData.get(user2Name).get();
        } catch (Exception e) {
            fail("Expected request for user " + user2Name + " to NOT complete excdptionally, but it did");
        }
        try {
            result.all().get();
            fail("Expected 'result.all().get()' to throw an exception since at least one user failed, but it did not");
        } catch (final Exception expected) {
        // ignore, expected
        }
    }
}
Also used : KafkaFuture(org.apache.kafka.common.KafkaFuture) AlterUserScramCredentialsResponse(org.apache.kafka.common.requests.AlterUserScramCredentialsResponse) ThrottlingQuotaExceededException(org.apache.kafka.common.errors.ThrottlingQuotaExceededException) KafkaException(org.apache.kafka.common.KafkaException) UnknownTopicOrPartitionException(org.apache.kafka.common.errors.UnknownTopicOrPartitionException) AuthenticationException(org.apache.kafka.common.errors.AuthenticationException) SecurityDisabledException(org.apache.kafka.common.errors.SecurityDisabledException) ExecutionException(java.util.concurrent.ExecutionException) GroupAuthorizationException(org.apache.kafka.common.errors.GroupAuthorizationException) ClusterAuthorizationException(org.apache.kafka.common.errors.ClusterAuthorizationException) UnknownServerException(org.apache.kafka.common.errors.UnknownServerException) UnknownMemberIdException(org.apache.kafka.common.errors.UnknownMemberIdException) TimeoutException(org.apache.kafka.common.errors.TimeoutException) GroupSubscribedToTopicException(org.apache.kafka.common.errors.GroupSubscribedToTopicException) ConfigException(org.apache.kafka.common.config.ConfigException) TopicAuthorizationException(org.apache.kafka.common.errors.TopicAuthorizationException) InvalidRequestException(org.apache.kafka.common.errors.InvalidRequestException) NotLeaderOrFollowerException(org.apache.kafka.common.errors.NotLeaderOrFollowerException) InvalidTopicException(org.apache.kafka.common.errors.InvalidTopicException) TopicDeletionDisabledException(org.apache.kafka.common.errors.TopicDeletionDisabledException) UnknownTopicIdException(org.apache.kafka.common.errors.UnknownTopicIdException) SaslAuthenticationException(org.apache.kafka.common.errors.SaslAuthenticationException) FencedInstanceIdException(org.apache.kafka.common.errors.FencedInstanceIdException) UnsupportedVersionException(org.apache.kafka.common.errors.UnsupportedVersionException) LogDirNotFoundException(org.apache.kafka.common.errors.LogDirNotFoundException) TopicExistsException(org.apache.kafka.common.errors.TopicExistsException) LeaderNotAvailableException(org.apache.kafka.common.errors.LeaderNotAvailableException) OffsetOutOfRangeException(org.apache.kafka.common.errors.OffsetOutOfRangeException) ApiException(org.apache.kafka.common.errors.ApiException) AlterUserScramCredentialsResponseData(org.apache.kafka.common.message.AlterUserScramCredentialsResponseData) ParameterizedTest(org.junit.jupiter.params.ParameterizedTest) Test(org.junit.jupiter.api.Test)

Example 60 with KafkaFuture

use of org.apache.kafka.common.KafkaFuture in project kafka by apache.

the class KafkaAdminClientTest method testDescribeLogDirsOfflineDirDeprecated.

@SuppressWarnings("deprecation")
@Test
public void testDescribeLogDirsOfflineDirDeprecated() throws ExecutionException, InterruptedException {
    Set<Integer> brokers = singleton(0);
    String logDir = "/var/data/kafka";
    Errors error = Errors.KAFKA_STORAGE_ERROR;
    try (AdminClientUnitTestEnv env = mockClientEnv()) {
        env.kafkaClient().setNodeApiVersions(NodeApiVersions.create());
        env.kafkaClient().prepareResponseFrom(prepareDescribeLogDirsResponse(error, logDir, emptyList()), env.cluster().nodeById(0));
        DescribeLogDirsResult result = env.adminClient().describeLogDirs(brokers);
        Map<Integer, KafkaFuture<Map<String, DescribeLogDirsResponse.LogDirInfo>>> deprecatedValues = result.values();
        assertEquals(brokers, deprecatedValues.keySet());
        assertNotNull(deprecatedValues.get(0));
        Map<String, DescribeLogDirsResponse.LogDirInfo> valuesMap = deprecatedValues.get(0).get();
        assertEquals(singleton(logDir), valuesMap.keySet());
        assertEquals(error, valuesMap.get(logDir).error);
        assertEquals(emptySet(), valuesMap.get(logDir).replicaInfos.keySet());
        Map<Integer, Map<String, DescribeLogDirsResponse.LogDirInfo>> deprecatedAll = result.all().get();
        assertEquals(brokers, deprecatedAll.keySet());
        Map<String, DescribeLogDirsResponse.LogDirInfo> allMap = deprecatedAll.get(0);
        assertNotNull(allMap);
        assertEquals(singleton(logDir), allMap.keySet());
        assertEquals(error, allMap.get(logDir).error);
        assertEquals(emptySet(), allMap.get(logDir).replicaInfos.keySet());
    }
}
Also used : KafkaFuture(org.apache.kafka.common.KafkaFuture) DescribeLogDirsResponse(org.apache.kafka.common.requests.DescribeLogDirsResponse) Errors(org.apache.kafka.common.protocol.Errors) Map(java.util.Map) HashMap(java.util.HashMap) ParameterizedTest(org.junit.jupiter.params.ParameterizedTest) Test(org.junit.jupiter.api.Test)

Aggregations

KafkaFuture (org.apache.kafka.common.KafkaFuture)84 HashMap (java.util.HashMap)59 Map (java.util.Map)43 KafkaFutureImpl (org.apache.kafka.common.internals.KafkaFutureImpl)31 ExecutionException (java.util.concurrent.ExecutionException)30 TimeoutException (org.apache.kafka.common.errors.TimeoutException)21 ArrayList (java.util.ArrayList)16 TopicPartition (org.apache.kafka.common.TopicPartition)16 ConfigResource (org.apache.kafka.common.config.ConfigResource)16 UnknownTopicOrPartitionException (org.apache.kafka.common.errors.UnknownTopicOrPartitionException)15 Test (org.junit.jupiter.api.Test)15 HashSet (java.util.HashSet)14 ParameterizedTest (org.junit.jupiter.params.ParameterizedTest)14 Test (org.junit.Test)12 TopicPartitionReplica (org.apache.kafka.common.TopicPartitionReplica)10 TopicExistsException (org.apache.kafka.common.errors.TopicExistsException)10 NewTopic (org.apache.kafka.clients.admin.NewTopic)8 AbstractResponse (org.apache.kafka.common.requests.AbstractResponse)8 AdminClient (org.apache.kafka.clients.admin.AdminClient)7 ReplicaLogDirInfo (org.apache.kafka.clients.admin.DescribeReplicaLogDirsResult.ReplicaLogDirInfo)7