Search in sources :

Example 6 with KafkaUser

use of io.strimzi.api.kafka.model.KafkaUser in project strimzi by strimzi.

the class SecurityST method testCertRenewalInMaintenanceWindow.

@ParallelNamespaceTest
@Tag(INTERNAL_CLIENTS_USED)
void testCertRenewalInMaintenanceWindow(ExtensionContext extensionContext) {
    final String namespaceName = StUtils.getNamespaceBasedOnRbac(namespace, extensionContext);
    final String clusterName = mapWithClusterNames.get(extensionContext.getDisplayName());
    final String topicName = mapWithTestTopics.get(extensionContext.getDisplayName());
    final String secretName = KafkaResources.clusterCaCertificateSecretName(clusterName);
    final String userName = mapWithTestUsers.get(extensionContext.getDisplayName());
    final LabelSelector kafkaSelector = KafkaResource.getLabelSelector(clusterName, KafkaResources.kafkaStatefulSetName(clusterName));
    LocalDateTime maintenanceWindowStart = LocalDateTime.now().withSecond(0);
    long maintenanceWindowDuration = 14;
    maintenanceWindowStart = maintenanceWindowStart.plusMinutes(5);
    final long windowStartMin = maintenanceWindowStart.getMinute();
    final long windowStopMin = windowStartMin + maintenanceWindowDuration > 59 ? windowStartMin + maintenanceWindowDuration - 60 : windowStartMin + maintenanceWindowDuration;
    String maintenanceWindowCron = "* " + windowStartMin + "-" + windowStopMin + " * * * ? *";
    LOGGER.info("Maintenance window is: {}", maintenanceWindowCron);
    resourceManager.createResource(extensionContext, KafkaTemplates.kafkaPersistent(clusterName, 3, 1).editSpec().addToMaintenanceTimeWindows(maintenanceWindowCron).endSpec().build());
    KafkaUser user = KafkaUserTemplates.tlsUser(clusterName, userName).build();
    resourceManager.createResource(extensionContext, user);
    resourceManager.createResource(extensionContext, KafkaTopicTemplates.topic(clusterName, topicName).build());
    resourceManager.createResource(extensionContext, KafkaTopicTemplates.topic(clusterName, topicName).build());
    resourceManager.createResource(extensionContext, KafkaClientsTemplates.kafkaClients(true, clusterName + "-" + Constants.KAFKA_CLIENTS, user).build());
    String defaultKafkaClientsPodName = kubeClient(namespaceName).listPodsByPrefixInName(namespaceName, clusterName + "-" + Constants.KAFKA_CLIENTS).get(0).getMetadata().getName();
    InternalKafkaClient internalKafkaClient = new InternalKafkaClient.Builder().withUsingPodName(defaultKafkaClientsPodName).withTopicName(topicName).withNamespaceName(namespaceName).withClusterName(clusterName).withMessageCount(MESSAGE_COUNT).withKafkaUsername(user.getMetadata().getName()).withListenerName(Constants.TLS_LISTENER_DEFAULT_NAME).build();
    Map<String, String> kafkaPods = PodUtils.podSnapshot(namespaceName, kafkaSelector);
    LOGGER.info("Annotate secret {} with secret force-renew annotation", secretName);
    Secret secret = new SecretBuilder(kubeClient(namespaceName).getSecret(namespaceName, secretName)).editMetadata().addToAnnotations(Ca.ANNO_STRIMZI_IO_FORCE_RENEW, "true").endMetadata().build();
    kubeClient(namespaceName).patchSecret(namespaceName, secretName, secret);
    LOGGER.info("Wait until maintenance windows starts");
    LocalDateTime finalMaintenanceWindowStart = maintenanceWindowStart;
    TestUtils.waitFor("maintenance window start", Constants.GLOBAL_POLL_INTERVAL, Duration.ofMinutes(maintenanceWindowDuration).toMillis() - 10000, () -> LocalDateTime.now().isAfter(finalMaintenanceWindowStart));
    LOGGER.info("Maintenance window starts");
    assertThat("Rolling update was performed out of maintenance window!", kafkaPods, is(PodUtils.podSnapshot(namespaceName, kafkaSelector)));
    LOGGER.info("Wait until rolling update is triggered during maintenance window");
    RollingUpdateUtils.waitTillComponentHasRolled(namespaceName, kafkaSelector, 3, kafkaPods);
    assertThat("Rolling update wasn't performed in correct time", LocalDateTime.now().isAfter(maintenanceWindowStart));
    LOGGER.info("Checking consumed messages to pod:{}", defaultKafkaClientsPodName);
    internalKafkaClient.checkProducedAndConsumedMessages(internalKafkaClient.sendMessagesTls(), internalKafkaClient.receiveMessagesTls());
}
Also used : LocalDateTime(java.time.LocalDateTime) Secret(io.fabric8.kubernetes.api.model.Secret) SecretBuilder(io.fabric8.kubernetes.api.model.SecretBuilder) GenericKafkaListenerBuilder(io.strimzi.api.kafka.model.listener.arraylistener.GenericKafkaListenerBuilder) ResourceRequirementsBuilder(io.fabric8.kubernetes.api.model.ResourceRequirementsBuilder) SecretBuilder(io.fabric8.kubernetes.api.model.SecretBuilder) CertificateAuthorityBuilder(io.strimzi.api.kafka.model.CertificateAuthorityBuilder) LabelSelector(io.fabric8.kubernetes.api.model.LabelSelector) InternalKafkaClient(io.strimzi.systemtest.kafkaclients.clients.InternalKafkaClient) Matchers.containsString(org.hamcrest.Matchers.containsString) KafkaUser(io.strimzi.api.kafka.model.KafkaUser) ParallelNamespaceTest(io.strimzi.systemtest.annotations.ParallelNamespaceTest) Tag(org.junit.jupiter.api.Tag)

Example 7 with KafkaUser

use of io.strimzi.api.kafka.model.KafkaUser in project strimzi by strimzi.

the class MirrorMaker2IsolatedST method testKMM2RollAfterSecretsCertsUpdateScramsha.

/**
 * Test mirroring messages by MirrorMaker 2.0 over tls transport using scram-sha-512 auth
 * while user Scram passwords, CA cluster and clients certificates are changed.
 */
@ParallelNamespaceTest
@SuppressWarnings({ "checkstyle:MethodLength" })
void testKMM2RollAfterSecretsCertsUpdateScramsha(ExtensionContext extensionContext) {
    TestStorage testStorage = new TestStorage(extensionContext);
    String kafkaClusterSourceName = testStorage.getClusterName() + "-source";
    String kafkaClusterTargetName = testStorage.getClusterName() + "-target";
    String topicSourceNameA = MIRRORMAKER2_TOPIC_NAME + "-a-" + rng.nextInt(Integer.MAX_VALUE);
    String topicSourceNameB = MIRRORMAKER2_TOPIC_NAME + "-b-" + rng.nextInt(Integer.MAX_VALUE);
    String topicTargetNameA = kafkaClusterSourceName + "." + topicSourceNameA;
    String topicTargetNameB = kafkaClusterSourceName + "." + topicSourceNameB;
    String kafkaUserSourceName = testStorage.getClusterName() + "-my-user-source";
    String kafkaUserTargetName = testStorage.getClusterName() + "-my-user-target";
    String kafkaTlsScramListenerName = "tlsscram";
    // Deploy source kafka with tls listener and SCRAM-SHA authentication
    resourceManager.createResource(extensionContext, KafkaTemplates.kafkaPersistent(kafkaClusterSourceName, 1, 1).editSpec().editKafka().withListeners(new GenericKafkaListenerBuilder().withName(kafkaTlsScramListenerName).withPort(9093).withType(KafkaListenerType.INTERNAL).withTls(true).withAuth(new KafkaListenerAuthenticationScramSha512()).build(), new GenericKafkaListenerBuilder().withName(Constants.TLS_LISTENER_DEFAULT_NAME).withPort(9094).withType(KafkaListenerType.INTERNAL).withTls(true).withAuth(new KafkaListenerAuthenticationTls()).build()).endKafka().endSpec().build());
    // Deploy target kafka with tls listeners with tls and SCRAM-SHA authentication
    resourceManager.createResource(extensionContext, KafkaTemplates.kafkaPersistent(kafkaClusterTargetName, 1, 1).editSpec().editKafka().withListeners(new GenericKafkaListenerBuilder().withName(kafkaTlsScramListenerName).withPort(9093).withType(KafkaListenerType.INTERNAL).withTls(true).withAuth(new KafkaListenerAuthenticationScramSha512()).build(), new GenericKafkaListenerBuilder().withName(Constants.TLS_LISTENER_DEFAULT_NAME).withPort(9094).withType(KafkaListenerType.INTERNAL).withTls(true).withAuth(new KafkaListenerAuthenticationTls()).build()).endKafka().endSpec().build());
    // Deploy topic
    resourceManager.createResource(extensionContext, KafkaTopicTemplates.topic(kafkaClusterSourceName, topicSourceNameA).build(), KafkaTopicTemplates.topic(kafkaClusterTargetName, topicSourceNameB).build());
    // Create Kafka user for source and target cluster
    KafkaUser userSource = KafkaUserTemplates.scramShaUser(kafkaClusterSourceName, kafkaUserSourceName).build();
    KafkaUser userTarget = KafkaUserTemplates.scramShaUser(kafkaClusterTargetName, kafkaUserTargetName).build();
    resourceManager.createResource(extensionContext, userSource, userTarget);
    // Initialize PasswordSecretSource to set this as PasswordSecret in Source/Target MirrorMaker2 spec
    PasswordSecretSource passwordSecretSource = new PasswordSecretSource();
    passwordSecretSource.setSecretName(kafkaUserSourceName);
    passwordSecretSource.setPassword("password");
    PasswordSecretSource passwordSecretTarget = new PasswordSecretSource();
    passwordSecretTarget.setSecretName(kafkaUserTargetName);
    passwordSecretTarget.setPassword("password");
    // Initialize CertSecretSource with certificate and secret names for source
    CertSecretSource certSecretSource = new CertSecretSource();
    certSecretSource.setCertificate("ca.crt");
    certSecretSource.setSecretName(KafkaResources.clusterCaCertificateSecretName(kafkaClusterSourceName));
    // Initialize CertSecretSource with certificate and secret names for target
    CertSecretSource certSecretTarget = new CertSecretSource();
    certSecretTarget.setCertificate("ca.crt");
    certSecretTarget.setSecretName(KafkaResources.clusterCaCertificateSecretName(kafkaClusterTargetName));
    // Deploy client
    resourceManager.createResource(extensionContext, KafkaClientsTemplates.kafkaClients(testStorage.getNamespaceName(), true, testStorage.getKafkaClientsName(), userSource, userTarget).build());
    String kafkaClientsPodName = kubeClient().listPodsByPrefixInName(testStorage.getKafkaClientsName()).get(0).getMetadata().getName();
    KafkaMirrorMaker2ClusterSpec sourceClusterWithScramSha512Auth = new KafkaMirrorMaker2ClusterSpecBuilder().withAlias(kafkaClusterSourceName).withBootstrapServers(KafkaResources.tlsBootstrapAddress(kafkaClusterSourceName)).withNewKafkaClientAuthenticationScramSha512().withUsername(kafkaUserSourceName).withPasswordSecret(passwordSecretSource).endKafkaClientAuthenticationScramSha512().withNewTls().withTrustedCertificates(certSecretSource).endTls().build();
    KafkaMirrorMaker2ClusterSpec targetClusterWithScramSha512Auth = new KafkaMirrorMaker2ClusterSpecBuilder().withAlias(kafkaClusterTargetName).withBootstrapServers(KafkaResources.tlsBootstrapAddress(kafkaClusterTargetName)).withNewKafkaClientAuthenticationScramSha512().withUsername(kafkaUserTargetName).withPasswordSecret(passwordSecretTarget).endKafkaClientAuthenticationScramSha512().withNewTls().withTrustedCertificates(certSecretTarget).endTls().addToConfig("config.storage.replication.factor", -1).addToConfig("offset.storage.replication.factor", -1).addToConfig("status.storage.replication.factor", -1).build();
    resourceManager.createResource(extensionContext, KafkaMirrorMaker2Templates.kafkaMirrorMaker2(testStorage.getClusterName(), kafkaClusterTargetName, kafkaClusterSourceName, 1, true).editSpec().withClusters(targetClusterWithScramSha512Auth, sourceClusterWithScramSha512Auth).editFirstMirror().editSourceConnector().addToConfig("refresh.topics.interval.seconds", 1).endSourceConnector().endMirror().endSpec().build());
    InternalKafkaClient internalKafkaClient = new InternalKafkaClient.Builder().withUsingPodName(kafkaClientsPodName).withTopicName(topicSourceNameA).withNamespaceName(testStorage.getNamespaceName()).withClusterName(kafkaClusterSourceName).withKafkaUsername(kafkaUserSourceName).withMessageCount(messagesCount).withListenerName(kafkaTlsScramListenerName).build();
    int sent = internalKafkaClient.sendMessagesTls();
    internalKafkaClient.checkProducedAndConsumedMessages(sent, internalKafkaClient.receiveMessagesTls());
    internalKafkaClient = internalKafkaClient.toBuilder().withTopicName(topicTargetNameA).withClusterName(kafkaClusterTargetName).withKafkaUsername(kafkaUserTargetName).withConsumerGroupName(ClientUtils.generateRandomConsumerGroup()).build();
    LOGGER.info("Now messages should be mirrored to target topic and cluster");
    internalKafkaClient.checkProducedAndConsumedMessages(sent, internalKafkaClient.receiveMessagesTls());
    LOGGER.info("Messages successfully mirrored");
    String kmm2DeploymentName = KafkaMirrorMaker2Resources.deploymentName(testStorage.getClusterName());
    Map<String, String> mmSnapshot = DeploymentUtils.depSnapshot(testStorage.getNamespaceName(), kmm2DeploymentName);
    LOGGER.info("Changing KafkaUser sha-password on KMM2 Source and make sure it rolled");
    Secret passwordSource = new SecretBuilder().withNewMetadata().withName(kafkaUserSourceName).endMetadata().addToData("password", "c291cmNlLXBhc3N3b3Jk").build();
    kubeClient().patchSecret(testStorage.getNamespaceName(), kafkaUserSourceName, passwordSource);
    mmSnapshot = DeploymentUtils.waitTillDepHasRolled(testStorage.getNamespaceName(), kmm2DeploymentName, 1, mmSnapshot);
    LOGGER.info("Changing KafkaUser sha-password on KMM2 Target");
    Secret passwordTarget = new SecretBuilder().withNewMetadata().withName(kafkaUserTargetName).endMetadata().addToData("password", "dGFyZ2V0LXBhc3N3b3Jk").build();
    kubeClient().patchSecret(testStorage.getNamespaceName(), kafkaUserTargetName, passwordTarget);
    DeploymentUtils.waitTillDepHasRolled(testStorage.getNamespaceName(), kmm2DeploymentName, 1, mmSnapshot);
    LOGGER.info("Recreate kafkaClients pod with new passwords.");
    resourceManager.deleteResource(kubeClient().namespace(testStorage.getNamespaceName()).getDeployment(testStorage.getKafkaClientsName()));
    resourceManager.createResource(extensionContext, KafkaClientsTemplates.kafkaClients(testStorage.getNamespaceName(), true, testStorage.getKafkaClientsName(), userSource, userTarget).build());
    kafkaClientsPodName = kubeClient().listPodsByPrefixInName(testStorage.getKafkaClientsName()).get(0).getMetadata().getName();
    internalKafkaClient = internalKafkaClient.toBuilder().withUsingPodName(kafkaClientsPodName).withTopicName(topicSourceNameB).withClusterName(kafkaClusterSourceName).withKafkaUsername(kafkaUserSourceName).withListenerName(kafkaTlsScramListenerName).build();
    sent = internalKafkaClient.sendMessagesTls();
    internalKafkaClient = internalKafkaClient.toBuilder().withTopicName(topicTargetNameB).withClusterName(kafkaClusterTargetName).withKafkaUsername(kafkaUserTargetName).withConsumerGroupName(ClientUtils.generateRandomConsumerGroup()).build();
    LOGGER.info("Now messages should be mirrored to target topic and cluster");
    internalKafkaClient.consumesTlsMessagesUntilOperationIsSuccessful(sent);
    LOGGER.info("Messages successfully mirrored");
}
Also used : GenericKafkaListenerBuilder(io.strimzi.api.kafka.model.listener.arraylistener.GenericKafkaListenerBuilder) JobBuilder(io.fabric8.kubernetes.api.model.batch.v1.JobBuilder) KafkaClientsBuilder(io.strimzi.systemtest.kafkaclients.internalClients.KafkaClientsBuilder) KafkaMirrorMaker2ClusterSpecBuilder(io.strimzi.api.kafka.model.KafkaMirrorMaker2ClusterSpecBuilder) SecretBuilder(io.fabric8.kubernetes.api.model.SecretBuilder) PasswordSecretSource(io.strimzi.api.kafka.model.PasswordSecretSource) Matchers.containsString(org.hamcrest.Matchers.containsString) KafkaListenerAuthenticationScramSha512(io.strimzi.api.kafka.model.listener.KafkaListenerAuthenticationScramSha512) Secret(io.fabric8.kubernetes.api.model.Secret) SecretBuilder(io.fabric8.kubernetes.api.model.SecretBuilder) KafkaListenerAuthenticationTls(io.strimzi.api.kafka.model.listener.KafkaListenerAuthenticationTls) GenericKafkaListenerBuilder(io.strimzi.api.kafka.model.listener.arraylistener.GenericKafkaListenerBuilder) KafkaMirrorMaker2ClusterSpecBuilder(io.strimzi.api.kafka.model.KafkaMirrorMaker2ClusterSpecBuilder) KafkaMirrorMaker2ClusterSpec(io.strimzi.api.kafka.model.KafkaMirrorMaker2ClusterSpec) InternalKafkaClient(io.strimzi.systemtest.kafkaclients.clients.InternalKafkaClient) TestStorage(io.strimzi.systemtest.storage.TestStorage) CertSecretSource(io.strimzi.api.kafka.model.CertSecretSource) KafkaUser(io.strimzi.api.kafka.model.KafkaUser) ParallelNamespaceTest(io.strimzi.systemtest.annotations.ParallelNamespaceTest)

Example 8 with KafkaUser

use of io.strimzi.api.kafka.model.KafkaUser in project strimzi by strimzi.

the class ThrottlingQuotaST method setupKafkaUserInNamespace.

void setupKafkaUserInNamespace(ExtensionContext extensionContext, String kafkaUsername) {
    // Deploy KafkaUser with defined Quotas
    KafkaUser kafkaUserWQuota = KafkaUserTemplates.defaultUser(namespace, CLUSTER_NAME, kafkaUsername).editOrNewSpec().withNewQuotas().withControllerMutationRate(1.0).endQuotas().withAuthentication(new KafkaUserScramSha512ClientAuthentication()).endSpec().build();
    resourceManager.createResource(extensionContext, kafkaUserWQuota);
}
Also used : KafkaUserScramSha512ClientAuthentication(io.strimzi.api.kafka.model.KafkaUserScramSha512ClientAuthentication) KafkaUser(io.strimzi.api.kafka.model.KafkaUser)

Example 9 with KafkaUser

use of io.strimzi.api.kafka.model.KafkaUser in project strimzi by strimzi.

the class KafkaUserModelTest method testGenerateSecretUseDesiredPasswordMissingKey.

@Test
public void testGenerateSecretUseDesiredPasswordMissingKey() {
    KafkaUser user = new KafkaUserBuilder(scramShaUser).editSpec().withNewKafkaUserScramSha512ClientAuthentication().withNewPassword().withNewValueFrom().withNewSecretKeyRef("my-password", "my-secret", false).endValueFrom().endPassword().endKafkaUserScramSha512ClientAuthentication().endSpec().build();
    Secret desiredPasswordSecret = new SecretBuilder().withNewMetadata().withName("my-secret").endMetadata().addToData("my-other-password", DESIRED_BASE64_PASSWORD).build();
    KafkaUserModel model = KafkaUserModel.fromCrd(user, UserOperatorConfig.DEFAULT_SECRET_PREFIX, UserOperatorConfig.DEFAULT_STRIMZI_ACLS_ADMIN_API_SUPPORTED, false);
    InvalidResourceException e = assertThrows(InvalidResourceException.class, () -> {
        model.maybeGeneratePassword(Reconciliation.DUMMY_RECONCILIATION, passwordGenerator, null, desiredPasswordSecret);
    });
    assertThat(e.getMessage(), is("Secret my-secret does not contain the key my-password with requested user password."));
}
Also used : Secret(io.fabric8.kubernetes.api.model.Secret) SecretBuilder(io.fabric8.kubernetes.api.model.SecretBuilder) InvalidResourceException(io.strimzi.operator.cluster.model.InvalidResourceException) KafkaUserBuilder(io.strimzi.api.kafka.model.KafkaUserBuilder) KafkaUser(io.strimzi.api.kafka.model.KafkaUser) Test(org.junit.jupiter.api.Test)

Example 10 with KafkaUser

use of io.strimzi.api.kafka.model.KafkaUser in project strimzi by strimzi.

the class KafkaUserModelTest method testGetSimpleAclRulesWithNoSimpleAuthorizationReturnsNull.

@Test
public void testGetSimpleAclRulesWithNoSimpleAuthorizationReturnsNull() {
    Secret userCert = ResourceUtils.createUserSecretTls();
    KafkaUser user = ResourceUtils.createKafkaUserTls();
    user.setSpec(new KafkaUserSpec());
    KafkaUserModel model = KafkaUserModel.fromCrd(user, UserOperatorConfig.DEFAULT_SECRET_PREFIX, UserOperatorConfig.DEFAULT_STRIMZI_ACLS_ADMIN_API_SUPPORTED, false);
    assertThat(model.getSimpleAclRules(), is(nullValue()));
}
Also used : Secret(io.fabric8.kubernetes.api.model.Secret) KafkaUserSpec(io.strimzi.api.kafka.model.KafkaUserSpec) KafkaUser(io.strimzi.api.kafka.model.KafkaUser) Test(org.junit.jupiter.api.Test)

Aggregations

KafkaUser (io.strimzi.api.kafka.model.KafkaUser)128 Test (org.junit.jupiter.api.Test)70 Secret (io.fabric8.kubernetes.api.model.Secret)68 KafkaUserBuilder (io.strimzi.api.kafka.model.KafkaUserBuilder)58 SecretBuilder (io.fabric8.kubernetes.api.model.SecretBuilder)56 LabelSelector (io.fabric8.kubernetes.api.model.LabelSelector)44 CrdOperator (io.strimzi.operator.common.operator.resource.CrdOperator)44 SecretOperator (io.strimzi.operator.common.operator.resource.SecretOperator)44 HashSet (java.util.HashSet)44 KafkaUserStatus (io.strimzi.api.kafka.model.status.KafkaUserStatus)42 Reconciliation (io.strimzi.operator.common.Reconciliation)42 Promise (io.vertx.core.Promise)42 Checkpoint (io.vertx.junit5.Checkpoint)40 CopyOnWriteArraySet (java.util.concurrent.CopyOnWriteArraySet)40 ArgumentMatchers.anyString (org.mockito.ArgumentMatchers.anyString)40 KafkaUserQuotas (io.strimzi.api.kafka.model.KafkaUserQuotas)36 CertManager (io.strimzi.certs.CertManager)36 KafkaUserModel (io.strimzi.operator.user.model.KafkaUserModel)36 SimpleAclRule (io.strimzi.operator.user.model.acl.SimpleAclRule)36 Future (io.vertx.core.Future)36