Search in sources :

Example 6 with Record

use of org.akhq.models.Record in project akhq by tchiotludo.

the class RecordRepository method newRecord.

private Record newRecord(ConsumerRecord<byte[], byte[]> record, BaseOptions options, Topic topic) {
    SchemaRegistryType schemaRegistryType = this.schemaRegistryRepository.getSchemaRegistryType(options.clusterId);
    SchemaRegistryClient client = this.kafkaModule.getRegistryClient(options.clusterId);
    return new Record(client, record, schemaRegistryType, this.schemaRegistryRepository.getKafkaAvroDeserializer(options.clusterId), schemaRegistryType == SchemaRegistryType.CONFLUENT ? this.schemaRegistryRepository.getKafkaJsonDeserializer(options.clusterId) : null, schemaRegistryType == SchemaRegistryType.CONFLUENT ? this.schemaRegistryRepository.getKafkaProtoDeserializer(options.clusterId) : null, this.avroToJsonSerializer, this.customDeserializerRepository.getProtobufToJsonDeserializer(options.clusterId), this.customDeserializerRepository.getAvroToJsonDeserializer(options.clusterId), avroWireFormatConverter.convertValueToWireFormat(record, client, this.schemaRegistryRepository.getSchemaRegistryType(options.clusterId)), topic);
}
Also used : ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) Record(org.akhq.models.Record) SchemaRegistryType(org.akhq.configs.SchemaRegistryType) SchemaRegistryClient(io.confluent.kafka.schemaregistry.client.SchemaRegistryClient)

Example 7 with Record

use of org.akhq.models.Record in project akhq by tchiotludo.

the class RecordRepository method tail.

public Flowable<Event<TailEvent>> tail(String clusterId, TailOptions options) {
    return Flowable.generate(() -> {
        KafkaConsumer<byte[], byte[]> consumer = this.kafkaModule.getConsumer(options.clusterId);
        Map<String, Topic> topics = topicRepository.findByName(clusterId, options.topics).stream().collect(Collectors.toMap(Topic::getName, Function.identity()));
        consumer.assign(topics.values().stream().flatMap(topic -> topic.getPartitions().stream().map(partition -> new TopicPartition(topic.getName(), partition.getId()))).collect(Collectors.toList()));
        if (options.getAfter() != null) {
            options.getAfter().forEach(s -> {
                String[] split = s.split(",");
                consumer.seek(new TopicPartition(split[0], Integer.parseInt(split[1])), Long.parseLong(split[2]));
            });
        }
        return new TailState(consumer, new TailEvent(), topics);
    }, (state, subscriber) -> {
        ConsumerRecords<byte[], byte[]> records = this.poll(state.getConsumer());
        TailEvent tailEvent = state.getTailEvent();
        List<Record> list = new ArrayList<>();
        for (ConsumerRecord<byte[], byte[]> record : records) {
            tailEvent.offsets.put(ImmutableMap.of(record.topic(), record.partition()), record.offset());
            Record current = newRecord(record, options, state.getTopics().get(record.topic()));
            if (searchFilter(options, current)) {
                list.add(current);
                log.trace("Record [topic: {}] [partition: {}] [offset: {}] [key: {}]", record.topic(), record.partition(), record.offset(), record.key());
            }
        }
        tailEvent.records = list;
        subscriber.onNext(Event.of(tailEvent).name("tailBody"));
        state.tailEvent = tailEvent;
        return state;
    });
}
Also used : JsonProperty(com.fasterxml.jackson.annotation.JsonProperty) Environment(io.micronaut.context.env.Environment) org.apache.kafka.clients.consumer(org.apache.kafka.clients.consumer) java.util(java.util) TopicController(org.akhq.controllers.TopicController) ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) Record(org.akhq.models.Record) SchemaRegistryClient(io.confluent.kafka.schemaregistry.client.SchemaRegistryClient) Debug(org.akhq.utils.Debug) RecordHeader(org.apache.kafka.common.header.internals.RecordHeader) Function(java.util.function.Function) Event(io.micronaut.http.sse.Event) RecordsToDelete(org.apache.kafka.clients.admin.RecordsToDelete) Topic(org.akhq.models.Topic) AvroToJsonSerializer(org.akhq.utils.AvroToJsonSerializer) KafkaProducer(org.apache.kafka.clients.producer.KafkaProducer) Flowable(io.reactivex.Flowable) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) KeyValue(org.akhq.models.KeyValue) KafkaModule(org.akhq.modules.KafkaModule) SchemaSerializer(org.akhq.modules.schemaregistry.SchemaSerializer) URIBuilder(org.codehaus.httpcache4j.uri.URIBuilder) Splitter(com.google.common.base.Splitter) RecordWithSchemaSerializerFactory(org.akhq.modules.schemaregistry.RecordWithSchemaSerializerFactory) TopicPartition(org.apache.kafka.common.TopicPartition) ImmutableMap(com.google.common.collect.ImmutableMap) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Singleton(jakarta.inject.Singleton) Value(io.micronaut.context.annotation.Value) KafkaFuture(org.apache.kafka.common.KafkaFuture) RecordMetadata(org.apache.kafka.clients.producer.RecordMetadata) Collectors(java.util.stream.Collectors) lombok(lombok) Partition(org.akhq.models.Partition) ExecutionException(java.util.concurrent.ExecutionException) StringUtils(io.micronaut.core.util.StringUtils) Slf4j(lombok.extern.slf4j.Slf4j) Stream(java.util.stream.Stream) SchemaRegistryType(org.akhq.configs.SchemaRegistryType) Inject(jakarta.inject.Inject) DeletedRecords(org.apache.kafka.clients.admin.DeletedRecords) TopicPartition(org.apache.kafka.common.TopicPartition) ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) Record(org.akhq.models.Record) Topic(org.akhq.models.Topic)

Example 8 with Record

use of org.akhq.models.Record in project akhq by tchiotludo.

the class RecordRepositoryTest method produceAndConsumeRecordUsingJsonSchema.

@Test
void produceAndConsumeRecordUsingJsonSchema() throws ExecutionException, InterruptedException, IOException, RestClientException {
    Schema keyJsonSchema = registerSchema("json_schema/key.json", KafkaTestCluster.TOPIC_JSON_SCHEMA + "-key");
    Schema valueJsonSchema = registerSchema("json_schema/album.json", KafkaTestCluster.TOPIC_JSON_SCHEMA + "-value");
    Album objectSatisfyingJsonSchema = new Album("title", List.of("artist_1", "artist_2"), 1989, List.of("song_1", "song_2"));
    String recordAsJsonString = objectMapper.writeValueAsString(objectSatisfyingJsonSchema);
    String keyJsonString = new JSONObject(Collections.singletonMap("id", "83fff9f8-b47a-4bf7-863b-9942c4369f06")).toString();
    RecordMetadata producedRecordMetadata = repository.produce(KafkaTestCluster.CLUSTER_ID, KafkaTestCluster.TOPIC_JSON_SCHEMA, recordAsJsonString, Collections.emptyMap(), Optional.of(keyJsonString), Optional.empty(), Optional.empty(), Optional.of(keyJsonSchema.getId()), Optional.of(valueJsonSchema.getId()));
    RecordRepository.Options options = new RecordRepository.Options(environment, KafkaTestCluster.CLUSTER_ID, KafkaTestCluster.TOPIC_JSON_SCHEMA);
    List<Record> records = consumeAllRecord(options);
    Optional<Record> consumedRecord = records.stream().filter(record -> Objects.equals(record.getKey(), keyJsonString)).findFirst();
    assertTrue(consumedRecord.isPresent());
    Record recordToAssert = consumedRecord.get();
    assertEquals(recordToAssert.getKey(), keyJsonString);
    assertEquals(recordToAssert.getValue(), recordAsJsonString);
    assertEquals(recordToAssert.getValueSchemaId(), valueJsonSchema.getId());
    // clear schema registry as it is shared between tests
    schemaRegistryRepository.delete(KafkaTestCluster.CLUSTER_ID, keyJsonSchema.getSubject());
    schemaRegistryRepository.delete(KafkaTestCluster.CLUSTER_ID, valueJsonSchema.getSubject());
}
Also used : RecordMetadata(org.apache.kafka.clients.producer.RecordMetadata) Environment(io.micronaut.context.env.Environment) java.util(java.util) Record(org.akhq.models.Record) ResourceTestUtil(org.akhq.utils.ResourceTestUtil) Album(org.akhq.utils.Album) AtomicBoolean(java.util.concurrent.atomic.AtomicBoolean) Disabled(org.junit.jupiter.api.Disabled) Schema(org.akhq.models.Schema) Topic(org.akhq.models.Topic) JSONObject(org.json.JSONObject) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) KafkaTestCluster(org.akhq.KafkaTestCluster) RestClientException(io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException) MatcherAssert.assertThat(org.hamcrest.MatcherAssert.assertThat) Assertions.assertEquals(org.junit.jupiter.api.Assertions.assertEquals) URIBuilder(org.codehaus.httpcache4j.uri.URIBuilder) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) IOException(java.io.IOException) RecordMetadata(org.apache.kafka.clients.producer.RecordMetadata) AbstractTest(org.akhq.AbstractTest) Test(org.junit.jupiter.api.Test) ExecutionException(java.util.concurrent.ExecutionException) Slf4j(lombok.extern.slf4j.Slf4j) Assertions.assertTrue(org.junit.jupiter.api.Assertions.assertTrue) Inject(jakarta.inject.Inject) Matchers.containsString(org.hamcrest.Matchers.containsString) JSONObject(org.json.JSONObject) Schema(org.akhq.models.Schema) Album(org.akhq.utils.Album) Record(org.akhq.models.Record) Matchers.containsString(org.hamcrest.Matchers.containsString) AbstractTest(org.akhq.AbstractTest) Test(org.junit.jupiter.api.Test)

Example 9 with Record

use of org.akhq.models.Record in project akhq by tchiotludo.

the class RecordRepositoryTest method consumeAllRecord.

private List<Record> consumeAllRecord(RecordRepository.Options options) throws ExecutionException, InterruptedException {
    boolean hasNext = true;
    List<Record> all = new ArrayList<>();
    do {
        List<Record> datas = repository.consume(KafkaTestCluster.CLUSTER_ID, options);
        all.addAll(datas);
        datas.forEach(record -> log.debug("Records [Topic: {}] [Partition: {}] [Offset: {}] [Key: {}] [Value: {}]", record.getTopic(), record.getPartition(), record.getOffset(), record.getKey(), record.getValue()));
        log.info("Consume {} records", datas.size());
        URIBuilder after = options.after(datas, URIBuilder.empty());
        if (datas.size() == 0) {
            hasNext = false;
        } else if (after != null) {
            options.setAfter(after.getParametersByName("after").get(0).getValue());
        }
    } while (hasNext);
    return all;
}
Also used : Record(org.akhq.models.Record) URIBuilder(org.codehaus.httpcache4j.uri.URIBuilder)

Example 10 with Record

use of org.akhq.models.Record in project akhq by tchiotludo.

the class SseControllerTest method searchApi.

@Test
void searchApi() {
    RxSseClient sseClient = embeddedServer.getApplicationContext().createBean(RxSseClient.class, embeddedServer.getURL());
    List<Record> results = sseClient.eventStream(BASE_URL + "/" + KafkaTestCluster.TOPIC_HUGE + "/data/search?searchByKey=key_100_C", TopicController.SearchRecord.class).toList().blockingGet().stream().flatMap(r -> r.getData() != null && r.getData().getRecords() != null ? r.getData().getRecords().stream() : Stream.empty()).collect(Collectors.toList());
    assertThat(results.size(), is(3));
}
Also used : Property(io.micronaut.context.annotation.Property) Record(org.akhq.models.Record) Collectors(java.util.stream.Collectors) AbstractTest(org.akhq.AbstractTest) Test(org.junit.jupiter.api.Test) List(java.util.List) RxSseClient(io.micronaut.rxjava2.http.client.sse.RxSseClient) Stream(java.util.stream.Stream) EmbeddedServer(io.micronaut.runtime.server.EmbeddedServer) KafkaTestCluster(org.akhq.KafkaTestCluster) Matchers.is(org.hamcrest.Matchers.is) MatcherAssert.assertThat(org.hamcrest.MatcherAssert.assertThat) Inject(jakarta.inject.Inject) RxSseClient(io.micronaut.rxjava2.http.client.sse.RxSseClient) Record(org.akhq.models.Record) AbstractTest(org.akhq.AbstractTest) Test(org.junit.jupiter.api.Test)

Aggregations

Record (org.akhq.models.Record)10 ProducerRecord (org.apache.kafka.clients.producer.ProducerRecord)6 SchemaRegistryClient (io.confluent.kafka.schemaregistry.client.SchemaRegistryClient)4 Inject (jakarta.inject.Inject)4 AtomicInteger (java.util.concurrent.atomic.AtomicInteger)4 SchemaRegistryType (org.akhq.configs.SchemaRegistryType)4 org.apache.kafka.clients.consumer (org.apache.kafka.clients.consumer)4 TopicPartition (org.apache.kafka.common.TopicPartition)4 URIBuilder (org.codehaus.httpcache4j.uri.URIBuilder)4 Environment (io.micronaut.context.env.Environment)3 java.util (java.util)3 ExecutionException (java.util.concurrent.ExecutionException)3 Collectors (java.util.stream.Collectors)3 Stream (java.util.stream.Stream)3 Slf4j (lombok.extern.slf4j.Slf4j)3 AbstractTest (org.akhq.AbstractTest)3 Topic (org.akhq.models.Topic)3 RecordMetadata (org.apache.kafka.clients.producer.RecordMetadata)3 JsonProperty (com.fasterxml.jackson.annotation.JsonProperty)2 Splitter (com.google.common.base.Splitter)2