Search in sources :

Example 6 with ObjectMapperSerde

use of io.quarkus.kafka.client.serialization.ObjectMapperSerde in project AD482-apps by RedHatTraining.

the class VehiclePositionsStream method buildTopology.

@Produces
public Topology buildTopology() {
    StreamsBuilder builder = new StreamsBuilder();
    // TODO: create serde to desearialize VehiclePosition messages
    ObjectMapperSerde<VehiclePosition> vehiclePositionSerde = new ObjectMapperSerde<>(VehiclePosition.class);
    // TODO: Create the stream from the "vehicle-positions" topic
    KStream<String, VehiclePosition> stream = builder.stream("vehicle-positions", Consumed.with(stringSerde, vehiclePositionSerde));
    // TODO: print stream values
    stream.foreach((key, value) -> System.out.println("Received vehicle position: " + value));
    // TODO: map positions to elevations in feet
    // and send the stream to "vehicle-feet-elevations" topic
    stream.map((key, value) -> {
        Double feet = value.elevation * 3.28084;
        return KeyValue.pair(value.vehicleId, feet);
    }).to("vehicle-feet-elevations", Produced.with(Serdes.Integer(), Serdes.Double()));
    // TODO: group positions by vehicle id
    KGroupedStream<Integer, VehiclePosition> positionsByVehicle = stream.groupBy((key, value) -> value.vehicleId, Grouped.with(Serdes.Integer(), vehiclePositionSerde));
    // TODO: count positions by vehicle
    KTable<Integer, Long> countsByVehicle = positionsByVehicle.count();
    // TODO: print the count values
    countsByVehicle.toStream().foreach((vehicleId, count) -> System.out.println("Vehicle: " + vehicleId + " Positions reported: " + count + "\n"));
    return builder.build();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) KTable(org.apache.kafka.streams.kstream.KTable) Produces(javax.enterprise.inject.Produces) KGroupedStream(org.apache.kafka.streams.kstream.KGroupedStream) Produced(org.apache.kafka.streams.kstream.Produced) Consumed(org.apache.kafka.streams.kstream.Consumed) KeyValue(org.apache.kafka.streams.KeyValue) KStream(org.apache.kafka.streams.kstream.KStream) Grouped(org.apache.kafka.streams.kstream.Grouped) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) Serde(org.apache.kafka.common.serialization.Serde) Serdes(org.apache.kafka.common.serialization.Serdes) ApplicationScoped(javax.enterprise.context.ApplicationScoped) Topology(org.apache.kafka.streams.Topology) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) Produces(javax.enterprise.inject.Produces)

Example 7 with ObjectMapperSerde

use of io.quarkus.kafka.client.serialization.ObjectMapperSerde in project AD482-apps by RedHatTraining.

the class NotifyAboutLowProfitMarginPipeline method onStart.

void onStart(@Observes StartupEvent startupEvent) {
    StreamsBuilder builder = new StreamsBuilder();
    ObjectMapperSerde<WindTurbineProfitMarginWasCalculated> profitEventSerde = new ObjectMapperSerde<>(WindTurbineProfitMarginWasCalculated.class);
    ObjectMapperSerde<LowProfitMarginWasDetected> alertsEventSerde = new ObjectMapperSerde<>(LowProfitMarginWasDetected.class);
    // TODO: Build the stream topology
    builder.stream(WIND_TURBINE_PROFIT_MARGINS_TOPIC, Consumed.with(Serdes.Integer(), profitEventSerde)).filter((key, profit) -> profit.profitMargin < 0.10).map((key, profit) -> {
        logLowProfitMarginAlert(key, profit.profitMargin);
        return new KeyValue<>(key, new LowProfitMarginWasDetected(key, profit.profitMargin));
    }).to(LOW_PROFIT_MARGIN_TOPIC, Produced.with(Serdes.Integer(), alertsEventSerde));
    streams = new KafkaStreams(builder.build(), generateStreamConfig());
    // Starting from a clean state
    streams.cleanUp();
    streams.start();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Produced(org.apache.kafka.streams.kstream.Produced) Consumed(org.apache.kafka.streams.kstream.Consumed) Logger(org.jboss.logging.Logger) KeyValue(org.apache.kafka.streams.KeyValue) ShutdownEvent(io.quarkus.runtime.ShutdownEvent) LowProfitMarginWasDetected(com.redhat.energy.profit.event.alert.LowProfitMarginWasDetected) WindTurbineProfitMarginWasCalculated(com.redhat.energy.profit.event.WindTurbineProfitMarginWasCalculated) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) Observes(javax.enterprise.event.Observes) Serdes(org.apache.kafka.common.serialization.Serdes) ApplicationScoped(javax.enterprise.context.ApplicationScoped) StartupEvent(io.quarkus.runtime.StartupEvent) KafkaStreams(org.apache.kafka.streams.KafkaStreams) StreamProcessor(com.redhat.energy.profit.stream.common.StreamProcessor) LowProfitMarginWasDetected(com.redhat.energy.profit.event.alert.LowProfitMarginWasDetected) KafkaStreams(org.apache.kafka.streams.KafkaStreams) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) WindTurbineProfitMarginWasCalculated(com.redhat.energy.profit.event.WindTurbineProfitMarginWasCalculated)

Example 8 with ObjectMapperSerde

use of io.quarkus.kafka.client.serialization.ObjectMapperSerde in project AD482-apps by RedHatTraining.

the class WindTurbineProfitMarginsPipeline method onStart.

void onStart(@Observes StartupEvent startupEvent) {
    StreamsBuilder builder = new StreamsBuilder();
    ObjectMapperSerde<WindTurbineEarningWasAdded> earningEventSerde = new ObjectMapperSerde<>(WindTurbineEarningWasAdded.class);
    ObjectMapperSerde<WindTurbineExpenseWasAdded> expenseEventSerde = new ObjectMapperSerde<>(WindTurbineExpenseWasAdded.class);
    ObjectMapperSerde<AverageData> averageDataSerde = new ObjectMapperSerde<>(AverageData.class);
    ObjectMapperSerde<WindTurbineProfitMarginWasCalculated> profitEventsSerde = new ObjectMapperSerde<>(WindTurbineProfitMarginWasCalculated.class);
    // TODO: Create a KStream for the earning events
    KStream<Integer, WindTurbineEarningWasAdded> earningsStream = builder.stream(WIND_TURBINE_EARNINGS_TOPIC, Consumed.with(Serdes.Integer(), earningEventSerde));
    // TODO: Aggregate the earnings
    KTable<Integer, AverageData> aggregatedEarnings = earningsStream.groupByKey().aggregate(AverageData::new, (key, value, aggregate) -> {
        aggregate.increaseCount(1);
        aggregate.increaseSum(value.amount);
        return aggregate;
    }, Materialized.<Integer, AverageData, KeyValueStore<Bytes, byte[]>>as(AGGREGATED_EARNINGS_STORE).withKeySerde(Serdes.Integer()).withValueSerde(averageDataSerde));
    // TODO: Calculate the average earnings
    KTable<Integer, Double> averageEarningsTable = aggregatedEarnings.mapValues(value -> value.sum / value.count, Materialized.<Integer, Double, KeyValueStore<Bytes, byte[]>>as(AVERAGE_EARNINGS_STORE).withKeySerde(Serdes.Integer()).withValueSerde(Serdes.Double()));
    // TODO: Create a KStream for the expense events
    KStream<Integer, WindTurbineExpenseWasAdded> expensesStream = builder.stream(WIND_TURBINE_EXPENSES_TOPIC, Consumed.with(Serdes.Integer(), expenseEventSerde));
    // TODO: Aggregate the expenses
    KTable<Integer, AverageData> aggregatedExpenses = expensesStream.groupByKey().aggregate(AverageData::new, (key, value, aggregate) -> {
        aggregate.increaseCount(1);
        aggregate.increaseSum(value.amount);
        return aggregate;
    }, Materialized.<Integer, AverageData, KeyValueStore<Bytes, byte[]>>as(AGGREGATED_EXPENSES_STORE).withKeySerde(Serdes.Integer()).withValueSerde(averageDataSerde));
    // TODO: Calculate the average expenses
    KTable<Integer, Double> averageExpensesTable = aggregatedExpenses.mapValues(value -> value.sum / value.count, Materialized.<Integer, Double, KeyValueStore<Bytes, byte[]>>as(AVERAGE_EXPENSES_STORE).withKeySerde(Serdes.Integer()).withValueSerde(Serdes.Double()));
    // TODO: Calculate the profit margins
    averageEarningsTable.join(averageExpensesTable, WindTurbineProfitMarginWasCalculated::new).toStream().to(WIND_TURBINE_PROFIT_MARGINS_TOPIC, Produced.with(Serdes.Integer(), profitEventsSerde));
    streams = new KafkaStreams(builder.build(), generateStreamConfig());
    // Starting from a clean state
    streams.cleanUp();
    streams.start();
}
Also used : KafkaStreams(org.apache.kafka.streams.KafkaStreams) AverageData(com.redhat.energy.profit.model.AverageData) WindTurbineProfitMarginWasCalculated(com.redhat.energy.profit.event.WindTurbineProfitMarginWasCalculated) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Bytes(org.apache.kafka.common.utils.Bytes) WindTurbineEarningWasAdded(com.redhat.energy.profit.event.WindTurbineEarningWasAdded) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) WindTurbineExpenseWasAdded(com.redhat.energy.profit.event.WindTurbineExpenseWasAdded)

Example 9 with ObjectMapperSerde

use of io.quarkus.kafka.client.serialization.ObjectMapperSerde in project AD482-apps by RedHatTraining.

the class RepartitionStream method onStart.

void onStart(@Observes StartupEvent startupEvent) {
    StreamsBuilder builder = new StreamsBuilder();
    ObjectMapperSerde<TemperatureWasMeasuredInCelsius> temperaturesEventSerde = new ObjectMapperSerde<>(TemperatureWasMeasuredInCelsius.class);
    KStream<String, TemperatureWasMeasuredInCelsius> stream = builder.stream(TEMPERATURES_TOPIC, Consumed.with(Serdes.String(), temperaturesEventSerde));
    // TODO: Implement the topology for the repartitioning
    stream.map((key, measure) -> {
        LOGGER.infov("Repartitioning ID {0}, {1}ÂșC ...", measure.locationId, measure.measure);
        return new KeyValue<>(measure.locationId, measure);
    }).to(TEMPERATURES_REPARTITIONED_TOPIC, Produced.with(Serdes.Integer(), temperaturesEventSerde));
    streams = new KafkaStreams(builder.build(), generateStreamConfig());
    streams.start();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Produces(javax.enterprise.inject.Produces) Produced(org.apache.kafka.streams.kstream.Produced) Consumed(org.apache.kafka.streams.kstream.Consumed) Logger(org.jboss.logging.Logger) KeyValue(org.apache.kafka.streams.KeyValue) ShutdownEvent(io.quarkus.runtime.ShutdownEvent) KStream(org.apache.kafka.streams.kstream.KStream) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) TemperatureWasMeasuredInCelsius(com.redhat.monitor.event.TemperatureWasMeasuredInCelsius) Observes(javax.enterprise.event.Observes) Serdes(org.apache.kafka.common.serialization.Serdes) ApplicationScoped(javax.enterprise.context.ApplicationScoped) StartupEvent(io.quarkus.runtime.StartupEvent) KafkaStreams(org.apache.kafka.streams.KafkaStreams) TemperatureWasMeasuredInCelsius(com.redhat.monitor.event.TemperatureWasMeasuredInCelsius) KafkaStreams(org.apache.kafka.streams.KafkaStreams) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde)

Example 10 with ObjectMapperSerde

use of io.quarkus.kafka.client.serialization.ObjectMapperSerde in project quarkus-quickstarts by quarkusio.

the class TopologyProducer method buildTopology.

@Produces
public Topology buildTopology() {
    StreamsBuilder builder = new StreamsBuilder();
    ObjectMapperSerde<WeatherStation> weatherStationSerde = new ObjectMapperSerde<>(WeatherStation.class);
    ObjectMapperSerde<Aggregation> aggregationSerde = new ObjectMapperSerde<>(Aggregation.class);
    KeyValueBytesStoreSupplier storeSupplier = Stores.persistentKeyValueStore(WEATHER_STATIONS_STORE);
    GlobalKTable<Integer, WeatherStation> stations = builder.globalTable(WEATHER_STATIONS_TOPIC, Consumed.with(Serdes.Integer(), weatherStationSerde));
    builder.stream(TEMPERATURE_VALUES_TOPIC, Consumed.with(Serdes.Integer(), Serdes.String())).join(stations, (stationId, timestampAndValue) -> stationId, (timestampAndValue, station) -> {
        String[] parts = timestampAndValue.split(";");
        return new TemperatureMeasurement(station.id, station.name, Instant.parse(parts[0]), Double.valueOf(parts[1]));
    }).groupByKey().aggregate(Aggregation::new, (stationId, value, aggregation) -> aggregation.updateFrom(value), Materialized.<Integer, Aggregation>as(storeSupplier).withKeySerde(Serdes.Integer()).withValueSerde(aggregationSerde)).toStream().to(TEMPERATURES_AGGREGATED_TOPIC, Produced.with(Serdes.Integer(), aggregationSerde));
    return builder.build();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Aggregation(org.acme.kafka.streams.aggregator.model.Aggregation) WeatherStation(org.acme.kafka.streams.aggregator.model.WeatherStation) TemperatureMeasurement(org.acme.kafka.streams.aggregator.model.TemperatureMeasurement) KeyValueBytesStoreSupplier(org.apache.kafka.streams.state.KeyValueBytesStoreSupplier) ObjectMapperSerde(io.quarkus.kafka.client.serialization.ObjectMapperSerde) Produces(javax.enterprise.inject.Produces)

Aggregations

ObjectMapperSerde (io.quarkus.kafka.client.serialization.ObjectMapperSerde)15 StreamsBuilder (org.apache.kafka.streams.StreamsBuilder)13 ApplicationScoped (javax.enterprise.context.ApplicationScoped)8 Serdes (org.apache.kafka.common.serialization.Serdes)8 KafkaStreams (org.apache.kafka.streams.KafkaStreams)8 Produces (javax.enterprise.inject.Produces)7 KeyValue (org.apache.kafka.streams.KeyValue)7 Consumed (org.apache.kafka.streams.kstream.Consumed)7 Produced (org.apache.kafka.streams.kstream.Produced)7 ShutdownEvent (io.quarkus.runtime.ShutdownEvent)5 StartupEvent (io.quarkus.runtime.StartupEvent)5 Observes (javax.enterprise.event.Observes)5 KStream (org.apache.kafka.streams.kstream.KStream)5 Logger (org.jboss.logging.Logger)5 Bytes (org.apache.kafka.common.utils.Bytes)3 Topology (org.apache.kafka.streams.Topology)3 WindTurbineProfitMarginWasCalculated (com.redhat.energy.profit.event.WindTurbineProfitMarginWasCalculated)2 MWattsMeasurement (com.redhat.energy.records.MWattsMeasurement)2 TemperatureWasMeasuredInCelsius (com.redhat.monitor.event.TemperatureWasMeasuredInCelsius)2 ObjectMapper (com.fasterxml.jackson.databind.ObjectMapper)1