use of org.apache.storm.kafka.trident.mapper.FieldNameBasedTupleToKafkaMapper in project storm by apache.
the class TridentKafkaTest method setup.
@Before
public void setup() {
broker = new KafkaTestBroker();
simpleConsumer = TestUtils.getKafkaConsumer(broker);
TridentTupleToKafkaMapper<Object, Object> mapper = new FieldNameBasedTupleToKafkaMapper<Object, Object>("key", "message");
KafkaTopicSelector topicSelector = new DefaultTopicSelector(TestUtils.TOPIC);
state = new TridentKafkaState().withKafkaTopicSelector(topicSelector).withTridentTupleToKafkaMapper(mapper);
state.prepare(TestUtils.getProducerProperties(broker.getBrokerConnectionString()));
}
use of org.apache.storm.kafka.trident.mapper.FieldNameBasedTupleToKafkaMapper in project storm by apache.
the class TridentKafkaTopology method buildTopology.
private static StormTopology buildTopology(String brokerConnectionString) {
Fields fields = new Fields("word", "count");
FixedBatchSpout spout = new FixedBatchSpout(fields, 4, new Values("storm", "1"), new Values("trident", "1"), new Values("needs", "1"), new Values("javadoc", "1"));
spout.setCycle(true);
TridentTopology topology = new TridentTopology();
Stream stream = topology.newStream("spout1", spout);
Properties props = new Properties();
props.put("bootstrap.servers", brokerConnectionString);
props.put("acks", "1");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
TridentKafkaStateFactory stateFactory = new TridentKafkaStateFactory().withProducerProperties(props).withKafkaTopicSelector(new DefaultTopicSelector("test")).withTridentTupleToKafkaMapper(new FieldNameBasedTupleToKafkaMapper("word", "count"));
stream.partitionPersist(stateFactory, fields, new TridentKafkaUpdater(), new Fields());
return topology.build();
}
Aggregations