use of org.apache.storm.tuple.Values in project storm by apache.
the class WordCounter method execute.
public void execute(Tuple input, BasicOutputCollector collector) {
String word = input.getStringByField("word");
int count;
if (wordCounter.containsKey(word)) {
count = wordCounter.get(word) + 1;
wordCounter.put(word, wordCounter.get(word) + 1);
} else {
count = 1;
}
wordCounter.put(word, count);
collector.emit(new Values(word, String.valueOf(count)));
}
use of org.apache.storm.tuple.Values in project storm by apache.
the class WordCountValueMapper method toValues.
@Override
public List<Values> toValues(ITuple tuple, Result result) throws Exception {
List<Values> values = new ArrayList<Values>();
Cell[] cells = result.rawCells();
for (Cell cell : cells) {
Values value = new Values(Bytes.toString(CellUtil.cloneQualifier(cell)), Bytes.toLong(CellUtil.cloneValue(cell)));
values.add(value);
}
return values;
}
use of org.apache.storm.tuple.Values in project storm by apache.
the class TridentFileTopology method buildTopology.
public static StormTopology buildTopology(String hdfsUrl) {
FixedBatchSpout spout = new FixedBatchSpout(new Fields("sentence", "key"), 1000, new Values("the cow jumped over the moon", 1l), new Values("the man went to the store and bought some candy", 2l), new Values("four score and seven years ago", 3l), new Values("how many apples can you eat", 4l), new Values("to be or not to be the person", 5l));
spout.setCycle(true);
TridentTopology topology = new TridentTopology();
Stream stream = topology.newStream("spout1", spout);
Fields hdfsFields = new Fields("sentence", "key");
FileNameFormat fileNameFormat = new DefaultFileNameFormat().withPath("/tmp/trident").withPrefix("trident").withExtension(".txt");
RecordFormat recordFormat = new DelimitedRecordFormat().withFields(hdfsFields);
FileRotationPolicy rotationPolicy = new FileSizeRotationPolicy(5.0f, FileSizeRotationPolicy.Units.MB);
HdfsState.Options options = new HdfsState.HdfsFileOptions().withFileNameFormat(fileNameFormat).withRecordFormat(recordFormat).withRotationPolicy(rotationPolicy).withFsUrl(hdfsUrl).withConfigKey("hdfs.config");
StateFactory factory = new HdfsStateFactory().withOptions(options);
TridentState state = stream.partitionPersist(factory, hdfsFields, new HdfsUpdater(), new Fields());
return topology.build();
}
use of org.apache.storm.tuple.Values in project storm by apache.
the class AbstractRankerBolt method emitRankings.
private void emitRankings(BasicOutputCollector collector) {
collector.emit(new Values(rankings.copy()));
getLogger().debug("Rankings: " + rankings);
}
use of org.apache.storm.tuple.Values in project storm by apache.
the class TridentWindowingInmemoryStoreTopology method buildTopology.
public static StormTopology buildTopology(WindowsStoreFactory windowStore, WindowConfig windowConfig) throws Exception {
FixedBatchSpout spout = new FixedBatchSpout(new Fields("sentence"), 3, new Values("the cow jumped over the moon"), new Values("the man went to the store and bought some candy"), new Values("four score and seven years ago"), new Values("how many apples can you eat"), new Values("to be or not to be the person"));
spout.setCycle(true);
TridentTopology topology = new TridentTopology();
Stream stream = topology.newStream("spout1", spout).parallelismHint(16).each(new Fields("sentence"), new Split(), new Fields("word")).window(windowConfig, windowStore, new Fields("word"), new CountAsAggregator(), new Fields("count")).peek(new Consumer() {
@Override
public void accept(TridentTuple input) {
LOG.info("Received tuple: [{}]", input);
}
});
return topology.build();
}
Aggregations