Search in sources :

Example 6 with Values

use of org.apache.storm.tuple.Values in project storm by apache.

the class WordCounter method execute.

public void execute(Tuple input, BasicOutputCollector collector) {
    String word = input.getStringByField("word");
    int count;
    if (wordCounter.containsKey(word)) {
        count = wordCounter.get(word) + 1;
        wordCounter.put(word, wordCounter.get(word) + 1);
    } else {
        count = 1;
    }
    wordCounter.put(word, count);
    collector.emit(new Values(word, String.valueOf(count)));
}
Also used : Values(org.apache.storm.tuple.Values)

Example 7 with Values

use of org.apache.storm.tuple.Values in project storm by apache.

the class WordCountValueMapper method toValues.

@Override
public List<Values> toValues(ITuple tuple, Result result) throws Exception {
    List<Values> values = new ArrayList<Values>();
    Cell[] cells = result.rawCells();
    for (Cell cell : cells) {
        Values value = new Values(Bytes.toString(CellUtil.cloneQualifier(cell)), Bytes.toLong(CellUtil.cloneValue(cell)));
        values.add(value);
    }
    return values;
}
Also used : ArrayList(java.util.ArrayList) Values(org.apache.storm.tuple.Values) Cell(org.apache.hadoop.hbase.Cell)

Example 8 with Values

use of org.apache.storm.tuple.Values in project storm by apache.

the class TridentFileTopology method buildTopology.

public static StormTopology buildTopology(String hdfsUrl) {
    FixedBatchSpout spout = new FixedBatchSpout(new Fields("sentence", "key"), 1000, new Values("the cow jumped over the moon", 1l), new Values("the man went to the store and bought some candy", 2l), new Values("four score and seven years ago", 3l), new Values("how many apples can you eat", 4l), new Values("to be or not to be the person", 5l));
    spout.setCycle(true);
    TridentTopology topology = new TridentTopology();
    Stream stream = topology.newStream("spout1", spout);
    Fields hdfsFields = new Fields("sentence", "key");
    FileNameFormat fileNameFormat = new DefaultFileNameFormat().withPath("/tmp/trident").withPrefix("trident").withExtension(".txt");
    RecordFormat recordFormat = new DelimitedRecordFormat().withFields(hdfsFields);
    FileRotationPolicy rotationPolicy = new FileSizeRotationPolicy(5.0f, FileSizeRotationPolicy.Units.MB);
    HdfsState.Options options = new HdfsState.HdfsFileOptions().withFileNameFormat(fileNameFormat).withRecordFormat(recordFormat).withRotationPolicy(rotationPolicy).withFsUrl(hdfsUrl).withConfigKey("hdfs.config");
    StateFactory factory = new HdfsStateFactory().withOptions(options);
    TridentState state = stream.partitionPersist(factory, hdfsFields, new HdfsUpdater(), new Fields());
    return topology.build();
}
Also used : TridentState(org.apache.storm.trident.TridentState) Values(org.apache.storm.tuple.Values) FileRotationPolicy(org.apache.storm.hdfs.trident.rotation.FileRotationPolicy) Fields(org.apache.storm.tuple.Fields) StateFactory(org.apache.storm.trident.state.StateFactory) TridentTopology(org.apache.storm.trident.TridentTopology) FileInputStream(java.io.FileInputStream) Stream(org.apache.storm.trident.Stream) InputStream(java.io.InputStream) FileSizeRotationPolicy(org.apache.storm.hdfs.trident.rotation.FileSizeRotationPolicy)

Example 9 with Values

use of org.apache.storm.tuple.Values in project storm by apache.

the class AbstractRankerBolt method emitRankings.

private void emitRankings(BasicOutputCollector collector) {
    collector.emit(new Values(rankings.copy()));
    getLogger().debug("Rankings: " + rankings);
}
Also used : Values(org.apache.storm.tuple.Values)

Example 10 with Values

use of org.apache.storm.tuple.Values in project storm by apache.

the class TridentWindowingInmemoryStoreTopology method buildTopology.

public static StormTopology buildTopology(WindowsStoreFactory windowStore, WindowConfig windowConfig) throws Exception {
    FixedBatchSpout spout = new FixedBatchSpout(new Fields("sentence"), 3, new Values("the cow jumped over the moon"), new Values("the man went to the store and bought some candy"), new Values("four score and seven years ago"), new Values("how many apples can you eat"), new Values("to be or not to be the person"));
    spout.setCycle(true);
    TridentTopology topology = new TridentTopology();
    Stream stream = topology.newStream("spout1", spout).parallelismHint(16).each(new Fields("sentence"), new Split(), new Fields("word")).window(windowConfig, windowStore, new Fields("word"), new CountAsAggregator(), new Fields("count")).peek(new Consumer() {

        @Override
        public void accept(TridentTuple input) {
            LOG.info("Received tuple: [{}]", input);
        }
    });
    return topology.build();
}
Also used : FixedBatchSpout(org.apache.storm.trident.testing.FixedBatchSpout) Fields(org.apache.storm.tuple.Fields) Consumer(org.apache.storm.trident.operation.Consumer) TridentTopology(org.apache.storm.trident.TridentTopology) CountAsAggregator(org.apache.storm.trident.testing.CountAsAggregator) Values(org.apache.storm.tuple.Values) Stream(org.apache.storm.trident.Stream) Split(org.apache.storm.trident.testing.Split) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Aggregations

Values (org.apache.storm.tuple.Values)187 Test (org.junit.Test)80 ArrayList (java.util.ArrayList)42 Fields (org.apache.storm.tuple.Fields)40 HashMap (java.util.HashMap)36 ChannelHandler (org.apache.storm.sql.runtime.ChannelHandler)26 TridentTopology (org.apache.storm.trident.TridentTopology)21 FixedBatchSpout (org.apache.storm.trident.testing.FixedBatchSpout)14 Stream (org.apache.storm.trident.Stream)12 TupleImpl (org.apache.storm.tuple.TupleImpl)12 TestUtils (org.apache.storm.sql.TestUtils)11 TridentState (org.apache.storm.trident.TridentState)11 List (java.util.List)9 Config (org.apache.storm.Config)9 TopologyBuilder (org.apache.storm.topology.TopologyBuilder)9 Tuple (org.apache.storm.tuple.Tuple)9 Map (java.util.Map)8 GeneralTopologyContext (org.apache.storm.task.GeneralTopologyContext)8 TridentTuple (org.apache.storm.trident.tuple.TridentTuple)8 StateFactory (org.apache.storm.trident.state.StateFactory)7