Search in sources :

Example 31 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class TridentMapExample method buildTopology.

public static StormTopology buildTopology(LocalDRPC drpc) {
    FixedBatchSpout spout = new FixedBatchSpout(new Fields("word"), 3, new Values("the cow jumped over the moon"), new Values("the man went to the store and bought some candy"), new Values("four score and seven years ago"), new Values("how many apples can you eat"), new Values("to be or not to be the person"));
    spout.setCycle(true);
    TridentTopology topology = new TridentTopology();
    TridentState wordCounts = topology.newStream("spout1", spout).parallelismHint(16).flatMap(split).map(toUpper, new Fields("uppercased")).filter(theFilter).peek(new Consumer() {

        @Override
        public void accept(TridentTuple input) {
            System.out.println(input.getString(0));
        }
    }).groupBy(new Fields("uppercased")).persistentAggregate(new MemoryMapState.Factory(), new Count(), new Fields("count")).parallelismHint(16);
    topology.newDRPCStream("words", drpc).flatMap(split, new Fields("word")).groupBy(new Fields("word")).stateQuery(wordCounts, new Fields("word"), new MapGet(), new Fields("count")).filter(new FilterNull()).aggregate(new Fields("count"), new Sum(), new Fields("sum"));
    return topology.build();
}
Also used : FixedBatchSpout(org.apache.storm.trident.testing.FixedBatchSpout) FilterNull(org.apache.storm.trident.operation.builtin.FilterNull) Fields(org.apache.storm.tuple.Fields) Consumer(org.apache.storm.trident.operation.Consumer) TridentTopology(org.apache.storm.trident.TridentTopology) TridentState(org.apache.storm.trident.TridentState) Values(org.apache.storm.tuple.Values) MapGet(org.apache.storm.trident.operation.builtin.MapGet) Sum(org.apache.storm.trident.operation.builtin.Sum) Count(org.apache.storm.trident.operation.builtin.Count) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 32 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class HBaseState method updateState.

public void updateState(List<TridentTuple> tuples, TridentCollector collector) {
    List<Mutation> mutations = Lists.newArrayList();
    for (TridentTuple tuple : tuples) {
        byte[] rowKey = options.mapper.rowKey(tuple);
        ColumnList cols = options.mapper.columns(tuple);
        mutations.addAll(hBaseClient.constructMutationReq(rowKey, cols, options.durability));
    }
    try {
        hBaseClient.batchMutate(mutations);
    } catch (Exception e) {
        collector.reportError(e);
        throw new FailedException(e);
    }
}
Also used : FailedException(org.apache.storm.topology.FailedException) ColumnList(org.apache.storm.hbase.common.ColumnList) FailedException(org.apache.storm.topology.FailedException) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 33 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class JdbcState method batchRetrieve.

public List<List<Values>> batchRetrieve(List<TridentTuple> tridentTuples) {
    List<List<Values>> batchRetrieveResult = Lists.newArrayList();
    try {
        for (TridentTuple tuple : tridentTuples) {
            List<Column> columns = options.jdbcLookupMapper.getColumns(tuple);
            List<List<Column>> rows = jdbcClient.select(options.selectQuery, columns);
            for (List<Column> row : rows) {
                List<Values> values = options.jdbcLookupMapper.toTuple(tuple, row);
                batchRetrieveResult.add(values);
            }
        }
    } catch (Exception e) {
        LOG.warn("Batch get operation failed. Triggering replay.", e);
        throw new FailedException(e);
    }
    return batchRetrieveResult;
}
Also used : Column(org.apache.storm.jdbc.common.Column) FailedException(org.apache.storm.topology.FailedException) Values(org.apache.storm.tuple.Values) ArrayList(java.util.ArrayList) List(java.util.List) FailedException(org.apache.storm.topology.FailedException) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 34 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class HiveState method writeTuples.

private void writeTuples(List<TridentTuple> tuples) throws Exception {
    for (TridentTuple tuple : tuples) {
        List<String> partitionVals = options.getMapper().mapPartitions(tuple);
        HiveEndPoint endPoint = HiveUtils.makeEndPoint(partitionVals, options);
        HiveWriter writer = getOrCreateWriter(endPoint);
        writer.write(options.getMapper().mapRecord(tuple));
        currentBatchSize++;
        if (currentBatchSize >= options.getBatchSize()) {
            flushAllWriters();
            currentBatchSize = 0;
        }
    }
}
Also used : HiveWriter(org.apache.storm.hive.common.HiveWriter) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 35 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class OpenTsdbState method update.

public void update(List<TridentTuple> tridentTuples, TridentCollector collector) {
    try {
        List<OpenTsdbMetricDatapoint> metricDataPoints = new ArrayList<>();
        for (TridentTuple tridentTuple : tridentTuples) {
            for (ITupleOpenTsdbDatapointMapper tupleOpenTsdbDatapointMapper : tupleMetricPointMappers) {
                metricDataPoints.add(tupleOpenTsdbDatapointMapper.getMetricPoint(tridentTuple));
            }
        }
        final ClientResponse.Details details = openTsdbClient.writeMetricPoints(metricDataPoints);
        if (details != null && (details.getFailed() > 0)) {
            final String errorMsg = "Failed in writing metrics to TSDB with details: " + details;
            LOG.error(errorMsg);
            throw new RuntimeException(errorMsg);
        }
    } catch (Exception e) {
        collector.reportError(e);
        throw new FailedException(e);
    }
}
Also used : ClientResponse(org.apache.storm.opentsdb.client.ClientResponse) OpenTsdbMetricDatapoint(org.apache.storm.opentsdb.OpenTsdbMetricDatapoint) FailedException(org.apache.storm.topology.FailedException) ArrayList(java.util.ArrayList) ITupleOpenTsdbDatapointMapper(org.apache.storm.opentsdb.bolt.ITupleOpenTsdbDatapointMapper) FailedException(org.apache.storm.topology.FailedException) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Aggregations

TridentTuple (org.apache.storm.trident.tuple.TridentTuple)46 ArrayList (java.util.ArrayList)18 FailedException (org.apache.storm.topology.FailedException)11 List (java.util.List)10 Values (org.apache.storm.tuple.Values)8 ISqlTridentDataSource (org.apache.storm.sql.runtime.ISqlTridentDataSource)6 Test (org.junit.Test)6 HashMap (java.util.HashMap)5 TridentTopology (org.apache.storm.trident.TridentTopology)5 Consumer (org.apache.storm.trident.operation.Consumer)5 StateUpdater (org.apache.storm.trident.state.StateUpdater)5 Stream (org.apache.storm.trident.Stream)4 Fields (org.apache.storm.tuple.Fields)4 Map (java.util.Map)3 FixedBatchSpout (org.apache.storm.trident.testing.FixedBatchSpout)3 BatchStatement (com.datastax.driver.core.BatchStatement)2 Statement (com.datastax.driver.core.Statement)2 IOException (java.io.IOException)2 Future (java.util.concurrent.Future)2 ProducerRecord (org.apache.kafka.clients.producer.ProducerRecord)2