Search in sources :

Example 16 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class HBaseState method batchRetrieve.

public List<List<Values>> batchRetrieve(List<TridentTuple> tridentTuples) {
    List<List<Values>> batchRetrieveResult = Lists.newArrayList();
    List<Get> gets = Lists.newArrayList();
    for (TridentTuple tuple : tridentTuples) {
        byte[] rowKey = options.mapper.rowKey(tuple);
        gets.add(hBaseClient.constructGetRequests(rowKey, options.projectionCriteria));
    }
    try {
        Result[] results = hBaseClient.batchGet(gets);
        for (int i = 0; i < results.length; i++) {
            Result result = results[i];
            TridentTuple tuple = tridentTuples.get(i);
            List<Values> values = options.rowToStormValueMapper.toValues(tuple, result);
            batchRetrieveResult.add(values);
        }
    } catch (Exception e) {
        LOG.warn("Batch get operation failed. Triggering replay.", e);
        throw new FailedException(e);
    }
    return batchRetrieveResult;
}
Also used : Values(org.apache.storm.tuple.Values) FailedException(org.apache.storm.topology.FailedException) FailedException(org.apache.storm.topology.FailedException) ColumnList(org.apache.storm.hbase.common.ColumnList) List(java.util.List) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 17 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class HdfsStateTest method createMockTridentTuples.

private List<TridentTuple> createMockTridentTuples(int count) {
    TridentTuple tuple = mock(TridentTuple.class);
    when(tuple.getValueByField(any(String.class))).thenReturn("data");
    List<TridentTuple> tuples = new ArrayList<>();
    for (int i = 0; i < count; i++) {
        tuples.add(tuple);
    }
    return tuples;
}
Also used : ArrayList(java.util.ArrayList) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 18 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class MongoState method batchRetrieve.

public List<List<Values>> batchRetrieve(List<TridentTuple> tridentTuples) {
    List<List<Values>> batchRetrieveResult = Lists.newArrayList();
    try {
        for (TridentTuple tuple : tridentTuples) {
            Bson filter = options.queryCreator.createFilter(tuple);
            Document doc = mongoClient.find(filter);
            List<Values> values = options.lookupMapper.toTuple(tuple, doc);
            batchRetrieveResult.add(values);
        }
    } catch (Exception e) {
        LOG.warn("Batch get operation failed. Triggering replay.", e);
        throw new FailedException(e);
    }
    return batchRetrieveResult;
}
Also used : FailedException(org.apache.storm.topology.FailedException) Values(org.apache.storm.tuple.Values) List(java.util.List) Document(org.bson.Document) FailedException(org.apache.storm.topology.FailedException) TridentTuple(org.apache.storm.trident.tuple.TridentTuple) Bson(org.bson.conversions.Bson)

Example 19 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class AbstractRedisStateQuerier method batchRetrieve.

/**
	 * {@inheritDoc}
	 */
@Override
public List<List<Values>> batchRetrieve(T state, List<TridentTuple> inputs) {
    List<List<Values>> values = Lists.newArrayList();
    List<String> keys = Lists.newArrayList();
    for (TridentTuple input : inputs) {
        keys.add(lookupMapper.getKeyFromTuple(input));
    }
    List<String> redisVals = retrieveValuesFromRedis(state, keys);
    for (int i = 0; i < redisVals.size(); i++) {
        values.add(lookupMapper.toTuple(inputs.get(i), redisVals.get(i)));
    }
    return values;
}
Also used : List(java.util.List) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 20 with TridentTuple

use of org.apache.storm.trident.tuple.TridentTuple in project storm by apache.

the class AbstractRedisStateUpdater method updateState.

/**
	 * {@inheritDoc}
	 */
@Override
public void updateState(T state, List<TridentTuple> inputs, TridentCollector collector) {
    Map<String, String> keyToValue = new HashMap<String, String>();
    for (TridentTuple input : inputs) {
        String key = storeMapper.getKeyFromTuple(input);
        String value = storeMapper.getValueFromTuple(input);
        keyToValue.put(key, value);
    }
    updateStatesToRedis(state, keyToValue);
}
Also used : HashMap(java.util.HashMap) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Aggregations

TridentTuple (org.apache.storm.trident.tuple.TridentTuple)46 ArrayList (java.util.ArrayList)18 FailedException (org.apache.storm.topology.FailedException)11 List (java.util.List)10 Values (org.apache.storm.tuple.Values)8 ISqlTridentDataSource (org.apache.storm.sql.runtime.ISqlTridentDataSource)6 Test (org.junit.Test)6 HashMap (java.util.HashMap)5 TridentTopology (org.apache.storm.trident.TridentTopology)5 Consumer (org.apache.storm.trident.operation.Consumer)5 StateUpdater (org.apache.storm.trident.state.StateUpdater)5 Stream (org.apache.storm.trident.Stream)4 Fields (org.apache.storm.tuple.Fields)4 Map (java.util.Map)3 FixedBatchSpout (org.apache.storm.trident.testing.FixedBatchSpout)3 BatchStatement (com.datastax.driver.core.BatchStatement)2 Statement (com.datastax.driver.core.Statement)2 IOException (java.io.IOException)2 Future (java.util.concurrent.Future)2 ProducerRecord (org.apache.kafka.clients.producer.ProducerRecord)2