Search in sources :

Example 16 with FailedException

use of org.apache.storm.topology.FailedException in project storm by apache.

the class SolrState method updateState.

public void updateState(List<TridentTuple> tuples) {
    try {
        SolrRequest solrRequest = solrMapper.toSolrRequest(tuples);
        solrClient.request(solrRequest, solrMapper.getCollection());
        solrClient.commit(solrMapper.getCollection());
    } catch (Exception e) {
        final String msg = String.format("%s", tuples);
        logger.warn(msg);
        throw new FailedException(msg, e);
    }
}
Also used : FailedException(org.apache.storm.topology.FailedException) SolrRequest(org.apache.solr.client.solrj.SolrRequest) FailedException(org.apache.storm.topology.FailedException)

Example 17 with FailedException

use of org.apache.storm.topology.FailedException in project storm by apache.

the class MongoState method updateState.

/**
 * Update Mongo state.
 * @param tuples trident tuples
 * @param collector trident collector
 */
public void updateState(List<TridentTuple> tuples, TridentCollector collector) {
    List<Document> documents = Lists.newArrayList();
    for (TridentTuple tuple : tuples) {
        Document document = options.mapper.toDocument(tuple);
        documents.add(document);
    }
    try {
        this.mongoClient.insert(documents, true);
    } catch (Exception e) {
        LOG.warn("Batch write failed but some requests might have succeeded. Triggering replay.", e);
        throw new FailedException(e);
    }
}
Also used : FailedException(org.apache.storm.topology.FailedException) Document(org.bson.Document) FailedException(org.apache.storm.topology.FailedException) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Example 18 with FailedException

use of org.apache.storm.topology.FailedException in project storm by apache.

the class MongoMapState method multiGet.

@Override
public List<T> multiGet(List<List<Object>> keysList) {
    List<T> retval = new ArrayList<>();
    try {
        for (List<Object> keys : keysList) {
            Bson filter = options.queryCreator.createFilterByKeys(keys);
            Document doc = mongoClient.find(filter);
            if (doc != null) {
                retval.add(this.serializer.deserialize((byte[]) doc.get(options.serDocumentField)));
            } else {
                retval.add(null);
            }
        }
    } catch (Exception e) {
        LOG.warn("Batch get operation failed.", e);
        throw new FailedException(e);
    }
    return retval;
}
Also used : FailedException(org.apache.storm.topology.FailedException) ArrayList(java.util.ArrayList) Document(org.bson.Document) FailedException(org.apache.storm.topology.FailedException) Bson(org.bson.conversions.Bson)

Example 19 with FailedException

use of org.apache.storm.topology.FailedException in project storm by apache.

the class HiveState method commit.

@Override
public void commit(Long txId) {
    try {
        flushAllWriters();
        currentBatchSize = 0;
    } catch (HiveWriter.TxnFailure | InterruptedException | HiveWriter.CommitFailure | HiveWriter.TxnBatchFailure ex) {
        LOG.warn("Commit failed. Failing the batch.", ex);
        throw new FailedException(ex);
    }
}
Also used : FailedException(org.apache.storm.topology.FailedException)

Example 20 with FailedException

use of org.apache.storm.topology.FailedException in project storm by apache.

the class JdbcState method batchRetrieve.

public List<List<Values>> batchRetrieve(List<TridentTuple> tridentTuples) {
    List<List<Values>> batchRetrieveResult = Lists.newArrayList();
    try {
        for (TridentTuple tuple : tridentTuples) {
            List<Column> columns = options.jdbcLookupMapper.getColumns(tuple);
            List<List<Column>> rows = jdbcClient.select(options.selectQuery, columns);
            for (List<Column> row : rows) {
                List<Values> values = options.jdbcLookupMapper.toTuple(tuple, row);
                batchRetrieveResult.add(values);
            }
        }
    } catch (Exception e) {
        LOG.warn("Batch get operation failed. Triggering replay.", e);
        throw new FailedException(e);
    }
    return batchRetrieveResult;
}
Also used : Column(org.apache.storm.jdbc.common.Column) FailedException(org.apache.storm.topology.FailedException) Values(org.apache.storm.tuple.Values) ArrayList(java.util.ArrayList) List(java.util.List) FailedException(org.apache.storm.topology.FailedException) TridentTuple(org.apache.storm.trident.tuple.TridentTuple)

Aggregations

FailedException (org.apache.storm.topology.FailedException)27 TridentTuple (org.apache.storm.trident.tuple.TridentTuple)11 ArrayList (java.util.ArrayList)10 Values (org.apache.storm.tuple.Values)8 List (java.util.List)5 IOException (java.io.IOException)4 Document (org.bson.Document)4 Statement (com.datastax.driver.core.Statement)3 Bson (org.bson.conversions.Bson)3 BatchStatement (com.datastax.driver.core.BatchStatement)2 InterruptedIOException (java.io.InterruptedIOException)2 BigInteger (java.math.BigInteger)2 Get (org.apache.hadoop.hbase.client.Get)2 Result (org.apache.hadoop.hbase.client.Result)2 ColumnList (org.apache.storm.hbase.common.ColumnList)2 ReportedFailedException (org.apache.storm.topology.ReportedFailedException)2 ResultSet (com.datastax.driver.core.ResultSet)1 Row (com.datastax.driver.core.Row)1 BufferedWriter (java.io.BufferedWriter)1 OutputStreamWriter (java.io.OutputStreamWriter)1