Search in sources :

Example 1 with SplitFailedException

use of com.mongodb.hadoop.splitter.SplitFailedException in project mongo-hadoop by mongodb.

the class MongoInputFormat method getSplits.

public InputSplit[] getSplits(final JobConf job, final int numSplits) throws IOException {
    try {
        MongoSplitter splitterImpl = MongoSplitterFactory.getSplitter(job);
        LOG.info("Using " + splitterImpl + " to calculate splits. (old mapreduce API)");
        final List<org.apache.hadoop.mapreduce.InputSplit> splits = splitterImpl.calculateSplits();
        return splits.toArray(new InputSplit[splits.size()]);
    } catch (SplitFailedException spfe) {
        throw new IOException(spfe);
    }
}
Also used : MongoSplitter(com.mongodb.hadoop.splitter.MongoSplitter) IOException(java.io.IOException) MongoInputSplit(com.mongodb.hadoop.input.MongoInputSplit) InputSplit(org.apache.hadoop.mapred.InputSplit) SplitFailedException(com.mongodb.hadoop.splitter.SplitFailedException)

Example 2 with SplitFailedException

use of com.mongodb.hadoop.splitter.SplitFailedException in project mongo-hadoop by mongodb.

the class HiveMongoInputFormat method getSplits.

@Override
public FileSplit[] getSplits(final JobConf conf, final int numSplits) throws IOException {
    try {
        MongoSplitter splitterImpl = MongoSplitterFactory.getSplitter(conf);
        final List<org.apache.hadoop.mapreduce.InputSplit> splits = splitterImpl.calculateSplits();
        InputSplit[] splitIns = splits.toArray(new InputSplit[splits.size()]);
        // wrap InputSplits in FileSplits so that 'getPath' 
        // doesn't produce an error (Hive bug)
        FileSplit[] wrappers = new FileSplit[splitIns.length];
        Path path = new Path(conf.get(MongoStorageHandler.TABLE_LOCATION));
        for (int i = 0; i < wrappers.length; i++) {
            wrappers[i] = new MongoHiveInputSplit(splitIns[i], path);
        }
        return wrappers;
    } catch (SplitFailedException spfe) {
        // split failed because no namespace found 
        // (so the corresponding collection doesn't exist)
        LOG.error(spfe.getMessage(), spfe);
        throw new IOException(spfe.getMessage(), spfe);
    } catch (Exception e) {
        throw new IOException(e);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) MongoSplitter(com.mongodb.hadoop.splitter.MongoSplitter) IOException(java.io.IOException) FileSplit(org.apache.hadoop.mapred.FileSplit) SplitFailedException(com.mongodb.hadoop.splitter.SplitFailedException) IOException(java.io.IOException) SplitFailedException(com.mongodb.hadoop.splitter.SplitFailedException) MongoInputSplit(com.mongodb.hadoop.input.MongoInputSplit) InputSplit(org.apache.hadoop.mapred.InputSplit)

Aggregations

MongoInputSplit (com.mongodb.hadoop.input.MongoInputSplit)2 MongoSplitter (com.mongodb.hadoop.splitter.MongoSplitter)2 SplitFailedException (com.mongodb.hadoop.splitter.SplitFailedException)2 IOException (java.io.IOException)2 InputSplit (org.apache.hadoop.mapred.InputSplit)2 Path (org.apache.hadoop.fs.Path)1 FileSplit (org.apache.hadoop.mapred.FileSplit)1