Search in sources :

Example 1 with HBaseAddElementsFromHdfsJobFactory

use of uk.gov.gchq.gaffer.hbasestore.operation.hdfs.handler.job.factory.HBaseAddElementsFromHdfsJobFactory in project Gaffer by gchq.

the class AddElementsFromHdfsHandler method fetchElements.

private void fetchElements(final AddElementsFromHdfs operation, final HBaseStore store) throws OperationException {
    try {
        /* Parse any Hadoop arguments passed on the command line and use these to configure the Tool */
        final Configuration configuration = new GenericOptionsParser(store.getConfiguration(), operation.getCommandLineArgs()).getConfiguration();
        final AddElementsFromHdfsTool fetchTool = new AddElementsFromHdfsTool(new HBaseAddElementsFromHdfsJobFactory(configuration), operation, store);
        LOGGER.info("Running FetchElementsFromHdfsTool job");
        ToolRunner.run(fetchTool, operation.getCommandLineArgs());
        LOGGER.info("Finished running FetchElementsFromHdfsTool job");
    } catch (final Exception e) {
        LOGGER.error("Failed to fetch elements from HDFS: {}", e.getMessage());
        throw new OperationException("Failed to fetch elements from HDFS", e);
    }
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) AddElementsFromHdfsTool(uk.gov.gchq.gaffer.hdfs.operation.handler.job.tool.AddElementsFromHdfsTool) HBaseAddElementsFromHdfsJobFactory(uk.gov.gchq.gaffer.hbasestore.operation.hdfs.handler.job.factory.HBaseAddElementsFromHdfsJobFactory) IOException(java.io.IOException) OperationException(uk.gov.gchq.gaffer.operation.OperationException) OperationException(uk.gov.gchq.gaffer.operation.OperationException) GenericOptionsParser(org.apache.hadoop.util.GenericOptionsParser)

Example 2 with HBaseAddElementsFromHdfsJobFactory

use of uk.gov.gchq.gaffer.hbasestore.operation.hdfs.handler.job.factory.HBaseAddElementsFromHdfsJobFactory in project Gaffer by gchq.

the class AddElementsFromHdfsHandler method checkHdfsDirectories.

private void checkHdfsDirectories(final AddElementsFromHdfs operation, final HBaseStore store) throws IOException {
    final AddElementsFromHdfsTool tool = new AddElementsFromHdfsTool(new HBaseAddElementsFromHdfsJobFactory(), operation, store);
    LOGGER.info("Checking that the correct HDFS directories exist");
    final FileSystem fs = FileSystem.get(tool.getConfig());
    final Path outputPath = new Path(operation.getOutputPath());
    LOGGER.info("Ensuring output directory {} doesn't exist", outputPath);
    if (fs.exists(outputPath)) {
        if (fs.listFiles(outputPath, true).hasNext()) {
            LOGGER.error("Output directory exists and is not empty: {}", outputPath);
            throw new IllegalArgumentException("Output directory exists and is not empty: " + outputPath);
        }
        LOGGER.info("Output directory exists and is empty so deleting: {}", outputPath);
        fs.delete(outputPath, true);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) FileSystem(org.apache.hadoop.fs.FileSystem) AddElementsFromHdfsTool(uk.gov.gchq.gaffer.hdfs.operation.handler.job.tool.AddElementsFromHdfsTool) HBaseAddElementsFromHdfsJobFactory(uk.gov.gchq.gaffer.hbasestore.operation.hdfs.handler.job.factory.HBaseAddElementsFromHdfsJobFactory)

Aggregations

HBaseAddElementsFromHdfsJobFactory (uk.gov.gchq.gaffer.hbasestore.operation.hdfs.handler.job.factory.HBaseAddElementsFromHdfsJobFactory)2 AddElementsFromHdfsTool (uk.gov.gchq.gaffer.hdfs.operation.handler.job.tool.AddElementsFromHdfsTool)2 IOException (java.io.IOException)1 Configuration (org.apache.hadoop.conf.Configuration)1 FileSystem (org.apache.hadoop.fs.FileSystem)1 Path (org.apache.hadoop.fs.Path)1 GenericOptionsParser (org.apache.hadoop.util.GenericOptionsParser)1 OperationException (uk.gov.gchq.gaffer.operation.OperationException)1