Search in sources :

Example 21 with OperationException

use of uk.gov.gchq.gaffer.operation.OperationException in project Gaffer by gchq.

the class OperationExample method createExampleGraph.

protected Graph createExampleGraph() {
    final Graph graph = new Graph.Builder().addSchemas(StreamUtil.openStreams(getClass(), "/example/operation/schema")).storeProperties(StreamUtil.openStream(getClass(), "/example/operation/mockaccumulostore.properties")).build();
    // Create data generator
    final DataGenerator dataGenerator = new DataGenerator();
    // Load data into memory
    final List<String> data = DataUtils.loadData(StreamUtil.openStream(getClass(), "/example/operation/data.txt", true));
    //add the edges to the graph using an operation chain consisting of:
    //generateElements - generating edges from the data (note these are directed edges)
    //addElements - add the edges to the graph
    final OperationChain addOpChain = new OperationChain.Builder().first(new GenerateElements.Builder<String>().generator(dataGenerator).objects(data).build()).then(new AddElements()).build();
    try {
        graph.execute(addOpChain, new User());
    } catch (OperationException e) {
        throw new RuntimeException(e);
    }
    return graph;
}
Also used : AddElements(uk.gov.gchq.gaffer.operation.impl.add.AddElements) Graph(uk.gov.gchq.gaffer.graph.Graph) User(uk.gov.gchq.gaffer.user.User) DataGenerator(uk.gov.gchq.gaffer.example.operation.generator.DataGenerator) OperationChain(uk.gov.gchq.gaffer.operation.OperationChain) OperationException(uk.gov.gchq.gaffer.operation.OperationException)

Example 22 with OperationException

use of uk.gov.gchq.gaffer.operation.OperationException in project Gaffer by gchq.

the class GetDataFrameOfElementsExample method runExamples.

@Override
public void runExamples() {
    // Need to actively turn logging on and off as needed as Spark produces some logs
    // even when the log level is set to off.
    ROOT_LOGGER.setLevel(Level.OFF);
    final SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("getDataFrameOfElementsWithEntityGroup").set("spark.serializer", "org.apache.spark.serializer.KryoSerializer").set("spark.kryo.registrator", "uk.gov.gchq.gaffer.spark.serialisation.kryo.Registrator").set("spark.driver.allowMultipleContexts", "true");
    final SparkContext sc = new SparkContext(sparkConf);
    sc.setLogLevel("OFF");
    final SQLContext sqlc = new SQLContext(sc);
    final Graph graph = getGraph();
    try {
        getDataFrameOfElementsWithEntityGroup(sqlc, graph);
        getDataFrameOfElementsWithEdgeGroup(sqlc, graph);
    } catch (final OperationException e) {
        throw new RuntimeException(e);
    }
    sc.stop();
    ROOT_LOGGER.setLevel(Level.INFO);
}
Also used : SparkContext(org.apache.spark.SparkContext) Graph(uk.gov.gchq.gaffer.graph.Graph) SparkConf(org.apache.spark.SparkConf) SQLContext(org.apache.spark.sql.SQLContext) OperationException(uk.gov.gchq.gaffer.operation.OperationException)

Example 23 with OperationException

use of uk.gov.gchq.gaffer.operation.OperationException in project Gaffer by gchq.

the class GetJavaRDDOfAllElementsExample method runExamples.

@Override
public void runExamples() {
    // Need to actively turn logging on and off as needed as Spark produces some logs
    // even when the log level is set to off.
    ROOT_LOGGER.setLevel(Level.OFF);
    final SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("GetJavaRDDOfAllElementsExample").set("spark.serializer", "org.apache.spark.serializer.KryoSerializer").set("spark.kryo.registrator", "uk.gov.gchq.gaffer.spark.serialisation.kryo.Registrator").set("spark.driver.allowMultipleContexts", "true");
    final JavaSparkContext sc = new JavaSparkContext(sparkConf);
    sc.setLogLevel("OFF");
    final Graph graph = getGraph();
    try {
        getJavaRddOfAllElements(sc, graph);
        getJavaRddOfAllElementsReturningEdgesOnly(sc, graph);
    } catch (final OperationException e) {
        throw new RuntimeException(e);
    }
    sc.stop();
    ROOT_LOGGER.setLevel(Level.INFO);
}
Also used : Graph(uk.gov.gchq.gaffer.graph.Graph) JavaSparkContext(org.apache.spark.api.java.JavaSparkContext) SparkConf(org.apache.spark.SparkConf) OperationException(uk.gov.gchq.gaffer.operation.OperationException)

Example 24 with OperationException

use of uk.gov.gchq.gaffer.operation.OperationException in project Gaffer by gchq.

the class AccumuloKeyRangePartitioner method getSplits.

public static synchronized String[] getSplits(final AccumuloStore store) throws OperationException {
    final Connector connector;
    try {
        connector = store.getConnection();
    } catch (StoreException e) {
        throw new OperationException("Failed to create accumulo connection", e);
    }
    final String table = store.getProperties().getTable();
    try {
        final Collection<Text> splits = connector.tableOperations().listSplits(table);
        final String[] arr = new String[splits.size()];
        return splits.parallelStream().map(text -> text.toString()).collect(Collectors.toList()).toArray(arr);
    } catch (TableNotFoundException | AccumuloSecurityException | AccumuloException e) {
        throw new OperationException("Failed to get accumulo split points from table " + table, e);
    }
}
Also used : Connector(org.apache.accumulo.core.client.Connector) TableNotFoundException(org.apache.accumulo.core.client.TableNotFoundException) AccumuloException(org.apache.accumulo.core.client.AccumuloException) Text(org.apache.hadoop.io.Text) AccumuloSecurityException(org.apache.accumulo.core.client.AccumuloSecurityException) OperationException(uk.gov.gchq.gaffer.operation.OperationException) StoreException(uk.gov.gchq.gaffer.store.StoreException)

Example 25 with OperationException

use of uk.gov.gchq.gaffer.operation.OperationException in project Gaffer by gchq.

the class AbstractImportKeyValuePairRDDToAccumuloHandler method getConfiguration.

protected Configuration getConfiguration(final T operation) throws OperationException {
    final Configuration conf = new Configuration();
    final String serialisedConf = operation.getOption(AbstractGetRDDHandler.HADOOP_CONFIGURATION_KEY);
    if (serialisedConf != null) {
        try {
            final ByteArrayInputStream bais = new ByteArrayInputStream(serialisedConf.getBytes(CommonConstants.UTF_8));
            conf.readFields(new DataInputStream(bais));
        } catch (final IOException e) {
            throw new OperationException("Exception decoding Configuration from options", e);
        }
    }
    return conf;
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) ByteArrayInputStream(java.io.ByteArrayInputStream) IOException(java.io.IOException) DataInputStream(java.io.DataInputStream) OperationException(uk.gov.gchq.gaffer.operation.OperationException)

Aggregations

OperationException (uk.gov.gchq.gaffer.operation.OperationException)38 User (uk.gov.gchq.gaffer.user.User)9 StoreException (uk.gov.gchq.gaffer.store.StoreException)7 Element (uk.gov.gchq.gaffer.data.element.Element)6 IOException (java.io.IOException)4 ArrayList (java.util.ArrayList)4 Configuration (org.apache.hadoop.conf.Configuration)4 FileSystem (org.apache.hadoop.fs.FileSystem)4 Path (org.apache.hadoop.fs.Path)4 Test (org.junit.Test)4 Graph (uk.gov.gchq.gaffer.graph.Graph)4 SparkConf (org.apache.spark.SparkConf)3 IteratorSettingException (uk.gov.gchq.gaffer.accumulostore.key.exception.IteratorSettingException)3 Edge (uk.gov.gchq.gaffer.data.element.Edge)3 JobDetail (uk.gov.gchq.gaffer.jobtracker.JobDetail)3 AddElements (uk.gov.gchq.gaffer.operation.impl.add.AddElements)3 BufferedWriter (java.io.BufferedWriter)2 ByteArrayInputStream (java.io.ByteArrayInputStream)2 DataInputStream (java.io.DataInputStream)2 OutputStreamWriter (java.io.OutputStreamWriter)2