Search in sources :

Example 1 with MockAccumuloStore

use of uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore in project Gaffer by gchq.

the class AddElementsFromHdfsIT method createGraph.

private Graph createGraph(final Class<? extends AccumuloKeyPackage> keyPackageClass) throws StoreException {
    final Schema schema = Schema.fromJson(StreamUtil.schemas(getClass()));
    final AccumuloProperties properties = AccumuloProperties.loadStoreProperties(StreamUtil.storeProps(getClass()));
    properties.setKeyPackageClass(keyPackageClass.getName());
    properties.setInstance("instance_" + keyPackageClass.getName());
    final AccumuloStore store = new MockAccumuloStore();
    store.initialise(schema, properties);
    store.updateConfiguration(createLocalConf(), new View(), new User());
    return new Graph.Builder().store(store).build();
}
Also used : User(uk.gov.gchq.gaffer.user.User) AccumuloProperties(uk.gov.gchq.gaffer.accumulostore.AccumuloProperties) MockAccumuloStore(uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore) Schema(uk.gov.gchq.gaffer.store.schema.Schema) AccumuloStore(uk.gov.gchq.gaffer.accumulostore.AccumuloStore) MockAccumuloStore(uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore) View(uk.gov.gchq.gaffer.data.elementdefinition.view.View)

Example 2 with MockAccumuloStore

use of uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore in project Gaffer by gchq.

the class AccumuloRangeIDRetrieverTest method setup.

@BeforeClass
public static void setup() throws StoreException, IOException {
    byteEntityStore = new MockAccumuloStore();
    gaffer1KeyStore = new MockAccumuloStore();
    byteEntityStore.initialise(schema, PROPERTIES);
    gaffer1KeyStore.initialise(schema, CLASSIC_PROPERTIES);
    defaultView = new View.Builder().edge(TestGroups.EDGE).entity(TestGroups.ENTITY).build();
    setupGraph(byteEntityStore, numEntries);
    setupGraph(gaffer1KeyStore, numEntries);
}
Also used : MockAccumuloStore(uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore) BeforeClass(org.junit.BeforeClass)

Example 3 with MockAccumuloStore

use of uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore in project Gaffer by gchq.

the class InputFormatTest method shouldReturnCorrectDataToMapReduceJob.

private void shouldReturnCorrectDataToMapReduceJob(final Schema schema, final KeyPackage kp, final List<Element> data, final View view, final User user, final String instanceName, final Set<String> expectedResults) throws Exception {
    final AccumuloStore store = new MockAccumuloStore();
    final AccumuloProperties properties = AccumuloProperties.loadStoreProperties(StreamUtil.storeProps(getClass()));
    switch(kp) {
        case BYTE_ENTITY_KEY_PACKAGE:
            properties.setKeyPackageClass(ByteEntityKeyPackage.class.getName());
            properties.setInstance(instanceName + "_BYTE_ENTITY");
            break;
        case CLASSIC_KEY_PACKAGE:
            properties.setKeyPackageClass(ClassicKeyPackage.class.getName());
            properties.setInstance(instanceName + "_CLASSIC");
    }
    try {
        store.initialise(schema, properties);
    } catch (StoreException e) {
        fail("StoreException thrown: " + e);
    }
    setupGraph(store, data);
    // Set up local conf
    final JobConf conf = new JobConf();
    conf.set("fs.default.name", "file:///");
    conf.set("mapred.job.tracker", "local");
    final FileSystem fs = FileSystem.getLocal(conf);
    // Update configuration with instance, table name, etc.
    store.updateConfiguration(conf, view, user);
    // Run Driver
    final File outputFolder = testFolder.newFolder();
    FileUtils.deleteDirectory(outputFolder);
    final Driver driver = new Driver(outputFolder.getAbsolutePath());
    driver.setConf(conf);
    driver.run(new String[] {});
    // Read results and check correct
    final SequenceFile.Reader reader = new SequenceFile.Reader(fs, new Path(outputFolder + "/part-m-00000"), conf);
    final Text text = new Text();
    final Set<String> results = new HashSet<>();
    while (reader.next(text)) {
        results.add(text.toString());
    }
    reader.close();
    assertEquals(expectedResults, results);
    FileUtils.deleteDirectory(outputFolder);
}
Also used : Path(org.apache.hadoop.fs.Path) ClassicKeyPackage(uk.gov.gchq.gaffer.accumulostore.key.core.impl.classic.ClassicKeyPackage) MockAccumuloStore(uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore) AccumuloProperties(uk.gov.gchq.gaffer.accumulostore.AccumuloProperties) Text(org.apache.hadoop.io.Text) ByteEntityKeyPackage(uk.gov.gchq.gaffer.accumulostore.key.core.impl.byteEntity.ByteEntityKeyPackage) StoreException(uk.gov.gchq.gaffer.store.StoreException) SequenceFile(org.apache.hadoop.io.SequenceFile) FileSystem(org.apache.hadoop.fs.FileSystem) AccumuloStore(uk.gov.gchq.gaffer.accumulostore.AccumuloStore) MockAccumuloStore(uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore) JobConf(org.apache.hadoop.mapred.JobConf) SequenceFile(org.apache.hadoop.io.SequenceFile) File(java.io.File) HashSet(java.util.HashSet)

Aggregations

MockAccumuloStore (uk.gov.gchq.gaffer.accumulostore.MockAccumuloStore)3 AccumuloProperties (uk.gov.gchq.gaffer.accumulostore.AccumuloProperties)2 AccumuloStore (uk.gov.gchq.gaffer.accumulostore.AccumuloStore)2 File (java.io.File)1 HashSet (java.util.HashSet)1 FileSystem (org.apache.hadoop.fs.FileSystem)1 Path (org.apache.hadoop.fs.Path)1 SequenceFile (org.apache.hadoop.io.SequenceFile)1 Text (org.apache.hadoop.io.Text)1 JobConf (org.apache.hadoop.mapred.JobConf)1 BeforeClass (org.junit.BeforeClass)1 ByteEntityKeyPackage (uk.gov.gchq.gaffer.accumulostore.key.core.impl.byteEntity.ByteEntityKeyPackage)1 ClassicKeyPackage (uk.gov.gchq.gaffer.accumulostore.key.core.impl.classic.ClassicKeyPackage)1 View (uk.gov.gchq.gaffer.data.elementdefinition.view.View)1 StoreException (uk.gov.gchq.gaffer.store.StoreException)1 Schema (uk.gov.gchq.gaffer.store.schema.Schema)1 User (uk.gov.gchq.gaffer.user.User)1