Search in sources :

Example 6 with RFileWriter

use of org.apache.accumulo.core.client.rfile.RFileWriter in project accumulo by apache.

the class CryptoTest method testRFileEncrypted.

@Test
public void testRFileEncrypted() throws Exception {
    AccumuloConfiguration cryptoOnConf = getAccumuloConfig(ConfigMode.CRYPTO_ON);
    FileSystem fs = FileSystem.getLocal(hadoopConf);
    ArrayList<Key> keys = testData();
    SummarizerConfiguration sumConf = SummarizerConfiguration.builder(KeyCounter.class.getName()).build();
    String file = "target/testFile1.rf";
    fs.delete(new Path(file), true);
    try (RFileWriter writer = RFile.newWriter().to(file).withFileSystem(fs).withTableProperties(cryptoOnConf).withSummarizers(sumConf).build()) {
        Value empty = new Value();
        writer.startDefaultLocalityGroup();
        for (Key key : keys) {
            writer.append(key, empty);
        }
    }
    Scanner iter = RFile.newScanner().from(file).withFileSystem(fs).withTableProperties(cryptoOnConf).build();
    ArrayList<Key> keysRead = new ArrayList<>();
    iter.forEach(e -> keysRead.add(e.getKey()));
    assertEquals(keys, keysRead);
    Collection<Summary> summaries = RFile.summaries().from(file).withFileSystem(fs).withTableProperties(cryptoOnConf).read();
    Summary summary = Iterables.getOnlyElement(summaries);
    assertEquals(keys.size(), (long) summary.getStatistics().get("keys"));
    assertEquals(1, summary.getStatistics().size());
    assertEquals(0, summary.getFileStatistics().getInaccurate());
    assertEquals(1, summary.getFileStatistics().getTotal());
}
Also used : Path(org.apache.hadoop.fs.Path) Scanner(org.apache.accumulo.core.client.Scanner) RFileWriter(org.apache.accumulo.core.client.rfile.RFileWriter) ArrayList(java.util.ArrayList) FileSystem(org.apache.hadoop.fs.FileSystem) Value(org.apache.accumulo.core.data.Value) Summary(org.apache.accumulo.core.client.summary.Summary) SummarizerConfiguration(org.apache.accumulo.core.client.summary.SummarizerConfiguration) Key(org.apache.accumulo.core.data.Key) AccumuloConfiguration(org.apache.accumulo.core.conf.AccumuloConfiguration) Test(org.junit.jupiter.api.Test)

Example 7 with RFileWriter

use of org.apache.accumulo.core.client.rfile.RFileWriter in project accumulo by apache.

the class BulkFailureIT method createTestFile.

private String createTestFile(long txid, SortedMap<Key, Value> testData, FileSystem fs) throws IOException {
    Path base = new Path(getCluster().getTemporaryPath(), "testBulk_ICI_" + txid);
    fs.delete(base, true);
    fs.mkdirs(base);
    Path files = new Path(base, "files");
    try (RFileWriter writer = RFile.newWriter().to(new Path(files, "ici_01.rf").toString()).withFileSystem(fs).build()) {
        writer.append(testData.entrySet());
    }
    String filesStr = fs.makeQualified(files).toString();
    return filesStr;
}
Also used : Path(org.apache.hadoop.fs.Path) RFileWriter(org.apache.accumulo.core.client.rfile.RFileWriter)

Aggregations

RFileWriter (org.apache.accumulo.core.client.rfile.RFileWriter)7 Path (org.apache.hadoop.fs.Path)7 AccumuloConfiguration (org.apache.accumulo.core.conf.AccumuloConfiguration)6 Key (org.apache.accumulo.core.data.Key)6 Value (org.apache.accumulo.core.data.Value)6 Configuration (org.apache.hadoop.conf.Configuration)4 ArrayList (java.util.ArrayList)2 Scanner (org.apache.accumulo.core.client.Scanner)2 SamplerConfiguration (org.apache.accumulo.core.client.sample.SamplerConfiguration)2 FileSystem (org.apache.hadoop.fs.FileSystem)2 RecordWriter (org.apache.hadoop.mapred.RecordWriter)2 Reporter (org.apache.hadoop.mapred.Reporter)2 RecordWriter (org.apache.hadoop.mapreduce.RecordWriter)2 TaskAttemptContext (org.apache.hadoop.mapreduce.TaskAttemptContext)2 Test (org.junit.jupiter.api.Test)2 SummarizerConfiguration (org.apache.accumulo.core.client.summary.SummarizerConfiguration)1 Summary (org.apache.accumulo.core.client.summary.Summary)1