Search in sources :

Example 21 with FSDataInputStreamWrapper

use of org.apache.hadoop.hbase.io.FSDataInputStreamWrapper in project hbase by apache.

the class TestHFileEncryption method testDataBlockEncryption.

@Test
public void testDataBlockEncryption() throws IOException {
    final int blocks = 10;
    final int[] blockSizes = new int[blocks];
    for (int i = 0; i < blocks; i++) {
        blockSizes[i] = (1024 + RNG.nextInt(1024 * 63)) / Bytes.SIZEOF_INT;
    }
    for (Compression.Algorithm compression : HBaseCommonTestingUtil.COMPRESSION_ALGORITHMS) {
        Path path = new Path(TEST_UTIL.getDataTestDir(), "block_v3_" + compression + "_AES");
        LOG.info("testDataBlockEncryption: encryption=AES compression=" + compression);
        long totalSize = 0;
        HFileContext fileContext = new HFileContextBuilder().withCompression(compression).withEncryptionContext(cryptoContext).build();
        FSDataOutputStream os = fs.create(path);
        try {
            for (int i = 0; i < blocks; i++) {
                totalSize += writeBlock(TEST_UTIL.getConfiguration(), os, fileContext, blockSizes[i]);
            }
        } finally {
            os.close();
        }
        FSDataInputStream is = fs.open(path);
        ReaderContext context = new ReaderContextBuilder().withInputStreamWrapper(new FSDataInputStreamWrapper(is)).withFilePath(path).withFileSystem(fs).withFileSize(totalSize).build();
        try {
            HFileBlock.FSReaderImpl hbr = new HFileBlock.FSReaderImpl(context, fileContext, ByteBuffAllocator.HEAP, TEST_UTIL.getConfiguration());
            long pos = 0;
            for (int i = 0; i < blocks; i++) {
                pos += readAndVerifyBlock(pos, fileContext, hbr, blockSizes[i]);
            }
        } finally {
            is.close();
        }
    }
}
Also used : Path(org.apache.hadoop.fs.Path) Compression(org.apache.hadoop.hbase.io.compress.Compression) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) FSDataInputStreamWrapper(org.apache.hadoop.hbase.io.FSDataInputStreamWrapper) FSDataOutputStream(org.apache.hadoop.fs.FSDataOutputStream) Test(org.junit.Test)

Aggregations

FSDataInputStreamWrapper (org.apache.hadoop.hbase.io.FSDataInputStreamWrapper)21 Path (org.apache.hadoop.fs.Path)16 FSDataInputStream (org.apache.hadoop.fs.FSDataInputStream)9 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)8 ArrayList (java.util.ArrayList)7 Compression (org.apache.hadoop.hbase.io.compress.Compression)7 ByteBuff (org.apache.hadoop.hbase.nio.ByteBuff)7 DataOutputStream (java.io.DataOutputStream)6 Configuration (org.apache.hadoop.conf.Configuration)5 SingleByteBuff (org.apache.hadoop.hbase.nio.SingleByteBuff)5 Test (org.junit.Test)5 Random (java.util.Random)4 HBaseConfiguration (org.apache.hadoop.hbase.HBaseConfiguration)4 Algorithm (org.apache.hadoop.hbase.io.compress.Compression.Algorithm)4 MultiByteBuff (org.apache.hadoop.hbase.nio.MultiByteBuff)4 IOException (java.io.IOException)3 FileSystem (org.apache.hadoop.fs.FileSystem)3 KeyValue (org.apache.hadoop.hbase.KeyValue)3 ByteArrayInputStream (java.io.ByteArrayInputStream)2 DataInputStream (java.io.DataInputStream)2