Search in sources :

Example 16 with Compression

use of org.apache.hadoop.hbase.io.compress.Compression in project hbase by apache.

the class TestHFileEncryption method testDataBlockEncryption.

@Test
public void testDataBlockEncryption() throws IOException {
    final int blocks = 10;
    final int[] blockSizes = new int[blocks];
    for (int i = 0; i < blocks; i++) {
        blockSizes[i] = (1024 + RNG.nextInt(1024 * 63)) / Bytes.SIZEOF_INT;
    }
    for (Compression.Algorithm compression : HBaseCommonTestingUtil.COMPRESSION_ALGORITHMS) {
        Path path = new Path(TEST_UTIL.getDataTestDir(), "block_v3_" + compression + "_AES");
        LOG.info("testDataBlockEncryption: encryption=AES compression=" + compression);
        long totalSize = 0;
        HFileContext fileContext = new HFileContextBuilder().withCompression(compression).withEncryptionContext(cryptoContext).build();
        FSDataOutputStream os = fs.create(path);
        try {
            for (int i = 0; i < blocks; i++) {
                totalSize += writeBlock(TEST_UTIL.getConfiguration(), os, fileContext, blockSizes[i]);
            }
        } finally {
            os.close();
        }
        FSDataInputStream is = fs.open(path);
        ReaderContext context = new ReaderContextBuilder().withInputStreamWrapper(new FSDataInputStreamWrapper(is)).withFilePath(path).withFileSystem(fs).withFileSize(totalSize).build();
        try {
            HFileBlock.FSReaderImpl hbr = new HFileBlock.FSReaderImpl(context, fileContext, ByteBuffAllocator.HEAP, TEST_UTIL.getConfiguration());
            long pos = 0;
            for (int i = 0; i < blocks; i++) {
                pos += readAndVerifyBlock(pos, fileContext, hbr, blockSizes[i]);
            }
        } finally {
            is.close();
        }
    }
}
Also used : Path(org.apache.hadoop.fs.Path) Compression(org.apache.hadoop.hbase.io.compress.Compression) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) FSDataInputStreamWrapper(org.apache.hadoop.hbase.io.FSDataInputStreamWrapper) FSDataOutputStream(org.apache.hadoop.fs.FSDataOutputStream) Test(org.junit.Test)

Aggregations

Compression (org.apache.hadoop.hbase.io.compress.Compression)16 Path (org.apache.hadoop.fs.Path)9 ArrayList (java.util.ArrayList)7 Algorithm (org.apache.hadoop.hbase.io.compress.Compression.Algorithm)7 Configuration (org.apache.hadoop.conf.Configuration)6 FSDataInputStream (org.apache.hadoop.fs.FSDataInputStream)6 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)6 FSDataInputStreamWrapper (org.apache.hadoop.hbase.io.FSDataInputStreamWrapper)6 DataBlockEncoding (org.apache.hadoop.hbase.io.encoding.DataBlockEncoding)5 HBaseConfiguration (org.apache.hadoop.hbase.HBaseConfiguration)4 SingleByteBuff (org.apache.hadoop.hbase.nio.SingleByteBuff)4 DataOutputStream (java.io.DataOutputStream)3 IOException (java.io.IOException)3 ColumnFamilyDescriptor (org.apache.hadoop.hbase.client.ColumnFamilyDescriptor)3 ByteBuff (org.apache.hadoop.hbase.nio.ByteBuff)3 MultiByteBuff (org.apache.hadoop.hbase.nio.MultiByteBuff)3 DataInputStream (java.io.DataInputStream)2 ByteBuffer (java.nio.ByteBuffer)2 Random (java.util.Random)2 Cell (org.apache.hadoop.hbase.Cell)2