Search in sources :

Example 1 with CryptoFSDataInputStream

use of org.apache.hadoop.fs.crypto.CryptoFSDataInputStream in project hadoop by apache.

the class CryptoUtils method wrapIfNecessary.

/**
   * Wraps a given FSDataInputStream with a CryptoInputStream. The size of the
   * data buffer required for the stream is specified by the
   * "mapreduce.job.encrypted-intermediate-data.buffer.kb" Job configuration
   * variable.
   * 
   * @param conf configuration
   * @param in given input stream
   * @return FSDataInputStream encrypted input stream if encryption is
   *         enabled; otherwise the given input stream itself
   * @throws IOException exception in case of error
   */
public static FSDataInputStream wrapIfNecessary(Configuration conf, FSDataInputStream in) throws IOException {
    if (isEncryptedSpillEnabled(conf)) {
        CryptoCodec cryptoCodec = CryptoCodec.getInstance(conf);
        int bufferSize = getBufferSize(conf);
        // Not going to be used... but still has to be read...
        // Since the O/P stream always writes it..
        IOUtils.readFully(in, new byte[8], 0, 8);
        byte[] iv = new byte[cryptoCodec.getCipherSuite().getAlgorithmBlockSize()];
        IOUtils.readFully(in, iv, 0, cryptoCodec.getCipherSuite().getAlgorithmBlockSize());
        if (LOG.isDebugEnabled()) {
            LOG.debug("IV read from Stream [" + Base64.encodeBase64URLSafeString(iv) + "]");
        }
        return new CryptoFSDataInputStream(in, cryptoCodec, bufferSize, getEncryptionKey(), iv);
    } else {
        return in;
    }
}
Also used : CryptoFSDataInputStream(org.apache.hadoop.fs.crypto.CryptoFSDataInputStream) CryptoCodec(org.apache.hadoop.crypto.CryptoCodec)

Aggregations

CryptoCodec (org.apache.hadoop.crypto.CryptoCodec)1 CryptoFSDataInputStream (org.apache.hadoop.fs.crypto.CryptoFSDataInputStream)1