Search in sources :

Example 1 with MD5MD5CRC32CastagnoliFileChecksum

use of org.apache.hadoop.fs.MD5MD5CRC32CastagnoliFileChecksum in project hadoop by apache.

the class JsonUtilClient method toMD5MD5CRC32FileChecksum.

/** Convert a Json map to a MD5MD5CRC32FileChecksum. */
static MD5MD5CRC32FileChecksum toMD5MD5CRC32FileChecksum(final Map<?, ?> json) throws IOException {
    if (json == null) {
        return null;
    }
    final Map<?, ?> m = (Map<?, ?>) json.get(FileChecksum.class.getSimpleName());
    final String algorithm = (String) m.get("algorithm");
    final int length = ((Number) m.get("length")).intValue();
    final byte[] bytes = StringUtils.hexStringToByte((String) m.get("bytes"));
    final DataInputStream in = new DataInputStream(new ByteArrayInputStream(bytes));
    final DataChecksum.Type crcType = MD5MD5CRC32FileChecksum.getCrcTypeFromAlgorithmName(algorithm);
    final MD5MD5CRC32FileChecksum checksum;
    // Recreate what DFSClient would have returned.
    switch(crcType) {
        case CRC32:
            checksum = new MD5MD5CRC32GzipFileChecksum();
            break;
        case CRC32C:
            checksum = new MD5MD5CRC32CastagnoliFileChecksum();
            break;
        default:
            throw new IOException("Unknown algorithm: " + algorithm);
    }
    checksum.readFields(in);
    //check algorithm name
    if (!checksum.getAlgorithmName().equals(algorithm)) {
        throw new IOException("Algorithm not matched. Expected " + algorithm + ", Received " + checksum.getAlgorithmName());
    }
    //check length
    if (length != checksum.getLength()) {
        throw new IOException("Length not matched: length=" + length + ", checksum.getLength()=" + checksum.getLength());
    }
    return checksum;
}
Also used : MD5MD5CRC32FileChecksum(org.apache.hadoop.fs.MD5MD5CRC32FileChecksum) MD5MD5CRC32GzipFileChecksum(org.apache.hadoop.fs.MD5MD5CRC32GzipFileChecksum) IOException(java.io.IOException) DataInputStream(java.io.DataInputStream) MD5MD5CRC32CastagnoliFileChecksum(org.apache.hadoop.fs.MD5MD5CRC32CastagnoliFileChecksum) DataChecksum(org.apache.hadoop.util.DataChecksum) ByteArrayInputStream(java.io.ByteArrayInputStream) Map(java.util.Map)

Example 2 with MD5MD5CRC32CastagnoliFileChecksum

use of org.apache.hadoop.fs.MD5MD5CRC32CastagnoliFileChecksum in project hadoop by apache.

the class MD5MD5CRC32FileChecksum method valueOf.

/** Return the object represented in the attributes. */
public static MD5MD5CRC32FileChecksum valueOf(Attributes attrs) throws SAXException {
    final String bytesPerCRC = attrs.getValue("bytesPerCRC");
    final String crcPerBlock = attrs.getValue("crcPerBlock");
    final String md5 = attrs.getValue("md5");
    String crcType = attrs.getValue("crcType");
    DataChecksum.Type finalCrcType;
    if (bytesPerCRC == null || crcPerBlock == null || md5 == null) {
        return null;
    }
    try {
        // old versions don't support crcType.
        if (crcType == null || crcType.equals("")) {
            finalCrcType = DataChecksum.Type.CRC32;
        } else {
            finalCrcType = DataChecksum.Type.valueOf(crcType);
        }
        switch(finalCrcType) {
            case CRC32:
                return new MD5MD5CRC32GzipFileChecksum(Integer.parseInt(bytesPerCRC), Integer.parseInt(crcPerBlock), new MD5Hash(md5));
            case CRC32C:
                return new MD5MD5CRC32CastagnoliFileChecksum(Integer.parseInt(bytesPerCRC), Integer.parseInt(crcPerBlock), new MD5Hash(md5));
            default:
                // hold a valid type or we should have got an exception.
                return null;
        }
    } catch (Exception e) {
        throw new SAXException("Invalid attributes: bytesPerCRC=" + bytesPerCRC + ", crcPerBlock=" + crcPerBlock + ", crcType=" + crcType + ", md5=" + md5, e);
    }
}
Also used : MD5MD5CRC32GzipFileChecksum(org.apache.hadoop.fs.MD5MD5CRC32GzipFileChecksum) MD5Hash(org.apache.hadoop.io.MD5Hash) MD5MD5CRC32CastagnoliFileChecksum(org.apache.hadoop.fs.MD5MD5CRC32CastagnoliFileChecksum) IOException(java.io.IOException) SAXException(org.xml.sax.SAXException) DataChecksum(org.apache.hadoop.util.DataChecksum) SAXException(org.xml.sax.SAXException)

Aggregations

IOException (java.io.IOException)2 MD5MD5CRC32CastagnoliFileChecksum (org.apache.hadoop.fs.MD5MD5CRC32CastagnoliFileChecksum)2 MD5MD5CRC32GzipFileChecksum (org.apache.hadoop.fs.MD5MD5CRC32GzipFileChecksum)2 DataChecksum (org.apache.hadoop.util.DataChecksum)2 ByteArrayInputStream (java.io.ByteArrayInputStream)1 DataInputStream (java.io.DataInputStream)1 Map (java.util.Map)1 MD5MD5CRC32FileChecksum (org.apache.hadoop.fs.MD5MD5CRC32FileChecksum)1 MD5Hash (org.apache.hadoop.io.MD5Hash)1 SAXException (org.xml.sax.SAXException)1