Search in sources :

Example 11 with FileChecksum

use of org.apache.hadoop.fs.FileChecksum in project hadoop by apache.

the class HttpFSFileSystem method getFileChecksum.

@Override
public FileChecksum getFileChecksum(Path f) throws IOException {
    Map<String, String> params = new HashMap<String, String>();
    params.put(OP_PARAM, Operation.GETFILECHECKSUM.toString());
    HttpURLConnection conn = getConnection(Operation.GETFILECHECKSUM.getMethod(), params, f, true);
    HttpExceptionUtils.validateResponse(conn, HttpURLConnection.HTTP_OK);
    final JSONObject json = (JSONObject) ((JSONObject) HttpFSUtils.jsonParse(conn)).get(FILE_CHECKSUM_JSON);
    return new FileChecksum() {

        @Override
        public String getAlgorithmName() {
            return (String) json.get(CHECKSUM_ALGORITHM_JSON);
        }

        @Override
        public int getLength() {
            return ((Long) json.get(CHECKSUM_LENGTH_JSON)).intValue();
        }

        @Override
        public byte[] getBytes() {
            return StringUtils.hexStringToByte((String) json.get(CHECKSUM_BYTES_JSON));
        }

        @Override
        public void write(DataOutput out) throws IOException {
            throw new UnsupportedOperationException();
        }

        @Override
        public void readFields(DataInput in) throws IOException {
            throw new UnsupportedOperationException();
        }
    };
}
Also used : DataInput(java.io.DataInput) DataOutput(java.io.DataOutput) HttpURLConnection(java.net.HttpURLConnection) JSONObject(org.json.simple.JSONObject) HashMap(java.util.HashMap) FileChecksum(org.apache.hadoop.fs.FileChecksum)

Example 12 with FileChecksum

use of org.apache.hadoop.fs.FileChecksum in project hadoop by apache.

the class TestFileChecksum method testStripedFileChecksumWithMissedDataBlocks1.

@Test(timeout = 90000)
public void testStripedFileChecksumWithMissedDataBlocks1() throws Exception {
    prepareTestFiles(fileSize, new String[] { stripedFile1 });
    FileChecksum stripedFileChecksum1 = getFileChecksum(stripedFile1, fileSize, false);
    FileChecksum stripedFileChecksumRecon = getFileChecksum(stripedFile1, fileSize, true);
    LOG.info("stripedFileChecksum1:" + stripedFileChecksum1);
    LOG.info("stripedFileChecksumRecon:" + stripedFileChecksumRecon);
    Assert.assertTrue("Checksum mismatches!", stripedFileChecksum1.equals(stripedFileChecksumRecon));
}
Also used : FileChecksum(org.apache.hadoop.fs.FileChecksum) Test(org.junit.Test)

Example 13 with FileChecksum

use of org.apache.hadoop.fs.FileChecksum in project hadoop by apache.

the class TestFileChecksum method testStripedFileChecksumWithMissedDataBlocksRangeQuery.

private void testStripedFileChecksumWithMissedDataBlocksRangeQuery(String stripedFile, int requestedLen) throws Exception {
    LOG.info("Checksum file:{}, requested length:{}", stripedFile, requestedLen);
    prepareTestFiles(fileSize, new String[] { stripedFile });
    FileChecksum stripedFileChecksum1 = getFileChecksum(stripedFile, requestedLen, false);
    FileChecksum stripedFileChecksumRecon = getFileChecksum(stripedFile, requestedLen, true);
    LOG.info("stripedFileChecksum1:" + stripedFileChecksum1);
    LOG.info("stripedFileChecksumRecon:" + stripedFileChecksumRecon);
    Assert.assertTrue("Checksum mismatches!", stripedFileChecksum1.equals(stripedFileChecksumRecon));
}
Also used : FileChecksum(org.apache.hadoop.fs.FileChecksum)

Example 14 with FileChecksum

use of org.apache.hadoop.fs.FileChecksum in project hadoop by apache.

the class TestFileChecksum method getFileChecksum.

private FileChecksum getFileChecksum(String filePath, int range, boolean killDn) throws Exception {
    int dnIdxToDie = -1;
    if (killDn) {
        dnIdxToDie = getDataNodeToKill(filePath);
        DataNode dnToDie = cluster.getDataNodes().get(dnIdxToDie);
        shutdownDataNode(dnToDie);
    }
    Path testPath = new Path(filePath);
    FileChecksum fc;
    if (range >= 0) {
        fc = fs.getFileChecksum(testPath, range);
    } else {
        fc = fs.getFileChecksum(testPath);
    }
    if (dnIdxToDie != -1) {
        cluster.restartDataNode(dnIdxToDie);
    }
    return fc;
}
Also used : Path(org.apache.hadoop.fs.Path) DataNode(org.apache.hadoop.hdfs.server.datanode.DataNode) FileChecksum(org.apache.hadoop.fs.FileChecksum)

Example 15 with FileChecksum

use of org.apache.hadoop.fs.FileChecksum in project hadoop by apache.

the class TestGetFileChecksum method testGetFileChecksum.

public void testGetFileChecksum(final Path foo, final int appendLength) throws Exception {
    final int appendRounds = 16;
    FileChecksum[] fc = new FileChecksum[appendRounds + 1];
    DFSTestUtil.createFile(dfs, foo, appendLength, REPLICATION, 0L);
    fc[0] = dfs.getFileChecksum(foo);
    for (int i = 0; i < appendRounds; i++) {
        DFSTestUtil.appendFile(dfs, foo, appendLength);
        fc[i + 1] = dfs.getFileChecksum(foo);
    }
    for (int i = 0; i < appendRounds + 1; i++) {
        FileChecksum checksum = dfs.getFileChecksum(foo, appendLength * (i + 1));
        Assert.assertTrue(checksum.equals(fc[i]));
    }
}
Also used : FileChecksum(org.apache.hadoop.fs.FileChecksum)

Aggregations

FileChecksum (org.apache.hadoop.fs.FileChecksum)28 Path (org.apache.hadoop.fs.Path)13 Test (org.junit.Test)11 FileSystem (org.apache.hadoop.fs.FileSystem)8 IOException (java.io.IOException)6 ArrayList (java.util.ArrayList)2 Configuration (org.apache.hadoop.conf.Configuration)2 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)2 FileStatus (org.apache.hadoop.fs.FileStatus)2 DatanodeInfo (org.apache.hadoop.hdfs.protocol.DatanodeInfo)2 LocatedBlock (org.apache.hadoop.hdfs.protocol.LocatedBlock)2 DataInput (java.io.DataInput)1 DataOutput (java.io.DataOutput)1 FileNotFoundException (java.io.FileNotFoundException)1 FileOutputStream (java.io.FileOutputStream)1 OutputStream (java.io.OutputStream)1 HttpURLConnection (java.net.HttpURLConnection)1 SocketTimeoutException (java.net.SocketTimeoutException)1 HashMap (java.util.HashMap)1 TreeMap (java.util.TreeMap)1