Search in sources :

Example 26 with StorageBlockReport

use of org.apache.hadoop.hdfs.server.protocol.StorageBlockReport in project hadoop by apache.

the class TestLargeBlockReport method createReports.

/**
   * Creates storage block reports, consisting of a single report with the
   * requested number of blocks.  The block data is fake, because the tests just
   * need to validate that the messages can pass correctly.  This intentionally
   * uses the old-style decoding method as a helper.  The test needs to cover
   * the new-style encoding technique.  Passing through that code path here
   * would trigger an exception before the test is ready to deal with it.
   *
   * @param numBlocks requested number of blocks
   * @return storage block reports
   */
private StorageBlockReport[] createReports(int numBlocks) {
    int longsPerBlock = 3;
    int blockListSize = 2 + numBlocks * longsPerBlock;
    List<Long> longs = new ArrayList<Long>(blockListSize);
    longs.add(Long.valueOf(numBlocks));
    longs.add(0L);
    for (int i = 0; i < blockListSize; ++i) {
        longs.add(Long.valueOf(i));
    }
    BlockListAsLongs blockList = BlockListAsLongs.decodeLongs(longs);
    StorageBlockReport[] reports = new StorageBlockReport[] { new StorageBlockReport(dnStorage, blockList) };
    return reports;
}
Also used : ArrayList(java.util.ArrayList) BlockListAsLongs(org.apache.hadoop.hdfs.protocol.BlockListAsLongs) StorageBlockReport(org.apache.hadoop.hdfs.server.protocol.StorageBlockReport)

Example 27 with StorageBlockReport

use of org.apache.hadoop.hdfs.server.protocol.StorageBlockReport in project hadoop by apache.

the class TestLargeBlockReport method testBlockReportExceedsLengthLimit.

@Test
public void testBlockReportExceedsLengthLimit() throws Exception {
    initCluster();
    // Create a large enough report that we expect it will go beyond the RPC
    // server's length validation, and also protobuf length validation.
    StorageBlockReport[] reports = createReports(6000000);
    try {
        nnProxy.blockReport(bpRegistration, bpId, reports, new BlockReportContext(1, 0, reportId, fullBrLeaseId, sorted));
        fail("Should have failed because of the too long RPC data length");
    } catch (Exception e) {
    // Expected.  We can't reliably assert anything about the exception type
    // or the message.  The NameNode just disconnects, and the details are
    // buried in the NameNode log.
    }
}
Also used : BlockReportContext(org.apache.hadoop.hdfs.server.protocol.BlockReportContext) StorageBlockReport(org.apache.hadoop.hdfs.server.protocol.StorageBlockReport) Test(org.junit.Test)

Example 28 with StorageBlockReport

use of org.apache.hadoop.hdfs.server.protocol.StorageBlockReport in project hadoop by apache.

the class TestLargeBlockReport method testBlockReportSucceedsWithLargerLengthLimit.

@Test
public void testBlockReportSucceedsWithLargerLengthLimit() throws Exception {
    // 128 MB
    conf.setInt(IPC_MAXIMUM_DATA_LENGTH, 128 * 1024 * 1024);
    initCluster();
    StorageBlockReport[] reports = createReports(6000000);
    nnProxy.blockReport(bpRegistration, bpId, reports, new BlockReportContext(1, 0, reportId, fullBrLeaseId, sorted));
}
Also used : BlockReportContext(org.apache.hadoop.hdfs.server.protocol.BlockReportContext) StorageBlockReport(org.apache.hadoop.hdfs.server.protocol.StorageBlockReport) Test(org.junit.Test)

Aggregations

StorageBlockReport (org.apache.hadoop.hdfs.server.protocol.StorageBlockReport)28 Test (org.junit.Test)21 DatanodeRegistration (org.apache.hadoop.hdfs.server.protocol.DatanodeRegistration)15 Path (org.apache.hadoop.fs.Path)11 BlockListAsLongs (org.apache.hadoop.hdfs.protocol.BlockListAsLongs)8 BlockReportContext (org.apache.hadoop.hdfs.server.protocol.BlockReportContext)8 Block (org.apache.hadoop.hdfs.protocol.Block)7 LocatedBlock (org.apache.hadoop.hdfs.protocol.LocatedBlock)7 ArrayList (java.util.ArrayList)6 ExtendedBlock (org.apache.hadoop.hdfs.protocol.ExtendedBlock)6 DatanodeProtocolClientSideTranslatorPB (org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB)6 DatanodeStorage (org.apache.hadoop.hdfs.server.protocol.DatanodeStorage)6 NameNode (org.apache.hadoop.hdfs.server.namenode.NameNode)4 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)3 BlockReportResponseProto (org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos.BlockReportResponseProto)3 DataNode (org.apache.hadoop.hdfs.server.datanode.DataNode)3 DatanodeCommand (org.apache.hadoop.hdfs.server.protocol.DatanodeCommand)3 InvocationOnMock (org.mockito.invocation.InvocationOnMock)3 ServiceException (com.google.protobuf.ServiceException)2 IOException (java.io.IOException)2