Search in sources :

Example 86 with DFSClient

use of org.apache.hadoop.hdfs.DFSClient in project hadoop by apache.

the class TestFailoverWithBlockTokensEnabled method ensureInvalidBlockTokensAreRejected.

@Test
public void ensureInvalidBlockTokensAreRejected() throws IOException, URISyntaxException {
    cluster.transitionToActive(0);
    FileSystem fs = HATestUtil.configureFailoverFs(cluster, conf);
    DFSTestUtil.writeFile(fs, TEST_PATH, TEST_DATA);
    assertEquals(TEST_DATA, DFSTestUtil.readFile(fs, TEST_PATH));
    DFSClient dfsClient = DFSClientAdapter.getDFSClient((DistributedFileSystem) fs);
    DFSClient spyDfsClient = Mockito.spy(dfsClient);
    Mockito.doAnswer(new Answer<LocatedBlocks>() {

        @Override
        public LocatedBlocks answer(InvocationOnMock arg0) throws Throwable {
            LocatedBlocks locatedBlocks = (LocatedBlocks) arg0.callRealMethod();
            for (LocatedBlock lb : locatedBlocks.getLocatedBlocks()) {
                Token<BlockTokenIdentifier> token = lb.getBlockToken();
                BlockTokenIdentifier id = lb.getBlockToken().decodeIdentifier();
                // This will make the token invalid, since the password
                // won't match anymore
                id.setExpiryDate(Time.now() + 10);
                Token<BlockTokenIdentifier> newToken = new Token<BlockTokenIdentifier>(id.getBytes(), token.getPassword(), token.getKind(), token.getService());
                lb.setBlockToken(newToken);
            }
            return locatedBlocks;
        }
    }).when(spyDfsClient).getLocatedBlocks(Mockito.anyString(), Mockito.anyLong(), Mockito.anyLong());
    DFSClientAdapter.setDFSClient((DistributedFileSystem) fs, spyDfsClient);
    try {
        assertEquals(TEST_DATA, DFSTestUtil.readFile(fs, TEST_PATH));
        fail("Shouldn't have been able to read a file with invalid block tokens");
    } catch (IOException ioe) {
        GenericTestUtils.assertExceptionContains("Could not obtain block", ioe);
    }
}
Also used : DFSClient(org.apache.hadoop.hdfs.DFSClient) BlockTokenIdentifier(org.apache.hadoop.hdfs.security.token.block.BlockTokenIdentifier) InvocationOnMock(org.mockito.invocation.InvocationOnMock) FileSystem(org.apache.hadoop.fs.FileSystem) DistributedFileSystem(org.apache.hadoop.hdfs.DistributedFileSystem) LocatedBlocks(org.apache.hadoop.hdfs.protocol.LocatedBlocks) LocatedBlock(org.apache.hadoop.hdfs.protocol.LocatedBlock) Token(org.apache.hadoop.security.token.Token) IOException(java.io.IOException) Test(org.junit.Test)

Example 87 with DFSClient

use of org.apache.hadoop.hdfs.DFSClient in project hadoop by apache.

the class TestRetryCacheWithHA method testCreateSymlink.

@Test(timeout = 60000)
public void testCreateSymlink() throws Exception {
    final DFSClient client = genClientWithDummyHandler();
    AtMostOnceOp op = new CreateSymlinkOp(client, "/testfile", "/testlink");
    testClientRetryWithFailover(op);
}
Also used : DFSClient(org.apache.hadoop.hdfs.DFSClient) Test(org.junit.Test)

Example 88 with DFSClient

use of org.apache.hadoop.hdfs.DFSClient in project hadoop by apache.

the class TestRetryCacheWithHA method testAppend.

@Test(timeout = 60000)
public void testAppend() throws Exception {
    final DFSClient client = genClientWithDummyHandler();
    AtMostOnceOp op = new AppendOp(client, "/testfile");
    testClientRetryWithFailover(op);
}
Also used : DFSClient(org.apache.hadoop.hdfs.DFSClient) Test(org.junit.Test)

Example 89 with DFSClient

use of org.apache.hadoop.hdfs.DFSClient in project hadoop by apache.

the class TestRetryCacheWithHA method testCreateSnapshot.

@Test(timeout = 60000)
public void testCreateSnapshot() throws Exception {
    final DFSClient client = genClientWithDummyHandler();
    AtMostOnceOp op = new CreateSnapshotOp(client, "/test", "s1");
    testClientRetryWithFailover(op);
}
Also used : DFSClient(org.apache.hadoop.hdfs.DFSClient) Test(org.junit.Test)

Example 90 with DFSClient

use of org.apache.hadoop.hdfs.DFSClient in project hadoop by apache.

the class TestRetryCacheWithHA method testDelete.

@Test(timeout = 60000)
public void testDelete() throws Exception {
    final DFSClient client = genClientWithDummyHandler();
    AtMostOnceOp op = new DeleteOp(client, "/testfile");
    testClientRetryWithFailover(op);
}
Also used : DFSClient(org.apache.hadoop.hdfs.DFSClient) Test(org.junit.Test)

Aggregations

DFSClient (org.apache.hadoop.hdfs.DFSClient)97 Test (org.junit.Test)53 IOException (java.io.IOException)35 Nfs3FileAttributes (org.apache.hadoop.nfs.nfs3.Nfs3FileAttributes)27 FileHandle (org.apache.hadoop.nfs.nfs3.FileHandle)26 VisibleForTesting (com.google.common.annotations.VisibleForTesting)18 Path (org.apache.hadoop.fs.Path)18 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)17 InetSocketAddress (java.net.InetSocketAddress)13 MiniDFSCluster (org.apache.hadoop.hdfs.MiniDFSCluster)13 Configuration (org.apache.hadoop.conf.Configuration)12 NfsConfiguration (org.apache.hadoop.hdfs.nfs.conf.NfsConfiguration)12 FileSystem (org.apache.hadoop.fs.FileSystem)11 HdfsFileStatus (org.apache.hadoop.hdfs.protocol.HdfsFileStatus)11 HdfsDataOutputStream (org.apache.hadoop.hdfs.client.HdfsDataOutputStream)9 WccData (org.apache.hadoop.nfs.nfs3.response.WccData)9 ShellBasedIdMapping (org.apache.hadoop.security.ShellBasedIdMapping)8 ExtendedBlock (org.apache.hadoop.hdfs.protocol.ExtendedBlock)7 LocatedBlock (org.apache.hadoop.hdfs.protocol.LocatedBlock)7 ArrayList (java.util.ArrayList)6