Search in sources :

Example 1 with WebHdfsFileSystem

use of org.apache.hadoop.hdfs.web.WebHdfsFileSystem in project hadoop by apache.

the class TestEncryptionZones method testRootDirEZTrash.

@Test
public void testRootDirEZTrash() throws Exception {
    final HdfsAdmin dfsAdmin = new HdfsAdmin(FileSystem.getDefaultUri(conf), conf);
    final String currentUser = UserGroupInformation.getCurrentUser().getShortUserName();
    final Path rootDir = new Path("/");
    dfsAdmin.createEncryptionZone(rootDir, TEST_KEY, NO_TRASH);
    final Path encFile = new Path("/encFile");
    final int len = 8192;
    DFSTestUtil.createFile(fs, encFile, len, (short) 1, 0xFEED);
    Configuration clientConf = new Configuration(conf);
    clientConf.setLong(FS_TRASH_INTERVAL_KEY, 1);
    FsShell shell = new FsShell(clientConf);
    verifyShellDeleteWithTrash(shell, encFile);
    // Trash path should be consistent
    // if root path is an encryption zone
    Path encFileCurrentTrash = shell.getCurrentTrashDir(encFile);
    Path rootDirCurrentTrash = shell.getCurrentTrashDir(rootDir);
    assertEquals("Root trash should be equal with ezFile trash", encFileCurrentTrash, rootDirCurrentTrash);
    // Use webHDFS client to test trash root path
    final WebHdfsFileSystem webFS = WebHdfsTestUtil.getWebHdfsFileSystem(conf, WebHdfsConstants.WEBHDFS_SCHEME);
    final Path expectedTrash = new Path(rootDir, new Path(FileSystem.TRASH_PREFIX, currentUser));
    Path webHDFSTrash = webFS.getTrashRoot(encFile);
    assertEquals(expectedTrash.toUri().getPath(), webHDFSTrash.toUri().getPath());
    assertEquals(encFileCurrentTrash.getParent().toUri().getPath(), webHDFSTrash.toUri().getPath());
}
Also used : Path(org.apache.hadoop.fs.Path) FsShell(org.apache.hadoop.fs.FsShell) Configuration(org.apache.hadoop.conf.Configuration) HdfsAdmin(org.apache.hadoop.hdfs.client.HdfsAdmin) Mockito.anyString(org.mockito.Mockito.anyString) WebHdfsFileSystem(org.apache.hadoop.hdfs.web.WebHdfsFileSystem) Test(org.junit.Test)

Example 2 with WebHdfsFileSystem

use of org.apache.hadoop.hdfs.web.WebHdfsFileSystem in project hadoop by apache.

the class TestAuditLogs method testAuditWebHdfsStat.

/** test that stat via webhdfs puts proper entry in audit log */
@Test
public void testAuditWebHdfsStat() throws Exception {
    final Path file = new Path(fnames[0]);
    fs.setPermission(file, new FsPermission((short) 0644));
    fs.setOwner(file, "root", null);
    setupAuditLogs();
    WebHdfsFileSystem webfs = WebHdfsTestUtil.getWebHdfsFileSystemAs(userGroupInfo, conf, WebHdfsConstants.WEBHDFS_SCHEME);
    FileStatus st = webfs.getFileStatus(file);
    verifyAuditLogs(true);
    assertTrue("failed to stat file", st != null && st.isFile());
}
Also used : Path(org.apache.hadoop.fs.Path) FileStatus(org.apache.hadoop.fs.FileStatus) FsPermission(org.apache.hadoop.fs.permission.FsPermission) WebHdfsFileSystem(org.apache.hadoop.hdfs.web.WebHdfsFileSystem) Test(org.junit.Test)

Example 3 with WebHdfsFileSystem

use of org.apache.hadoop.hdfs.web.WebHdfsFileSystem in project hadoop by apache.

the class TestAuditLogs method testAuditWebHdfsDenied.

/** test that denied access via webhdfs puts proper entry in audit log */
@Test
public void testAuditWebHdfsDenied() throws Exception {
    final Path file = new Path(fnames[0]);
    fs.setPermission(file, new FsPermission((short) 0600));
    fs.setOwner(file, "root", null);
    setupAuditLogs();
    try {
        WebHdfsFileSystem webfs = WebHdfsTestUtil.getWebHdfsFileSystemAs(userGroupInfo, conf, WebHdfsConstants.WEBHDFS_SCHEME);
        InputStream istream = webfs.open(file);
        int val = istream.read();
        fail("open+read must not succeed, got " + val);
    } catch (AccessControlException E) {
        System.out.println("got access denied, as expected.");
    }
    verifyAuditLogsRepeat(false, 2);
}
Also used : Path(org.apache.hadoop.fs.Path) InputStream(java.io.InputStream) AccessControlException(org.apache.hadoop.security.AccessControlException) FsPermission(org.apache.hadoop.fs.permission.FsPermission) WebHdfsFileSystem(org.apache.hadoop.hdfs.web.WebHdfsFileSystem) Test(org.junit.Test)

Example 4 with WebHdfsFileSystem

use of org.apache.hadoop.hdfs.web.WebHdfsFileSystem in project hadoop by apache.

the class TestDelegationTokenFetcher method testReturnedTokenIsNull.

/**
   * If token returned is null, saveDelegationToken should not
   * throw nullPointerException
   */
@Test
public void testReturnedTokenIsNull() throws Exception {
    WebHdfsFileSystem fs = mock(WebHdfsFileSystem.class);
    doReturn(null).when(fs).getDelegationToken(anyString());
    Path p = new Path(f.getRoot().getAbsolutePath(), tokenFile);
    DelegationTokenFetcher.saveDelegationToken(conf, fs, null, p);
    // When Token returned is null, TokenFile should not exist
    Assert.assertFalse(p.getFileSystem(conf).exists(p));
}
Also used : Path(org.apache.hadoop.fs.Path) WebHdfsFileSystem(org.apache.hadoop.hdfs.web.WebHdfsFileSystem) Test(org.junit.Test)

Example 5 with WebHdfsFileSystem

use of org.apache.hadoop.hdfs.web.WebHdfsFileSystem in project hadoop by apache.

the class TestDelegationTokenFetcher method testTokenFetchFail.

/**
   * try to fetch token without http server with IOException
   */
@Test(expected = IOException.class)
public void testTokenFetchFail() throws Exception {
    WebHdfsFileSystem fs = mock(WebHdfsFileSystem.class);
    doThrow(new IOException()).when(fs).getDelegationToken(anyString());
    Path p = new Path(f.getRoot().getAbsolutePath(), tokenFile);
    DelegationTokenFetcher.saveDelegationToken(conf, fs, null, p);
}
Also used : Path(org.apache.hadoop.fs.Path) IOException(java.io.IOException) WebHdfsFileSystem(org.apache.hadoop.hdfs.web.WebHdfsFileSystem) Test(org.junit.Test)

Aggregations

WebHdfsFileSystem (org.apache.hadoop.hdfs.web.WebHdfsFileSystem)21 Test (org.junit.Test)19 Path (org.apache.hadoop.fs.Path)18 Configuration (org.apache.hadoop.conf.Configuration)10 URI (java.net.URI)9 HttpURLConnection (java.net.HttpURLConnection)5 URL (java.net.URL)5 ContentSummary (org.apache.hadoop.fs.ContentSummary)5 FsPermission (org.apache.hadoop.fs.permission.FsPermission)5 FileStatus (org.apache.hadoop.fs.FileStatus)3 IOException (java.io.IOException)2 InputStream (java.io.InputStream)2 FsShell (org.apache.hadoop.fs.FsShell)2 HdfsAdmin (org.apache.hadoop.hdfs.client.HdfsAdmin)2 DelegationTokenIdentifier (org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenIdentifier)2 Text (org.apache.hadoop.io.Text)2 AccessControlException (org.apache.hadoop.security.AccessControlException)2 Credentials (org.apache.hadoop.security.Credentials)2 Token (org.apache.hadoop.security.token.Token)2 Mockito.anyString (org.mockito.Mockito.anyString)2