Search in sources :

Example 21 with CacheDirectiveInfo

use of org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo in project hadoop by apache.

the class TestAuditLoggerWithCommands method testModifyCacheDirective.

@Test
public void testModifyCacheDirective() throws Exception {
    removeExistingCachePools(null);
    proto.addCachePool(new CachePoolInfo("pool1").setMode(new FsPermission((short) 0)));
    CacheDirectiveInfo alpha = new CacheDirectiveInfo.Builder().setPath(new Path("/alpha")).setPool("pool1").build();
    fileSys = DFSTestUtil.getFileSystemAs(user1, conf);
    Long id = ((DistributedFileSystem) fs).addCacheDirective(alpha);
    try {
        ((DistributedFileSystem) fileSys).modifyCacheDirective(new CacheDirectiveInfo.Builder().setId(id).setReplication((short) 1).build());
        fail("The operation should have failed with AccessControlException");
    } catch (AccessControlException ace) {
    }
    String aceModifyCachePattern = ".*allowed=false.*ugi=theDoctor.*cmd=modifyCache.*";
    verifyAuditLogs(aceModifyCachePattern);
    fileSys.close();
    try {
        ((DistributedFileSystem) fileSys).modifyCacheDirective(new CacheDirectiveInfo.Builder().setId(id).setReplication((short) 1).build());
        fail("The operation should have failed with IOException");
    } catch (IOException e) {
    }
}
Also used : Path(org.apache.hadoop.fs.Path) CacheDirectiveInfo(org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo) AccessControlException(org.apache.hadoop.security.AccessControlException) FsPermission(org.apache.hadoop.fs.permission.FsPermission) IOException(java.io.IOException) DistributedFileSystem(org.apache.hadoop.hdfs.DistributedFileSystem) CachePoolInfo(org.apache.hadoop.hdfs.protocol.CachePoolInfo) Test(org.junit.Test)

Example 22 with CacheDirectiveInfo

use of org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo in project hadoop by apache.

the class TestRetryCacheWithHA method testListCacheDirectives.

/**
   * Add a list of cache directives, list cache directives,
   * switch active NN, and list cache directives again.
   */
@Test(timeout = 60000)
public void testListCacheDirectives() throws Exception {
    final int poolCount = 7;
    HashSet<String> poolNames = new HashSet<String>(poolCount);
    Path path = new Path("/p");
    for (int i = 0; i < poolCount; i++) {
        String poolName = "testListCacheDirectives-" + i;
        CacheDirectiveInfo directiveInfo = new CacheDirectiveInfo.Builder().setPool(poolName).setPath(path).build();
        dfs.addCachePool(new CachePoolInfo(poolName));
        dfs.addCacheDirective(directiveInfo, EnumSet.of(CacheFlag.FORCE));
        poolNames.add(poolName);
    }
    listCacheDirectives(poolNames, 0);
    cluster.transitionToStandby(0);
    cluster.transitionToActive(1);
    cluster.waitActive(1);
    listCacheDirectives(poolNames, 1);
}
Also used : Path(org.apache.hadoop.fs.Path) CacheDirectiveInfo(org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo) CachePoolInfo(org.apache.hadoop.hdfs.protocol.CachePoolInfo) HashSet(java.util.HashSet) Test(org.junit.Test)

Example 23 with CacheDirectiveInfo

use of org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo in project SSM by Intel-bigdata.

the class CacheFileAction method isCached.

public boolean isCached(String fileName) throws Exception {
    CacheDirectiveInfo.Builder filterBuilder = new CacheDirectiveInfo.Builder();
    filterBuilder.setPath(new Path(fileName));
    CacheDirectiveInfo filter = filterBuilder.build();
    RemoteIterator<CacheDirectiveEntry> directiveEntries = dfsClient.listCacheDirectives(filter);
    return directiveEntries.hasNext();
}
Also used : Path(org.apache.hadoop.fs.Path) CacheDirectiveInfo(org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo) CacheDirectiveEntry(org.apache.hadoop.hdfs.protocol.CacheDirectiveEntry)

Example 24 with CacheDirectiveInfo

use of org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo in project SSM by Intel-bigdata.

the class CacheFileAction method addDirective.

private void addDirective(String fileName) throws Exception {
    CacheDirectiveInfo.Builder filterBuilder = new CacheDirectiveInfo.Builder();
    filterBuilder.setPath(new Path(fileName));
    filterBuilder.setPool(SSMPOOL);
    CacheDirectiveInfo filter = filterBuilder.build();
    EnumSet<CacheFlag> flags = EnumSet.noneOf(CacheFlag.class);
    dfsClient.addCacheDirective(filter, flags);
}
Also used : Path(org.apache.hadoop.fs.Path) CacheFlag(org.apache.hadoop.fs.CacheFlag) CacheDirectiveInfo(org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo)

Aggregations

CacheDirectiveInfo (org.apache.hadoop.hdfs.protocol.CacheDirectiveInfo)24 Path (org.apache.hadoop.fs.Path)15 CachePoolInfo (org.apache.hadoop.hdfs.protocol.CachePoolInfo)10 IOException (java.io.IOException)9 CacheDirectiveEntry (org.apache.hadoop.hdfs.protocol.CacheDirectiveEntry)9 Test (org.junit.Test)9 InvalidRequestException (org.apache.hadoop.fs.InvalidRequestException)6 AccessControlException (org.apache.hadoop.security.AccessControlException)6 FsPermission (org.apache.hadoop.fs.permission.FsPermission)5 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)4 CacheDirective (org.apache.hadoop.hdfs.protocol.CacheDirective)4 CacheFlag (org.apache.hadoop.fs.CacheFlag)3 CachePoolEntry (org.apache.hadoop.hdfs.protocol.CachePoolEntry)3 ArrayList (java.util.ArrayList)2 Date (java.util.Date)2 ServiceException (com.google.protobuf.ServiceException)1 HashSet (java.util.HashSet)1 LinkedList (java.util.LinkedList)1 List (java.util.List)1 BatchedListEntries (org.apache.hadoop.fs.BatchedRemoteIterator.BatchedListEntries)1