Search in sources :

Example 6 with DFSInputStream

use of org.apache.hadoop.hdfs.DFSInputStream in project SSM by Intel-bigdata.

the class ReadFileAction method execute.

@Override
protected void execute() throws Exception {
    if (filePath == null) {
        throw new IllegalArgumentException("File parameter is missing.");
    }
    appendLog(String.format("Action starts at %s : Read %s", Utils.getFormatedCurrentTime(), filePath));
    if (!dfsClient.exists(filePath)) {
        throw new ActionException("ReadFile Action fails, file " + filePath + " doesn't exist!");
    }
    DFSInputStream dfsInputStream = dfsClient.open(filePath);
    byte[] buffer = new byte[bufferSize];
    // read from HDFS
    while (dfsInputStream.read(buffer, 0, bufferSize) != -1) {
    }
    dfsInputStream.close();
}
Also used : ActionException(org.smartdata.action.ActionException) DFSInputStream(org.apache.hadoop.hdfs.DFSInputStream)

Example 7 with DFSInputStream

use of org.apache.hadoop.hdfs.DFSInputStream in project SSM by Intel-bigdata.

the class SmartDFSClient method open.

@Override
public DFSInputStream open(HdfsPathHandle fd, int buffersize, boolean verifyChecksum) throws IOException {
    String src = fd.getPath();
    DFSInputStream is = super.open(fd, buffersize, verifyChecksum);
    if (is.getFileLength() == 0) {
        is.close();
        FileState fileState = getFileState(src);
        if (fileState.getFileStage().equals(FileState.FileStage.PROCESSING)) {
            throw new IOException("Cannot open " + src + " when it is under PROCESSING to " + fileState.getFileType());
        }
        is = SmartInputStreamFactory.create(this, src, verifyChecksum, fileState);
    }
    reportFileAccessEvent(src);
    return is;
}
Also used : NormalFileState(org.smartdata.model.NormalFileState) CompactFileState(org.smartdata.model.CompactFileState) FileState(org.smartdata.model.FileState) CompressionFileState(org.smartdata.model.CompressionFileState) DFSInputStream(org.apache.hadoop.hdfs.DFSInputStream) IOException(java.io.IOException)

Example 8 with DFSInputStream

use of org.apache.hadoop.hdfs.DFSInputStream in project SSM by Intel-bigdata.

the class ReadFileAction method execute.

@Override
protected void execute() {
    ActionStatus actionStatus = getActionStatus();
    actionStatus.begin();
    try {
        HdfsFileStatus fileStatus = dfsClient.getFileInfo(filePath);
        if (fileStatus == null) {
            resultOut.println("ReadFile Action fails, file doesn't exist!");
        }
        DFSInputStream dfsInputStream = dfsClient.open(filePath);
        byte[] buffer = new byte[bufferSize];
        // read from HDFS
        while (dfsInputStream.read(buffer, 0, bufferSize) != -1) {
        }
        dfsInputStream.close();
        actionStatus.setSuccessful(true);
    } catch (IOException e) {
        actionStatus.setSuccessful(false);
        resultOut.println("ReadFile Action fails!\n" + e.getMessage());
    } finally {
        actionStatus.end();
    }
}
Also used : HdfsFileStatus(org.apache.hadoop.hdfs.protocol.HdfsFileStatus) DFSInputStream(org.apache.hadoop.hdfs.DFSInputStream) IOException(java.io.IOException) ActionStatus(org.smartdata.actions.ActionStatus)

Example 9 with DFSInputStream

use of org.apache.hadoop.hdfs.DFSInputStream in project SSM by Intel-bigdata.

the class SmartDFSClient method open.

@Override
public DFSInputStream open(String src, int buffersize, boolean verifyChecksum) throws IOException, UnresolvedLinkException {
    DFSInputStream is = super.open(src, buffersize, verifyChecksum);
    reportFileAccessEvent(src);
    return is;
}
Also used : DFSInputStream(org.apache.hadoop.hdfs.DFSInputStream)

Example 10 with DFSInputStream

use of org.apache.hadoop.hdfs.DFSInputStream in project SSM by Intel-bigdata.

the class TestSmallFileRead method testRead.

@Test
public void testRead() throws Exception {
    waitTillSSMExitSafeMode();
    SmartDFSClient smartDFSClient = new SmartDFSClient(smartContext.getConf());
    DFSInputStream is = smartDFSClient.open("/test/small_files/file_0");
    Assert.assertEquals(1, is.getAllBlocks().size());
    Assert.assertEquals(fileLength, is.getFileLength());
    Assert.assertEquals(0, is.getPos());
    int byteRead = is.read();
    Assert.assertEquals(ret, byteRead);
    byte[] bytes = new byte[50];
    Assert.assertEquals(fileLength - 1, is.read(bytes));
    is.close();
    is = smartDFSClient.open("/test/small_files/file_0");
    ByteBuffer buffer = ByteBuffer.allocate(50);
    Assert.assertEquals(fileLength, is.read(buffer));
    is.close();
    is = smartDFSClient.open("/test/small_files/file_0");
    Assert.assertEquals(fileLength - 2, is.read(2, bytes, 1, 50));
    is.close();
}
Also used : DFSInputStream(org.apache.hadoop.hdfs.DFSInputStream) ByteBuffer(java.nio.ByteBuffer) SmartDFSClient(org.smartdata.hdfs.client.SmartDFSClient) Test(org.junit.Test)

Aggregations

DFSInputStream (org.apache.hadoop.hdfs.DFSInputStream)16 IOException (java.io.IOException)7 Test (org.junit.Test)6 Path (org.apache.hadoop.fs.Path)4 CompressionFileState (org.smartdata.model.CompressionFileState)4 HdfsFileStatus (org.apache.hadoop.hdfs.protocol.HdfsFileStatus)3 ActionException (org.smartdata.action.ActionException)3 SmartDFSClient (org.smartdata.hdfs.client.SmartDFSClient)3 FileState (org.smartdata.model.FileState)3 Gson (com.google.gson.Gson)2 OutputStream (java.io.OutputStream)2 DFSStripedInputStream (org.apache.hadoop.hdfs.DFSStripedInputStream)2 SmartConf (org.smartdata.conf.SmartConf)2 CompactFileState (org.smartdata.model.CompactFileState)2 NormalFileState (org.smartdata.model.NormalFileState)2 CmdletManager (org.smartdata.server.engine.CmdletManager)2 CacheLoader (com.google.common.cache.CacheLoader)1 TypeToken (com.google.gson.reflect.TypeToken)1 FilterInputStream (java.io.FilterInputStream)1 Field (java.lang.reflect.Field)1