Search in sources :

Example 86 with FSDataInputStream

use of org.apache.hadoop.fs.FSDataInputStream in project hadoop by apache.

the class HadoopLogsAnalyzer method maybeUncompressedPath.

private LineReader maybeUncompressedPath(Path p) throws FileNotFoundException, IOException {
    CompressionCodecFactory codecs = new CompressionCodecFactory(getConf());
    inputCodec = codecs.getCodec(p);
    FileSystem fs = p.getFileSystem(getConf());
    FSDataInputStream fileIn = fs.open(p);
    if (inputCodec == null) {
        return new LineReader(fileIn, getConf());
    } else {
        inputDecompressor = CodecPool.getDecompressor(inputCodec);
        return new LineReader(inputCodec.createInputStream(fileIn, inputDecompressor), getConf());
    }
}
Also used : CompressionCodecFactory(org.apache.hadoop.io.compress.CompressionCodecFactory) FileSystem(org.apache.hadoop.fs.FileSystem) LineReader(org.apache.hadoop.util.LineReader) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream)

Example 87 with FSDataInputStream

use of org.apache.hadoop.fs.FSDataInputStream in project hadoop by apache.

the class StatePool method reloadState.

private boolean reloadState(Path stateFile, Configuration configuration) throws Exception {
    FileSystem fs = stateFile.getFileSystem(configuration);
    try (FSDataInputStream in = fs.open(stateFile)) {
        System.out.println("Reading state from " + stateFile.toString());
        read(in);
        return true;
    } catch (FileNotFoundException e) {
        System.out.println("No state information found for " + stateFile);
        return false;
    }
}
Also used : FileSystem(org.apache.hadoop.fs.FileSystem) FileNotFoundException(java.io.FileNotFoundException) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream)

Example 88 with FSDataInputStream

use of org.apache.hadoop.fs.FSDataInputStream in project hadoop by apache.

the class TestSwiftFileSystemExtendedContract method testWriteReadFile.

@Test(timeout = SWIFT_TEST_TIMEOUT)
public void testWriteReadFile() throws Exception {
    final Path f = new Path("/test/test");
    final FSDataOutputStream fsDataOutputStream = fs.create(f);
    final String message = "Test string";
    fsDataOutputStream.write(message.getBytes());
    fsDataOutputStream.close();
    assertExists("created file", f);
    FSDataInputStream open = null;
    try {
        open = fs.open(f);
        final byte[] bytes = new byte[512];
        final int read = open.read(bytes);
        final byte[] buffer = new byte[read];
        System.arraycopy(bytes, 0, buffer, 0, read);
        assertEquals(message, new String(buffer));
    } finally {
        fs.delete(f, false);
        IOUtils.closeStream(open);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) FSDataOutputStream(org.apache.hadoop.fs.FSDataOutputStream) Test(org.junit.Test)

Example 89 with FSDataInputStream

use of org.apache.hadoop.fs.FSDataInputStream in project hadoop by apache.

the class TestSwiftFileSystemExtendedContract method testOpenNonExistingFile.

@Test(timeout = SWIFT_TEST_TIMEOUT)
public void testOpenNonExistingFile() throws IOException {
    final Path p = new Path("/test/testOpenNonExistingFile");
    //open it as a file, should get FileNotFoundException
    try {
        final FSDataInputStream in = fs.open(p);
        in.close();
        fail("didn't expect to get here");
    } catch (FileNotFoundException fnfe) {
        LOG.debug("Expected: " + fnfe, fnfe);
    }
}
Also used : Path(org.apache.hadoop.fs.Path) FileNotFoundException(java.io.FileNotFoundException) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) Test(org.junit.Test)

Example 90 with FSDataInputStream

use of org.apache.hadoop.fs.FSDataInputStream in project hadoop by apache.

the class TestSwiftFileSystemRename method testRenameFile.

@Test(timeout = SWIFT_TEST_TIMEOUT)
public void testRenameFile() throws Exception {
    assumeRenameSupported();
    final Path old = new Path("/test/alice/file");
    final Path newPath = new Path("/test/bob/file");
    fs.mkdirs(newPath.getParent());
    final FSDataOutputStream fsDataOutputStream = fs.create(old);
    final byte[] message = "Some data".getBytes();
    fsDataOutputStream.write(message);
    fsDataOutputStream.close();
    assertTrue(fs.exists(old));
    rename(old, newPath, true, false, true);
    final FSDataInputStream bobStream = fs.open(newPath);
    final byte[] bytes = new byte[512];
    final int read = bobStream.read(bytes);
    bobStream.close();
    final byte[] buffer = new byte[read];
    System.arraycopy(bytes, 0, buffer, 0, read);
    assertEquals(new String(message), new String(buffer));
}
Also used : Path(org.apache.hadoop.fs.Path) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) FSDataOutputStream(org.apache.hadoop.fs.FSDataOutputStream) SwiftTestUtils.readBytesToString(org.apache.hadoop.fs.swift.util.SwiftTestUtils.readBytesToString) Test(org.junit.Test)

Aggregations

FSDataInputStream (org.apache.hadoop.fs.FSDataInputStream)431 Path (org.apache.hadoop.fs.Path)271 FileSystem (org.apache.hadoop.fs.FileSystem)143 Test (org.junit.Test)135 IOException (java.io.IOException)125 Configuration (org.apache.hadoop.conf.Configuration)94 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)93 FileStatus (org.apache.hadoop.fs.FileStatus)62 InputStreamReader (java.io.InputStreamReader)37 BufferedReader (java.io.BufferedReader)36 FileNotFoundException (java.io.FileNotFoundException)26 IgfsPath (org.apache.ignite.igfs.IgfsPath)26 MiniDFSCluster (org.apache.hadoop.hdfs.MiniDFSCluster)21 ArrayList (java.util.ArrayList)20 Random (java.util.Random)19 EOFException (java.io.EOFException)18 HashMap (java.util.HashMap)16 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)15 URI (java.net.URI)14 File (java.io.File)13