Search in sources :

Example 26 with FileSystem

use of org.apache.hadoop.fs.FileSystem in project camel by apache.

the class HdfsAppendTest method testAppend.

@Test
public void testAppend() throws Exception {
    context.addRoutes(new RouteBuilder() {

        @Override
        public void configure() throws Exception {
            from("direct:start1").to("hdfs2://localhost:9000/tmp/test/test-camel-simple-write-file1?append=true&fileSystemType=HDFS");
        }
    });
    startCamelContext();
    for (int i = 0; i < 10; ++i) {
        template.sendBody("direct:start1", "PIPPQ");
    }
    Configuration conf = new Configuration();
    Path file = new Path("hdfs://localhost:9000/tmp/test/test-camel-simple-write-file1");
    FileSystem fs = FileSystem.get(file.toUri(), conf);
    FSDataInputStream in = fs.open(file);
    byte[] buffer = new byte[5];
    int ret = 0;
    for (int i = 0; i < 20; ++i) {
        ret = in.read(buffer);
        System.out.println("> " + new String(buffer));
    }
    ret = in.read(buffer);
    assertEquals(-1, ret);
    in.close();
}
Also used : Path(org.apache.hadoop.fs.Path) RouteBuilder(org.apache.camel.builder.RouteBuilder) Configuration(org.apache.hadoop.conf.Configuration) FileSystem(org.apache.hadoop.fs.FileSystem) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) Test(org.junit.Test)

Example 27 with FileSystem

use of org.apache.hadoop.fs.FileSystem in project camel by apache.

the class HdfsAppendTest method testAppendWithDynamicFileName.

@Test
public void testAppendWithDynamicFileName() throws Exception {
    context.addRoutes(new RouteBuilder() {

        @Override
        public void configure() throws Exception {
            from("direct:start1").to("hdfs2://localhost:9000/tmp/test-dynamic/?append=true&fileSystemType=HDFS");
        }
    });
    startCamelContext();
    for (int i = 0; i < ITERATIONS; ++i) {
        template.sendBodyAndHeader("direct:start1", "HELLO", Exchange.FILE_NAME, "camel-hdfs2.log");
    }
    Configuration conf = new Configuration();
    Path file = new Path("hdfs://localhost:9000/tmp/test-dynamic/camel-hdfs2.log");
    FileSystem fs = FileSystem.get(file.toUri(), conf);
    FSDataInputStream in = fs.open(file);
    byte[] buffer = new byte[5];
    for (int i = 0; i < ITERATIONS; ++i) {
        assertEquals(5, in.read(buffer));
        System.out.println("> " + new String(buffer));
    }
    int ret = in.read(buffer);
    assertEquals(-1, ret);
    in.close();
}
Also used : Path(org.apache.hadoop.fs.Path) RouteBuilder(org.apache.camel.builder.RouteBuilder) Configuration(org.apache.hadoop.conf.Configuration) FileSystem(org.apache.hadoop.fs.FileSystem) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) Test(org.junit.Test)

Example 28 with FileSystem

use of org.apache.hadoop.fs.FileSystem in project hadoop by apache.

the class MapFile method main.

public static void main(String[] args) throws Exception {
    String usage = "Usage: MapFile inFile outFile";
    if (args.length != 2) {
        System.err.println(usage);
        System.exit(-1);
    }
    String in = args[0];
    String out = args[1];
    Configuration conf = new Configuration();
    FileSystem fs = FileSystem.getLocal(conf);
    MapFile.Reader reader = null;
    MapFile.Writer writer = null;
    try {
        reader = new MapFile.Reader(fs, in, conf);
        writer = new MapFile.Writer(conf, fs, out, reader.getKeyClass().asSubclass(WritableComparable.class), reader.getValueClass());
        WritableComparable<?> key = ReflectionUtils.newInstance(reader.getKeyClass().asSubclass(WritableComparable.class), conf);
        Writable value = ReflectionUtils.newInstance(reader.getValueClass().asSubclass(Writable.class), conf);
        while (// copy all entries
        reader.next(key, value)) writer.append(key, value);
    } finally {
        IOUtils.cleanup(LOG, writer, reader);
    }
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) FileSystem(org.apache.hadoop.fs.FileSystem)

Example 29 with FileSystem

use of org.apache.hadoop.fs.FileSystem in project hadoop by apache.

the class TestFileSystemCaching method testCacheDisabled.

@Test
public void testCacheDisabled() throws Exception {
    Configuration conf = new Configuration();
    conf.set("fs.uncachedfile.impl", FileSystem.getFileSystemClass("file", null).getName());
    conf.setBoolean("fs.uncachedfile.impl.disable.cache", true);
    FileSystem fs1 = FileSystem.get(new URI("uncachedfile://a"), conf);
    FileSystem fs2 = FileSystem.get(new URI("uncachedfile://a"), conf);
    assertNotSame(fs1, fs2);
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) FileSystem(org.apache.hadoop.fs.FileSystem) URI(java.net.URI) Test(org.junit.Test)

Example 30 with FileSystem

use of org.apache.hadoop.fs.FileSystem in project hadoop by apache.

the class TestFileSystemCaching method testDeleteOnExit.

@Test
public void testDeleteOnExit() throws IOException {
    FileSystem mockFs = mock(FileSystem.class);
    FileSystem fs = new FilterFileSystem(mockFs);
    Path path = new Path("/a");
    // delete on close if path does exist
    when(mockFs.getFileStatus(eq(path))).thenReturn(new FileStatus());
    assertTrue(fs.deleteOnExit(path));
    verify(mockFs).getFileStatus(eq(path));
    reset(mockFs);
    when(mockFs.getFileStatus(eq(path))).thenReturn(new FileStatus());
    fs.close();
    verify(mockFs).getFileStatus(eq(path));
    verify(mockFs).delete(eq(path), eq(true));
}
Also used : FileSystem(org.apache.hadoop.fs.FileSystem) Test(org.junit.Test)

Aggregations

FileSystem (org.apache.hadoop.fs.FileSystem)2611 Path (org.apache.hadoop.fs.Path)2199 Test (org.junit.Test)1034 Configuration (org.apache.hadoop.conf.Configuration)890 IOException (java.io.IOException)757 FileStatus (org.apache.hadoop.fs.FileStatus)419 FSDataOutputStream (org.apache.hadoop.fs.FSDataOutputStream)264 MiniDFSCluster (org.apache.hadoop.hdfs.MiniDFSCluster)227 ArrayList (java.util.ArrayList)208 File (java.io.File)181 DistributedFileSystem (org.apache.hadoop.hdfs.DistributedFileSystem)165 JobConf (org.apache.hadoop.mapred.JobConf)163 FSDataInputStream (org.apache.hadoop.fs.FSDataInputStream)151 HdfsConfiguration (org.apache.hadoop.hdfs.HdfsConfiguration)145 URI (java.net.URI)135 SequenceFile (org.apache.hadoop.io.SequenceFile)118 Text (org.apache.hadoop.io.Text)112 FileNotFoundException (java.io.FileNotFoundException)102 FsPermission (org.apache.hadoop.fs.permission.FsPermission)94 Job (org.apache.hadoop.mapreduce.Job)81