Search in sources :

Example 71 with FloatWritable

use of org.apache.hadoop.io.FloatWritable in project flink by apache.

the class HiveShimV100 method javaToWritable.

Optional<Writable> javaToWritable(@Nonnull Object value) {
    Writable writable = null;
    // in case value is already a Writable
    if (value instanceof Writable) {
        writable = (Writable) value;
    } else if (value instanceof Boolean) {
        writable = new BooleanWritable((Boolean) value);
    } else if (value instanceof Byte) {
        writable = new ByteWritable((Byte) value);
    } else if (value instanceof Short) {
        writable = new ShortWritable((Short) value);
    } else if (value instanceof Integer) {
        writable = new IntWritable((Integer) value);
    } else if (value instanceof Long) {
        writable = new LongWritable((Long) value);
    } else if (value instanceof Float) {
        writable = new FloatWritable((Float) value);
    } else if (value instanceof Double) {
        writable = new DoubleWritable((Double) value);
    } else if (value instanceof String) {
        writable = new Text((String) value);
    } else if (value instanceof HiveChar) {
        writable = new HiveCharWritable((HiveChar) value);
    } else if (value instanceof HiveVarchar) {
        writable = new HiveVarcharWritable((HiveVarchar) value);
    } else if (value instanceof HiveDecimal) {
        writable = new HiveDecimalWritable((HiveDecimal) value);
    } else if (value instanceof Date) {
        writable = new DateWritable((Date) value);
    } else if (value instanceof Timestamp) {
        writable = new TimestampWritable((Timestamp) value);
    } else if (value instanceof BigDecimal) {
        HiveDecimal hiveDecimal = HiveDecimal.create((BigDecimal) value);
        writable = new HiveDecimalWritable(hiveDecimal);
    } else if (value instanceof byte[]) {
        writable = new BytesWritable((byte[]) value);
    }
    return Optional.ofNullable(writable);
}
Also used : HiveChar(org.apache.hadoop.hive.common.type.HiveChar) DateWritable(org.apache.hadoop.hive.serde2.io.DateWritable) Writable(org.apache.hadoop.io.Writable) LongWritable(org.apache.hadoop.io.LongWritable) ShortWritable(org.apache.hadoop.hive.serde2.io.ShortWritable) IntWritable(org.apache.hadoop.io.IntWritable) HiveVarcharWritable(org.apache.hadoop.hive.serde2.io.HiveVarcharWritable) BooleanWritable(org.apache.hadoop.io.BooleanWritable) ByteWritable(org.apache.hadoop.hive.serde2.io.ByteWritable) HiveCharWritable(org.apache.hadoop.hive.serde2.io.HiveCharWritable) BytesWritable(org.apache.hadoop.io.BytesWritable) TimestampWritable(org.apache.hadoop.hive.serde2.io.TimestampWritable) DoubleWritable(org.apache.hadoop.hive.serde2.io.DoubleWritable) HiveDecimalWritable(org.apache.hadoop.hive.serde2.io.HiveDecimalWritable) FloatWritable(org.apache.hadoop.io.FloatWritable) TimestampWritable(org.apache.hadoop.hive.serde2.io.TimestampWritable) DoubleWritable(org.apache.hadoop.hive.serde2.io.DoubleWritable) ShortWritable(org.apache.hadoop.hive.serde2.io.ShortWritable) Timestamp(java.sql.Timestamp) HiveDecimal(org.apache.hadoop.hive.common.type.HiveDecimal) LongWritable(org.apache.hadoop.io.LongWritable) ByteWritable(org.apache.hadoop.hive.serde2.io.ByteWritable) IntWritable(org.apache.hadoop.io.IntWritable) HiveDecimalWritable(org.apache.hadoop.hive.serde2.io.HiveDecimalWritable) DateWritable(org.apache.hadoop.hive.serde2.io.DateWritable) HiveCharWritable(org.apache.hadoop.hive.serde2.io.HiveCharWritable) HiveVarcharWritable(org.apache.hadoop.hive.serde2.io.HiveVarcharWritable) Text(org.apache.hadoop.io.Text) BytesWritable(org.apache.hadoop.io.BytesWritable) HiveVarchar(org.apache.hadoop.hive.common.type.HiveVarchar) LocalDate(java.time.LocalDate) CatalogColumnStatisticsDataDate(org.apache.flink.table.catalog.stats.CatalogColumnStatisticsDataDate) Date(java.sql.Date) BigDecimal(java.math.BigDecimal) FloatWritable(org.apache.hadoop.io.FloatWritable) BooleanWritable(org.apache.hadoop.io.BooleanWritable)

Example 72 with FloatWritable

use of org.apache.hadoop.io.FloatWritable in project goldenorb by jzachr.

the class SampleFloatMessageTest method startServer.

/**
 */
@SuppressWarnings("unchecked")
@Before
public void startServer() throws IOException {
    server = new RPCServer<FloatMessage, FloatWritable>(SERVER_PORT);
    server.start();
    Configuration conf = new Configuration();
    InetSocketAddress addr = new InetSocketAddress("localhost", SERVER_PORT);
    if (client == null)
        client = (RPCProtocol<FloatMessage, FloatWritable>) RPC.waitForProxy(RPCProtocol.class, RPCProtocol.versionID, addr, conf);
}
Also used : FloatWritable(org.apache.hadoop.io.FloatWritable) Configuration(org.apache.hadoop.conf.Configuration) InetSocketAddress(java.net.InetSocketAddress) FloatMessage(org.goldenorb.types.message.FloatMessage) Before(org.junit.Before)

Example 73 with FloatWritable

use of org.apache.hadoop.io.FloatWritable in project goldenorb by jzachr.

the class CheckPointDataTest method testCheckpointInput.

/**
 * Tests the CheckPointDataInput class by reading several different types of Writables from the checkpoint.
 * Asserts that Writables that were written in are of the same value and type when reading in from HDFS.
 *
 * @throws Exception
 */
@Test
public void testCheckpointInput() throws Exception {
    int superStep = 0;
    int partition = 0;
    OrbConfiguration orbConf = new OrbConfiguration();
    orbConf.set("fs.default.name", "hdfs://localhost:" + cluster.getNameNodePort());
    orbConf.setJobNumber("0");
    orbConf.setFileOutputPath("test");
    CheckPointDataInput checkpointInput = new CheckPointDataInput(orbConf, superStep, partition);
    // Data is read on a FIFO basis
    IntWritable intInput = new IntWritable();
    intInput.readFields(checkpointInput);
    LongWritable longInput = new LongWritable();
    longInput.readFields(checkpointInput);
    Text textInput = new Text();
    textInput.readFields(checkpointInput);
    FloatWritable floatInput = new FloatWritable();
    floatInput.readFields(checkpointInput);
    checkpointInput.close();
    assertThat(checkpointInput, notNullValue());
    assertEquals(intInput.get(), 4);
    assertEquals(longInput.get(), 9223372036854775807L);
    assertEquals(textInput.toString(), "test");
    assertTrue(floatInput.get() == 3.14159F);
}
Also used : FloatWritable(org.apache.hadoop.io.FloatWritable) OrbConfiguration(org.goldenorb.conf.OrbConfiguration) Text(org.apache.hadoop.io.Text) LongWritable(org.apache.hadoop.io.LongWritable) CheckPointDataInput(org.goldenorb.io.input.checkpoint.CheckPointDataInput) IntWritable(org.apache.hadoop.io.IntWritable)

Example 74 with FloatWritable

use of org.apache.hadoop.io.FloatWritable in project goldenorb by jzachr.

the class CheckPointDataTest method testCheckpointOutput.

/**
 * Tests the CheckPointDataOutput class by writing several different types of Writables to the checkpoint.
 *
 * @throws Exception
 */
@Test
public void testCheckpointOutput() throws Exception {
    int superStep = 0;
    int partition = 0;
    OrbConfiguration orbConf = new OrbConfiguration();
    orbConf.set("fs.default.name", "hdfs://localhost:" + cluster.getNameNodePort());
    orbConf.setJobNumber("0");
    orbConf.setFileOutputPath("test");
    CheckPointDataOutput checkpointOutput = new CheckPointDataOutput(orbConf, superStep, partition);
    IntWritable intOutput = new IntWritable(4);
    intOutput.write(checkpointOutput);
    LongWritable longOutput = new LongWritable(9223372036854775807L);
    longOutput.write(checkpointOutput);
    Text textOutput = new Text("test");
    textOutput.write(checkpointOutput);
    FloatWritable floatOutput = new FloatWritable(3.14159F);
    floatOutput.write(checkpointOutput);
    checkpointOutput.close();
    assertThat(checkpointOutput, notNullValue());
}
Also used : CheckPointDataOutput(org.goldenorb.io.output.checkpoint.CheckPointDataOutput) FloatWritable(org.apache.hadoop.io.FloatWritable) OrbConfiguration(org.goldenorb.conf.OrbConfiguration) Text(org.apache.hadoop.io.Text) LongWritable(org.apache.hadoop.io.LongWritable) IntWritable(org.apache.hadoop.io.IntWritable)

Example 75 with FloatWritable

use of org.apache.hadoop.io.FloatWritable in project Cloud9 by lintool.

the class AnalyzeBigramRelativeFrequency method main.

@SuppressWarnings({ "static-access" })
public static void main(String[] args) {
    Options options = new Options();
    options.addOption(OptionBuilder.withArgName("path").hasArg().withDescription("input path").create(INPUT));
    CommandLine cmdline = null;
    CommandLineParser parser = new GnuParser();
    try {
        cmdline = parser.parse(options, args);
    } catch (ParseException exp) {
        System.err.println("Error parsing command line: " + exp.getMessage());
        System.exit(-1);
    }
    if (!cmdline.hasOption(INPUT)) {
        System.out.println("args: " + Arrays.toString(args));
        HelpFormatter formatter = new HelpFormatter();
        formatter.setWidth(120);
        formatter.printHelp(AnalyzeBigramRelativeFrequency.class.getName(), options);
        ToolRunner.printGenericCommandUsage(System.out);
        System.exit(-1);
    }
    String inputPath = cmdline.getOptionValue(INPUT);
    System.out.println("input path: " + inputPath);
    List<PairOfWritables<PairOfStrings, FloatWritable>> pairs = SequenceFileUtils.readDirectory(new Path(inputPath));
    List<PairOfWritables<PairOfStrings, FloatWritable>> list1 = Lists.newArrayList();
    List<PairOfWritables<PairOfStrings, FloatWritable>> list2 = Lists.newArrayList();
    for (PairOfWritables<PairOfStrings, FloatWritable> p : pairs) {
        PairOfStrings bigram = p.getLeftElement();
        if (bigram.getLeftElement().equals("light")) {
            list1.add(p);
        }
        if (bigram.getLeftElement().equals("contain")) {
            list2.add(p);
        }
    }
    Collections.sort(list1, new Comparator<PairOfWritables<PairOfStrings, FloatWritable>>() {

        public int compare(PairOfWritables<PairOfStrings, FloatWritable> e1, PairOfWritables<PairOfStrings, FloatWritable> e2) {
            if (e1.getRightElement().compareTo(e2.getRightElement()) == 0) {
                return e1.getLeftElement().compareTo(e2.getLeftElement());
            }
            return e2.getRightElement().compareTo(e1.getRightElement());
        }
    });
    Iterator<PairOfWritables<PairOfStrings, FloatWritable>> iter1 = Iterators.limit(list1.iterator(), 10);
    while (iter1.hasNext()) {
        PairOfWritables<PairOfStrings, FloatWritable> p = iter1.next();
        PairOfStrings bigram = p.getLeftElement();
        System.out.println(bigram + "\t" + p.getRightElement());
    }
    Collections.sort(list2, new Comparator<PairOfWritables<PairOfStrings, FloatWritable>>() {

        public int compare(PairOfWritables<PairOfStrings, FloatWritable> e1, PairOfWritables<PairOfStrings, FloatWritable> e2) {
            if (e1.getRightElement().compareTo(e2.getRightElement()) == 0) {
                return e1.getLeftElement().compareTo(e2.getLeftElement());
            }
            return e2.getRightElement().compareTo(e1.getRightElement());
        }
    });
    Iterator<PairOfWritables<PairOfStrings, FloatWritable>> iter2 = Iterators.limit(list2.iterator(), 10);
    while (iter2.hasNext()) {
        PairOfWritables<PairOfStrings, FloatWritable> p = iter2.next();
        PairOfStrings bigram = p.getLeftElement();
        System.out.println(bigram + "\t" + p.getRightElement());
    }
}
Also used : Path(org.apache.hadoop.fs.Path) Options(org.apache.commons.cli.Options) GnuParser(org.apache.commons.cli.GnuParser) HelpFormatter(org.apache.commons.cli.HelpFormatter) CommandLine(org.apache.commons.cli.CommandLine) FloatWritable(org.apache.hadoop.io.FloatWritable) PairOfWritables(tl.lin.data.pair.PairOfWritables) PairOfStrings(tl.lin.data.pair.PairOfStrings) CommandLineParser(org.apache.commons.cli.CommandLineParser) ParseException(org.apache.commons.cli.ParseException)

Aggregations

FloatWritable (org.apache.hadoop.io.FloatWritable)111 IntWritable (org.apache.hadoop.io.IntWritable)68 LongWritable (org.apache.hadoop.io.LongWritable)65 BooleanWritable (org.apache.hadoop.io.BooleanWritable)54 Text (org.apache.hadoop.io.Text)51 Test (org.junit.Test)49 DoubleWritable (org.apache.hadoop.hive.serde2.io.DoubleWritable)44 ShortWritable (org.apache.hadoop.hive.serde2.io.ShortWritable)40 BytesWritable (org.apache.hadoop.io.BytesWritable)40 ByteWritable (org.apache.hadoop.hive.serde2.io.ByteWritable)37 Writable (org.apache.hadoop.io.Writable)28 HiveDecimalWritable (org.apache.hadoop.hive.serde2.io.HiveDecimalWritable)27 ArrayList (java.util.ArrayList)24 Configuration (org.apache.hadoop.conf.Configuration)18 HiveCharWritable (org.apache.hadoop.hive.serde2.io.HiveCharWritable)18 ObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector)18 Path (org.apache.hadoop.fs.Path)17 HiveChar (org.apache.hadoop.hive.common.type.HiveChar)17 HiveVarchar (org.apache.hadoop.hive.common.type.HiveVarchar)17 HiveVarcharWritable (org.apache.hadoop.hive.serde2.io.HiveVarcharWritable)17