Search in sources :

Example 11 with AccumuloHiveRow

use of org.apache.hadoop.hive.accumulo.AccumuloHiveRow in project hive by apache.

the class TestHiveAccumuloTableInputFormat method testGetOnlyName.

@Test
public void testGetOnlyName() throws Exception {
    FileInputFormat.addInputPath(conf, new Path("unused"));
    InputSplit[] splits = inputformat.getSplits(conf, 0);
    assertEquals(splits.length, 1);
    RecordReader<Text, AccumuloHiveRow> reader = inputformat.getRecordReader(splits[0], conf, null);
    Text rowId = new Text("r1");
    AccumuloHiveRow row = new AccumuloHiveRow();
    assertTrue(reader.next(rowId, row));
    assertEquals(row.getRowId(), rowId.toString());
    assertTrue(row.hasFamAndQual(COLUMN_FAMILY, NAME));
    assertArrayEquals(row.getValue(COLUMN_FAMILY, NAME), "brian".getBytes());
    rowId = new Text("r2");
    assertTrue(reader.next(rowId, row));
    assertEquals(row.getRowId(), rowId.toString());
    assertTrue(row.hasFamAndQual(COLUMN_FAMILY, NAME));
    assertArrayEquals(row.getValue(COLUMN_FAMILY, NAME), "mark".getBytes());
    rowId = new Text("r3");
    assertTrue(reader.next(rowId, row));
    assertEquals(row.getRowId(), rowId.toString());
    assertTrue(row.hasFamAndQual(COLUMN_FAMILY, NAME));
    assertArrayEquals(row.getValue(COLUMN_FAMILY, NAME), "dennis".getBytes());
    assertFalse(reader.next(rowId, row));
}
Also used : Path(org.apache.hadoop.fs.Path) Text(org.apache.hadoop.io.Text) InputSplit(org.apache.hadoop.mapred.InputSplit) AccumuloHiveRow(org.apache.hadoop.hive.accumulo.AccumuloHiveRow) Test(org.junit.Test)

Example 12 with AccumuloHiveRow

use of org.apache.hadoop.hive.accumulo.AccumuloHiveRow in project hive by apache.

the class TestAccumuloSerDe method invalidColMapping.

@Test(expected = InvalidColumnMappingException.class)
public void invalidColMapping() throws Exception {
    Properties properties = new Properties();
    Configuration conf = new Configuration();
    properties.setProperty(AccumuloSerDeParameters.COLUMN_MAPPINGS, "cf,cf:f2,cf:f3");
    properties.setProperty(serdeConstants.LIST_COLUMNS, "field2,field3,field4");
    serde.initialize(conf, properties);
    AccumuloHiveRow row = new AccumuloHiveRow();
    row.setRowId("r1");
    Object obj = serde.deserialize(row);
    assertTrue(obj instanceof LazyAccumuloRow);
    LazyAccumuloRow lazyRow = (LazyAccumuloRow) obj;
    lazyRow.getField(0);
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) LazyAccumuloRow(org.apache.hadoop.hive.accumulo.LazyAccumuloRow) Properties(java.util.Properties) AccumuloHiveRow(org.apache.hadoop.hive.accumulo.AccumuloHiveRow) Test(org.junit.Test)

Aggregations

AccumuloHiveRow (org.apache.hadoop.hive.accumulo.AccumuloHiveRow)12 Test (org.junit.Test)12 LazyString (org.apache.hadoop.hive.serde2.lazy.LazyString)7 Properties (java.util.Properties)6 Configuration (org.apache.hadoop.conf.Configuration)6 Path (org.apache.hadoop.fs.Path)6 LazyAccumuloRow (org.apache.hadoop.hive.accumulo.LazyAccumuloRow)6 Text (org.apache.hadoop.io.Text)6 InputSplit (org.apache.hadoop.mapred.InputSplit)6 Mutation (org.apache.accumulo.core.data.Mutation)4 BatchWriter (org.apache.accumulo.core.client.BatchWriter)3 BatchWriterConfig (org.apache.accumulo.core.client.BatchWriterConfig)3 Value (org.apache.accumulo.core.data.Value)3 Authorizations (org.apache.accumulo.core.security.Authorizations)3 ByteArrayRef (org.apache.hadoop.hive.serde2.lazy.ByteArrayRef)3 ByteArrayOutputStream (java.io.ByteArrayOutputStream)2 Date (java.sql.Date)2 Timestamp (java.sql.Timestamp)2 Connector (org.apache.accumulo.core.client.Connector)2 MockInstance (org.apache.accumulo.core.client.mock.MockInstance)2