Search in sources :

Example 26 with InputRow

use of org.apache.druid.data.input.InputRow in project druid by druid-io.

the class TransformerTest method testTransformTimeColumn.

@Test
public void testTransformTimeColumn() {
    final Transformer transformer = new Transformer(new TransformSpec(null, ImmutableList.of(new ExpressionTransform("__time", "timestamp_shift(__time, 'P1D', -2)", TestExprMacroTable.INSTANCE))));
    final DateTime now = DateTimes.nowUtc();
    final InputRow row = new MapBasedInputRow(now, ImmutableList.of("dim"), ImmutableMap.of("__time", now, "dim", false));
    final InputRow actual = transformer.transform(row);
    Assert.assertNotNull(actual);
    Assert.assertEquals(now.minusDays(2), actual.getTimestamp());
}
Also used : MapBasedInputRow(org.apache.druid.data.input.MapBasedInputRow) InputRow(org.apache.druid.data.input.InputRow) MapBasedInputRow(org.apache.druid.data.input.MapBasedInputRow) DateTime(org.joda.time.DateTime) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 27 with InputRow

use of org.apache.druid.data.input.InputRow in project druid by druid-io.

the class TransformerTest method testTransformWithStringTransformOnListColumnThrowingException.

@Ignore("Disabled until https://github.com/apache/druid/issues/9824 is fixed")
@Test
public void testTransformWithStringTransformOnListColumnThrowingException() {
    final Transformer transformer = new Transformer(new TransformSpec(null, ImmutableList.of(new ExpressionTransform("dim", "strlen(dim)", TestExprMacroTable.INSTANCE))));
    final InputRow row = new MapBasedInputRow(DateTimes.nowUtc(), ImmutableList.of("dim"), ImmutableMap.of("dim", ImmutableList.of(10, 20, 100)));
    final InputRow actual = transformer.transform(row);
    Assert.assertNotNull(actual);
    Assert.assertEquals(ImmutableList.of("dim"), actual.getDimensions());
    // Unlike for querying, Druid doesn't explode multi-valued columns automatically for ingestion.
    expectedException.expect(AssertionError.class);
    actual.getRaw("dim");
}
Also used : MapBasedInputRow(org.apache.druid.data.input.MapBasedInputRow) InputRow(org.apache.druid.data.input.InputRow) MapBasedInputRow(org.apache.druid.data.input.MapBasedInputRow) Ignore(org.junit.Ignore) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 28 with InputRow

use of org.apache.druid.data.input.InputRow in project druid by druid-io.

the class HyperUniquesSerdeForTest method getExtractor.

@Override
public ComplexMetricExtractor getExtractor() {
    return new ComplexMetricExtractor() {

        @Override
        public Class<HyperLogLogCollector> extractedClass() {
            return HyperLogLogCollector.class;
        }

        @Override
        public HyperLogLogCollector extractValue(InputRow inputRow, String metricName) {
            Object rawValue = inputRow.getRaw(metricName);
            if (rawValue instanceof HyperLogLogCollector) {
                return (HyperLogLogCollector) rawValue;
            } else {
                HyperLogLogCollector collector = HyperLogLogCollector.makeLatestCollector();
                List<String> dimValues = inputRow.getDimension(metricName);
                if (dimValues == null) {
                    return collector;
                }
                for (String dimensionValue : dimValues) {
                    collector.add(hashFn.hashBytes(StringUtils.toUtf8(dimensionValue)).asBytes());
                }
                return collector;
            }
        }
    };
}
Also used : HyperLogLogCollector(org.apache.druid.hll.HyperLogLogCollector) InputRow(org.apache.druid.data.input.InputRow)

Example 29 with InputRow

use of org.apache.druid.data.input.InputRow in project druid by druid-io.

the class TransformSpecTest method testTransformTimeFromOtherFields.

@Test
public void testTransformTimeFromOtherFields() {
    final TransformSpec transformSpec = new TransformSpec(null, ImmutableList.of(new ExpressionTransform("__time", "(a + b) * 3600000", TestExprMacroTable.INSTANCE)));
    Assert.assertEquals(ImmutableSet.of("a", "b"), transformSpec.getRequiredColumns());
    final InputRowParser<Map<String, Object>> parser = transformSpec.decorate(PARSER);
    final InputRow row = parser.parseBatch(ROW1).get(0);
    Assert.assertNotNull(row);
    Assert.assertEquals(DateTimes.of("1970-01-01T05:00:00Z"), row.getTimestamp());
    Assert.assertEquals(DateTimes.of("1970-01-01T05:00:00Z").getMillis(), row.getTimestampFromEpoch());
}
Also used : InputRow(org.apache.druid.data.input.InputRow) ImmutableMap(com.google.common.collect.ImmutableMap) Map(java.util.Map) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 30 with InputRow

use of org.apache.druid.data.input.InputRow in project druid by druid-io.

the class IndexMergerTestBase method persistAndLoad.

private QueryableIndex persistAndLoad(List<DimensionSchema> schema, InputRow... rows) throws IOException {
    IncrementalIndex toPersist = IncrementalIndexTest.createIndex(null, new DimensionsSpec(schema));
    for (InputRow row : rows) {
        toPersist.add(row);
    }
    final File tempDir = temporaryFolder.newFolder();
    return closer.closeLater(indexIO.loadIndex(indexMerger.persist(toPersist, tempDir, indexSpec, null)));
}
Also used : IncrementalIndex(org.apache.druid.segment.incremental.IncrementalIndex) OnheapIncrementalIndex(org.apache.druid.segment.incremental.OnheapIncrementalIndex) InputRow(org.apache.druid.data.input.InputRow) MapBasedInputRow(org.apache.druid.data.input.MapBasedInputRow) DimensionsSpec(org.apache.druid.data.input.impl.DimensionsSpec) File(java.io.File)

Aggregations

InputRow (org.apache.druid.data.input.InputRow)266 Test (org.junit.Test)193 MapBasedInputRow (org.apache.druid.data.input.MapBasedInputRow)57 InputEntityReader (org.apache.druid.data.input.InputEntityReader)54 InputRowSchema (org.apache.druid.data.input.InputRowSchema)52 DimensionsSpec (org.apache.druid.data.input.impl.DimensionsSpec)52 TimestampSpec (org.apache.druid.data.input.impl.TimestampSpec)49 ArrayList (java.util.ArrayList)46 List (java.util.List)37 ImmutableList (com.google.common.collect.ImmutableList)33 JSONPathSpec (org.apache.druid.java.util.common.parsers.JSONPathSpec)33 InitializedNullHandlingTest (org.apache.druid.testing.InitializedNullHandlingTest)33 InputRowListPlusRawValues (org.apache.druid.data.input.InputRowListPlusRawValues)29 File (java.io.File)27 HadoopDruidIndexerConfig (org.apache.druid.indexer.HadoopDruidIndexerConfig)27 JSONPathFieldSpec (org.apache.druid.java.util.common.parsers.JSONPathFieldSpec)27 DateTime (org.joda.time.DateTime)24 Map (java.util.Map)23 IOException (java.io.IOException)18 Interval (org.joda.time.Interval)18