Search in sources :

Example 11 with DateTime

use of org.joda.time.DateTime in project druid by druid-io.

the class MapVirtualColumnTest method constructorFeeder.

@Parameterized.Parameters
public static Iterable<Object[]> constructorFeeder() throws IOException {
    final Supplier<SelectQueryConfig> selectConfigSupplier = Suppliers.ofInstance(new SelectQueryConfig(true));
    SelectQueryRunnerFactory factory = new SelectQueryRunnerFactory(new SelectQueryQueryToolChest(new DefaultObjectMapper(), QueryRunnerTestHelper.NoopIntervalChunkingQueryRunnerDecorator(), selectConfigSupplier), new SelectQueryEngine(selectConfigSupplier), QueryRunnerTestHelper.NOOP_QUERYWATCHER);
    final IncrementalIndexSchema schema = new IncrementalIndexSchema.Builder().withMinTimestamp(new DateTime("2011-01-12T00:00:00.000Z").getMillis()).withQueryGranularity(Granularities.NONE).build();
    final IncrementalIndex index = new OnheapIncrementalIndex(schema, true, 10000);
    final StringInputRowParser parser = new StringInputRowParser(new DelimitedParseSpec(new TimestampSpec("ts", "iso", null), new DimensionsSpec(DimensionsSpec.getDefaultSchemas(Arrays.asList("dim", "keys", "values")), null, null), "\t", ",", Arrays.asList("ts", "dim", "keys", "values")), "utf8");
    CharSource input = CharSource.wrap("2011-01-12T00:00:00.000Z\ta\tkey1,key2,key3\tvalue1,value2,value3\n" + "2011-01-12T00:00:00.000Z\tb\tkey4,key5,key6\tvalue4\n" + "2011-01-12T00:00:00.000Z\tc\tkey1,key5\tvalue1,value5,value9\n");
    IncrementalIndex index1 = TestIndex.loadIncrementalIndex(index, input, parser);
    QueryableIndex index2 = TestIndex.persistRealtimeAndLoadMMapped(index1);
    return transformToConstructionFeeder(Arrays.asList(makeQueryRunner(factory, "index1", new IncrementalIndexSegment(index1, "index1"), "incremental"), makeQueryRunner(factory, "index2", new QueryableIndexSegment("index2", index2), "queryable")));
}
Also used : CharSource(com.google.common.io.CharSource) IncrementalIndex(io.druid.segment.incremental.IncrementalIndex) OnheapIncrementalIndex(io.druid.segment.incremental.OnheapIncrementalIndex) DelimitedParseSpec(io.druid.data.input.impl.DelimitedParseSpec) OnheapIncrementalIndex(io.druid.segment.incremental.OnheapIncrementalIndex) SelectQueryRunnerFactory(io.druid.query.select.SelectQueryRunnerFactory) SelectQueryConfig(io.druid.query.select.SelectQueryConfig) DateTime(org.joda.time.DateTime) SelectQueryQueryToolChest(io.druid.query.select.SelectQueryQueryToolChest) SelectQueryEngine(io.druid.query.select.SelectQueryEngine) StringInputRowParser(io.druid.data.input.impl.StringInputRowParser) TimestampSpec(io.druid.data.input.impl.TimestampSpec) DimensionsSpec(io.druid.data.input.impl.DimensionsSpec) DefaultObjectMapper(io.druid.jackson.DefaultObjectMapper) IncrementalIndexSchema(io.druid.segment.incremental.IncrementalIndexSchema)

Example 12 with DateTime

use of org.joda.time.DateTime in project druid by druid-io.

the class AvroStreamInputRowParser method parseGenericRecord.

protected static InputRow parseGenericRecord(GenericRecord record, ParseSpec parseSpec, List<String> dimensions, boolean fromPigAvroStorage, boolean binaryAsString) {
    GenericRecordAsMap genericRecordAsMap = new GenericRecordAsMap(record, fromPigAvroStorage, binaryAsString);
    TimestampSpec timestampSpec = parseSpec.getTimestampSpec();
    DateTime dateTime = timestampSpec.extractTimestamp(genericRecordAsMap);
    return new MapBasedInputRow(dateTime, dimensions, genericRecordAsMap);
}
Also used : GenericRecordAsMap(io.druid.data.input.avro.GenericRecordAsMap) TimestampSpec(io.druid.data.input.impl.TimestampSpec) DateTime(org.joda.time.DateTime)

Example 13 with DateTime

use of org.joda.time.DateTime in project druid by druid-io.

the class OrcHadoopInputRowParser method parse.

@Override
public InputRow parse(OrcStruct input) {
    Map<String, Object> map = Maps.newHashMap();
    List<? extends StructField> fields = oip.getAllStructFieldRefs();
    for (StructField field : fields) {
        ObjectInspector objectInspector = field.getFieldObjectInspector();
        switch(objectInspector.getCategory()) {
            case PRIMITIVE:
                PrimitiveObjectInspector primitiveObjectInspector = (PrimitiveObjectInspector) objectInspector;
                map.put(field.getFieldName(), primitiveObjectInspector.getPrimitiveJavaObject(oip.getStructFieldData(input, field)));
                break;
            case // array case - only 1-depth array supported yet
            LIST:
                ListObjectInspector listObjectInspector = (ListObjectInspector) objectInspector;
                map.put(field.getFieldName(), getListObject(listObjectInspector, oip.getStructFieldData(input, field)));
                break;
            default:
                break;
        }
    }
    TimestampSpec timestampSpec = parseSpec.getTimestampSpec();
    DateTime dateTime = timestampSpec.extractTimestamp(map);
    return new MapBasedInputRow(dateTime, dimensions, map);
}
Also used : ListObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) StructObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector) StructField(org.apache.hadoop.hive.serde2.objectinspector.StructField) ListObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector) TimestampSpec(io.druid.data.input.impl.TimestampSpec) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) MapBasedInputRow(io.druid.data.input.MapBasedInputRow) DateTime(org.joda.time.DateTime)

Example 14 with DateTime

use of org.joda.time.DateTime in project druid by druid-io.

the class OrcIndexGeneratorJobTest method verifyJob.

private void verifyJob(IndexGeneratorJob job) throws IOException {
    JobHelper.runJobs(ImmutableList.<Jobby>of(job), config);
    int segmentNum = 0;
    for (DateTime currTime = interval.getStart(); currTime.isBefore(interval.getEnd()); currTime = currTime.plusDays(1)) {
        Integer[][] shardInfo = shardInfoForEachSegment[segmentNum++];
        File segmentOutputFolder = new File(String.format("%s/%s/%s_%s/%s", config.getSchema().getIOConfig().getSegmentOutputPath(), config.getSchema().getDataSchema().getDataSource(), currTime.toString(), currTime.plusDays(1).toString(), config.getSchema().getTuningConfig().getVersion()));
        Assert.assertTrue(segmentOutputFolder.exists());
        Assert.assertEquals(shardInfo.length, segmentOutputFolder.list().length);
        int rowCount = 0;
        for (int partitionNum = 0; partitionNum < shardInfo.length; ++partitionNum) {
            File individualSegmentFolder = new File(segmentOutputFolder, Integer.toString(partitionNum));
            Assert.assertTrue(individualSegmentFolder.exists());
            File descriptor = new File(individualSegmentFolder, "descriptor.json");
            File indexZip = new File(individualSegmentFolder, "index.zip");
            Assert.assertTrue(descriptor.exists());
            Assert.assertTrue(indexZip.exists());
            DataSegment dataSegment = mapper.readValue(descriptor, DataSegment.class);
            Assert.assertEquals(config.getSchema().getTuningConfig().getVersion(), dataSegment.getVersion());
            Assert.assertEquals(new Interval(currTime, currTime.plusDays(1)), dataSegment.getInterval());
            Assert.assertEquals("local", dataSegment.getLoadSpec().get("type"));
            Assert.assertEquals(indexZip.getCanonicalPath(), dataSegment.getLoadSpec().get("path"));
            Assert.assertEquals(Integer.valueOf(9), dataSegment.getBinaryVersion());
            Assert.assertEquals(dataSourceName, dataSegment.getDataSource());
            Assert.assertTrue(dataSegment.getDimensions().size() == 1);
            String[] dimensions = dataSegment.getDimensions().toArray(new String[dataSegment.getDimensions().size()]);
            Arrays.sort(dimensions);
            Assert.assertEquals("host", dimensions[0]);
            Assert.assertEquals("visited_num", dataSegment.getMetrics().get(0));
            Assert.assertEquals("unique_hosts", dataSegment.getMetrics().get(1));
            Integer[] hashShardInfo = shardInfo[partitionNum];
            HashBasedNumberedShardSpec spec = (HashBasedNumberedShardSpec) dataSegment.getShardSpec();
            Assert.assertEquals((int) hashShardInfo[0], spec.getPartitionNum());
            Assert.assertEquals((int) hashShardInfo[1], spec.getPartitions());
            File dir = Files.createTempDir();
            unzip(indexZip, dir);
            QueryableIndex index = HadoopDruidIndexerConfig.INDEX_IO.loadIndex(dir);
            QueryableIndexIndexableAdapter adapter = new QueryableIndexIndexableAdapter(index);
            for (Rowboat row : adapter.getRows()) {
                Object[] metrics = row.getMetrics();
                rowCount++;
                Assert.assertTrue(metrics.length == 2);
            }
        }
        Assert.assertEquals(rowCount, data.size());
    }
}
Also used : HashBasedNumberedShardSpec(io.druid.timeline.partition.HashBasedNumberedShardSpec) DataSegment(io.druid.timeline.DataSegment) DateTime(org.joda.time.DateTime) QueryableIndexIndexableAdapter(io.druid.segment.QueryableIndexIndexableAdapter) QueryableIndex(io.druid.segment.QueryableIndex) OrcFile(org.apache.orc.OrcFile) File(java.io.File) Rowboat(io.druid.segment.Rowboat) Interval(org.joda.time.Interval)

Example 15 with DateTime

use of org.joda.time.DateTime in project druid by druid-io.

the class StatsDEmitterTest method testNoConvertRange.

@Test
public void testNoConvertRange() {
    StatsDClient client = createMock(StatsDClient.class);
    StatsDEmitter emitter = new StatsDEmitter(new StatsDEmitterConfig("localhost", 8888, null, null, null, null), new ObjectMapper(), client);
    client.time("broker.query.time.data-source.groupBy", 10);
    replay(client);
    emitter.emit(new ServiceMetricEvent.Builder().setDimension("dataSource", "data-source").setDimension("type", "groupBy").setDimension("interval", "2013/2015").setDimension("some_random_dim1", "random_dim_value1").setDimension("some_random_dim2", "random_dim_value2").setDimension("hasFilters", "no").setDimension("duration", "P1D").setDimension("remoteAddress", "194.0.90.2").setDimension("id", "ID").setDimension("context", "{context}").build(new DateTime(), "query/time", 10).build("broker", "brokerHost1"));
    verify(client);
}
Also used : StatsDClient(com.timgroup.statsd.StatsDClient) StatsDEmitter(io.druid.emitter.statsd.StatsDEmitter) ServiceMetricEvent(com.metamx.emitter.service.ServiceMetricEvent) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) DateTime(org.joda.time.DateTime) StatsDEmitterConfig(io.druid.emitter.statsd.StatsDEmitterConfig) Test(org.junit.Test)

Aggregations

DateTime (org.joda.time.DateTime)3381 Test (org.junit.Test)1000 Test (org.testng.annotations.Test)499 DateTimeRfc1123 (com.microsoft.rest.DateTimeRfc1123)349 ResponseBody (okhttp3.ResponseBody)332 ArrayList (java.util.ArrayList)299 LocalDate (org.joda.time.LocalDate)256 Date (java.util.Date)239 Interval (org.joda.time.Interval)200 Result (io.druid.query.Result)153 ServiceCall (com.microsoft.rest.ServiceCall)148 HashMap (java.util.HashMap)144 BigDecimal (java.math.BigDecimal)132 List (java.util.List)131 DateTimeZone (org.joda.time.DateTimeZone)127 LocalDateTime (org.joda.time.LocalDateTime)98 UUID (java.util.UUID)93 DateTimeFormatter (org.joda.time.format.DateTimeFormatter)88 IOException (java.io.IOException)85 Map (java.util.Map)85