Search in sources :

Example 6 with SpatialDimFilter

use of org.apache.druid.query.filter.SpatialDimFilter in project druid by druid-io.

the class SpatialFilterBonusTest method testSpatialQueryMorePoints.

@Test
public void testSpatialQueryMorePoints() {
    TimeseriesQuery query = Druids.newTimeseriesQueryBuilder().dataSource("test").granularity(Granularities.DAY).intervals(Collections.singletonList(Intervals.of("2013-01-01/2013-01-07"))).filters(new SpatialDimFilter("dim.geo", new RectangularBound(new float[] { 0.0f, 0.0f }, new float[] { 9.0f, 9.0f }))).aggregators(Arrays.asList(new CountAggregatorFactory("rows"), new LongSumAggregatorFactory("val", "val"))).build();
    List<Result<TimeseriesResultValue>> expectedResults = Arrays.asList(new Result<TimeseriesResultValue>(DateTimes.of("2013-01-01T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 17L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-02T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 29L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-03T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 13L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-04T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 91L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-05T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 47L).build())));
    try {
        TimeseriesQueryRunnerFactory factory = new TimeseriesQueryRunnerFactory(new TimeseriesQueryQueryToolChest(), new TimeseriesQueryEngine(), QueryRunnerTestHelper.NOOP_QUERYWATCHER);
        QueryRunner runner = new FinalizeResultsQueryRunner(factory.createRunner(segment), factory.getToolchest());
        TestHelper.assertExpectedResults(expectedResults, runner.run(QueryPlus.wrap(query)));
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : TimeseriesResultValue(org.apache.druid.query.timeseries.TimeseriesResultValue) TimeseriesQuery(org.apache.druid.query.timeseries.TimeseriesQuery) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) TimeseriesQueryQueryToolChest(org.apache.druid.query.timeseries.TimeseriesQueryQueryToolChest) QueryRunner(org.apache.druid.query.QueryRunner) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) IOException(java.io.IOException) RectangularBound(org.apache.druid.collections.spatial.search.RectangularBound) Result(org.apache.druid.query.Result) TimeseriesQueryEngine(org.apache.druid.query.timeseries.TimeseriesQueryEngine) SpatialDimFilter(org.apache.druid.query.filter.SpatialDimFilter) TimeseriesQueryRunnerFactory(org.apache.druid.query.timeseries.TimeseriesQueryRunnerFactory) CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) Test(org.junit.Test)

Example 7 with SpatialDimFilter

use of org.apache.druid.query.filter.SpatialDimFilter in project druid by druid-io.

the class SpatialFilterTest method testSpatialQueryMorePoints.

@Test
public void testSpatialQueryMorePoints() {
    TimeseriesQuery query = Druids.newTimeseriesQueryBuilder().dataSource("test").granularity(Granularities.DAY).intervals(Collections.singletonList(Intervals.of("2013-01-01/2013-01-07"))).filters(new SpatialDimFilter("dim.geo", new RectangularBound(new float[] { 0.0f, 0.0f }, new float[] { 9.0f, 9.0f }))).aggregators(Arrays.asList(new CountAggregatorFactory("rows"), new LongSumAggregatorFactory("val", "val"))).build();
    List<Result<TimeseriesResultValue>> expectedResults = Arrays.asList(new Result<TimeseriesResultValue>(DateTimes.of("2013-01-01T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 17L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-02T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 29L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-03T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 13L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-04T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 91L).build())), new Result<TimeseriesResultValue>(DateTimes.of("2013-01-05T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 47L).build())));
    try {
        TimeseriesQueryRunnerFactory factory = new TimeseriesQueryRunnerFactory(new TimeseriesQueryQueryToolChest(), new TimeseriesQueryEngine(), QueryRunnerTestHelper.NOOP_QUERYWATCHER);
        QueryRunner runner = new FinalizeResultsQueryRunner(factory.createRunner(segment), factory.getToolchest());
        TestHelper.assertExpectedResults(expectedResults, runner.run(QueryPlus.wrap(query)));
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : TimeseriesResultValue(org.apache.druid.query.timeseries.TimeseriesResultValue) TimeseriesQuery(org.apache.druid.query.timeseries.TimeseriesQuery) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) TimeseriesQueryQueryToolChest(org.apache.druid.query.timeseries.TimeseriesQueryQueryToolChest) QueryRunner(org.apache.druid.query.QueryRunner) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) IOException(java.io.IOException) RectangularBound(org.apache.druid.collections.spatial.search.RectangularBound) Result(org.apache.druid.query.Result) TimeseriesQueryEngine(org.apache.druid.query.timeseries.TimeseriesQueryEngine) SpatialDimFilter(org.apache.druid.query.filter.SpatialDimFilter) TimeseriesQueryRunnerFactory(org.apache.druid.query.timeseries.TimeseriesQueryRunnerFactory) CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 8 with SpatialDimFilter

use of org.apache.druid.query.filter.SpatialDimFilter in project druid by druid-io.

the class IndexMergerV9WithSpatialIndexTest method testSpatialQueryMorePoints.

@Test
public void testSpatialQueryMorePoints() {
    TimeseriesQuery query = Druids.newTimeseriesQueryBuilder().dataSource("test").granularity(Granularities.DAY).intervals(Collections.singletonList(Intervals.of("2013-01-01/2013-01-07"))).filters(new SpatialDimFilter("dim.geo", new RectangularBound(new float[] { 0.0f, 0.0f }, new float[] { 9.0f, 9.0f }))).aggregators(Arrays.asList(new CountAggregatorFactory("rows"), new LongSumAggregatorFactory("val", "val"))).build();
    List<Result<TimeseriesResultValue>> expectedResults = Arrays.asList(new Result<>(DateTimes.of("2013-01-01T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 17L).build())), new Result<>(DateTimes.of("2013-01-02T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 29L).build())), new Result<>(DateTimes.of("2013-01-03T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 13L).build())), new Result<>(DateTimes.of("2013-01-04T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 91L).build())), new Result<>(DateTimes.of("2013-01-05T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 1L).put("val", 47L).build())));
    try {
        TimeseriesQueryRunnerFactory factory = new TimeseriesQueryRunnerFactory(new TimeseriesQueryQueryToolChest(), new TimeseriesQueryEngine(), QueryRunnerTestHelper.NOOP_QUERYWATCHER);
        QueryRunner runner = new FinalizeResultsQueryRunner(factory.createRunner(segment), factory.getToolchest());
        TestHelper.assertExpectedResults(expectedResults, runner.run(QueryPlus.wrap(query)));
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : TimeseriesResultValue(org.apache.druid.query.timeseries.TimeseriesResultValue) TimeseriesQuery(org.apache.druid.query.timeseries.TimeseriesQuery) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) TimeseriesQueryQueryToolChest(org.apache.druid.query.timeseries.TimeseriesQueryQueryToolChest) QueryRunner(org.apache.druid.query.QueryRunner) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) IOException(java.io.IOException) RectangularBound(org.apache.druid.collections.spatial.search.RectangularBound) Result(org.apache.druid.query.Result) TimeseriesQueryEngine(org.apache.druid.query.timeseries.TimeseriesQueryEngine) SpatialDimFilter(org.apache.druid.query.filter.SpatialDimFilter) TimeseriesQueryRunnerFactory(org.apache.druid.query.timeseries.TimeseriesQueryRunnerFactory) CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 9 with SpatialDimFilter

use of org.apache.druid.query.filter.SpatialDimFilter in project druid by druid-io.

the class IndexMergerV9WithSpatialIndexTest method testSpatialQuery.

@Test
public void testSpatialQuery() {
    TimeseriesQuery query = Druids.newTimeseriesQueryBuilder().dataSource("test").granularity(Granularities.ALL).intervals(Collections.singletonList(Intervals.of("2013-01-01/2013-01-07"))).filters(new SpatialDimFilter("dim.geo", new RadiusBound(new float[] { 0.0f, 0.0f }, 5))).aggregators(Arrays.asList(new CountAggregatorFactory("rows"), new LongSumAggregatorFactory("val", "val"))).build();
    List<Result<TimeseriesResultValue>> expectedResults = Collections.singletonList(new Result<>(DateTimes.of("2013-01-01T00:00:00.000Z"), new TimeseriesResultValue(ImmutableMap.<String, Object>builder().put("rows", 3L).put("val", 59L).build())));
    try {
        TimeseriesQueryRunnerFactory factory = new TimeseriesQueryRunnerFactory(new TimeseriesQueryQueryToolChest(), new TimeseriesQueryEngine(), QueryRunnerTestHelper.NOOP_QUERYWATCHER);
        QueryRunner runner = new FinalizeResultsQueryRunner(factory.createRunner(segment), factory.getToolchest());
        TestHelper.assertExpectedResults(expectedResults, runner.run(QueryPlus.wrap(query)));
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
Also used : TimeseriesResultValue(org.apache.druid.query.timeseries.TimeseriesResultValue) TimeseriesQuery(org.apache.druid.query.timeseries.TimeseriesQuery) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) TimeseriesQueryQueryToolChest(org.apache.druid.query.timeseries.TimeseriesQueryQueryToolChest) QueryRunner(org.apache.druid.query.QueryRunner) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) IOException(java.io.IOException) Result(org.apache.druid.query.Result) TimeseriesQueryEngine(org.apache.druid.query.timeseries.TimeseriesQueryEngine) SpatialDimFilter(org.apache.druid.query.filter.SpatialDimFilter) TimeseriesQueryRunnerFactory(org.apache.druid.query.timeseries.TimeseriesQueryRunnerFactory) RadiusBound(org.apache.druid.collections.spatial.search.RadiusBound) CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) FinalizeResultsQueryRunner(org.apache.druid.query.FinalizeResultsQueryRunner) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 10 with SpatialDimFilter

use of org.apache.druid.query.filter.SpatialDimFilter in project druid by druid-io.

the class IngestSegmentFirehoseTest method testReadFromIndexAndWriteAnotherIndex.

@Test
public void testReadFromIndexAndWriteAnotherIndex() throws Exception {
    // Tests a "reindexing" use case that is a common use of ingestSegment.
    File segmentDir = tempFolder.newFolder();
    createTestIndex(segmentDir);
    try (final QueryableIndex qi = indexIO.loadIndex(segmentDir);
        final IncrementalIndex index = new OnheapIncrementalIndex.Builder().setIndexSchema(new IncrementalIndexSchema.Builder().withDimensionsSpec(DIMENSIONS_SPEC_REINDEX).withMetrics(AGGREGATORS_REINDEX.toArray(new AggregatorFactory[0])).build()).setMaxRowCount(5000).build()) {
        final StorageAdapter sa = new QueryableIndexStorageAdapter(qi);
        final WindowedStorageAdapter wsa = new WindowedStorageAdapter(sa, sa.getInterval());
        final IngestSegmentFirehose firehose = new IngestSegmentFirehose(ImmutableList.of(wsa, wsa), TransformSpec.NONE, ImmutableList.of("host", "spatial"), ImmutableList.of("visited_sum", "unique_hosts"), null);
        int count = 0;
        while (firehose.hasMore()) {
            final InputRow row = firehose.nextRow();
            Assert.assertNotNull(row);
            if (count == 0) {
                Assert.assertEquals(DateTimes.of("2014-10-22T00Z"), row.getTimestamp());
                Assert.assertEquals("host1", row.getRaw("host"));
                Assert.assertEquals("0,1", row.getRaw("spatial"));
                Assert.assertEquals(10L, row.getRaw("visited_sum"));
                Assert.assertEquals(1.0d, ((HyperLogLogCollector) row.getRaw("unique_hosts")).estimateCardinality(), 0.1);
            }
            count++;
            index.add(row);
        }
        Assert.assertEquals(18, count);
        // Check the index
        Assert.assertEquals(9, index.size());
        final IncrementalIndexStorageAdapter queryable = new IncrementalIndexStorageAdapter(index);
        Assert.assertEquals(2, queryable.getAvailableDimensions().size());
        Assert.assertEquals("host", queryable.getAvailableDimensions().get(0));
        Assert.assertEquals("spatial", queryable.getAvailableDimensions().get(1));
        Assert.assertEquals(ImmutableList.of("visited_sum", "unique_hosts"), queryable.getAvailableMetrics());
        // Do a spatial filter
        final IngestSegmentFirehose firehose2 = new IngestSegmentFirehose(ImmutableList.of(new WindowedStorageAdapter(queryable, Intervals.of("2000/3000"))), TransformSpec.NONE, ImmutableList.of("host", "spatial"), ImmutableList.of("visited_sum", "unique_hosts"), new SpatialDimFilter("spatial", new RadiusBound(new float[] { 1, 0 }, 0.1f)));
        final InputRow row = firehose2.nextRow();
        Assert.assertFalse(firehose2.hasMore());
        Assert.assertEquals(DateTimes.of("2014-10-22T00Z"), row.getTimestamp());
        Assert.assertEquals("host2", row.getRaw("host"));
        Assert.assertEquals("1,0", row.getRaw("spatial"));
        Assert.assertEquals(40L, row.getRaw("visited_sum"));
        Assert.assertEquals(1.0d, ((HyperLogLogCollector) row.getRaw("unique_hosts")).estimateCardinality(), 0.1);
    }
}
Also used : IncrementalIndex(org.apache.druid.segment.incremental.IncrementalIndex) OnheapIncrementalIndex(org.apache.druid.segment.incremental.OnheapIncrementalIndex) OnheapIncrementalIndex(org.apache.druid.segment.incremental.OnheapIncrementalIndex) StorageAdapter(org.apache.druid.segment.StorageAdapter) IncrementalIndexStorageAdapter(org.apache.druid.segment.incremental.IncrementalIndexStorageAdapter) QueryableIndexStorageAdapter(org.apache.druid.segment.QueryableIndexStorageAdapter) QueryableIndexStorageAdapter(org.apache.druid.segment.QueryableIndexStorageAdapter) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) HyperUniquesAggregatorFactory(org.apache.druid.query.aggregation.hyperloglog.HyperUniquesAggregatorFactory) AggregatorFactory(org.apache.druid.query.aggregation.AggregatorFactory) SpatialDimFilter(org.apache.druid.query.filter.SpatialDimFilter) RadiusBound(org.apache.druid.collections.spatial.search.RadiusBound) QueryableIndex(org.apache.druid.segment.QueryableIndex) InputRow(org.apache.druid.data.input.InputRow) IncrementalIndexStorageAdapter(org.apache.druid.segment.incremental.IncrementalIndexStorageAdapter) File(java.io.File) Test(org.junit.Test)

Aggregations

LongSumAggregatorFactory (org.apache.druid.query.aggregation.LongSumAggregatorFactory)10 SpatialDimFilter (org.apache.druid.query.filter.SpatialDimFilter)10 Test (org.junit.Test)10 IOException (java.io.IOException)9 FinalizeResultsQueryRunner (org.apache.druid.query.FinalizeResultsQueryRunner)9 QueryRunner (org.apache.druid.query.QueryRunner)9 Result (org.apache.druid.query.Result)9 CountAggregatorFactory (org.apache.druid.query.aggregation.CountAggregatorFactory)9 TimeseriesQuery (org.apache.druid.query.timeseries.TimeseriesQuery)9 TimeseriesQueryEngine (org.apache.druid.query.timeseries.TimeseriesQueryEngine)9 TimeseriesQueryQueryToolChest (org.apache.druid.query.timeseries.TimeseriesQueryQueryToolChest)9 TimeseriesQueryRunnerFactory (org.apache.druid.query.timeseries.TimeseriesQueryRunnerFactory)9 TimeseriesResultValue (org.apache.druid.query.timeseries.TimeseriesResultValue)9 RadiusBound (org.apache.druid.collections.spatial.search.RadiusBound)6 InitializedNullHandlingTest (org.apache.druid.testing.InitializedNullHandlingTest)6 RectangularBound (org.apache.druid.collections.spatial.search.RectangularBound)4 File (java.io.File)1 InputRow (org.apache.druid.data.input.InputRow)1 AggregatorFactory (org.apache.druid.query.aggregation.AggregatorFactory)1 FilteredAggregatorFactory (org.apache.druid.query.aggregation.FilteredAggregatorFactory)1