Search in sources :

Example 86 with DefaultLimitSpec

use of org.apache.druid.query.groupby.orderby.DefaultLimitSpec in project druid by druid-io.

the class FixedBucketsHistogramGroupByQueryTest method testGroupByWithFixedHistogramAgg.

@Test
public void testGroupByWithFixedHistogramAgg() {
    FixedBucketsHistogramAggregatorFactory aggFactory = new FixedBucketsHistogramAggregatorFactory("histo", "index", 10, 0, 2000, FixedBucketsHistogram.OutlierHandlingMode.OVERFLOW, false);
    GroupByQuery query = new GroupByQuery.Builder().setDataSource(QueryRunnerTestHelper.DATA_SOURCE).setGranularity(QueryRunnerTestHelper.ALL_GRAN).setDimensions(new DefaultDimensionSpec(QueryRunnerTestHelper.MARKET_DIMENSION, "marketalias")).setInterval(QueryRunnerTestHelper.FULL_ON_INTERVAL).setLimitSpec(new DefaultLimitSpec(Collections.singletonList(new OrderByColumnSpec("marketalias", OrderByColumnSpec.Direction.DESCENDING)), 1)).setAggregatorSpecs(QueryRunnerTestHelper.ROWS_COUNT, aggFactory).setPostAggregatorSpecs(Collections.singletonList(new QuantilePostAggregator("quantile", "histo", 0.5f))).build();
    List<ResultRow> expectedResults = Collections.singletonList(GroupByQueryRunnerTestHelper.createExpectedRow(query, "1970-01-01T00:00:00.000Z", "marketalias", "upfront", "rows", 186L, "quantile", 969.6969604492188f, "histo", new FixedBucketsHistogram(0, 2000, 10, FixedBucketsHistogram.OutlierHandlingMode.OVERFLOW, new long[] { 0, 0, 4, 33, 66, 35, 25, 11, 10, 2 }, 186, 1870.061029, 545.990623, 0, 0, 0).toString()));
    Iterable<ResultRow> results = GroupByQueryRunnerTestHelper.runQuery(factory, runner, query);
    TestHelper.assertExpectedObjects(expectedResults, results, "fixed-histo");
}
Also used : OrderByColumnSpec(org.apache.druid.query.groupby.orderby.OrderByColumnSpec) ResultRow(org.apache.druid.query.groupby.ResultRow) GroupByQuery(org.apache.druid.query.groupby.GroupByQuery) DefaultLimitSpec(org.apache.druid.query.groupby.orderby.DefaultLimitSpec) DefaultDimensionSpec(org.apache.druid.query.dimension.DefaultDimensionSpec) GroupByQueryRunnerTest(org.apache.druid.query.groupby.GroupByQueryRunnerTest) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 87 with DefaultLimitSpec

use of org.apache.druid.query.groupby.orderby.DefaultLimitSpec in project druid by druid-io.

the class FixedBucketsHistogramGroupByQueryTest method testGroupByWithSameNameComplexPostAgg.

@Test(expected = IllegalArgumentException.class)
public void testGroupByWithSameNameComplexPostAgg() {
    FixedBucketsHistogramAggregatorFactory aggFactory = new FixedBucketsHistogramAggregatorFactory("histo", "index", 10, 0, 2000, FixedBucketsHistogram.OutlierHandlingMode.OVERFLOW, false);
    GroupByQuery query = new GroupByQuery.Builder().setDataSource(QueryRunnerTestHelper.DATA_SOURCE).setGranularity(QueryRunnerTestHelper.ALL_GRAN).setDimensions(new DefaultDimensionSpec(QueryRunnerTestHelper.MARKET_DIMENSION, "marketalias")).setInterval(QueryRunnerTestHelper.FULL_ON_INTERVAL).setLimitSpec(new DefaultLimitSpec(Collections.singletonList(new OrderByColumnSpec("marketalias", OrderByColumnSpec.Direction.DESCENDING)), 1)).setAggregatorSpecs(QueryRunnerTestHelper.ROWS_COUNT, aggFactory).setPostAggregatorSpecs(Collections.singletonList(new QuantilePostAggregator("quantile", "quantile", 0.5f))).build();
    List<ResultRow> expectedResults = Collections.singletonList(GroupByQueryRunnerTestHelper.createExpectedRow(query, "1970-01-01T00:00:00.000Z", "marketalias", "upfront", "rows", 186L, "quantile", 969.6969604492188f));
    Iterable<ResultRow> results = GroupByQueryRunnerTestHelper.runQuery(factory, runner, query);
    TestHelper.assertExpectedObjects(expectedResults, results, "fixed-histo");
}
Also used : OrderByColumnSpec(org.apache.druid.query.groupby.orderby.OrderByColumnSpec) ResultRow(org.apache.druid.query.groupby.ResultRow) GroupByQuery(org.apache.druid.query.groupby.GroupByQuery) DefaultLimitSpec(org.apache.druid.query.groupby.orderby.DefaultLimitSpec) DefaultDimensionSpec(org.apache.druid.query.dimension.DefaultDimensionSpec) GroupByQueryRunnerTest(org.apache.druid.query.groupby.GroupByQueryRunnerTest) InitializedNullHandlingTest(org.apache.druid.testing.InitializedNullHandlingTest) Test(org.junit.Test)

Example 88 with DefaultLimitSpec

use of org.apache.druid.query.groupby.orderby.DefaultLimitSpec in project druid by druid-io.

the class LimitedBufferHashGrouperTest method makeGrouperWithOrderBy.

private static LimitedBufferHashGrouper<Integer> makeGrouperWithOrderBy(TestColumnSelectorFactory columnSelectorFactory, String orderByColumn, OrderByColumnSpec.Direction direction) {
    final StringComparator stringComparator = "value".equals(orderByColumn) ? StringComparators.LEXICOGRAPHIC : StringComparators.NUMERIC;
    final DefaultLimitSpec orderBy = DefaultLimitSpec.builder().orderBy(new OrderByColumnSpec(orderByColumn, direction, stringComparator)).limit(LIMIT).build();
    LimitedBufferHashGrouper<Integer> grouper = new LimitedBufferHashGrouper<>(Suppliers.ofInstance(ByteBuffer.allocate(12120)), new GroupByIshKeySerde(orderBy), AggregatorAdapters.factorizeBuffered(columnSelectorFactory, ImmutableList.of(new LongSumAggregatorFactory("valueSum", "value"), new CountAggregatorFactory("count"))), Integer.MAX_VALUE, 0.5f, 2, LIMIT, !orderBy.getColumns().get(0).getDimension().equals("value"));
    grouper.init();
    return grouper;
}
Also used : OrderByColumnSpec(org.apache.druid.query.groupby.orderby.OrderByColumnSpec) DefaultLimitSpec(org.apache.druid.query.groupby.orderby.DefaultLimitSpec) CountAggregatorFactory(org.apache.druid.query.aggregation.CountAggregatorFactory) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) StringComparator(org.apache.druid.query.ordering.StringComparator)

Example 89 with DefaultLimitSpec

use of org.apache.druid.query.groupby.orderby.DefaultLimitSpec in project druid by druid-io.

the class CalciteMultiValueStringQueryTest method testMultiValueStringConcatBackwardsCompat0dot22andOlder.

@Test
public void testMultiValueStringConcatBackwardsCompat0dot22andOlder() throws Exception {
    try {
        ExpressionProcessing.initializeForHomogenizeNullMultiValueStrings();
        // Cannot vectorize due to usage of expressions.
        cannotVectorize();
        ImmutableList<Object[]> results;
        if (useDefault) {
            results = ImmutableList.of(new Object[] { "", 6L }, new Object[] { "b", 4L }, new Object[] { "a", 2L }, new Object[] { "c", 2L }, new Object[] { "d", 2L });
        } else {
            results = ImmutableList.of(new Object[] { null, 4L }, new Object[] { "b", 4L }, new Object[] { "", 2L }, new Object[] { "a", 2L }, new Object[] { "c", 2L }, new Object[] { "d", 2L });
        }
        testQuery("SELECT MV_CONCAT(dim3, dim3), SUM(cnt) FROM druid.numfoo GROUP BY 1 ORDER BY 2 DESC", ImmutableList.of(GroupByQuery.builder().setDataSource(CalciteTests.DATASOURCE3).setInterval(querySegmentSpec(Filtration.eternity())).setGranularity(Granularities.ALL).setVirtualColumns(expressionVirtualColumn("v0", "array_concat(\"dim3\",\"dim3\")", ColumnType.STRING)).setDimensions(dimensions(new DefaultDimensionSpec("v0", "_d0", ColumnType.STRING))).setAggregatorSpecs(aggregators(new LongSumAggregatorFactory("a0", "cnt"))).setLimitSpec(new DefaultLimitSpec(ImmutableList.of(new OrderByColumnSpec("a0", OrderByColumnSpec.Direction.DESCENDING, StringComparators.NUMERIC)), Integer.MAX_VALUE)).setContext(QUERY_CONTEXT_DEFAULT).build()), results);
    } finally {
        ExpressionProcessing.initializeForTests(null);
    }
}
Also used : OrderByColumnSpec(org.apache.druid.query.groupby.orderby.OrderByColumnSpec) DefaultLimitSpec(org.apache.druid.query.groupby.orderby.DefaultLimitSpec) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) DefaultDimensionSpec(org.apache.druid.query.dimension.DefaultDimensionSpec) Test(org.junit.Test)

Example 90 with DefaultLimitSpec

use of org.apache.druid.query.groupby.orderby.DefaultLimitSpec in project druid by druid-io.

the class CalciteMultiValueStringQueryTest method testMultiValueListFilterComposed.

@Test
public void testMultiValueListFilterComposed() throws Exception {
    // Cannot vectorize due to usage of expressions.
    cannotVectorize();
    testQuery("SELECT MV_LENGTH(MV_FILTER_ONLY(dim3, ARRAY['b'])), SUM(cnt) FROM druid.numfoo GROUP BY 1 ORDER BY 2 DESC", ImmutableList.of(GroupByQuery.builder().setDataSource(CalciteTests.DATASOURCE3).setInterval(querySegmentSpec(Filtration.eternity())).setGranularity(Granularities.ALL).setVirtualColumns(expressionVirtualColumn("v0", "array_length(\"v1\")", ColumnType.LONG), new ListFilteredVirtualColumn("v1", DefaultDimensionSpec.of("dim3"), ImmutableSet.of("b"), true)).setDimensions(dimensions(new DefaultDimensionSpec("v0", "_d0", ColumnType.LONG))).setAggregatorSpecs(aggregators(new LongSumAggregatorFactory("a0", "cnt"))).setLimitSpec(new DefaultLimitSpec(ImmutableList.of(new OrderByColumnSpec("a0", OrderByColumnSpec.Direction.DESCENDING, StringComparators.NUMERIC)), Integer.MAX_VALUE)).setContext(QUERY_CONTEXT_DEFAULT).build()), useDefault ? ImmutableList.of(new Object[] { 0, 4L }, new Object[] { 1, 2L }) : ImmutableList.of(// selector, which treats a 0 length array as null instead of an empty array like is produced by filter
    new Object[] { null, 4L }, new Object[] { 1, 2L }));
}
Also used : OrderByColumnSpec(org.apache.druid.query.groupby.orderby.OrderByColumnSpec) DefaultLimitSpec(org.apache.druid.query.groupby.orderby.DefaultLimitSpec) LongSumAggregatorFactory(org.apache.druid.query.aggregation.LongSumAggregatorFactory) ListFilteredVirtualColumn(org.apache.druid.segment.virtual.ListFilteredVirtualColumn) DefaultDimensionSpec(org.apache.druid.query.dimension.DefaultDimensionSpec) Test(org.junit.Test)

Aggregations

DefaultLimitSpec (org.apache.druid.query.groupby.orderby.DefaultLimitSpec)113 DefaultDimensionSpec (org.apache.druid.query.dimension.DefaultDimensionSpec)107 Test (org.junit.Test)105 OrderByColumnSpec (org.apache.druid.query.groupby.orderby.OrderByColumnSpec)100 LongSumAggregatorFactory (org.apache.druid.query.aggregation.LongSumAggregatorFactory)79 InitializedNullHandlingTest (org.apache.druid.testing.InitializedNullHandlingTest)47 MultipleIntervalSegmentSpec (org.apache.druid.query.spec.MultipleIntervalSegmentSpec)15 GroupByQuery (org.apache.druid.query.groupby.GroupByQuery)13 FinalizeResultsQueryRunner (org.apache.druid.query.FinalizeResultsQueryRunner)11 QueryPlus (org.apache.druid.query.QueryPlus)11 QueryRunner (org.apache.druid.query.QueryRunner)11 ResponseContext (org.apache.druid.query.context.ResponseContext)11 ResultRow (org.apache.druid.query.groupby.ResultRow)11 CountAggregatorFactory (org.apache.druid.query.aggregation.CountAggregatorFactory)10 GreaterThanHavingSpec (org.apache.druid.query.groupby.having.GreaterThanHavingSpec)9 QuerySegmentSpec (org.apache.druid.query.spec.QuerySegmentSpec)8 ArrayList (java.util.ArrayList)7 Sequence (org.apache.druid.java.util.common.guava.Sequence)7 QueryDataSource (org.apache.druid.query.QueryDataSource)7 GroupByQueryRunnerTest (org.apache.druid.query.groupby.GroupByQueryRunnerTest)7