Search in sources :

Example 26 with SearchQuery

use of io.druid.query.search.search.SearchQuery in project druid by druid-io.

the class SearchQueryRunnerTest method testSearchOnLongColumn.

@Test
public void testSearchOnLongColumn() {
    SearchQuery searchQuery = Druids.newSearchQueryBuilder().dimensions(new DefaultDimensionSpec(Column.TIME_COLUMN_NAME, Column.TIME_COLUMN_NAME, ValueType.LONG)).dataSource(QueryRunnerTestHelper.dataSource).granularity(QueryRunnerTestHelper.allGran).intervals(QueryRunnerTestHelper.fullOnInterval).query("1297123200000").build();
    List<SearchHit> expectedHits = Lists.newLinkedList();
    expectedHits.add(new SearchHit(Column.TIME_COLUMN_NAME, "1297123200000", 13));
    checkSearchQuery(searchQuery, expectedHits);
}
Also used : SearchQuery(io.druid.query.search.search.SearchQuery) SearchHit(io.druid.query.search.search.SearchHit) DefaultDimensionSpec(io.druid.query.dimension.DefaultDimensionSpec) Test(org.junit.Test)

Example 27 with SearchQuery

use of io.druid.query.search.search.SearchQuery in project druid by druid-io.

the class SearchQueryRunnerTest method testSearchWithCardinality.

@Test
public void testSearchWithCardinality() {
    final SearchQuery searchQuery = Druids.newSearchQueryBuilder().dataSource(QueryRunnerTestHelper.dataSource).granularity(QueryRunnerTestHelper.allGran).intervals(QueryRunnerTestHelper.fullOnInterval).query("a").build();
    // double the value
    QueryRunner mergedRunner = toolChest.mergeResults(new QueryRunner<Result<SearchResultValue>>() {

        @Override
        public Sequence<Result<SearchResultValue>> run(Query<Result<SearchResultValue>> query, Map<String, Object> responseContext) {
            final Query<Result<SearchResultValue>> query1 = searchQuery.withQuerySegmentSpec(new MultipleIntervalSegmentSpec(Lists.newArrayList(new Interval("2011-01-12/2011-02-28"))));
            final Query<Result<SearchResultValue>> query2 = searchQuery.withQuerySegmentSpec(new MultipleIntervalSegmentSpec(Lists.newArrayList(new Interval("2011-03-01/2011-04-15"))));
            return Sequences.concat(runner.run(query1, responseContext), runner.run(query2, responseContext));
        }
    });
    List<SearchHit> expectedHits = Lists.newLinkedList();
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.qualityDimension, "automotive", 91));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.qualityDimension, "mezzanine", 273));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.qualityDimension, "travel", 91));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.qualityDimension, "health", 91));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.qualityDimension, "entertainment", 91));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.marketDimension, "total_market", 182));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.placementishDimension, "a", 91));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.partialNullDimension, "value", 182));
    checkSearchQuery(searchQuery, mergedRunner, expectedHits);
}
Also used : SearchQuery(io.druid.query.search.search.SearchQuery) Query(io.druid.query.Query) SearchQuery(io.druid.query.search.search.SearchQuery) SearchHit(io.druid.query.search.search.SearchHit) MultipleIntervalSegmentSpec(io.druid.query.spec.MultipleIntervalSegmentSpec) Sequence(io.druid.java.util.common.guava.Sequence) QueryRunner(io.druid.query.QueryRunner) Result(io.druid.query.Result) Interval(org.joda.time.Interval) Test(org.junit.Test)

Example 28 with SearchQuery

use of io.druid.query.search.search.SearchQuery in project druid by druid-io.

the class SearchQueryRunnerTest method testSearchOnFloatColumnWithExFn.

@Test
public void testSearchOnFloatColumnWithExFn() {
    String jsFn = "function(str) { return 'super-' + str; }";
    ExtractionFn jsExtractionFn = new JavaScriptExtractionFn(jsFn, false, JavaScriptConfig.getEnabledInstance());
    SearchQuery searchQuery = Druids.newSearchQueryBuilder().dimensions(new ExtractionDimensionSpec(QueryRunnerTestHelper.indexMetric, QueryRunnerTestHelper.indexMetric, jsExtractionFn)).dataSource(QueryRunnerTestHelper.dataSource).granularity(QueryRunnerTestHelper.allGran).intervals(QueryRunnerTestHelper.fullOnInterval).query("100.7").build();
    List<SearchHit> expectedHits = Lists.newLinkedList();
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.indexMetric, "super-100.7060546875", 1));
    expectedHits.add(new SearchHit(QueryRunnerTestHelper.indexMetric, "super-100.77559661865234", 1));
    checkSearchQuery(searchQuery, expectedHits);
}
Also used : SearchQuery(io.druid.query.search.search.SearchQuery) LookupExtractionFn(io.druid.query.lookup.LookupExtractionFn) ExtractionFn(io.druid.query.extraction.ExtractionFn) JavaScriptExtractionFn(io.druid.query.extraction.JavaScriptExtractionFn) TimeFormatExtractionFn(io.druid.query.extraction.TimeFormatExtractionFn) SearchHit(io.druid.query.search.search.SearchHit) JavaScriptExtractionFn(io.druid.query.extraction.JavaScriptExtractionFn) ExtractionDimensionSpec(io.druid.query.dimension.ExtractionDimensionSpec) Test(org.junit.Test)

Example 29 with SearchQuery

use of io.druid.query.search.search.SearchQuery in project druid by druid-io.

the class SearchQueryRunnerTest method testSearchWithNullValueInDimension.

@Test
public void testSearchWithNullValueInDimension() throws Exception {
    IncrementalIndex<Aggregator> index = new OnheapIncrementalIndex(new IncrementalIndexSchema.Builder().withQueryGranularity(Granularities.NONE).withMinTimestamp(new DateTime("2011-01-12T00:00:00.000Z").getMillis()).build(), true, 10);
    index.add(new MapBasedInputRow(1481871600000L, Arrays.asList("name", "host"), ImmutableMap.<String, Object>of("name", "name1", "host", "host")));
    index.add(new MapBasedInputRow(1481871670000L, Arrays.asList("name", "table"), ImmutableMap.<String, Object>of("name", "name2", "table", "table")));
    SearchQuery searchQuery = Druids.newSearchQueryBuilder().dimensions(new DefaultDimensionSpec("table", "table")).dataSource(QueryRunnerTestHelper.dataSource).granularity(QueryRunnerTestHelper.allGran).intervals(QueryRunnerTestHelper.fullOnInterval).context(ImmutableMap.<String, Object>of("searchStrategy", "cursorOnly")).build();
    QueryRunnerFactory factory = new SearchQueryRunnerFactory(selector, toolChest, QueryRunnerTestHelper.NOOP_QUERYWATCHER);
    QueryRunner runner = factory.createRunner(new QueryableIndexSegment("asdf", TestIndex.persistRealtimeAndLoadMMapped(index)));
    List<SearchHit> expectedHits = Lists.newLinkedList();
    expectedHits.add(new SearchHit("table", "table", 1));
    expectedHits.add(new SearchHit("table", "", 1));
    checkSearchQuery(searchQuery, runner, expectedHits);
}
Also used : SearchQuery(io.druid.query.search.search.SearchQuery) QueryableIndexSegment(io.druid.segment.QueryableIndexSegment) SearchHit(io.druid.query.search.search.SearchHit) OnheapIncrementalIndex(io.druid.segment.incremental.OnheapIncrementalIndex) Aggregator(io.druid.query.aggregation.Aggregator) DateTime(org.joda.time.DateTime) DefaultDimensionSpec(io.druid.query.dimension.DefaultDimensionSpec) QueryRunner(io.druid.query.QueryRunner) QueryRunnerFactory(io.druid.query.QueryRunnerFactory) MapBasedInputRow(io.druid.data.input.MapBasedInputRow) Test(org.junit.Test)

Example 30 with SearchQuery

use of io.druid.query.search.search.SearchQuery in project druid by druid-io.

the class SearchQueryQueryToolChest method getCacheStrategy.

@Override
public CacheStrategy<Result<SearchResultValue>, Object, SearchQuery> getCacheStrategy(final SearchQuery query) {
    return new CacheStrategy<Result<SearchResultValue>, Object, SearchQuery>() {

        private final List<DimensionSpec> dimensionSpecs = query.getDimensions() != null ? query.getDimensions() : Collections.<DimensionSpec>emptyList();

        private final List<String> dimOutputNames = dimensionSpecs.size() > 0 ? Lists.transform(dimensionSpecs, new Function<DimensionSpec, String>() {

            @Override
            public String apply(DimensionSpec input) {
                return input.getOutputName();
            }
        }) : Collections.<String>emptyList();

        @Override
        public boolean isCacheable(SearchQuery query, boolean willMergeRunners) {
            return true;
        }

        @Override
        public byte[] computeCacheKey(SearchQuery query) {
            final DimFilter dimFilter = query.getDimensionsFilter();
            final byte[] filterBytes = dimFilter == null ? new byte[] {} : dimFilter.getCacheKey();
            final byte[] querySpecBytes = query.getQuery().getCacheKey();
            final byte[] granularityBytes = query.getGranularity().getCacheKey();
            final List<DimensionSpec> dimensionSpecs = query.getDimensions() != null ? query.getDimensions() : Collections.<DimensionSpec>emptyList();
            final byte[][] dimensionsBytes = new byte[dimensionSpecs.size()][];
            int dimensionsBytesSize = 0;
            int index = 0;
            for (DimensionSpec dimensionSpec : dimensionSpecs) {
                dimensionsBytes[index] = dimensionSpec.getCacheKey();
                dimensionsBytesSize += dimensionsBytes[index].length;
                ++index;
            }
            final byte[] sortSpecBytes = query.getSort().getCacheKey();
            final ByteBuffer queryCacheKey = ByteBuffer.allocate(1 + 4 + granularityBytes.length + filterBytes.length + querySpecBytes.length + dimensionsBytesSize + sortSpecBytes.length).put(SEARCH_QUERY).put(Ints.toByteArray(query.getLimit())).put(granularityBytes).put(filterBytes).put(querySpecBytes).put(sortSpecBytes);
            for (byte[] bytes : dimensionsBytes) {
                queryCacheKey.put(bytes);
            }
            return queryCacheKey.array();
        }

        @Override
        public TypeReference<Object> getCacheObjectClazz() {
            return OBJECT_TYPE_REFERENCE;
        }

        @Override
        public Function<Result<SearchResultValue>, Object> prepareForCache() {
            return new Function<Result<SearchResultValue>, Object>() {

                @Override
                public Object apply(Result<SearchResultValue> input) {
                    return dimensionSpecs.size() > 0 ? Lists.newArrayList(input.getTimestamp().getMillis(), input.getValue(), dimOutputNames) : Lists.newArrayList(input.getTimestamp().getMillis(), input.getValue());
                }
            };
        }

        @Override
        public Function<Object, Result<SearchResultValue>> pullFromCache() {
            return new Function<Object, Result<SearchResultValue>>() {

                @Override
                @SuppressWarnings("unchecked")
                public Result<SearchResultValue> apply(Object input) {
                    List<Object> result = (List<Object>) input;
                    boolean needsRename = false;
                    final Map<String, String> outputNameMap = Maps.newHashMap();
                    if (hasOutputName(result)) {
                        List<String> cachedOutputNames = (List) result.get(2);
                        Preconditions.checkArgument(cachedOutputNames.size() == dimOutputNames.size(), "cache hit, but number of dimensions mismatch");
                        needsRename = false;
                        for (int idx = 0; idx < cachedOutputNames.size(); idx++) {
                            String cachedOutputName = cachedOutputNames.get(idx);
                            String outputName = dimOutputNames.get(idx);
                            if (!cachedOutputName.equals(outputName)) {
                                needsRename = true;
                            }
                            outputNameMap.put(cachedOutputName, outputName);
                        }
                    }
                    return !needsRename ? new Result<>(new DateTime(((Number) result.get(0)).longValue()), new SearchResultValue(Lists.transform((List) result.get(1), new Function<Object, SearchHit>() {

                        @Override
                        public SearchHit apply(@Nullable Object input) {
                            if (input instanceof Map) {
                                return new SearchHit((String) ((Map) input).get("dimension"), (String) ((Map) input).get("value"), (Integer) ((Map) input).get("count"));
                            } else if (input instanceof SearchHit) {
                                return (SearchHit) input;
                            } else {
                                throw new IAE("Unknown format [%s]", input.getClass());
                            }
                        }
                    }))) : new Result<>(new DateTime(((Number) result.get(0)).longValue()), new SearchResultValue(Lists.transform((List) result.get(1), new Function<Object, SearchHit>() {

                        @Override
                        public SearchHit apply(@Nullable Object input) {
                            String dim = null;
                            String val = null;
                            Integer cnt = null;
                            if (input instanceof Map) {
                                dim = outputNameMap.get((String) ((Map) input).get("dimension"));
                                val = (String) ((Map) input).get("value");
                                cnt = (Integer) ((Map) input).get("count");
                            } else if (input instanceof SearchHit) {
                                SearchHit cached = (SearchHit) input;
                                dim = outputNameMap.get(cached.getDimension());
                                val = cached.getValue();
                                cnt = cached.getCount();
                            } else {
                                throw new IAE("Unknown format [%s]", input.getClass());
                            }
                            return new SearchHit(dim, val, cnt);
                        }
                    })));
                }
            };
        }

        private boolean hasOutputName(List<Object> cachedEntry) {
            /*
         * cached entry is list of two or three objects
         *  1. timestamp
         *  2. SearchResultValue
         *  3. outputName of each dimension (optional)
         *
         * if a cached entry has three objects, dimension name of SearchResultValue should be check if rename is needed
         */
            return cachedEntry.size() == 3;
        }
    };
}
Also used : DimensionSpec(io.druid.query.dimension.DimensionSpec) SearchHit(io.druid.query.search.search.SearchHit) DateTime(org.joda.time.DateTime) Result(io.druid.query.Result) Function(com.google.common.base.Function) List(java.util.List) SearchQuery(io.druid.query.search.search.SearchQuery) IAE(io.druid.java.util.common.IAE) ByteBuffer(java.nio.ByteBuffer) DimFilter(io.druid.query.filter.DimFilter) Map(java.util.Map) CacheStrategy(io.druid.query.CacheStrategy) Nullable(javax.annotation.Nullable)

Aggregations

SearchQuery (io.druid.query.search.search.SearchQuery)31 Test (org.junit.Test)25 SearchHit (io.druid.query.search.search.SearchHit)22 Result (io.druid.query.Result)12 SearchResultValue (io.druid.query.search.SearchResultValue)9 HashMap (java.util.HashMap)9 QueryRunner (io.druid.query.QueryRunner)8 DateTime (org.joda.time.DateTime)8 Druids (io.druid.query.Druids)5 Set (java.util.Set)5 DefaultDimensionSpec (io.druid.query.dimension.DefaultDimensionSpec)4 ExtractionDimensionSpec (io.druid.query.dimension.ExtractionDimensionSpec)4 Interval (org.joda.time.Interval)4 TimeFormatExtractionFn (io.druid.query.extraction.TimeFormatExtractionFn)3 LookupExtractionFn (io.druid.query.lookup.LookupExtractionFn)3 MultipleIntervalSegmentSpec (io.druid.query.spec.MultipleIntervalSegmentSpec)3 Map (java.util.Map)3 Function (com.google.common.base.Function)2 ImmutableMap (com.google.common.collect.ImmutableMap)2 ISE (io.druid.java.util.common.ISE)2