Search in sources :

Example 21 with IntArrayList

use of com.carrotsearch.hppc.IntArrayList in project elasticsearch by elastic.

the class FetchSearchPhase method innerRun.

private void innerRun() throws IOException {
    final int numShards = context.getNumShards();
    final boolean isScrollSearch = context.getRequest().scroll() != null;
    ScoreDoc[] sortedShardDocs = searchPhaseController.sortDocs(isScrollSearch, queryResults);
    String scrollId = isScrollSearch ? TransportSearchHelper.buildScrollId(queryResults) : null;
    List<AtomicArray.Entry<QuerySearchResultProvider>> queryResultsAsList = queryResults.asList();
    final SearchPhaseController.ReducedQueryPhase reducedQueryPhase = resultConsumer.reduce();
    final boolean queryAndFetchOptimization = queryResults.length() == 1;
    final Runnable finishPhase = () -> moveToNextPhase(searchPhaseController, sortedShardDocs, scrollId, reducedQueryPhase, queryAndFetchOptimization ? queryResults : fetchResults);
    if (queryAndFetchOptimization) {
        assert queryResults.get(0) == null || queryResults.get(0).fetchResult() != null;
        // query AND fetch optimization
        finishPhase.run();
    } else {
        final IntArrayList[] docIdsToLoad = searchPhaseController.fillDocIdsToLoad(numShards, sortedShardDocs);
        if (sortedShardDocs.length == 0) {
            // no docs to fetch -- sidestep everything and return
            queryResultsAsList.stream().map(e -> e.value.queryResult()).forEach(// we have to release contexts here to free up resources
            this::releaseIrrelevantSearchContext);
            finishPhase.run();
        } else {
            final ScoreDoc[] lastEmittedDocPerShard = isScrollSearch ? searchPhaseController.getLastEmittedDocPerShard(reducedQueryPhase, sortedShardDocs, numShards) : null;
            final CountedCollector<FetchSearchResult> counter = new CountedCollector<>(fetchResults::set, // we count down every shard in the result no matter if we got any results or not
            docIdsToLoad.length, finishPhase, context);
            for (int i = 0; i < docIdsToLoad.length; i++) {
                IntArrayList entry = docIdsToLoad[i];
                QuerySearchResultProvider queryResult = queryResults.get(i);
                if (entry == null) {
                    // no results for this shard ID
                    if (queryResult != null) {
                        // if we got some hits from this shard we have to release the context there
                        // we do this as we go since it will free up resources and passing on the request on the
                        // transport layer is cheap.
                        releaseIrrelevantSearchContext(queryResult.queryResult());
                    }
                    // in any case we count down this result since we don't talk to this shard anymore
                    counter.countDown();
                } else {
                    Transport.Connection connection = context.getConnection(queryResult.shardTarget().getNodeId());
                    ShardFetchSearchRequest fetchSearchRequest = createFetchRequest(queryResult.queryResult().id(), i, entry, lastEmittedDocPerShard);
                    executeFetch(i, queryResult.shardTarget(), counter, fetchSearchRequest, queryResult.queryResult(), connection);
                }
            }
        }
    }
}
Also used : SearchShardTarget(org.elasticsearch.search.SearchShardTarget) InternalSearchResponse(org.elasticsearch.search.internal.InternalSearchResponse) Transport(org.elasticsearch.transport.Transport) ScoreDoc(org.apache.lucene.search.ScoreDoc) AtomicArray(org.elasticsearch.common.util.concurrent.AtomicArray) IOException(java.io.IOException) ShardFetchSearchRequest(org.elasticsearch.search.fetch.ShardFetchSearchRequest) ParameterizedMessage(org.apache.logging.log4j.message.ParameterizedMessage) Function(java.util.function.Function) QuerySearchResultProvider(org.elasticsearch.search.query.QuerySearchResultProvider) List(java.util.List) Logger(org.apache.logging.log4j.Logger) FetchSearchResult(org.elasticsearch.search.fetch.FetchSearchResult) QuerySearchResult(org.elasticsearch.search.query.QuerySearchResult) Supplier(org.apache.logging.log4j.util.Supplier) IntArrayList(com.carrotsearch.hppc.IntArrayList) ActionRunnable(org.elasticsearch.action.ActionRunnable) ActionListener(org.elasticsearch.action.ActionListener) QuerySearchResultProvider(org.elasticsearch.search.query.QuerySearchResultProvider) FetchSearchResult(org.elasticsearch.search.fetch.FetchSearchResult) ScoreDoc(org.apache.lucene.search.ScoreDoc) ShardFetchSearchRequest(org.elasticsearch.search.fetch.ShardFetchSearchRequest) ActionRunnable(org.elasticsearch.action.ActionRunnable) IntArrayList(com.carrotsearch.hppc.IntArrayList) Transport(org.elasticsearch.transport.Transport)

Example 22 with IntArrayList

use of com.carrotsearch.hppc.IntArrayList in project elasticsearch by elastic.

the class MultiTermVectorsShardRequest method readFrom.

@Override
public void readFrom(StreamInput in) throws IOException {
    super.readFrom(in);
    int size = in.readVInt();
    locations = new IntArrayList(size);
    requests = new ArrayList<>(size);
    for (int i = 0; i < size; i++) {
        locations.add(in.readVInt());
        requests.add(TermVectorsRequest.readTermVectorsRequest(in));
    }
    preference = in.readOptionalString();
}
Also used : IntArrayList(com.carrotsearch.hppc.IntArrayList)

Example 23 with IntArrayList

use of com.carrotsearch.hppc.IntArrayList in project graphhopper by graphhopper.

the class LandmarkStorage method createLandmarks.

/**
     * This method calculates the landmarks and initial weightings to & from them.
     */
public void createLandmarks() {
    if (isInitialized())
        throw new IllegalStateException("Initialize the landmark storage only once!");
    // fill 'from' and 'to' weights with maximum value
    long maxBytes = (long) graph.getNodes() * LM_ROW_LENGTH;
    this.landmarkWeightDA.create(2000);
    this.landmarkWeightDA.ensureCapacity(maxBytes);
    for (long pointer = 0; pointer < maxBytes; pointer += 2) {
        landmarkWeightDA.setShort(pointer, (short) SHORT_INFINITY);
    }
    String additionalInfo = "";
    // guess the factor
    if (factor <= 0) {
        // A 'factor' is necessary to store the weight in just a short value but without loosing too much precision.
        // This factor is rather delicate to pick, we estimate it through the graph boundaries its maximum distance.
        // For small areas we use max_bounds_dist*X and otherwise we use a big fixed value for this distance.
        // If we would pick the distance too big for small areas this could lead to (slightly) suboptimal routes as there
        // will be too big rounding errors. But picking it too small is dangerous regarding performance
        // e.g. for Germany at least 1500km is very important otherwise speed is at least twice as slow e.g. for just 1000km
        BBox bounds = graph.getBounds();
        double distanceInMeter = Helper.DIST_EARTH.calcDist(bounds.maxLat, bounds.maxLon, bounds.minLat, bounds.minLon) * 7;
        if (distanceInMeter > 50_000 * 7 || /* for tests and convenience we do for now: */
        !bounds.isValid())
            distanceInMeter = 30_000_000;
        double maxWeight = weighting.getMinWeight(distanceInMeter);
        setMaximumWeight(maxWeight);
        additionalInfo = ", maxWeight:" + maxWeight + ", from max distance:" + distanceInMeter / 1000f + "km";
    }
    LOGGER.info("init landmarks for subnetworks with node count greater than " + minimumNodes + " with factor:" + factor + additionalInfo);
    // special subnetwork 0
    int[] empty = new int[landmarks];
    Arrays.fill(empty, UNSET_SUBNETWORK);
    landmarkIDs.add(empty);
    byte[] subnetworks = new byte[graph.getNodes()];
    Arrays.fill(subnetworks, (byte) UNSET_SUBNETWORK);
    EdgeFilter tarjanFilter = new DefaultEdgeFilter(encoder, false, true);
    IntHashSet blockedEdges = new IntHashSet();
    // the ruleLookup splits certain areas from each other but avoids making this a permanent change so that other algorithms still can route through these regions.
    if (ruleLookup != null && ruleLookup.size() > 0) {
        StopWatch sw = new StopWatch().start();
        blockedEdges = findBorderEdgeIds(ruleLookup);
        tarjanFilter = new BlockedEdgesFilter(encoder, false, true, blockedEdges);
        LOGGER.info("Made " + blockedEdges.size() + " edges inaccessible. Calculated country cut in " + sw.stop().getSeconds() + "s, " + Helper.getMemInfo());
    }
    StopWatch sw = new StopWatch().start();
    // we cannot reuse the components calculated in PrepareRoutingSubnetworks as the edgeIds changed in between (called graph.optimize)
    // also calculating subnetworks from scratch makes bigger problems when working with many oneways
    TarjansSCCAlgorithm tarjanAlgo = new TarjansSCCAlgorithm(graph, tarjanFilter, true);
    List<IntArrayList> graphComponents = tarjanAlgo.findComponents();
    LOGGER.info("Calculated tarjan subnetworks in " + sw.stop().getSeconds() + "s, " + Helper.getMemInfo());
    EdgeExplorer tmpExplorer = graph.createEdgeExplorer(new RequireBothDirectionsEdgeFilter(encoder));
    int nodes = 0;
    for (IntArrayList subnetworkIds : graphComponents) {
        nodes += subnetworkIds.size();
        if (subnetworkIds.size() < minimumNodes)
            continue;
        int index = subnetworkIds.size() - 1;
        // ensure start node is reachable from both sides and no subnetwork is associated
        for (; index >= 0; index--) {
            int nextStartNode = subnetworkIds.get(index);
            if (subnetworks[nextStartNode] == UNSET_SUBNETWORK && GHUtility.count(tmpExplorer.setBaseNode(nextStartNode)) > 0) {
                GHPoint p = createPoint(graph, nextStartNode);
                LOGGER.info("start node: " + nextStartNode + " (" + p + ") subnetwork size: " + subnetworkIds.size() + ", " + Helper.getMemInfo() + ((ruleLookup == null) ? "" : " area:" + ruleLookup.lookupRule(p).getId()));
                if (createLandmarksForSubnetwork(nextStartNode, subnetworks, blockedEdges))
                    break;
            }
        }
        if (index < 0)
            LOGGER.warn("next start node not found in big enough network of size " + subnetworkIds.size() + ", first element is " + subnetworkIds.get(0) + ", " + createPoint(graph, subnetworkIds.get(0)));
    }
    int subnetworkCount = landmarkIDs.size();
    // store all landmark node IDs and one int for the factor itself.
    this.landmarkWeightDA.ensureCapacity(maxBytes + /* landmark weights */
    subnetworkCount * landmarks);
    // calculate offset to point into landmark mapping
    long bytePos = maxBytes;
    for (int[] landmarks : landmarkIDs) {
        for (int lmNodeId : landmarks) {
            landmarkWeightDA.setInt(bytePos, lmNodeId);
            bytePos += 4L;
        }
    }
    landmarkWeightDA.setHeader(0 * 4, graph.getNodes());
    landmarkWeightDA.setHeader(1 * 4, landmarks);
    landmarkWeightDA.setHeader(2 * 4, subnetworkCount);
    if (factor * DOUBLE_MLTPL > Integer.MAX_VALUE)
        throw new UnsupportedOperationException("landmark weight factor cannot be bigger than Integer.MAX_VALUE " + factor * DOUBLE_MLTPL);
    landmarkWeightDA.setHeader(3 * 4, (int) Math.round(factor * DOUBLE_MLTPL));
    // serialize fast byte[] into DataAccess
    subnetworkStorage.create(graph.getNodes());
    for (int nodeId = 0; nodeId < subnetworks.length; nodeId++) {
        subnetworkStorage.setSubnetwork(nodeId, subnetworks[nodeId]);
    }
    LOGGER.info("Finished landmark creation. Subnetwork node count sum " + nodes + " vs. nodes " + graph.getNodes());
    initialized = true;
}
Also used : IntHashSet(com.carrotsearch.hppc.IntHashSet) TarjansSCCAlgorithm(com.graphhopper.routing.subnetwork.TarjansSCCAlgorithm) GHPoint(com.graphhopper.util.shapes.GHPoint) BBox(com.graphhopper.util.shapes.BBox) IntArrayList(com.carrotsearch.hppc.IntArrayList) GHPoint(com.graphhopper.util.shapes.GHPoint)

Example 24 with IntArrayList

use of com.carrotsearch.hppc.IntArrayList in project graphhopper by graphhopper.

the class LocationIndexTree method prepareAlgo.

void prepareAlgo() {
    // 0.1 meter should count as 'equal'
    equalNormedDelta = distCalc.calcNormalizedDist(0.1);
    // now calculate the necessary maxDepth d for our current bounds
    // if we assume a minimum resolution like 0.5km for a leaf-tile                
    // n^(depth/2) = toMeter(dLon) / minResolution
    BBox bounds = graph.getBounds();
    if (graph.getNodes() == 0)
        throw new IllegalStateException("Cannot create location index of empty graph!");
    if (!bounds.isValid())
        throw new IllegalStateException("Cannot create location index when graph has invalid bounds: " + bounds);
    double lat = Math.min(Math.abs(bounds.maxLat), Math.abs(bounds.minLat));
    double maxDistInMeter = Math.max((bounds.maxLat - bounds.minLat) / 360 * DistanceCalcEarth.C, (bounds.maxLon - bounds.minLon) / 360 * preciseDistCalc.calcCircumference(lat));
    double tmp = maxDistInMeter / minResolutionInMeter;
    tmp = tmp * tmp;
    IntArrayList tmpEntries = new IntArrayList();
    // the last one is always 4 to reduce costs if only a single entry
    tmp /= 4;
    while (tmp > 1) {
        int tmpNo;
        if (tmp >= 64) {
            tmpNo = 64;
        } else if (tmp >= 16) {
            tmpNo = 16;
        } else if (tmp >= 4) {
            tmpNo = 4;
        } else {
            break;
        }
        tmpEntries.add(tmpNo);
        tmp /= tmpNo;
    }
    tmpEntries.add(4);
    initEntries(tmpEntries.toArray());
    int shiftSum = 0;
    long parts = 1;
    for (int i = 0; i < shifts.length; i++) {
        shiftSum += shifts[i];
        parts *= entries[i];
    }
    if (shiftSum > 64)
        throw new IllegalStateException("sum of all shifts does not fit into a long variable");
    keyAlgo = new SpatialKeyAlgo(shiftSum).bounds(bounds);
    parts = Math.round(Math.sqrt(parts));
    deltaLat = (bounds.maxLat - bounds.minLat) / parts;
    deltaLon = (bounds.maxLon - bounds.minLon) / parts;
}
Also used : SpatialKeyAlgo(com.graphhopper.geohash.SpatialKeyAlgo) BBox(com.graphhopper.util.shapes.BBox) IntArrayList(com.carrotsearch.hppc.IntArrayList) GHPoint(com.graphhopper.util.shapes.GHPoint)

Example 25 with IntArrayList

use of com.carrotsearch.hppc.IntArrayList in project graphhopper by graphhopper.

the class PrepareRoutingSubnetworksTest method test481.

@Test
public void test481() {
    // 0->1->3->4->5->6
    //  2        7<--/
    GraphHopperStorage g = createStorage(em);
    g.edge(0, 1, 1, false);
    g.edge(1, 2, 1, false);
    g.edge(2, 0, 1, false);
    g.edge(1, 3, 1, false);
    g.edge(3, 4, 1, false);
    g.edge(4, 5, 1, false);
    g.edge(5, 6, 1, false);
    g.edge(6, 7, 1, false);
    g.edge(7, 4, 1, false);
    PrepareRoutingSubnetworks instance = new PrepareRoutingSubnetworks(g, Collections.singletonList(carFlagEncoder)).setMinOneWayNetworkSize(2).setMinNetworkSize(4);
    instance.doWork();
    // only one remaining network
    List<IntArrayList> components = instance.findSubnetworks(new PrepEdgeFilter(carFlagEncoder));
    assertEquals(1, components.size());
}
Also used : IntArrayList(com.carrotsearch.hppc.IntArrayList) PrepEdgeFilter(com.graphhopper.routing.subnetwork.PrepareRoutingSubnetworks.PrepEdgeFilter) GraphHopperStorage(com.graphhopper.storage.GraphHopperStorage) Test(org.junit.Test)

Aggregations

IntArrayList (com.carrotsearch.hppc.IntArrayList)27 GHIntArrayList (com.graphhopper.coll.GHIntArrayList)6 PrepEdgeFilter (com.graphhopper.routing.subnetwork.PrepareRoutingSubnetworks.PrepEdgeFilter)4 FlagEncoder (com.graphhopper.routing.util.FlagEncoder)4 GraphHopperStorage (com.graphhopper.storage.GraphHopperStorage)4 Test (org.junit.Test)4 ScoreDoc (org.apache.lucene.search.ScoreDoc)3 BBox (com.graphhopper.util.shapes.BBox)2 GHPoint (com.graphhopper.util.shapes.GHPoint)2 IOException (java.io.IOException)2 UUID (java.util.UUID)2 FetchSearchResult (org.elasticsearch.search.fetch.FetchSearchResult)2 ShardFetchRequest (org.elasticsearch.search.fetch.ShardFetchRequest)2 QuerySearchResult (org.elasticsearch.search.query.QuerySearchResult)2 QuerySearchResultProvider (org.elasticsearch.search.query.QuerySearchResultProvider)2 IntHashSet (com.carrotsearch.hppc.IntHashSet)1 IntIndexedContainer (com.carrotsearch.hppc.IntIndexedContainer)1 IntObjectHashMap (com.carrotsearch.hppc.IntObjectHashMap)1 GHBitSet (com.graphhopper.coll.GHBitSet)1 GHBitSetImpl (com.graphhopper.coll.GHBitSetImpl)1