Search in sources :

Example 36 with DataSegment

use of org.apache.druid.timeline.DataSegment in project druid by druid-io.

the class QueryRunnerBasedOnClusteredClientTestBase method prepareCluster.

protected void prepareCluster(int numServers) {
    Preconditions.checkArgument(numServers < 25, "Cannot be larger than 24");
    for (int i = 0; i < numServers; i++) {
        final int partitionId = i % 2;
        final int intervalIndex = i / 2;
        final Interval interval = Intervals.of("2000-01-01T%02d/PT1H", intervalIndex);
        final DataSegment segment = newSegment(interval, partitionId, 2);
        addServer(SimpleServerView.createServer(i + 1), segment, generateSegment(segment));
    }
}
Also used : DataSegment(org.apache.druid.timeline.DataSegment) Interval(org.joda.time.Interval)

Example 37 with DataSegment

use of org.apache.druid.timeline.DataSegment in project druid by druid-io.

the class RetryQueryRunnerTest method dropSegmentFromServerAndAddNewServerForSegment.

/**
 * Drops a segment from the {@code fromServer} and creates a new server serving the dropped segment.
 * This method updates the server view.
 */
private void dropSegmentFromServerAndAddNewServerForSegment(DruidServer fromServer) {
    final NonnullPair<DataSegment, QueryableIndex> pair = unannounceSegmentFromServer(fromServer);
    final DataSegment segmentToMove = pair.lhs;
    final QueryableIndex queryableIndexToMove = pair.rhs;
    addServer(SimpleServerView.createServer(11), segmentToMove, queryableIndexToMove);
}
Also used : QueryableIndex(org.apache.druid.segment.QueryableIndex) DataSegment(org.apache.druid.timeline.DataSegment)

Example 38 with DataSegment

use of org.apache.druid.timeline.DataSegment in project druid by druid-io.

the class OmniDataSegmentKillerTest method testKillSegmentUnknowType.

@Test
public void testKillSegmentUnknowType() {
    final DataSegment segment = Mockito.mock(DataSegment.class);
    Mockito.when(segment.getLoadSpec()).thenReturn(ImmutableMap.of("type", "unknown-type"));
    final Injector injector = createInjector(null);
    final OmniDataSegmentKiller segmentKiller = injector.getInstance(OmniDataSegmentKiller.class);
    Assert.assertThrows("Unknown loader type[unknown-type]. Known types are [explode]", SegmentLoadingException.class, () -> segmentKiller.kill(segment));
}
Also used : Injector(com.google.inject.Injector) DataSegment(org.apache.druid.timeline.DataSegment) Test(org.junit.Test)

Example 39 with DataSegment

use of org.apache.druid.timeline.DataSegment in project druid by druid-io.

the class SegmentLocalCacheManagerConcurrencyTest method testGetSegment.

@Test
public void testGetSegment() throws IOException, ExecutionException, InterruptedException {
    final File localStorageFolder = tmpFolder.newFolder("local_storage_folder");
    final List<DataSegment> segmentsToLoad = new ArrayList<>(4);
    final Interval interval = Intervals.of("2019-01-01/P1D");
    for (int partitionId = 0; partitionId < 4; partitionId++) {
        final String segmentPath = Paths.get(localStorageFolder.getCanonicalPath(), dataSource, StringUtils.format("%s_%s", interval.getStart().toString(), interval.getEnd().toString()), segmentVersion, String.valueOf(partitionId)).toString();
        // manually create a local segment under localStorageFolder
        final File localSegmentFile = new File(localStorageFolder, segmentPath);
        FileUtils.mkdirp(localSegmentFile);
        final File indexZip = new File(localSegmentFile, "index.zip");
        indexZip.createNewFile();
        final DataSegment segment = newSegment(interval, partitionId).withLoadSpec(ImmutableMap.of("type", "local", "path", localSegmentFile.getAbsolutePath()));
        segmentsToLoad.add(segment);
    }
    final List<Future> futures = segmentsToLoad.stream().map(segment -> executorService.submit(() -> manager.getSegmentFiles(segment))).collect(Collectors.toList());
    expectedException.expect(ExecutionException.class);
    expectedException.expectCause(CoreMatchers.instanceOf(SegmentLoadingException.class));
    expectedException.expectMessage("Failed to load segment");
    for (Future future : futures) {
        future.get();
    }
}
Also used : CoreMatchers(org.hamcrest.CoreMatchers) InjectableValues(com.fasterxml.jackson.databind.InjectableValues) Intervals(org.apache.druid.java.util.common.Intervals) ArrayList(java.util.ArrayList) Interval(org.joda.time.Interval) Future(java.util.concurrent.Future) ImmutableList(com.google.common.collect.ImmutableList) After(org.junit.After) NamedType(com.fasterxml.jackson.databind.jsontype.NamedType) ExpectedException(org.junit.rules.ExpectedException) FileUtils(org.apache.druid.java.util.common.FileUtils) NoopServiceEmitter(org.apache.druid.server.metrics.NoopServiceEmitter) ExecutorService(java.util.concurrent.ExecutorService) Before(org.junit.Before) DateTimes(org.apache.druid.java.util.common.DateTimes) Execs(org.apache.druid.java.util.common.concurrent.Execs) EmittingLogger(org.apache.druid.java.util.emitter.EmittingLogger) ImmutableMap(com.google.common.collect.ImmutableMap) NumberedShardSpec(org.apache.druid.timeline.partition.NumberedShardSpec) ObjectMapper(com.fasterxml.jackson.databind.ObjectMapper) StringUtils(org.apache.druid.java.util.common.StringUtils) Test(org.junit.Test) IOException(java.io.IOException) Collectors(java.util.stream.Collectors) File(java.io.File) DefaultObjectMapper(org.apache.druid.jackson.DefaultObjectMapper) ExecutionException(java.util.concurrent.ExecutionException) List(java.util.List) Rule(org.junit.Rule) Paths(java.nio.file.Paths) DataSegment(org.apache.druid.timeline.DataSegment) TemporaryFolder(org.junit.rules.TemporaryFolder) ArrayList(java.util.ArrayList) Future(java.util.concurrent.Future) File(java.io.File) DataSegment(org.apache.druid.timeline.DataSegment) Interval(org.joda.time.Interval) Test(org.junit.Test)

Example 40 with DataSegment

use of org.apache.druid.timeline.DataSegment in project druid by druid-io.

the class SqlSegmentsMetadataManagerTest method testMarkAsUnusedAllSegmentsInDataSource.

@Test(timeout = 60_000)
public void testMarkAsUnusedAllSegmentsInDataSource() throws IOException, InterruptedException {
    sqlSegmentsMetadataManager.startPollingDatabasePeriodically();
    sqlSegmentsMetadataManager.poll();
    Assert.assertTrue(sqlSegmentsMetadataManager.isPollingDatabasePeriodically());
    final String newDataSource = "wikipedia2";
    final DataSegment newSegment = createNewSegment1(newDataSource);
    publisher.publishSegment(newSegment);
    awaitDataSourceAppeared(newDataSource);
    int numChangedSegments = sqlSegmentsMetadataManager.markAsUnusedAllSegmentsInDataSource(newDataSource);
    Assert.assertEquals(1, numChangedSegments);
    awaitDataSourceDisappeared(newDataSource);
    Assert.assertNull(sqlSegmentsMetadataManager.getImmutableDataSourceWithUsedSegments(newDataSource));
}
Also used : DataSegment(org.apache.druid.timeline.DataSegment) Test(org.junit.Test)

Aggregations

DataSegment (org.apache.druid.timeline.DataSegment)612 Test (org.junit.Test)386 ArrayList (java.util.ArrayList)161 Interval (org.joda.time.Interval)158 File (java.io.File)138 Map (java.util.Map)110 List (java.util.List)108 ImmutableList (com.google.common.collect.ImmutableList)77 IOException (java.io.IOException)77 HashMap (java.util.HashMap)74 ImmutableMap (com.google.common.collect.ImmutableMap)72 NumberedShardSpec (org.apache.druid.timeline.partition.NumberedShardSpec)68 HashSet (java.util.HashSet)58 TaskStatus (org.apache.druid.indexer.TaskStatus)53 Collectors (java.util.stream.Collectors)52 Set (java.util.Set)50 CountDownLatch (java.util.concurrent.CountDownLatch)50 ISE (org.apache.druid.java.util.common.ISE)50 SegmentId (org.apache.druid.timeline.SegmentId)47 LinearShardSpec (org.apache.druid.timeline.partition.LinearShardSpec)45