Search in sources :

Example 26 with DynamicPartitionsSpec

use of org.apache.druid.indexer.partitions.DynamicPartitionsSpec in project druid by druid-io.

the class ParallelIndexTuningConfigTest method testSerdeWithMaxNumSubTasksAndMaxNumConcurrentSubTasks.

@Test
public void testSerdeWithMaxNumSubTasksAndMaxNumConcurrentSubTasks() {
    expectedException.expect(IllegalArgumentException.class);
    expectedException.expectMessage("Can't use both maxNumSubTasks and maxNumConcurrentSubTasks");
    final int maxNumSubTasks = 250;
    final ParallelIndexTuningConfig tuningConfig = new ParallelIndexTuningConfig(null, null, null, 10, 1000L, null, null, null, null, new DynamicPartitionsSpec(100, 100L), new IndexSpec(new RoaringBitmapSerdeFactory(true), CompressionStrategy.UNCOMPRESSED, CompressionStrategy.LZF, LongEncodingStrategy.LONGS), new IndexSpec(), 1, false, true, 10000L, OffHeapMemorySegmentWriteOutMediumFactory.instance(), maxNumSubTasks, maxNumSubTasks, 100, 20L, new Duration(3600), 128, null, null, false, null, null, null, null, null);
}
Also used : IndexSpec(org.apache.druid.segment.IndexSpec) DynamicPartitionsSpec(org.apache.druid.indexer.partitions.DynamicPartitionsSpec) RoaringBitmapSerdeFactory(org.apache.druid.segment.data.RoaringBitmapSerdeFactory) Duration(org.joda.time.Duration) Test(org.junit.Test)

Example 27 with DynamicPartitionsSpec

use of org.apache.druid.indexer.partitions.DynamicPartitionsSpec in project druid by druid-io.

the class ParallelIndexTuningConfigTest method testConstructorWithDynamicPartitionsSpecAndForceGuaranteedRollupFailToCreate.

@Test
public void testConstructorWithDynamicPartitionsSpecAndForceGuaranteedRollupFailToCreate() {
    expectedException.expect(IllegalArgumentException.class);
    expectedException.expectMessage("cannot be used for perfect rollup");
    final boolean forceGuaranteedRollup = true;
    new ParallelIndexTuningConfig(null, null, null, 10, 1000L, null, null, null, null, new DynamicPartitionsSpec(100, null), new IndexSpec(new RoaringBitmapSerdeFactory(true), CompressionStrategy.UNCOMPRESSED, CompressionStrategy.LZF, LongEncodingStrategy.LONGS), new IndexSpec(), 1, forceGuaranteedRollup, true, 10000L, OffHeapMemorySegmentWriteOutMediumFactory.instance(), null, 10, 100, 20L, new Duration(3600), 128, null, null, false, null, null, null, null, null);
}
Also used : IndexSpec(org.apache.druid.segment.IndexSpec) DynamicPartitionsSpec(org.apache.druid.indexer.partitions.DynamicPartitionsSpec) RoaringBitmapSerdeFactory(org.apache.druid.segment.data.RoaringBitmapSerdeFactory) Duration(org.joda.time.Duration) Test(org.junit.Test)

Example 28 with DynamicPartitionsSpec

use of org.apache.druid.indexer.partitions.DynamicPartitionsSpec in project druid by druid-io.

the class ParallelIndexTuningConfigTest method testSerdeWithMaxRowsPerSegment.

@Test
public void testSerdeWithMaxRowsPerSegment() throws IOException {
    final ParallelIndexTuningConfig tuningConfig = new ParallelIndexTuningConfig(null, null, null, 10, 1000L, null, null, null, null, new DynamicPartitionsSpec(100, 100L), new IndexSpec(new RoaringBitmapSerdeFactory(true), CompressionStrategy.UNCOMPRESSED, CompressionStrategy.LZF, LongEncodingStrategy.LONGS), new IndexSpec(), 1, false, true, 10000L, OffHeapMemorySegmentWriteOutMediumFactory.instance(), null, 250, 100, 20L, new Duration(3600), 128, null, null, false, null, null, null, null, null);
    final byte[] json = mapper.writeValueAsBytes(tuningConfig);
    final ParallelIndexTuningConfig fromJson = (ParallelIndexTuningConfig) mapper.readValue(json, TuningConfig.class);
    Assert.assertEquals(fromJson, tuningConfig);
}
Also used : TuningConfig(org.apache.druid.segment.indexing.TuningConfig) IndexSpec(org.apache.druid.segment.IndexSpec) DynamicPartitionsSpec(org.apache.druid.indexer.partitions.DynamicPartitionsSpec) RoaringBitmapSerdeFactory(org.apache.druid.segment.data.RoaringBitmapSerdeFactory) Duration(org.joda.time.Duration) Test(org.junit.Test)

Example 29 with DynamicPartitionsSpec

use of org.apache.druid.indexer.partitions.DynamicPartitionsSpec in project druid by druid-io.

the class ParallelIndexTuningConfigTest method testSerdeWithMaxNumConcurrentSubTasks.

@Test
public void testSerdeWithMaxNumConcurrentSubTasks() throws IOException {
    final int maxNumConcurrentSubTasks = 250;
    final ParallelIndexTuningConfig tuningConfig = new ParallelIndexTuningConfig(null, null, null, 10, 1000L, null, null, null, null, new DynamicPartitionsSpec(100, 100L), new IndexSpec(new RoaringBitmapSerdeFactory(true), CompressionStrategy.UNCOMPRESSED, CompressionStrategy.LZF, LongEncodingStrategy.LONGS), new IndexSpec(), 1, false, true, 10000L, OffHeapMemorySegmentWriteOutMediumFactory.instance(), null, maxNumConcurrentSubTasks, 100, 20L, new Duration(3600), 128, null, null, false, null, null, null, null, null);
    final byte[] json = mapper.writeValueAsBytes(tuningConfig);
    final ParallelIndexTuningConfig fromJson = (ParallelIndexTuningConfig) mapper.readValue(json, TuningConfig.class);
    Assert.assertEquals(fromJson, tuningConfig);
}
Also used : TuningConfig(org.apache.druid.segment.indexing.TuningConfig) IndexSpec(org.apache.druid.segment.IndexSpec) DynamicPartitionsSpec(org.apache.druid.indexer.partitions.DynamicPartitionsSpec) RoaringBitmapSerdeFactory(org.apache.druid.segment.data.RoaringBitmapSerdeFactory) Duration(org.joda.time.Duration) Test(org.junit.Test)

Example 30 with DynamicPartitionsSpec

use of org.apache.druid.indexer.partitions.DynamicPartitionsSpec in project druid by druid-io.

the class PartialCompactionTest method testPartialCompactHashAndDynamicPartitionedSegments.

@Test
public void testPartialCompactHashAndDynamicPartitionedSegments() {
    final Map<Interval, List<DataSegment>> hashPartitionedSegments = SegmentUtils.groupSegmentsByInterval(runTestTask(new HashedPartitionsSpec(null, 3, null), TaskState.SUCCESS, false));
    final Map<Interval, List<DataSegment>> linearlyPartitionedSegments = SegmentUtils.groupSegmentsByInterval(runTestTask(new DynamicPartitionsSpec(10, null), TaskState.SUCCESS, true));
    // Pick half of each partition lists to compact together
    hashPartitionedSegments.values().forEach(segmentsInInterval -> segmentsInInterval.sort(Comparator.comparing(segment -> segment.getShardSpec().getPartitionNum())));
    linearlyPartitionedSegments.values().forEach(segmentsInInterval -> segmentsInInterval.sort(Comparator.comparing(segment -> segment.getShardSpec().getPartitionNum())));
    final List<DataSegment> segmentsToCompact = new ArrayList<>();
    for (List<DataSegment> segmentsInInterval : hashPartitionedSegments.values()) {
        segmentsToCompact.addAll(segmentsInInterval.subList(segmentsInInterval.size() / 2, segmentsInInterval.size()));
    }
    for (List<DataSegment> segmentsInInterval : linearlyPartitionedSegments.values()) {
        segmentsToCompact.addAll(segmentsInInterval.subList(0, segmentsInInterval.size() / 2));
    }
    final CompactionTask compactionTask = newCompactionTaskBuilder().inputSpec(SpecificSegmentsSpec.fromSegments(segmentsToCompact)).tuningConfig(newTuningConfig(new DynamicPartitionsSpec(20, null), 2, false)).build();
    final Map<Interval, List<DataSegment>> compactedSegments = SegmentUtils.groupSegmentsByInterval(runTask(compactionTask, TaskState.SUCCESS));
    for (List<DataSegment> segmentsInInterval : compactedSegments.values()) {
        final int expectedAtomicUpdateGroupSize = segmentsInInterval.size();
        for (DataSegment segment : segmentsInInterval) {
            Assert.assertEquals(expectedAtomicUpdateGroupSize, segment.getShardSpec().getAtomicUpdateGroupSize());
        }
    }
}
Also used : HashedPartitionsSpec(org.apache.druid.indexer.partitions.HashedPartitionsSpec) DynamicPartitionsSpec(org.apache.druid.indexer.partitions.DynamicPartitionsSpec) CompactionTask(org.apache.druid.indexing.common.task.CompactionTask) ArrayList(java.util.ArrayList) ArrayList(java.util.ArrayList) List(java.util.List) DataSegment(org.apache.druid.timeline.DataSegment) Interval(org.joda.time.Interval) Test(org.junit.Test)

Aggregations

DynamicPartitionsSpec (org.apache.druid.indexer.partitions.DynamicPartitionsSpec)52 Test (org.junit.Test)34 IndexSpec (org.apache.druid.segment.IndexSpec)19 List (java.util.List)15 Map (java.util.Map)15 ImmutableList (com.google.common.collect.ImmutableList)13 StringUtils (org.apache.druid.java.util.common.StringUtils)13 DataSegment (org.apache.druid.timeline.DataSegment)13 ImmutableMap (com.google.common.collect.ImmutableMap)12 HashMap (java.util.HashMap)11 Function (java.util.function.Function)11 Pair (org.apache.druid.java.util.common.Pair)11 Closeable (java.io.Closeable)10 DimensionsSpec (org.apache.druid.data.input.impl.DimensionsSpec)10 RoaringBitmapSerdeFactory (org.apache.druid.segment.data.RoaringBitmapSerdeFactory)10 Duration (org.joda.time.Duration)10 Interval (org.joda.time.Interval)10 ArrayList (java.util.ArrayList)9 UUID (java.util.UUID)9 UniformGranularitySpec (org.apache.druid.segment.indexing.granularity.UniformGranularitySpec)9