use of org.apache.druid.query.SegmentDescriptor in project druid by druid-io.
the class CoordinatorBasedSegmentHandoffNotifierTest method testHandoffChecksForPartitionNumber.
@Test
public void testHandoffChecksForPartitionNumber() {
Interval interval = Intervals.of("2011-04-01/2011-04-02");
Assert.assertTrue(CoordinatorBasedSegmentHandoffNotifier.isHandOffComplete(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(interval, "v1", 1), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(interval, "v1", 1)));
Assert.assertFalse(CoordinatorBasedSegmentHandoffNotifier.isHandOffComplete(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(interval, "v1", 1), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(interval, "v1", 2)));
}
use of org.apache.druid.query.SegmentDescriptor in project druid by druid-io.
the class DataSourcesResourceTest method testSegmentLoadChecksForInterval.
@Test
public void testSegmentLoadChecksForInterval() {
Assert.assertFalse(DataSourcesResource.isSegmentLoaded(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(Intervals.of("2011-04-01/2011-04-02"), "v1", 1), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(Intervals.of("2011-04-01/2011-04-03"), "v1", 1)));
Assert.assertTrue(DataSourcesResource.isSegmentLoaded(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(Intervals.of("2011-04-01/2011-04-04"), "v1", 1), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(Intervals.of("2011-04-02/2011-04-03"), "v1", 1)));
}
use of org.apache.druid.query.SegmentDescriptor in project druid by druid-io.
the class DataSourcesResourceTest method testSegmentLoadChecksForAssignableServer.
@Test
public void testSegmentLoadChecksForAssignableServer() {
Interval interval = Intervals.of("2011-04-01/2011-04-02");
Assert.assertTrue(DataSourcesResource.isSegmentLoaded(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(interval, "v1", 2), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(interval, "v1", 2)));
Assert.assertFalse(DataSourcesResource.isSegmentLoaded(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(interval, "v1", 2), Sets.newHashSet(createRealtimeServerMetadata("a")))), new SegmentDescriptor(interval, "v1", 2)));
}
use of org.apache.druid.query.SegmentDescriptor in project druid by druid-io.
the class DataSourcesResourceTest method testSegmentLoadChecksForPartitionNumber.
@Test
public void testSegmentLoadChecksForPartitionNumber() {
Interval interval = Intervals.of("2011-04-01/2011-04-02");
Assert.assertTrue(DataSourcesResource.isSegmentLoaded(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(interval, "v1", 1), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(interval, "v1", 1)));
Assert.assertFalse(DataSourcesResource.isSegmentLoaded(Collections.singletonList(new ImmutableSegmentLoadInfo(createSegment(interval, "v1", 1), Sets.newHashSet(createHistoricalServerMetadata("a")))), new SegmentDescriptor(interval, "v1", 2)));
}
use of org.apache.druid.query.SegmentDescriptor in project druid by druid-io.
the class TestClusterQuerySegmentWalker method getSegmentsForTable.
private List<WindowedSegment> getSegmentsForTable(final String dataSource, final Iterable<SegmentDescriptor> specs) {
final VersionedIntervalTimeline<String, ReferenceCountingSegment> timeline = timelines.get(dataSource);
if (timeline == null) {
return Collections.emptyList();
} else {
final List<WindowedSegment> retVal = new ArrayList<>();
for (SegmentDescriptor spec : specs) {
final PartitionChunk<ReferenceCountingSegment> entry = timeline.findChunk(spec.getInterval(), spec.getVersion(), spec.getPartitionNumber());
retVal.add(new WindowedSegment(entry.getObject(), spec.getInterval()));
}
return retVal;
}
}
Aggregations