Search in sources :

Example 6 with RecordPublisher

use of org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher in project flink by apache.

the class PollingRecordPublisherFactoryTest method testBuildPollingRecordPublisher.

@Test
public void testBuildPollingRecordPublisher() throws Exception {
    RecordPublisher recordPublisher = factory.create(StartingPosition.restartFromSequenceNumber(SENTINEL_LATEST_SEQUENCE_NUM.get()), new Properties(), createFakeShardConsumerMetricGroup(), mock(StreamShardHandle.class));
    assertTrue(recordPublisher instanceof PollingRecordPublisher);
    assertFalse(recordPublisher instanceof AdaptivePollingRecordPublisher);
}
Also used : RecordPublisher(org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher) StreamShardHandle(org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle) Properties(java.util.Properties) Test(org.junit.Test)

Example 7 with RecordPublisher

use of org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher in project flink by apache.

the class ShardConsumerTestUtils method assertNumberOfMessagesReceivedFromKinesis.

public static ShardConsumerMetricsReporter assertNumberOfMessagesReceivedFromKinesis(final int expectedNumberOfMessages, final RecordPublisherFactory recordPublisherFactory, final SequenceNumber startingSequenceNumber, final Properties consumerProperties, final SequenceNumber expectedLastProcessedSequenceNum, final AbstractMetricGroup metricGroup) throws InterruptedException {
    ShardConsumerMetricsReporter shardMetricsReporter = new ShardConsumerMetricsReporter(metricGroup);
    StreamShardHandle fakeToBeConsumedShard = getMockStreamShard("fakeStream", 0);
    LinkedList<KinesisStreamShardState> subscribedShardsStateUnderTest = new LinkedList<>();
    subscribedShardsStateUnderTest.add(new KinesisStreamShardState(KinesisDataFetcher.convertToStreamShardMetadata(fakeToBeConsumedShard), fakeToBeConsumedShard, startingSequenceNumber));
    TestSourceContext<String> sourceContext = new TestSourceContext<>();
    KinesisDeserializationSchemaWrapper<String> deserializationSchema = new KinesisDeserializationSchemaWrapper<>(new SimpleStringSchema());
    TestableKinesisDataFetcher<String> fetcher = new TestableKinesisDataFetcher<>(Collections.singletonList("fakeStream"), sourceContext, consumerProperties, deserializationSchema, 10, 2, new AtomicReference<>(), subscribedShardsStateUnderTest, KinesisDataFetcher.createInitialSubscribedStreamsToLastDiscoveredShardsState(Collections.singletonList("fakeStream")), Mockito.mock(KinesisProxyInterface.class), Mockito.mock(KinesisProxyV2Interface.class));
    final StreamShardHandle shardHandle = subscribedShardsStateUnderTest.get(0).getStreamShardHandle();
    final SequenceNumber lastProcessedSequenceNum = subscribedShardsStateUnderTest.get(0).getLastProcessedSequenceNum();
    final StartingPosition startingPosition = AWSUtil.getStartingPosition(lastProcessedSequenceNum, consumerProperties);
    final RecordPublisher recordPublisher = recordPublisherFactory.create(startingPosition, fetcher.getConsumerConfiguration(), metricGroup, shardHandle);
    int shardIndex = fetcher.registerNewSubscribedShardState(subscribedShardsStateUnderTest.get(0));
    new ShardConsumer<>(fetcher, recordPublisher, shardIndex, shardHandle, lastProcessedSequenceNum, shardMetricsReporter, deserializationSchema).run();
    assertEquals(expectedNumberOfMessages, sourceContext.getCollectedOutputs().size());
    assertEquals(expectedLastProcessedSequenceNum, subscribedShardsStateUnderTest.get(0).getLastProcessedSequenceNum());
    return shardMetricsReporter;
}
Also used : StartingPosition(org.apache.flink.streaming.connectors.kinesis.model.StartingPosition) ShardConsumerMetricsReporter(org.apache.flink.streaming.connectors.kinesis.metrics.ShardConsumerMetricsReporter) KinesisDeserializationSchemaWrapper(org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper) LinkedList(java.util.LinkedList) TestSourceContext(org.apache.flink.streaming.connectors.kinesis.testutils.TestSourceContext) RecordPublisher(org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher) StreamShardHandle(org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle) SequenceNumber(org.apache.flink.streaming.connectors.kinesis.model.SequenceNumber) SimpleStringSchema(org.apache.flink.api.common.serialization.SimpleStringSchema) KinesisProxyV2Interface(org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyV2Interface) KinesisStreamShardState(org.apache.flink.streaming.connectors.kinesis.model.KinesisStreamShardState) KinesisProxyInterface(org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface) TestableKinesisDataFetcher(org.apache.flink.streaming.connectors.kinesis.testutils.TestableKinesisDataFetcher)

Example 8 with RecordPublisher

use of org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher in project flink by apache.

the class PollingRecordPublisherFactoryTest method testBuildAdaptivePollingRecordPublisher.

@Test
public void testBuildAdaptivePollingRecordPublisher() throws Exception {
    Properties properties = new Properties();
    properties.setProperty(SHARD_USE_ADAPTIVE_READS, "true");
    RecordPublisher recordPublisher = factory.create(StartingPosition.restartFromSequenceNumber(SENTINEL_LATEST_SEQUENCE_NUM.get()), properties, createFakeShardConsumerMetricGroup(), mock(StreamShardHandle.class));
    assertTrue(recordPublisher instanceof PollingRecordPublisher);
    assertTrue(recordPublisher instanceof AdaptivePollingRecordPublisher);
}
Also used : RecordPublisher(org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher) StreamShardHandle(org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle) Properties(java.util.Properties) Test(org.junit.Test)

Example 9 with RecordPublisher

use of org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher in project flink by apache.

the class FanOutRecordPublisherTest method testAggregatedRecordDurability.

@Test
public void testAggregatedRecordDurability() throws Exception {
    SingleShardFanOutKinesisV2 kinesis = FakeKinesisFanOutBehavioursFactory.boundedShard().withBatchCount(10).withAggregationFactor(5).withRecordsPerBatch(12).build();
    RecordPublisher recordPublisher = createRecordPublisher(kinesis);
    TestConsumer consumer = new TestConsumer();
    int count = 0;
    while (recordPublisher.run(consumer) == INCOMPLETE) {
        if (++count > 5) {
            break;
        }
    }
    List<UserRecord> userRecords = flattenToUserRecords(consumer.getRecordBatches());
    // Should have received 10 * 12 * 5 = 600 records
    assertEquals(600, userRecords.size());
    int sequence = 1;
    long subsequence = 0;
    for (UserRecord userRecord : userRecords) {
        assertEquals(String.valueOf(sequence), userRecord.getSequenceNumber());
        assertEquals(subsequence++, userRecord.getSubSequenceNumber());
        if (subsequence == 5) {
            sequence++;
            subsequence = 0;
        }
    }
}
Also used : RecordPublisher(org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher) UserRecord(com.amazonaws.services.kinesis.clientlibrary.types.UserRecord) SingleShardFanOutKinesisV2(org.apache.flink.streaming.connectors.kinesis.testutils.FakeKinesisFanOutBehavioursFactory.SingleShardFanOutKinesisV2) TestConsumer(org.apache.flink.streaming.connectors.kinesis.testutils.TestUtils.TestConsumer) Test(org.junit.Test)

Example 10 with RecordPublisher

use of org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher in project flink by apache.

the class FanOutRecordPublisherTest method testRecordDurability.

@Test
public void testRecordDurability() throws Exception {
    SingleShardFanOutKinesisV2 kinesis = FakeKinesisFanOutBehavioursFactory.boundedShard().withBatchCount(10).withBatchesPerSubscription(3).withRecordsPerBatch(12).build();
    RecordPublisher recordPublisher = createRecordPublisher(kinesis);
    TestConsumer consumer = new TestConsumer();
    int count = 0;
    while (recordPublisher.run(consumer) == INCOMPLETE) {
        if (++count > 4) {
            break;
        }
    }
    List<UserRecord> userRecords = flattenToUserRecords(consumer.getRecordBatches());
    // Should have received 10 * 12 = 120 records
    assertEquals(120, userRecords.size());
    int expectedSequenceNumber = 1;
    for (UserRecord record : userRecords) {
        assertEquals(String.valueOf(expectedSequenceNumber++), record.getSequenceNumber());
    }
}
Also used : RecordPublisher(org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher) UserRecord(com.amazonaws.services.kinesis.clientlibrary.types.UserRecord) SingleShardFanOutKinesisV2(org.apache.flink.streaming.connectors.kinesis.testutils.FakeKinesisFanOutBehavioursFactory.SingleShardFanOutKinesisV2) TestConsumer(org.apache.flink.streaming.connectors.kinesis.testutils.TestUtils.TestConsumer) Test(org.junit.Test)

Aggregations

RecordPublisher (org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher)16 Test (org.junit.Test)15 TestConsumer (org.apache.flink.streaming.connectors.kinesis.testutils.TestUtils.TestConsumer)12 SingleShardFanOutKinesisV2 (org.apache.flink.streaming.connectors.kinesis.testutils.FakeKinesisFanOutBehavioursFactory.SingleShardFanOutKinesisV2)8 KinesisProxyV2Interface (org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyV2Interface)4 UserRecord (com.amazonaws.services.kinesis.clientlibrary.types.UserRecord)3 StreamShardHandle (org.apache.flink.streaming.connectors.kinesis.model.StreamShardHandle)3 Date (java.util.Date)2 Properties (java.util.Properties)2 RecordPublisherRunResult (org.apache.flink.streaming.connectors.kinesis.internals.publisher.RecordPublisher.RecordPublisherRunResult)2 SubscriptionErrorKinesisV2 (org.apache.flink.streaming.connectors.kinesis.testutils.FakeKinesisFanOutBehavioursFactory.SubscriptionErrorKinesisV2)2 SdkInterruptedException (com.amazonaws.http.timers.client.SdkInterruptedException)1 LinkedList (java.util.LinkedList)1 SimpleStringSchema (org.apache.flink.api.common.serialization.SimpleStringSchema)1 ShardConsumerMetricsReporter (org.apache.flink.streaming.connectors.kinesis.metrics.ShardConsumerMetricsReporter)1 KinesisStreamShardState (org.apache.flink.streaming.connectors.kinesis.model.KinesisStreamShardState)1 SequenceNumber (org.apache.flink.streaming.connectors.kinesis.model.SequenceNumber)1 StartingPosition (org.apache.flink.streaming.connectors.kinesis.model.StartingPosition)1 KinesisProxyInterface (org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxyInterface)1 KinesisDeserializationSchemaWrapper (org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper)1