Search in sources :

Example 1 with EncodingFormatMock

use of org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock in project flink by apache.

the class KafkaDynamicTableFactoryTest method testTableSinkWithParallelism.

@Test
public void testTableSinkWithParallelism() {
    final Map<String, String> modifiedOptions = getModifiedOptions(getBasicSinkOptions(), options -> options.put("sink.parallelism", "100"));
    KafkaDynamicSink actualSink = (KafkaDynamicSink) createTableSink(SCHEMA, modifiedOptions);
    final EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat = new EncodingFormatMock(",");
    final DynamicTableSink expectedSink = createExpectedSink(SCHEMA_DATA_TYPE, null, valueEncodingFormat, new int[0], new int[] { 0, 1, 2 }, null, TOPIC, KAFKA_SINK_PROPERTIES, new FlinkFixedPartitioner<>(), DeliveryGuarantee.EXACTLY_ONCE, 100, "kafka-sink");
    assertThat(actualSink).isEqualTo(expectedSink);
    final DynamicTableSink.SinkRuntimeProvider provider = actualSink.getSinkRuntimeProvider(new SinkRuntimeProviderContext(false));
    assertThat(provider).isInstanceOf(SinkV2Provider.class);
    final SinkV2Provider sinkProvider = (SinkV2Provider) provider;
    assertThat(sinkProvider.getParallelism().isPresent()).isTrue();
    assertThat((long) sinkProvider.getParallelism().get()).isEqualTo(100);
}
Also used : SinkRuntimeProviderContext(org.apache.flink.table.runtime.connector.sink.SinkRuntimeProviderContext) EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) ConfluentRegistryAvroSerializationSchema(org.apache.flink.formats.avro.registry.confluent.ConfluentRegistryAvroSerializationSchema) AvroRowDataSerializationSchema(org.apache.flink.formats.avro.AvroRowDataSerializationSchema) SerializationSchema(org.apache.flink.api.common.serialization.SerializationSchema) DebeziumAvroSerializationSchema(org.apache.flink.formats.avro.registry.confluent.debezium.DebeziumAvroSerializationSchema) SinkV2Provider(org.apache.flink.table.connector.sink.SinkV2Provider) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) Test(org.junit.jupiter.api.Test) ParameterizedTest(org.junit.jupiter.params.ParameterizedTest)

Example 2 with EncodingFormatMock

use of org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock in project flink by apache.

the class KafkaDynamicTableFactoryTest method testTableSinkWithKeyValue.

@Test
public void testTableSinkWithKeyValue() {
    final Map<String, String> modifiedOptions = getModifiedOptions(getKeyValueOptions(), options -> {
        options.put("sink.delivery-guarantee", "exactly-once");
        options.put("sink.transactional-id-prefix", "kafka-sink");
    });
    final DynamicTableSink actualSink = createTableSink(SCHEMA, modifiedOptions);
    final KafkaDynamicSink actualKafkaSink = (KafkaDynamicSink) actualSink;
    // initialize stateful testing formats
    actualKafkaSink.getSinkRuntimeProvider(new SinkRuntimeProviderContext(false));
    final EncodingFormatMock keyEncodingFormat = new EncodingFormatMock("#");
    keyEncodingFormat.consumedDataType = DataTypes.ROW(DataTypes.FIELD(NAME, DataTypes.STRING().notNull())).notNull();
    final EncodingFormatMock valueEncodingFormat = new EncodingFormatMock("|");
    valueEncodingFormat.consumedDataType = DataTypes.ROW(DataTypes.FIELD(COUNT, DataTypes.DECIMAL(38, 18)), DataTypes.FIELD(TIME, DataTypes.TIMESTAMP(3))).notNull();
    final DynamicTableSink expectedSink = createExpectedSink(SCHEMA_DATA_TYPE, keyEncodingFormat, valueEncodingFormat, new int[] { 0 }, new int[] { 1, 2 }, null, TOPIC, KAFKA_FINAL_SINK_PROPERTIES, new FlinkFixedPartitioner<>(), DeliveryGuarantee.EXACTLY_ONCE, null, "kafka-sink");
    assertThat(actualSink).isEqualTo(expectedSink);
}
Also used : SinkRuntimeProviderContext(org.apache.flink.table.runtime.connector.sink.SinkRuntimeProviderContext) EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) Test(org.junit.jupiter.api.Test) ParameterizedTest(org.junit.jupiter.params.ParameterizedTest)

Example 3 with EncodingFormatMock

use of org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock in project flink by apache.

the class KafkaDynamicTableFactoryTest method testTableSinkSemanticTranslation.

@Test
public void testTableSinkSemanticTranslation() {
    final List<String> semantics = ImmutableList.of("exactly-once", "at-least-once", "none");
    final EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat = new EncodingFormatMock(",");
    for (final String semantic : semantics) {
        final Map<String, String> modifiedOptions = getModifiedOptions(getBasicSinkOptions(), options -> {
            options.put("sink.semantic", semantic);
            options.put("sink.transactional-id-prefix", "kafka-sink");
        });
        final DynamicTableSink actualSink = createTableSink(SCHEMA, modifiedOptions);
        final DynamicTableSink expectedSink = createExpectedSink(SCHEMA_DATA_TYPE, null, valueEncodingFormat, new int[0], new int[] { 0, 1, 2 }, null, TOPIC, KAFKA_SINK_PROPERTIES, new FlinkFixedPartitioner<>(), DeliveryGuarantee.valueOf(semantic.toUpperCase().replace("-", "_")), null, "kafka-sink");
        assertThat(actualSink).isEqualTo(expectedSink);
    }
}
Also used : EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) ConfluentRegistryAvroSerializationSchema(org.apache.flink.formats.avro.registry.confluent.ConfluentRegistryAvroSerializationSchema) AvroRowDataSerializationSchema(org.apache.flink.formats.avro.AvroRowDataSerializationSchema) SerializationSchema(org.apache.flink.api.common.serialization.SerializationSchema) DebeziumAvroSerializationSchema(org.apache.flink.formats.avro.registry.confluent.debezium.DebeziumAvroSerializationSchema) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) Test(org.junit.jupiter.api.Test) ParameterizedTest(org.junit.jupiter.params.ParameterizedTest)

Example 4 with EncodingFormatMock

use of org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock in project flink by apache.

the class FactoryUtilTest method testAllOptions.

@Test
public void testAllOptions() {
    final Map<String, String> options = createAllOptions();
    final DynamicTableSource actualSource = createTableSource(SCHEMA, options);
    final DynamicTableSource expectedSource = new DynamicTableSourceMock("MyTarget", null, new DecodingFormatMock(",", false), new DecodingFormatMock("|", true));
    assertThat(actualSource).isEqualTo(expectedSource);
    final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
    final DynamicTableSink expectedSink = new DynamicTableSinkMock("MyTarget", 1000L, new EncodingFormatMock(","), new EncodingFormatMock("|"));
    assertThat(actualSink).isEqualTo(expectedSink);
}
Also used : EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) DecodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) DynamicTableSourceMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSourceMock) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) DynamicTableSinkMock(org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSinkMock) Test(org.junit.jupiter.api.Test)

Example 5 with EncodingFormatMock

use of org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock in project flink by apache.

the class KafkaDynamicTableFactoryTest method testTableSink.

@Test
public void testTableSink() {
    final Map<String, String> modifiedOptions = getModifiedOptions(getBasicSinkOptions(), options -> {
        options.put("sink.delivery-guarantee", "exactly-once");
        options.put("sink.transactional-id-prefix", "kafka-sink");
    });
    final DynamicTableSink actualSink = createTableSink(SCHEMA, modifiedOptions);
    final EncodingFormat<SerializationSchema<RowData>> valueEncodingFormat = new EncodingFormatMock(",");
    final DynamicTableSink expectedSink = createExpectedSink(SCHEMA_DATA_TYPE, null, valueEncodingFormat, new int[0], new int[] { 0, 1, 2 }, null, TOPIC, KAFKA_SINK_PROPERTIES, new FlinkFixedPartitioner<>(), DeliveryGuarantee.EXACTLY_ONCE, null, "kafka-sink");
    assertThat(actualSink).isEqualTo(expectedSink);
    // Test kafka producer.
    final KafkaDynamicSink actualKafkaSink = (KafkaDynamicSink) actualSink;
    DynamicTableSink.SinkRuntimeProvider provider = actualKafkaSink.getSinkRuntimeProvider(new SinkRuntimeProviderContext(false));
    assertThat(provider).isInstanceOf(SinkV2Provider.class);
    final SinkV2Provider sinkProvider = (SinkV2Provider) provider;
    final Sink<RowData> sinkFunction = sinkProvider.createSink();
    assertThat(sinkFunction).isInstanceOf(KafkaSink.class);
}
Also used : SinkRuntimeProviderContext(org.apache.flink.table.runtime.connector.sink.SinkRuntimeProviderContext) EncodingFormatMock(org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock) ConfluentRegistryAvroSerializationSchema(org.apache.flink.formats.avro.registry.confluent.ConfluentRegistryAvroSerializationSchema) AvroRowDataSerializationSchema(org.apache.flink.formats.avro.AvroRowDataSerializationSchema) SerializationSchema(org.apache.flink.api.common.serialization.SerializationSchema) DebeziumAvroSerializationSchema(org.apache.flink.formats.avro.registry.confluent.debezium.DebeziumAvroSerializationSchema) DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) RowData(org.apache.flink.table.data.RowData) SinkV2Provider(org.apache.flink.table.connector.sink.SinkV2Provider) Test(org.junit.jupiter.api.Test) ParameterizedTest(org.junit.jupiter.params.ParameterizedTest)

Aggregations

DynamicTableSink (org.apache.flink.table.connector.sink.DynamicTableSink)8 EncodingFormatMock (org.apache.flink.table.factories.TestFormatFactory.EncodingFormatMock)8 Test (org.junit.jupiter.api.Test)8 DynamicTableSource (org.apache.flink.table.connector.source.DynamicTableSource)4 DynamicTableSinkMock (org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSinkMock)4 DynamicTableSourceMock (org.apache.flink.table.factories.TestDynamicTableFactory.DynamicTableSourceMock)4 DecodingFormatMock (org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock)4 ParameterizedTest (org.junit.jupiter.params.ParameterizedTest)4 SerializationSchema (org.apache.flink.api.common.serialization.SerializationSchema)3 AvroRowDataSerializationSchema (org.apache.flink.formats.avro.AvroRowDataSerializationSchema)3 ConfluentRegistryAvroSerializationSchema (org.apache.flink.formats.avro.registry.confluent.ConfluentRegistryAvroSerializationSchema)3 DebeziumAvroSerializationSchema (org.apache.flink.formats.avro.registry.confluent.debezium.DebeziumAvroSerializationSchema)3 SinkRuntimeProviderContext (org.apache.flink.table.runtime.connector.sink.SinkRuntimeProviderContext)3 SinkV2Provider (org.apache.flink.table.connector.sink.SinkV2Provider)2 RowData (org.apache.flink.table.data.RowData)1