Search in sources :

Example 1 with Builder

use of org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder in project flink by apache.

the class RexWindowBoundSerdeTest method testSerde.

@Test
public void testSerde() throws IOException {
    SerdeContext serdeCtx = new SerdeContext(null, new FlinkContextImpl(false, TableConfig.getDefault(), new ModuleManager(), null, CatalogManagerMocks.createEmptyCatalogManager(), null), Thread.currentThread().getContextClassLoader(), FlinkTypeFactory.INSTANCE(), FlinkSqlOperatorTable.instance());
    ObjectReader objectReader = JsonSerdeUtil.createObjectReader(serdeCtx);
    ObjectWriter objectWriter = JsonSerdeUtil.createObjectWriter(serdeCtx);
    assertEquals(RexWindowBounds.CURRENT_ROW, objectReader.readValue(objectWriter.writeValueAsString(RexWindowBounds.CURRENT_ROW), RexWindowBound.class));
    assertEquals(RexWindowBounds.UNBOUNDED_FOLLOWING, objectReader.readValue(objectWriter.writeValueAsString(RexWindowBounds.UNBOUNDED_FOLLOWING), RexWindowBound.class));
    assertEquals(RexWindowBounds.UNBOUNDED_PRECEDING, objectReader.readValue(objectWriter.writeValueAsString(RexWindowBounds.UNBOUNDED_PRECEDING), RexWindowBound.class));
    RexBuilder builder = new RexBuilder(FlinkTypeFactory.INSTANCE());
    RexWindowBound windowBound = RexWindowBounds.following(builder.makeLiteral("test"));
    assertEquals(windowBound, objectReader.readValue(objectWriter.writeValueAsString(windowBound), RexWindowBound.class));
    windowBound = RexWindowBounds.preceding(builder.makeLiteral("test"));
    assertEquals(windowBound, objectReader.readValue(objectWriter.writeValueAsString(windowBound), RexWindowBound.class));
}
Also used : FlinkContextImpl(org.apache.flink.table.planner.calcite.FlinkContextImpl) RexWindowBound(org.apache.calcite.rex.RexWindowBound) ObjectWriter(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectWriter) RexBuilder(org.apache.calcite.rex.RexBuilder) ObjectReader(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectReader) ModuleManager(org.apache.flink.table.module.ModuleManager) Test(org.junit.Test)

Example 2 with Builder

use of org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder in project flink by apache.

the class ChangelogModeJsonDeserializer method deserialize.

@Override
public ChangelogMode deserialize(JsonParser jsonParser, DeserializationContext deserializationContext) throws IOException {
    ChangelogMode.Builder builder = ChangelogMode.newBuilder();
    JsonNode rowKindsNode = jsonParser.readValueAsTree();
    for (JsonNode rowKindNode : rowKindsNode) {
        RowKind rowKind = RowKind.valueOf(rowKindNode.asText().toUpperCase());
        builder.addContainedKind(rowKind);
    }
    return builder.build();
}
Also used : ChangelogMode(org.apache.flink.table.connector.ChangelogMode) RowKind(org.apache.flink.types.RowKind) JsonNode(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode)

Example 3 with Builder

use of org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder in project flink by apache.

the class CsvRowSchemaConverter method convert.

/**
 * Convert {@link RowTypeInfo} to {@link CsvSchema}.
 */
public static CsvSchema convert(RowTypeInfo rowType) {
    final Builder builder = new CsvSchema.Builder();
    final String[] fields = rowType.getFieldNames();
    final TypeInformation<?>[] types = rowType.getFieldTypes();
    for (int i = 0; i < rowType.getArity(); i++) {
        builder.addColumn(new Column(i, fields[i], convertType(fields[i], types[i])));
    }
    return builder.build();
}
Also used : Column(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Column) Builder(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder) TypeInformation(org.apache.flink.api.common.typeinfo.TypeInformation)

Example 4 with Builder

use of org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder in project flink by apache.

the class CsvRowSchemaConverter method convert.

/**
 * Convert {@link RowType} to {@link CsvSchema}.
 */
public static CsvSchema convert(RowType rowType) {
    Builder builder = new CsvSchema.Builder();
    List<RowType.RowField> fields = rowType.getFields();
    for (int i = 0; i < rowType.getFieldCount(); i++) {
        String fieldName = fields.get(i).getName();
        LogicalType fieldType = fields.get(i).getType();
        builder.addColumn(new Column(i, fieldName, convertType(fieldName, fieldType)));
    }
    return builder.build();
}
Also used : Column(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Column) Builder(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder) LogicalType(org.apache.flink.table.types.logical.LogicalType)

Example 5 with Builder

use of org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder in project flink by apache.

the class SinkIntoKinesis method main.

public static void main(String[] args) throws Exception {
    ObjectMapper mapper = new ObjectMapper();
    final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
    env.enableCheckpointing(10_000);
    DataStream<String> fromGen = env.fromSequence(1, 10_000_000L).map(Object::toString).returns(String.class).map(data -> mapper.writeValueAsString(ImmutableMap.of("data", data)));
    Properties sinkProperties = new Properties();
    sinkProperties.put(AWSConfigConstants.AWS_REGION, "your-region-here");
    KinesisDataStreamsSink<String> kdsSink = KinesisDataStreamsSink.<String>builder().setSerializationSchema(new SimpleStringSchema()).setPartitionKeyGenerator(element -> String.valueOf(element.hashCode())).setStreamName("your-stream-name").setMaxBatchSize(20).setKinesisClientProperties(sinkProperties).build();
    fromGen.sinkTo(kdsSink);
    env.execute("KDS Async Sink Example Program");
}
Also used : SimpleStringSchema(org.apache.flink.api.common.serialization.SimpleStringSchema) StreamExecutionEnvironment(org.apache.flink.streaming.api.environment.StreamExecutionEnvironment) Properties(java.util.Properties) ObjectMapper(org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper)

Aggregations

JsonNode (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode)3 RexBuilder (org.apache.calcite.rex.RexBuilder)2 ArrayNode (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode)2 Builder (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Builder)2 Column (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.dataformat.csv.CsvSchema.Column)2 LogicalType (org.apache.flink.table.types.logical.LogicalType)2 BoundType (com.google.common.collect.BoundType)1 Builder (com.google.common.collect.ImmutableRangeSet.Builder)1 Range (com.google.common.collect.Range)1 ArrayList (java.util.ArrayList)1 Properties (java.util.Properties)1 RexWindowBound (org.apache.calcite.rex.RexWindowBound)1 SimpleStringSchema (org.apache.flink.api.common.serialization.SimpleStringSchema)1 TypeInformation (org.apache.flink.api.common.typeinfo.TypeInformation)1 ObjectMapper (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper)1 ObjectReader (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectReader)1 ObjectWriter (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectWriter)1 StreamExecutionEnvironment (org.apache.flink.streaming.api.environment.StreamExecutionEnvironment)1 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)1 ChangelogMode (org.apache.flink.table.connector.ChangelogMode)1