Search in sources :

Example 1 with Write

use of org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write in project beam by apache.

the class SpannerIOWriteTest method emptyDatabaseId.

@Test
public void emptyDatabaseId() throws Exception {
    SpannerIO.Write write = SpannerIO.write().withInstanceId("123");
    thrown.expect(NullPointerException.class);
    thrown.expectMessage("requires database id to be set with");
    write.expand(null);
}
Also used : Write(org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write) Test(org.junit.Test)

Example 2 with Write

use of org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write in project beam by apache.

the class SpannerIOWriteTest method streamingWritesWithGroupingWithPriority.

@Test
public void streamingWritesWithGroupingWithPriority() throws Exception {
    // verify that grouping/sorting occurs when set.
    TestStream<Mutation> testStream = TestStream.create(SerializableCoder.of(Mutation.class)).addElements(m(1L), m(5L), m(2L), m(4L), m(3L), m(6L)).advanceWatermarkToInfinity();
    Write write = SpannerIO.write().withProjectId("test-project").withInstanceId("test-instance").withDatabaseId("test-database").withServiceFactory(serviceFactory).withGroupingFactor(40).withMaxNumRows(2).withLowPriority();
    pipeline.apply(testStream).apply(write);
    pipeline.run();
    assertEquals(RpcPriority.LOW, write.getSpannerConfig().getRpcPriority().get());
    // Output should be batches of sorted mutations.
    verifyBatches(batch(m(1L), m(2L)), batch(m(3L), m(4L)), batch(m(5L), m(6L)));
}
Also used : Write(org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write) Mutation(com.google.cloud.spanner.Mutation) Test(org.junit.Test)

Example 3 with Write

use of org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write in project beam by apache.

the class SpannerIOWriteTest method emptyInstanceId.

@Test
public void emptyInstanceId() throws Exception {
    SpannerIO.Write write = SpannerIO.write().withDatabaseId("123");
    thrown.expect(NullPointerException.class);
    thrown.expectMessage("requires instance id to be set with");
    write.expand(null);
}
Also used : Write(org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write) Test(org.junit.Test)

Example 4 with Write

use of org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write in project beam by apache.

the class SpannerIOWriteTest method streamingWritesWithPriority.

@Test
public void streamingWritesWithPriority() throws Exception {
    TestStream<Mutation> testStream = TestStream.create(SerializableCoder.of(Mutation.class)).addElements(m(1L), m(2L)).advanceProcessingTime(Duration.standardMinutes(1)).addElements(m(3L), m(4L)).advanceProcessingTime(Duration.standardMinutes(1)).addElements(m(5L), m(6L)).advanceWatermarkToInfinity();
    Write write = SpannerIO.write().withProjectId("test-project").withInstanceId("test-instance").withDatabaseId("test-database").withServiceFactory(serviceFactory).withHighPriority();
    pipeline.apply(testStream).apply(write);
    pipeline.run();
    assertEquals(RpcPriority.HIGH, write.getSpannerConfig().getRpcPriority().get());
    verifyBatches(batch(m(1L), m(2L)), batch(m(3L), m(4L)), batch(m(5L), m(6L)));
}
Also used : Write(org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write) Mutation(com.google.cloud.spanner.Mutation) Test(org.junit.Test)

Example 5 with Write

use of org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write in project beam by apache.

the class SpannerIOWriteTest method displayDataWrite.

@Test
public void displayDataWrite() throws Exception {
    SpannerIO.Write write = SpannerIO.write().withProjectId("test-project").withInstanceId("test-instance").withDatabaseId("test-database").withBatchSizeBytes(123).withMaxNumMutations(456).withMaxNumRows(789).withGroupingFactor(100);
    DisplayData data = DisplayData.from(write);
    assertThat(data.items(), hasSize(7));
    assertThat(data, hasDisplayItem("projectId", "test-project"));
    assertThat(data, hasDisplayItem("instanceId", "test-instance"));
    assertThat(data, hasDisplayItem("databaseId", "test-database"));
    assertThat(data, hasDisplayItem("batchSizeBytes", 123));
    assertThat(data, hasDisplayItem("maxNumMutations", 456));
    assertThat(data, hasDisplayItem("maxNumRows", 789));
    assertThat(data, hasDisplayItem("groupingFactor", "100"));
    // check for default grouping value
    write = SpannerIO.write().withProjectId("test-project").withInstanceId("test-instance").withDatabaseId("test-database");
    data = DisplayData.from(write);
    assertThat(data.items(), hasSize(7));
    assertThat(data, hasDisplayItem("groupingFactor", "DEFAULT"));
}
Also used : Write(org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write) DisplayData(org.apache.beam.sdk.transforms.display.DisplayData) Test(org.junit.Test)

Aggregations

Write (org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write)6 Test (org.junit.Test)6 Mutation (com.google.cloud.spanner.Mutation)2 DisplayData (org.apache.beam.sdk.transforms.display.DisplayData)1