Search in sources :

Example 1 with NamedTestResult

use of org.apache.beam.sdk.testutils.NamedTestResult in project beam by apache.

the class KafkaIOIT method readMetrics.

private Set<NamedTestResult> readMetrics(PipelineResult writeResult, PipelineResult readResult) {
    BiFunction<MetricsReader, String, NamedTestResult> supplier = (reader, metricName) -> {
        long start = reader.getStartTimeMetric(metricName);
        long end = reader.getEndTimeMetric(metricName);
        return NamedTestResult.create(TEST_ID, TIMESTAMP, metricName, (end - start) / 1e3);
    };
    NamedTestResult writeTime = supplier.apply(new MetricsReader(writeResult, NAMESPACE), WRITE_TIME_METRIC_NAME);
    NamedTestResult readTime = supplier.apply(new MetricsReader(readResult, NAMESPACE), READ_TIME_METRIC_NAME);
    NamedTestResult runTime = NamedTestResult.create(TEST_ID, TIMESTAMP, RUN_TIME_METRIC_NAME, writeTime.getValue() + readTime.getValue());
    return ImmutableSet.of(readTime, writeTime, runTime);
}
Also used : Arrays(java.util.Arrays) BeforeClass(org.junit.BeforeClass) DockerImageName(org.testcontainers.utility.DockerImageName) BiFunction(java.util.function.BiFunction) PipelineResult(org.apache.beam.sdk.PipelineResult) Default(org.apache.beam.sdk.options.Default) MetricsReader(org.apache.beam.sdk.testutils.metrics.MetricsReader) Combine(org.apache.beam.sdk.transforms.Combine) Duration(org.joda.time.Duration) RunWith(org.junit.runner.RunWith) ImmutableMap(org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableMap) Timestamp(com.google.cloud.Timestamp) SimpleFunction(org.apache.beam.sdk.transforms.SimpleFunction) Metrics(org.apache.beam.sdk.metrics.Metrics) Description(org.apache.beam.sdk.options.Description) ImmutableSet(org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableSet) Read(org.apache.beam.sdk.io.Read) IOITHelper(org.apache.beam.sdk.io.common.IOITHelper) ByteArraySerializer(org.apache.kafka.common.serialization.ByteArraySerializer) InfluxDBSettings(org.apache.beam.sdk.testutils.publishing.InfluxDBSettings) Map(java.util.Map) TestPipeline(org.apache.beam.sdk.testing.TestPipeline) NamedTestResult(org.apache.beam.sdk.testutils.NamedTestResult) Nullable(org.checkerframework.checker.nullness.qual.Nullable) DoFn(org.apache.beam.sdk.transforms.DoFn) MapElements(org.apache.beam.sdk.transforms.MapElements) KafkaContainer(org.testcontainers.containers.KafkaContainer) HashingFn(org.apache.beam.sdk.io.common.HashingFn) AfterClass(org.junit.AfterClass) PAssert(org.apache.beam.sdk.testing.PAssert) Counter(org.apache.beam.sdk.metrics.Counter) StreamingOptions(org.apache.beam.sdk.options.StreamingOptions) TimeMonitor(org.apache.beam.sdk.testutils.metrics.TimeMonitor) Set(java.util.Set) IOException(java.io.IOException) SyntheticSourceOptions(org.apache.beam.sdk.io.synthetic.SyntheticSourceOptions) Test(org.junit.Test) UUID(java.util.UUID) JUnit4(org.junit.runners.JUnit4) PCollection(org.apache.beam.sdk.values.PCollection) SyntheticBoundedSource(org.apache.beam.sdk.io.synthetic.SyntheticBoundedSource) IOITMetrics(org.apache.beam.sdk.testutils.metrics.IOITMetrics) Rule(org.junit.Rule) SyntheticOptions.fromJsonString(org.apache.beam.sdk.io.synthetic.SyntheticOptions.fromJsonString) ParDo(org.apache.beam.sdk.transforms.ParDo) Validation(org.apache.beam.sdk.options.Validation) Assert.assertEquals(org.junit.Assert.assertEquals) IOTestPipelineOptions(org.apache.beam.sdk.io.common.IOTestPipelineOptions) MetricsReader(org.apache.beam.sdk.testutils.metrics.MetricsReader) NamedTestResult(org.apache.beam.sdk.testutils.NamedTestResult) SyntheticOptions.fromJsonString(org.apache.beam.sdk.io.synthetic.SyntheticOptions.fromJsonString)

Example 2 with NamedTestResult

use of org.apache.beam.sdk.testutils.NamedTestResult in project beam by apache.

the class KafkaIOIT method testKafkaIOReadsAndWritesCorrectlyInStreaming.

@Test
public void testKafkaIOReadsAndWritesCorrectlyInStreaming() throws IOException {
    // Use batch pipeline to write records.
    writePipeline.apply("Generate records", Read.from(new SyntheticBoundedSource(sourceOptions))).apply("Measure write time", ParDo.of(new TimeMonitor<>(NAMESPACE, WRITE_TIME_METRIC_NAME))).apply("Write to Kafka", writeToKafka());
    // Use streaming pipeline to read Kafka records.
    readPipeline.getOptions().as(Options.class).setStreaming(true);
    readPipeline.apply("Read from unbounded Kafka", readFromKafka()).apply("Measure read time", ParDo.of(new TimeMonitor<>(NAMESPACE, READ_TIME_METRIC_NAME))).apply("Map records to strings", MapElements.via(new MapKafkaRecordsToStrings())).apply("Counting element", ParDo.of(new CountingFn(NAMESPACE, READ_ELEMENT_METRIC_NAME)));
    PipelineResult writeResult = writePipeline.run();
    writeResult.waitUntilFinish();
    PipelineResult readResult = readPipeline.run();
    PipelineResult.State readState = readResult.waitUntilFinish(Duration.standardSeconds(options.getReadTimeout()));
    cancelIfTimeouted(readResult, readState);
    assertEquals(sourceOptions.numRecords, readElementMetric(readResult, NAMESPACE, READ_ELEMENT_METRIC_NAME));
    if (!options.isWithTestcontainers()) {
        Set<NamedTestResult> metrics = readMetrics(writeResult, readResult);
        IOITMetrics.publishToInflux(TEST_ID, TIMESTAMP, metrics, settings);
    }
}
Also used : SyntheticBoundedSource(org.apache.beam.sdk.io.synthetic.SyntheticBoundedSource) TimeMonitor(org.apache.beam.sdk.testutils.metrics.TimeMonitor) StreamingOptions(org.apache.beam.sdk.options.StreamingOptions) SyntheticSourceOptions(org.apache.beam.sdk.io.synthetic.SyntheticSourceOptions) IOTestPipelineOptions(org.apache.beam.sdk.io.common.IOTestPipelineOptions) NamedTestResult(org.apache.beam.sdk.testutils.NamedTestResult) PipelineResult(org.apache.beam.sdk.PipelineResult) Test(org.junit.Test)

Example 3 with NamedTestResult

use of org.apache.beam.sdk.testutils.NamedTestResult in project beam by apache.

the class BigQueryIOIT method extractAndPublishTime.

private void extractAndPublishTime(PipelineResult pipelineResult, String writeTimeMetricName) {
    final NamedTestResult metricResult = getMetricSupplier(writeTimeMetricName).apply(new MetricsReader(pipelineResult, NAMESPACE));
    final List<NamedTestResult> listResults = Collections.singletonList(metricResult);
    IOITMetrics.publishToInflux(TEST_ID, TEST_TIMESTAMP, listResults, settings);
}
Also used : NamedTestResult(org.apache.beam.sdk.testutils.NamedTestResult) MetricsReader(org.apache.beam.sdk.testutils.metrics.MetricsReader)

Example 4 with NamedTestResult

use of org.apache.beam.sdk.testutils.NamedTestResult in project beam by apache.

the class LoadTest method readMetrics.

private List<NamedTestResult> readMetrics(Timestamp timestamp, PipelineResult result, String testId) {
    MetricsReader reader = new MetricsReader(result, metricsNamespace);
    NamedTestResult runtime = NamedTestResult.create(testId, timestamp.toString(), buildMetric("runtime_sec"), (reader.getEndTimeMetric("runtime") - reader.getStartTimeMetric("runtime")) / 1000D);
    NamedTestResult totalBytes = NamedTestResult.create(testId, timestamp.toString(), buildMetric("total_bytes_count"), reader.getCounterMetric("totalBytes.count"));
    return Arrays.asList(runtime, totalBytes);
}
Also used : MetricsReader(org.apache.beam.sdk.testutils.metrics.MetricsReader) NamedTestResult(org.apache.beam.sdk.testutils.NamedTestResult)

Example 5 with NamedTestResult

use of org.apache.beam.sdk.testutils.NamedTestResult in project beam by apache.

the class LoadTest method run.

/**
 * Runs the load test, collects and publishes test results to various data store and/or console.
 */
public PipelineResult run() throws IOException {
    final Timestamp timestamp = Timestamp.now();
    loadTest();
    final PipelineResult pipelineResult = pipeline.run();
    pipelineResult.waitUntilFinish(Duration.standardMinutes(options.getLoadTestTimeout()));
    final String testId = UUID.randomUUID().toString();
    final List<NamedTestResult> metrics = readMetrics(timestamp, pipelineResult, testId);
    ConsoleResultPublisher.publish(metrics, testId, timestamp.toString());
    handleFailure(pipelineResult, metrics);
    if (options.getPublishToInfluxDB()) {
        InfluxDBPublisher.publishWithSettings(metrics, settings);
    }
    return pipelineResult;
}
Also used : NamedTestResult(org.apache.beam.sdk.testutils.NamedTestResult) PipelineResult(org.apache.beam.sdk.PipelineResult) SyntheticOptions.fromJsonString(org.apache.beam.sdk.io.synthetic.SyntheticOptions.fromJsonString) Timestamp(com.google.cloud.Timestamp)

Aggregations

NamedTestResult (org.apache.beam.sdk.testutils.NamedTestResult)6 PipelineResult (org.apache.beam.sdk.PipelineResult)4 SyntheticBoundedSource (org.apache.beam.sdk.io.synthetic.SyntheticBoundedSource)3 SyntheticOptions.fromJsonString (org.apache.beam.sdk.io.synthetic.SyntheticOptions.fromJsonString)3 MetricsReader (org.apache.beam.sdk.testutils.metrics.MetricsReader)3 TimeMonitor (org.apache.beam.sdk.testutils.metrics.TimeMonitor)3 Test (org.junit.Test)3 Timestamp (com.google.cloud.Timestamp)2 HashingFn (org.apache.beam.sdk.io.common.HashingFn)2 IOTestPipelineOptions (org.apache.beam.sdk.io.common.IOTestPipelineOptions)2 SyntheticSourceOptions (org.apache.beam.sdk.io.synthetic.SyntheticSourceOptions)2 StreamingOptions (org.apache.beam.sdk.options.StreamingOptions)2 IOException (java.io.IOException)1 Arrays (java.util.Arrays)1 Map (java.util.Map)1 Set (java.util.Set)1 UUID (java.util.UUID)1 BiFunction (java.util.function.BiFunction)1 Read (org.apache.beam.sdk.io.Read)1 IOITHelper (org.apache.beam.sdk.io.common.IOITHelper)1