Search in sources :

Example 11 with SparkManager

use of io.cdap.cdap.test.SparkManager in project cdap by caskdata.

the class MessagingAppTestRun method testSparkMessaging.

@Test
public void testSparkMessaging() throws Exception {
    ApplicationManager appManager = deployWithArtifact(NAMESPACE, MessagingApp.class, artifactJar);
    final SparkManager sparkManager = appManager.getSparkManager(MessagingSpark.class.getSimpleName()).start();
    final MessageFetcher fetcher = getMessagingContext().getMessageFetcher();
    final AtomicReference<String> messageId = new AtomicReference<>();
    // Wait for the Spark to create the topic
    final MessagingAdmin messagingAdmin = getMessagingAdmin(NAMESPACE.getNamespace());
    Tasks.waitFor(true, new Callable<Boolean>() {

        @Override
        public Boolean call() throws Exception {
            try {
                messagingAdmin.getTopicProperties(MessagingApp.TOPIC);
                return true;
            } catch (TopicNotFoundException e) {
                return false;
            }
        }
    }, 60, TimeUnit.SECONDS, 100, TimeUnit.MILLISECONDS);
    // This is to verify failed transaction is not publishing anything.
    for (String expected : Arrays.asList("start", "block")) {
        Tasks.waitFor(expected, new Callable<String>() {

            @Override
            public String call() throws Exception {
                try (CloseableIterator<Message> iterator = fetcher.fetch(NAMESPACE.getNamespace(), MessagingApp.TOPIC, 1, messageId.get())) {
                    if (!iterator.hasNext()) {
                        return null;
                    }
                    Message message = iterator.next();
                    messageId.set(message.getId());
                    return message.getPayloadAsString();
                }
            }
        }, 60, TimeUnit.SECONDS, 100, TimeUnit.MILLISECONDS);
    }
    // Publish a control message to unblock the Spark execution
    getMessagingContext().getMessagePublisher().publish(NAMESPACE.getNamespace(), MessagingApp.CONTROL_TOPIC, "go");
    // Expects a result message as "result-15", where 15 is the sum of 1,2,3,4,5
    Tasks.waitFor("result-15", new Callable<String>() {

        @Override
        public String call() throws Exception {
            try (CloseableIterator<Message> iterator = fetcher.fetch(NAMESPACE.getNamespace(), MessagingApp.TOPIC, 1, messageId.get())) {
                if (!iterator.hasNext()) {
                    return null;
                }
                Message message = iterator.next();
                messageId.set(message.getId());
                return message.getPayloadAsString();
            }
        }
    }, 60, TimeUnit.SECONDS, 100, TimeUnit.MILLISECONDS);
    sparkManager.waitForRun(ProgramRunStatus.COMPLETED, 60, TimeUnit.SECONDS);
}
Also used : ApplicationManager(io.cdap.cdap.test.ApplicationManager) MessageFetcher(io.cdap.cdap.api.messaging.MessageFetcher) CloseableIterator(io.cdap.cdap.api.dataset.lib.CloseableIterator) SparkManager(io.cdap.cdap.test.SparkManager) Message(io.cdap.cdap.api.messaging.Message) TopicNotFoundException(io.cdap.cdap.api.messaging.TopicNotFoundException) AtomicReference(java.util.concurrent.atomic.AtomicReference) TimeoutException(java.util.concurrent.TimeoutException) TopicNotFoundException(io.cdap.cdap.api.messaging.TopicNotFoundException) MessagingAdmin(io.cdap.cdap.api.messaging.MessagingAdmin) Test(org.junit.Test)

Example 12 with SparkManager

use of io.cdap.cdap.test.SparkManager in project cdap by caskdata.

the class SparkFileSetTestRun method testSparkWithPartitionedFileSet.

private void testSparkWithPartitionedFileSet(ApplicationManager applicationManager, String sparkProgram) throws Exception {
    DataSetManager<PartitionedFileSet> pfsManager = getDataset("pfs");
    PartitionedFileSet pfs = pfsManager.get();
    PartitionOutput partitionOutput = pfs.getPartitionOutput(PartitionKey.builder().addStringField("x", "nn").build());
    Location location = partitionOutput.getLocation();
    prepareFileInput(location);
    partitionOutput.addPartition();
    pfsManager.flush();
    Map<String, String> inputArgs = new HashMap<>();
    PartitionedFileSetArguments.setInputPartitionFilter(inputArgs, PartitionFilter.builder().addRangeCondition("x", "na", "nx").build());
    Map<String, String> outputArgs = new HashMap<>();
    PartitionKey outputKey = PartitionKey.builder().addStringField("x", "xx").build();
    PartitionedFileSetArguments.setOutputPartitionKey(outputArgs, outputKey);
    Map<String, String> args = new HashMap<>();
    args.putAll(RuntimeArguments.addScope(Scope.DATASET, "pfs", inputArgs));
    args.putAll(RuntimeArguments.addScope(Scope.DATASET, "pfs", outputArgs));
    args.put("input", "pfs");
    args.put("output", "pfs");
    SparkManager sparkManager = applicationManager.getSparkManager(sparkProgram).start(args);
    sparkManager.waitForRun(ProgramRunStatus.COMPLETED, 10, TimeUnit.MINUTES);
    pfsManager.flush();
    PartitionDetail partition = pfs.getPartition(outputKey);
    Assert.assertNotNull(partition);
    validateFileOutput(partition.getLocation());
    // Cleanup after test completed
    pfs.dropPartition(partitionOutput.getPartitionKey());
    pfs.dropPartition(partition.getPartitionKey());
    pfsManager.flush();
}
Also used : SparkManager(io.cdap.cdap.test.SparkManager) PartitionOutput(io.cdap.cdap.api.dataset.lib.PartitionOutput) HashMap(java.util.HashMap) PartitionKey(io.cdap.cdap.api.dataset.lib.PartitionKey) TimePartitionedFileSet(io.cdap.cdap.api.dataset.lib.TimePartitionedFileSet) PartitionedFileSet(io.cdap.cdap.api.dataset.lib.PartitionedFileSet) PartitionDetail(io.cdap.cdap.api.dataset.lib.PartitionDetail) Location(org.apache.twill.filesystem.Location)

Example 13 with SparkManager

use of io.cdap.cdap.test.SparkManager in project cdap by caskdata.

the class SparkFileSetTestRun method testSparkWithTimePartitionedFileSet.

private void testSparkWithTimePartitionedFileSet(ApplicationManager applicationManager, String sparkProgram) throws Exception {
    long customOutputPartitionKey = 123456789L;
    long customInputPartitionKey = 987654321L;
    DataSetManager<TimePartitionedFileSet> tpfsManager = getDataset("tpfs");
    long inputTime = System.currentTimeMillis();
    long outputTime = inputTime + TimeUnit.HOURS.toMillis(1);
    addTimePartition(tpfsManager, inputTime);
    addTimePartition(tpfsManager, customInputPartitionKey);
    Map<String, String> inputArgs = new HashMap<>();
    TimePartitionedFileSetArguments.setInputStartTime(inputArgs, inputTime - 100);
    TimePartitionedFileSetArguments.setInputEndTime(inputArgs, inputTime + 100);
    Map<String, String> outputArgs = new HashMap<>();
    TimePartitionedFileSetArguments.setOutputPartitionTime(outputArgs, outputTime);
    Map<String, String> args = new HashMap<>();
    args.putAll(RuntimeArguments.addScope(Scope.DATASET, "tpfs", inputArgs));
    args.putAll(RuntimeArguments.addScope(Scope.DATASET, "tpfs", outputArgs));
    args.put("input", "tpfs");
    args.put("output", "tpfs");
    args.put("outputKey", String.valueOf(customOutputPartitionKey));
    args.put("inputKey", String.valueOf(customInputPartitionKey));
    SparkManager sparkManager = applicationManager.getSparkManager(sparkProgram).start(args);
    sparkManager.waitForRun(ProgramRunStatus.COMPLETED, 10, TimeUnit.MINUTES);
    tpfsManager.flush();
    TimePartitionedFileSet tpfs = tpfsManager.get();
    PartitionDetail partition = tpfs.getPartitionByTime(outputTime);
    Assert.assertNotNull("Output partition is null while for running without custom dataset arguments", partition);
    validateFileOutput(partition.getLocation());
    PartitionDetail customPartition = tpfs.getPartitionByTime(customOutputPartitionKey);
    Assert.assertNotNull("Output partition is null while for running with custom dataset arguments", customPartition);
    validateFileOutput(customPartition.getLocation());
    // Cleanup after running the test
    tpfs.dropPartition(inputTime);
    tpfs.dropPartition(customInputPartitionKey);
    tpfs.dropPartition(partition.getPartitionKey());
    tpfs.dropPartition(customPartition.getPartitionKey());
    tpfsManager.flush();
}
Also used : SparkManager(io.cdap.cdap.test.SparkManager) HashMap(java.util.HashMap) TimePartitionedFileSet(io.cdap.cdap.api.dataset.lib.TimePartitionedFileSet) PartitionDetail(io.cdap.cdap.api.dataset.lib.PartitionDetail)

Example 14 with SparkManager

use of io.cdap.cdap.test.SparkManager in project cdap by caskdata.

the class SparkTest method testClassicSpark.

@Test
public void testClassicSpark() throws Exception {
    ApplicationManager appManager = deploy(TestSparkApp.class);
    for (Class<?> sparkClass : Arrays.asList(TestSparkApp.ClassicSpark.class, TestSparkApp.ScalaClassicSpark.class)) {
        final SparkManager sparkManager = appManager.getSparkManager(sparkClass.getSimpleName());
        sparkManager.startAndWaitForGoodRun(ProgramRunStatus.COMPLETED, 5, TimeUnit.MINUTES);
    }
    KeyValueTable resultTable = this.<KeyValueTable>getDataset("ResultTable").get();
    Assert.assertEquals(1L, Bytes.toLong(resultTable.read(ClassicSparkProgram.class.getName())));
    Assert.assertEquals(1L, Bytes.toLong(resultTable.read(ScalaClassicSparkProgram.class.getName())));
}
Also used : ApplicationManager(io.cdap.cdap.test.ApplicationManager) SparkManager(io.cdap.cdap.test.SparkManager) KeyValueTable(io.cdap.cdap.api.dataset.lib.KeyValueTable) TestSparkApp(io.cdap.cdap.spark.app.TestSparkApp) ScalaClassicSparkProgram(io.cdap.cdap.spark.app.ScalaClassicSparkProgram) ClassicSparkProgram(io.cdap.cdap.spark.app.ClassicSparkProgram) ScalaClassicSparkProgram(io.cdap.cdap.spark.app.ScalaClassicSparkProgram) Test(org.junit.Test)

Example 15 with SparkManager

use of io.cdap.cdap.test.SparkManager in project cdap by caskdata.

the class SparkTest method testTransaction.

@Test
public void testTransaction() throws Exception {
    ApplicationManager applicationManager = deploy(TestSparkApp.class);
    // Write some data to a local file
    File inputFile = TEMP_FOLDER.newFile();
    try (PrintWriter writer = new PrintWriter(Files.newBufferedWriter(inputFile.toPath(), StandardCharsets.UTF_8))) {
        writer.println("red fox");
        writer.println("brown fox");
        writer.println("grey fox");
        writer.println("brown bear");
        writer.println("black bear");
    }
    // Run the spark program
    SparkManager sparkManager = applicationManager.getSparkManager(TransactionSpark.class.getSimpleName());
    sparkManager.start(ImmutableMap.of("input.file", inputFile.getAbsolutePath(), "keyvalue.table", "KeyValueTable", "result.all.dataset", "SparkResult", "result.threshold", "2", "result.threshold.dataset", "SparkThresholdResult"));
    // Verify result from dataset before the Spark program terminates
    final DataSetManager<KeyValueTable> resultManager = getDataset("SparkThresholdResult");
    final KeyValueTable resultTable = resultManager.get();
    // Expect the threshold result dataset, with threshold >=2, contains [brown, fox, bear]
    Tasks.waitFor(ImmutableSet.of("brown", "fox", "bear"), () -> {
        // This is to start a new TX
        resultManager.flush();
        LOG.info("Reading from threshold result");
        try (CloseableIterator<KeyValue<byte[], byte[]>> itor = resultTable.scan(null, null)) {
            return ImmutableSet.copyOf(Iterators.transform(itor, input -> {
                String word = Bytes.toString(input.getKey());
                LOG.info("{}, {}", word, Bytes.toInt(input.getValue()));
                return word;
            }));
        }
    }, 3, TimeUnit.MINUTES, 1, TimeUnit.SECONDS);
    sparkManager.stop();
    sparkManager.waitForRun(ProgramRunStatus.KILLED, 60, TimeUnit.SECONDS);
}
Also used : HttpURLConnection(java.net.HttpURLConnection) SparkLogParser(io.cdap.cdap.spark.app.SparkLogParser) Arrays(java.util.Arrays) TypeToken(com.google.gson.reflect.TypeToken) ClassicSparkProgram(io.cdap.cdap.spark.app.ClassicSparkProgram) NamespaceId(io.cdap.cdap.proto.id.NamespaceId) PythonSpark(io.cdap.cdap.spark.app.PythonSpark) URL(java.net.URL) LoggerFactory(org.slf4j.LoggerFactory) Bytes(io.cdap.cdap.api.common.Bytes) KeyValue(io.cdap.cdap.api.dataset.lib.KeyValue) DatasetSQLSpark(io.cdap.cdap.spark.app.DatasetSQLSpark) ScheduleId(io.cdap.cdap.proto.id.ScheduleId) Gson(com.google.gson.Gson) Map(java.util.Map) ClassRule(org.junit.ClassRule) Scope(io.cdap.cdap.api.common.Scope) Application(io.cdap.cdap.api.app.Application) StringLengthUDT(io.cdap.cdap.spark.app.plugin.StringLengthUDT) PrintWriter(java.io.PrintWriter) Tasks(io.cdap.cdap.common.utils.Tasks) ImmutableSet(com.google.common.collect.ImmutableSet) TestSparkApp(io.cdap.cdap.spark.app.TestSparkApp) IdentityHashMap(java.util.IdentityHashMap) ImmutableMap(com.google.common.collect.ImmutableMap) ObjectMappedTable(io.cdap.cdap.api.dataset.lib.ObjectMappedTable) TransactionSpark(io.cdap.cdap.spark.app.TransactionSpark) Reader(java.io.Reader) ProgramRunStatus(io.cdap.cdap.proto.ProgramRunStatus) Collectors(java.util.stream.Collectors) StandardCharsets(java.nio.charset.StandardCharsets) SparkServiceProgram(io.cdap.cdap.spark.app.SparkServiceProgram) List(java.util.List) Person(io.cdap.cdap.spark.app.Person) Stream(java.util.stream.Stream) ApplicationManager(io.cdap.cdap.test.ApplicationManager) ByteStreams(com.google.common.io.ByteStreams) FileSet(io.cdap.cdap.api.dataset.lib.FileSet) DataSetManager(io.cdap.cdap.test.DataSetManager) Constants(io.cdap.cdap.common.conf.Constants) FileSetArguments(io.cdap.cdap.api.dataset.lib.FileSetArguments) DirUtils(io.cdap.cdap.common.utils.DirUtils) SparkAppUsingGetDataset(io.cdap.cdap.spark.app.SparkAppUsingGetDataset) Joiner(com.google.common.base.Joiner) Iterables(com.google.common.collect.Iterables) StringLengthFunc(io.cdap.cdap.spark.app.plugin.StringLengthFunc) BeforeClass(org.junit.BeforeClass) Location(org.apache.twill.filesystem.Location) TestConfiguration(io.cdap.cdap.test.TestConfiguration) PluggableFunc(io.cdap.cdap.spark.app.plugin.PluggableFunc) HashMap(java.util.HashMap) Function(java.util.function.Function) Iterators(com.google.common.collect.Iterators) ArrayList(java.util.ArrayList) OutputStreamWriter(java.io.OutputStreamWriter) KeyValueTable(io.cdap.cdap.api.dataset.lib.KeyValueTable) PrintStream(java.io.PrintStream) SparkManager(io.cdap.cdap.test.SparkManager) Logger(org.slf4j.Logger) Files(java.nio.file.Files) ScalaDynamicSpark(io.cdap.cdap.spark.app.ScalaDynamicSpark) BufferedWriter(java.io.BufferedWriter) RuntimeArguments(io.cdap.cdap.api.common.RuntimeArguments) Test(org.junit.Test) IOException(java.io.IOException) Maps(com.google.common.collect.Maps) CloseableIterator(io.cdap.cdap.api.dataset.lib.CloseableIterator) InputStreamReader(java.io.InputStreamReader) ScalaSparkLogParser(io.cdap.cdap.spark.app.ScalaSparkLogParser) File(java.io.File) ScalaClassicSparkProgram(io.cdap.cdap.spark.app.ScalaClassicSparkProgram) TestFrameworkTestBase(io.cdap.cdap.test.base.TestFrameworkTestBase) TimeUnit(java.util.concurrent.TimeUnit) URLEncoder(java.net.URLEncoder) Ignore(org.junit.Ignore) WorkflowManager(io.cdap.cdap.test.WorkflowManager) Assert(org.junit.Assert) Collections(java.util.Collections) TemporaryFolder(org.junit.rules.TemporaryFolder) InputStream(java.io.InputStream) ApplicationManager(io.cdap.cdap.test.ApplicationManager) SparkManager(io.cdap.cdap.test.SparkManager) KeyValue(io.cdap.cdap.api.dataset.lib.KeyValue) KeyValueTable(io.cdap.cdap.api.dataset.lib.KeyValueTable) File(java.io.File) PrintWriter(java.io.PrintWriter) TransactionSpark(io.cdap.cdap.spark.app.TransactionSpark) Test(org.junit.Test)

Aggregations

SparkManager (io.cdap.cdap.test.SparkManager)84 ApplicationManager (io.cdap.cdap.test.ApplicationManager)66 Test (org.junit.Test)60 KeyValueTable (io.cdap.cdap.api.dataset.lib.KeyValueTable)34 HashMap (java.util.HashMap)31 StructuredRecord (io.cdap.cdap.api.data.format.StructuredRecord)26 ApplicationId (io.cdap.cdap.proto.id.ApplicationId)26 HashSet (java.util.HashSet)25 Table (io.cdap.cdap.api.dataset.table.Table)24 AppRequest (io.cdap.cdap.proto.artifact.AppRequest)24 DataStreamsConfig (io.cdap.cdap.etl.proto.v2.DataStreamsConfig)22 ETLStage (io.cdap.cdap.etl.proto.v2.ETLStage)22 Schema (io.cdap.cdap.api.data.schema.Schema)21 FileSet (io.cdap.cdap.api.dataset.lib.FileSet)18 Location (org.apache.twill.filesystem.Location)18 File (java.io.File)17 WorkflowManager (io.cdap.cdap.test.WorkflowManager)14 URL (java.net.URL)14 PrintStream (java.io.PrintStream)12 HttpURLConnection (java.net.HttpURLConnection)12