Search in sources :

Example 1 with SparkStreamingPreparer

use of io.cdap.cdap.etl.spark.streaming.SparkStreamingPreparer in project cdap by caskdata.

the class SparkStreamingPipelineDriver method run.

private JavaStreamingContext run(DataStreamsPipelineSpec pipelineSpec, PipelinePhase pipelinePhase, JavaSparkExecutionContext sec, @Nullable String checkpointDir, @Nullable JavaSparkContext context) throws Exception {
    PipelinePluginContext pluginContext = new PipelinePluginContext(sec.getPluginContext(), sec.getMetrics(), pipelineSpec.isStageLoggingEnabled(), pipelineSpec.isProcessTimingEnabled());
    PipelineRuntime pipelineRuntime = new SparkPipelineRuntime(sec);
    MacroEvaluator evaluator = new DefaultMacroEvaluator(pipelineRuntime.getArguments(), sec.getLogicalStartTime(), sec.getSecureStore(), sec.getServiceDiscoverer(), sec.getNamespace());
    SparkStreamingPreparer preparer = new SparkStreamingPreparer(pluginContext, sec.getMetrics(), evaluator, pipelineRuntime, sec);
    try {
        SparkFieldLineageRecorder recorder = new SparkFieldLineageRecorder(sec, pipelinePhase, pipelineSpec, preparer);
        recorder.record();
    } catch (Exception e) {
        LOG.warn("Failed to emit field lineage operations for streaming pipeline", e);
    }
    Set<String> uncombinableSinks = preparer.getUncombinableSinks();
    // the content in the function might not run due to spark checkpointing, currently just have the lineage logic
    // before anything is run
    Function0<JavaStreamingContext> contextFunction = (Function0<JavaStreamingContext>) () -> {
        JavaSparkContext javaSparkContext = context == null ? new JavaSparkContext() : context;
        JavaStreamingContext jssc = new JavaStreamingContext(javaSparkContext, Durations.milliseconds(pipelineSpec.getBatchIntervalMillis()));
        SparkStreamingPipelineRunner runner = new SparkStreamingPipelineRunner(sec, jssc, pipelineSpec, pipelineSpec.isCheckpointsDisabled());
        // Seems like they should be set at configure time instead of runtime? but that requires an API change.
        try {
            PhaseSpec phaseSpec = new PhaseSpec(sec.getApplicationSpecification().getName(), pipelinePhase, Collections.emptyMap(), pipelineSpec.isStageLoggingEnabled(), pipelineSpec.isProcessTimingEnabled());
            boolean shouldConsolidateStages = Boolean.parseBoolean(sec.getRuntimeArguments().getOrDefault(Constants.CONSOLIDATE_STAGES, Boolean.TRUE.toString()));
            boolean shouldCacheFunctions = Boolean.parseBoolean(sec.getRuntimeArguments().getOrDefault(Constants.CACHE_FUNCTIONS, Boolean.TRUE.toString()));
            runner.runPipeline(phaseSpec, StreamingSource.PLUGIN_TYPE, sec, Collections.emptyMap(), pluginContext, Collections.emptyMap(), uncombinableSinks, shouldConsolidateStages, shouldCacheFunctions);
        } catch (Exception e) {
            throw new RuntimeException(e);
        }
        if (checkpointDir != null) {
            jssc.checkpoint(checkpointDir);
            jssc.sparkContext().hadoopConfiguration().set("fs.defaultFS", checkpointDir);
        }
        return jssc;
    };
    return checkpointDir == null ? contextFunction.call() : JavaStreamingContext.getOrCreate(checkpointDir, contextFunction, context.hadoopConfiguration());
}
Also used : PipelineRuntime(io.cdap.cdap.etl.common.PipelineRuntime) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) DefaultMacroEvaluator(io.cdap.cdap.etl.common.DefaultMacroEvaluator) MacroEvaluator(io.cdap.cdap.api.macro.MacroEvaluator) SparkPipelineRuntime(io.cdap.cdap.etl.spark.SparkPipelineRuntime) SparkStreamingPreparer(io.cdap.cdap.etl.spark.streaming.SparkStreamingPreparer) Function0(org.apache.spark.api.java.function.Function0) IOException(java.io.IOException) JavaStreamingContext(org.apache.spark.streaming.api.java.JavaStreamingContext) DefaultMacroEvaluator(io.cdap.cdap.etl.common.DefaultMacroEvaluator) JavaSparkContext(org.apache.spark.api.java.JavaSparkContext) PhaseSpec(io.cdap.cdap.etl.common.PhaseSpec) PipelinePluginContext(io.cdap.cdap.etl.common.plugin.PipelinePluginContext)

Aggregations

MacroEvaluator (io.cdap.cdap.api.macro.MacroEvaluator)1 DefaultMacroEvaluator (io.cdap.cdap.etl.common.DefaultMacroEvaluator)1 PhaseSpec (io.cdap.cdap.etl.common.PhaseSpec)1 PipelineRuntime (io.cdap.cdap.etl.common.PipelineRuntime)1 PipelinePluginContext (io.cdap.cdap.etl.common.plugin.PipelinePluginContext)1 SparkPipelineRuntime (io.cdap.cdap.etl.spark.SparkPipelineRuntime)1 SparkStreamingPreparer (io.cdap.cdap.etl.spark.streaming.SparkStreamingPreparer)1 IOException (java.io.IOException)1 JavaSparkContext (org.apache.spark.api.java.JavaSparkContext)1 Function0 (org.apache.spark.api.java.function.Function0)1 JavaStreamingContext (org.apache.spark.streaming.api.java.JavaStreamingContext)1