use of io.cdap.cdap.etl.api.Alert in project cdap by caskdata.
the class CombinedEmitter method emitAlert.
@Override
public void emitAlert(Map<String, String> payload) {
Alert alert = new Alert(stageName, payload);
emitted.add(RecordInfo.<Object>builder(alert, stageName, RecordType.ALERT).build());
}
use of io.cdap.cdap.etl.api.Alert in project cdap by caskdata.
the class StreamingAlertPublishFunction method call.
@Override
public void call(JavaRDD<Alert> data, Time batchTime) throws Exception {
MacroEvaluator evaluator = new DefaultMacroEvaluator(new BasicArguments(sec), batchTime.milliseconds(), sec.getSecureStore(), sec.getServiceDiscoverer(), sec.getNamespace());
PluginContext pluginContext = new SparkPipelinePluginContext(sec.getPluginContext(), sec.getMetrics(), stageSpec.isStageLoggingEnabled(), stageSpec.isProcessTimingEnabled());
String stageName = stageSpec.getName();
AlertPublisher alertPublisher = pluginContext.newPluginInstance(stageName, evaluator);
PipelineRuntime pipelineRuntime = new SparkPipelineRuntime(sec, batchTime.milliseconds());
AlertPublisherContext alertPublisherContext = new DefaultAlertPublisherContext(pipelineRuntime, stageSpec, sec.getMessagingContext(), sec.getAdmin());
alertPublisher.initialize(alertPublisherContext);
StageMetrics stageMetrics = new DefaultStageMetrics(sec.getMetrics(), stageName);
TrackedIterator<Alert> trackedAlerts = new TrackedIterator<>(data.collect().iterator(), stageMetrics, Constants.Metrics.RECORDS_IN);
alertPublisher.publish(trackedAlerts);
alertPublisher.destroy();
}
use of io.cdap.cdap.etl.api.Alert in project cdap by caskdata.
the class PipeEmitter method emitAlert.
@Override
public void emitAlert(Map<String, String> payload) {
Alert alert = new Alert(stageName, ImmutableMap.copyOf(payload));
RecordInfo<Alert> alertRecord = RecordInfo.builder(alert, stageName, RecordType.ALERT).build();
for (PipeStage<RecordInfo<Alert>> alertConsumer : alertConsumers) {
alertConsumer.consume(alertRecord);
}
}
Aggregations