Search in sources :

Example 6 with JobExecutionException

use of org.quartz.JobExecutionException in project spring-framework by spring-projects.

the class QuartzJobBean method execute.

/**
	 * This implementation applies the passed-in job data map as bean property
	 * values, and delegates to {@code executeInternal} afterwards.
	 * @see #executeInternal
	 */
@Override
public final void execute(JobExecutionContext context) throws JobExecutionException {
    try {
        BeanWrapper bw = PropertyAccessorFactory.forBeanPropertyAccess(this);
        MutablePropertyValues pvs = new MutablePropertyValues();
        pvs.addPropertyValues(context.getScheduler().getContext());
        pvs.addPropertyValues(context.getMergedJobDataMap());
        bw.setPropertyValues(pvs, true);
    } catch (SchedulerException ex) {
        throw new JobExecutionException(ex);
    }
    executeInternal(context);
}
Also used : BeanWrapper(org.springframework.beans.BeanWrapper) SchedulerException(org.quartz.SchedulerException) JobExecutionException(org.quartz.JobExecutionException) MutablePropertyValues(org.springframework.beans.MutablePropertyValues)

Example 7 with JobExecutionException

use of org.quartz.JobExecutionException in project camel by apache.

the class ScheduledJob method execute.

public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
    SchedulerContext schedulerContext;
    try {
        schedulerContext = jobExecutionContext.getScheduler().getContext();
    } catch (SchedulerException e) {
        throw new JobExecutionException("Failed to obtain scheduler context for job " + jobExecutionContext.getJobDetail().getName());
    }
    ScheduledJobState state = (ScheduledJobState) schedulerContext.get(jobExecutionContext.getJobDetail().getName());
    Action storedAction = state.getAction();
    Route storedRoute = state.getRoute();
    List<RoutePolicy> policyList = storedRoute.getRouteContext().getRoutePolicyList();
    for (RoutePolicy policy : policyList) {
        try {
            if (policy instanceof ScheduledRoutePolicy) {
                ((ScheduledRoutePolicy) policy).onJobExecute(storedAction, storedRoute);
            }
        } catch (Exception e) {
            throw new JobExecutionException("Failed to execute Scheduled Job for route " + storedRoute.getId() + " with trigger name: " + jobExecutionContext.getTrigger().getFullName(), e);
        }
    }
}
Also used : SchedulerException(org.quartz.SchedulerException) JobExecutionException(org.quartz.JobExecutionException) SchedulerContext(org.quartz.SchedulerContext) RoutePolicy(org.apache.camel.spi.RoutePolicy) Route(org.apache.camel.Route) JobExecutionException(org.quartz.JobExecutionException) SchedulerException(org.quartz.SchedulerException)

Example 8 with JobExecutionException

use of org.quartz.JobExecutionException in project camel by apache.

the class CamelJob method lookupQuartzEndpoint.

private QuartzEndpoint lookupQuartzEndpoint(CamelContext camelContext, String endpointUri, Trigger trigger) throws JobExecutionException {
    String targetTriggerName = trigger.getName();
    String targetTriggerGroup = trigger.getGroup();
    LOG.debug("Looking up existing QuartzEndpoint with trigger {}.{}", targetTriggerName, targetTriggerGroup);
    try {
        // as we prefer to use the existing endpoint from the routes
        for (Route route : camelContext.getRoutes()) {
            Endpoint endpoint = route.getEndpoint();
            if (endpoint instanceof DelegateEndpoint) {
                endpoint = ((DelegateEndpoint) endpoint).getEndpoint();
            }
            if (endpoint instanceof QuartzEndpoint) {
                QuartzEndpoint quartzEndpoint = (QuartzEndpoint) endpoint;
                String triggerName = quartzEndpoint.getTrigger().getName();
                String triggerGroup = quartzEndpoint.getTrigger().getGroup();
                LOG.trace("Checking route trigger {}.{}", triggerName, triggerGroup);
                if (triggerName.equals(targetTriggerName) && triggerGroup.equals(targetTriggerGroup)) {
                    return (QuartzEndpoint) endpoint;
                }
            }
        }
    } catch (Exception e) {
        throw new JobExecutionException("Error lookup up existing QuartzEndpoint with trigger: " + trigger, e);
    }
    // fallback and lookup existing from registry (eg maybe a @Consume POJO with a quartz endpoint, and thus not from a route)
    if (camelContext.hasEndpoint(endpointUri) != null) {
        return camelContext.getEndpoint(endpointUri, QuartzEndpoint.class);
    } else {
        LOG.warn("Cannot find existing QuartzEndpoint with uri: {}. Creating new endpoint instance.", endpointUri);
        return camelContext.getEndpoint(endpointUri, QuartzEndpoint.class);
    }
}
Also used : JobExecutionException(org.quartz.JobExecutionException) Endpoint(org.apache.camel.Endpoint) DelegateEndpoint(org.apache.camel.DelegateEndpoint) DelegateEndpoint(org.apache.camel.DelegateEndpoint) Route(org.apache.camel.Route) JobExecutionException(org.quartz.JobExecutionException) SchedulerException(org.quartz.SchedulerException)

Example 9 with JobExecutionException

use of org.quartz.JobExecutionException in project camel by apache.

the class ScheduledJob method execute.

public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
    LOG.debug("Running ScheduledJob: jobExecutionContext={}", jobExecutionContext);
    SchedulerContext schedulerContext = getSchedulerContext(jobExecutionContext);
    ScheduledJobState state = (ScheduledJobState) schedulerContext.get(jobExecutionContext.getJobDetail().getKey().toString());
    Action storedAction = state.getAction();
    Route storedRoute = state.getRoute();
    List<RoutePolicy> policyList = storedRoute.getRouteContext().getRoutePolicyList();
    for (RoutePolicy policy : policyList) {
        try {
            if (policy instanceof ScheduledRoutePolicy) {
                ((ScheduledRoutePolicy) policy).onJobExecute(storedAction, storedRoute);
            }
        } catch (Exception e) {
            throw new JobExecutionException("Failed to execute Scheduled Job for route " + storedRoute.getId() + " with trigger name: " + jobExecutionContext.getTrigger().getKey(), e);
        }
    }
}
Also used : JobExecutionException(org.quartz.JobExecutionException) SchedulerContext(org.quartz.SchedulerContext) RoutePolicy(org.apache.camel.spi.RoutePolicy) Route(org.apache.camel.Route) JobExecutionException(org.quartz.JobExecutionException) SchedulerException(org.quartz.SchedulerException)

Example 10 with JobExecutionException

use of org.quartz.JobExecutionException in project OpenClinica by OpenClinica.

the class ExampleSpringJob method executeInternal.

@Override
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
    // need to generate a Locale so that user beans and other things will
    // generate normally
    Locale locale = new Locale("en-US");
    ResourceBundleProvider.updateLocale(locale);
    ResourceBundle pageMessages = ResourceBundleProvider.getPageMessagesBundle();
    // logger.debug("--");
    // logger.debug("-- executing a job " + message + " at " + new
    // java.util.Date().toString());
    JobDataMap dataMap = context.getMergedJobDataMap();
    SimpleTrigger trigger = (SimpleTrigger) context.getTrigger();
    try {
        ApplicationContext appContext = (ApplicationContext) context.getScheduler().getContext().get("applicationContext");
        String studySubjectNumber = ((CoreResources) appContext.getBean("coreResources")).getField("extract.number");
        coreResources = (CoreResources) appContext.getBean("coreResources");
        ruleSetRuleDao = (RuleSetRuleDao) appContext.getBean("ruleSetRuleDao");
        dataSource = (DataSource) appContext.getBean("dataSource");
        mailSender = (OpenClinicaMailSender) appContext.getBean("openClinicaMailSender");
        AuditEventDAO auditEventDAO = new AuditEventDAO(dataSource);
        // Scheduler scheduler = context.getScheduler();
        // JobDetail detail = context.getJobDetail();
        // jobDetailBean = (JobDetailBean) detail;
        /*
             * data map here should coincide with the job data map found in
             * CreateJobExportServlet, with the following code: jobDataMap = new
             * JobDataMap(); jobDataMap.put(DATASET_ID, datasetId);
             * jobDataMap.put(PERIOD, period); jobDataMap.put(EMAIL, email);
             * jobDataMap.put(TAB, tab); jobDataMap.put(CDISC, cdisc);
             * jobDataMap.put(SPSS, spss);
             */
        String alertEmail = dataMap.getString(EMAIL);
        String localeStr = dataMap.getString(LOCALE);
        if (localeStr != null) {
            locale = new Locale(localeStr);
            ResourceBundleProvider.updateLocale(locale);
            pageMessages = ResourceBundleProvider.getPageMessagesBundle();
        }
        int dsId = dataMap.getInt(DATASET_ID);
        String tab = dataMap.getString(TAB);
        String cdisc = dataMap.getString(CDISC);
        String cdisc12 = dataMap.getString(CDISC12);
        if (cdisc12 == null) {
            cdisc12 = "0";
        }
        String cdisc13 = dataMap.getString(CDISC13);
        if (cdisc13 == null) {
            cdisc13 = "0";
        }
        String cdisc13oc = dataMap.getString(CDISC13OC);
        if (cdisc13oc == null) {
            cdisc13oc = "0";
        }
        String spss = dataMap.getString(SPSS);
        int userId = dataMap.getInt(USER_ID);
        int studyId = dataMap.getInt(STUDY_ID);
        // String datasetId = dataMap.getString(DATASET_ID);
        // int dsId = new Integer(datasetId).intValue();
        // String userAcctId = dataMap.getString(USER_ID);
        // int userId = new Integer(userAcctId).intValue();
        // why the flip-flop? if one property is set to 'true' we can
        // see jobs in another screen but all properties have to be
        // strings
        logger.debug("-- found the job: " + dsId + " dataset id");
        // for (Iterator it = dataMap.entrySet().iterator(); it.hasNext();)
        // {
        // java.util.Map.Entry entry = (java.util.Map.Entry) it.next();
        // Object key = entry.getKey();
        // Object value = entry.getValue();
        // // logger.debug("-- found datamap property: " + key.toString() +
        // // " : " + value.toString());
        // }
        HashMap fileName = new HashMap<String, Integer>();
        if (dsId > 0) {
            // trying to not throw an error if there's no dataset id
            DatasetDAO dsdao = new DatasetDAO(dataSource);
            DatasetBean datasetBean = (DatasetBean) dsdao.findByPK(dsId);
            StudyDAO studyDao = new StudyDAO(dataSource);
            UserAccountDAO userAccountDAO = new UserAccountDAO(dataSource);
            // hmm, three lines in the if block DRY?
            String generalFileDir = "";
            String generalFileDirCopy = "";
            String exportFilePath = SQLInitServlet.getField("exportFilePath");
            String pattern = "yyyy" + File.separator + "MM" + File.separator + "dd" + File.separator + "HHmmssSSS" + File.separator;
            SimpleDateFormat sdfDir = new SimpleDateFormat(pattern);
            generalFileDir = DATASET_DIR + datasetBean.getId() + File.separator + sdfDir.format(new java.util.Date());
            if (!"".equals(exportFilePath)) {
                generalFileDirCopy = SQLInitServlet.getField("filePath") + exportFilePath + File.separator;
            }
            // logger.debug("-- created the following dir: " +
            // generalFileDir);
            long sysTimeBegin = System.currentTimeMillis();
            // set up the user bean here, tbh
            // logger.debug("-- gen tab file 00");
            userBean = (UserAccountBean) userAccountDAO.findByPK(userId);
            // needs to also be captured by the servlet, tbh
            // logger.debug("-- gen tab file 00");
            generateFileService = new GenerateExtractFileService(dataSource, coreResources, ruleSetRuleDao);
            // logger.debug("-- gen tab file 00");
            // tbh #5796 - covers a bug when the user changes studies, 10/2010
            StudyBean activeStudy = (StudyBean) studyDao.findByPK(studyId);
            StudyBean parentStudy = new StudyBean();
            logger.debug("active study: " + studyId + " parent study: " + activeStudy.getParentStudyId());
            if (activeStudy.getParentStudyId() > 0) {
                // StudyDAO sdao = new StudyDAO(sm.getDataSource());
                parentStudy = (StudyBean) studyDao.findByPK(activeStudy.getParentStudyId());
            } else {
                parentStudy = activeStudy;
            // covers a bug in tab file creation, tbh 01/2009
            }
            logger.debug("-- found extract bean ");
            ExtractBean eb = generateFileService.generateExtractBean(datasetBean, activeStudy, parentStudy);
            MessageFormat mf = new MessageFormat("");
            StringBuffer message = new StringBuffer();
            StringBuffer auditMessage = new StringBuffer();
            // use resource bundle page messages to generate the email, tbh
            // 02/2009
            // message.append(pageMessages.getString("html_email_header_1")
            // + " " + alertEmail +
            // pageMessages.getString("html_email_header_2") + "<br/>");
            message.append("<p>" + pageMessages.getString("email_header_1") + " " + EmailEngine.getAdminEmail() + " " + pageMessages.getString("email_header_2") + " Job Execution " + pageMessages.getString("email_header_3") + "</p>");
            message.append("<P>Dataset: " + datasetBean.getName() + "</P>");
            message.append("<P>Study: " + activeStudy.getName() + "</P>");
            message.append("<p>" + pageMessages.getString("html_email_body_1") + datasetBean.getName() + pageMessages.getString("html_email_body_2") + SQLInitServlet.getField("sysURL") + pageMessages.getString("html_email_body_3") + "</p>");
            // logger.debug("-- gen tab file 00");
            if ("1".equals(tab)) {
                logger.debug("-- gen tab file 01");
                fileName = generateFileService.createTabFile(eb, sysTimeBegin, generalFileDir, datasetBean, activeStudy.getId(), parentStudy.getId(), generalFileDirCopy, userBean);
                message.append("<p>" + pageMessages.getString("html_email_body_4") + " " + getFileNameStr(fileName) + pageMessages.getString("html_email_body_4_5") + SQLInitServlet.getField("sysURL.base") + "AccessFile?fileId=" + getFileIdInt(fileName) + pageMessages.getString("html_email_body_3") + "</p>");
                // MessageFormat mf = new MessageFormat("");
                // mf.applyPattern(pageMessages.getString(
                // "you_can_access_tab_delimited"));
                // Object[] arguments = { getFileIdInt(fileName) };
                // auditMessage.append(mf.format(arguments));
                // auditMessage.append(
                // "You can access your tab-delimited file <a href='AccessFile?fileId="
                // + getFileIdInt(fileName) + "'>here</a>.<br/>");
                auditMessage.append(pageMessages.getString("you_can_access_tab_delimited") + getFileIdInt(fileName) + pageMessages.getString("access_end"));
            }
            if ("1".equals(cdisc)) {
                String odmVersion = "oc1.2";
                fileName = generateFileService.createODMFile(odmVersion, sysTimeBegin, generalFileDir, datasetBean, activeStudy, generalFileDirCopy, eb, activeStudy.getId(), parentStudy.getId(), studySubjectNumber, true, true, true, null, userBean);
                logger.debug("-- gen odm file");
                message.append("<p>" + pageMessages.getString("html_email_body_4") + " " + getFileNameStr(fileName) + pageMessages.getString("html_email_body_4_5") + SQLInitServlet.getField("sysURL.base") + "AccessFile?fileId=" + getFileIdInt(fileName) + pageMessages.getString("html_email_body_3") + "</p>");
                // MessageFormat mf = new MessageFormat("");
                // mf.applyPattern(pageMessages.getString(
                // "you_can_access_odm_12"));
                // Object[] arguments = { getFileIdInt(fileName) };
                // auditMessage.append(mf.format(arguments));
                // auditMessage.append(
                // "You can access your ODM 1.2 w/OpenClinica Extension XML file <a href='AccessFile?fileId="
                // + getFileIdInt(fileName)
                // + "'>here</a>.<br/>");
                auditMessage.append(pageMessages.getString("you_can_access_odm_12") + getFileIdInt(fileName) + pageMessages.getString("access_end"));
            }
            if ("1".equals(cdisc12)) {
                String odmVersion = "1.2";
                fileName = generateFileService.createODMFile(odmVersion, sysTimeBegin, generalFileDir, datasetBean, activeStudy, generalFileDirCopy, eb, activeStudy.getId(), parentStudy.getId(), studySubjectNumber, true, true, true, null, userBean);
                logger.debug("-- gen odm file 1.2 default");
                message.append("<p>" + pageMessages.getString("html_email_body_4") + " " + getFileNameStr(fileName) + pageMessages.getString("html_email_body_4_5") + SQLInitServlet.getField("sysURL.base") + "AccessFile?fileId=" + getFileIdInt(fileName) + pageMessages.getString("html_email_body_3") + "</p>");
                // mf.applyPattern(pageMessages.getString(
                // "you_can_access_odm_12_xml"));
                // Object[] arguments = { getFileIdInt(fileName) };
                // auditMessage.append(mf.format(arguments));
                // // auditMessage.append(
                // "You can access your ODM 1.2 XML file <a href='AccessFile?fileId="
                // + getFileIdInt(fileName) + "'>here</a>.<br/>");
                auditMessage.append(pageMessages.getString("you_can_access_odm_12_xml") + getFileIdInt(fileName) + pageMessages.getString("access_end"));
            }
            if ("1".equals(cdisc13)) {
                String odmVersion = "1.3";
                fileName = generateFileService.createODMFile(odmVersion, sysTimeBegin, generalFileDir, datasetBean, activeStudy, generalFileDirCopy, eb, activeStudy.getId(), parentStudy.getId(), studySubjectNumber, true, true, true, null, userBean);
                logger.debug("-- gen odm file 1.3");
                message.append("<p>" + pageMessages.getString("html_email_body_4") + " " + getFileNameStr(fileName) + pageMessages.getString("html_email_body_4_5") + SQLInitServlet.getField("sysURL.base") + "AccessFile?fileId=" + getFileIdInt(fileName) + pageMessages.getString("html_email_body_3") + "</p>");
                // MessageFormat mf = new MessageFormat("");
                // mf.applyPattern(pageMessages.getString(
                // "you_can_access_odm_13"));
                // Object[] arguments = { getFileIdInt(fileName) };
                // auditMessage.append(mf.format(arguments));
                // auditMessage.append(
                // "You can access your ODM 1.3 XML file <a href='AccessFile?fileId="
                // + getFileIdInt(fileName) + "'>here</a>.<br/>");
                auditMessage.append(pageMessages.getString("you_can_access_odm_13") + getFileIdInt(fileName) + pageMessages.getString("access_end"));
            }
            if ("1".equals(cdisc13oc)) {
                String odmVersion = "oc1.3";
                fileName = generateFileService.createODMFile(odmVersion, sysTimeBegin, generalFileDir, datasetBean, activeStudy, generalFileDirCopy, eb, activeStudy.getId(), parentStudy.getId(), studySubjectNumber, true, true, true, null, userBean);
                logger.debug("-- gen odm file 1.3 oc");
                message.append("<p>" + pageMessages.getString("html_email_body_4") + " " + getFileNameStr(fileName) + pageMessages.getString("html_email_body_4_5") + SQLInitServlet.getField("sysURL.base") + "AccessFile?fileId=" + getFileIdInt(fileName) + pageMessages.getString("html_email_body_3") + "</p>");
                // MessageFormat mf = new MessageFormat("");
                // mf.applyPattern(pageMessages.getString(
                // "you_can_access_odm_13_xml"));
                // Object[] arguments = { getFileIdInt(fileName) };
                // auditMessage.append(mf.format(arguments));
                // auditMessage.append(
                // "You can access your ODM 1.3 w/OpenClinica Extension XML file <a href='AccessFile?fileId="
                // + getFileIdInt(fileName)
                // + "'>here</a>.<br/>");
                auditMessage.append(pageMessages.getString("you_can_access_odm_13_xml") + getFileIdInt(fileName) + pageMessages.getString("access_end"));
            }
            if ("1".equals(spss)) {
                SPSSReportBean answer = new SPSSReportBean();
                fileName = generateFileService.createSPSSFile(datasetBean, eb, activeStudy, parentStudy, sysTimeBegin, generalFileDir, answer, generalFileDirCopy, userBean);
                logger.debug("-- gen spss file");
                message.append("<p>" + pageMessages.getString("html_email_body_4") + " " + getFileNameStr(fileName) + pageMessages.getString("html_email_body_4_5") + SQLInitServlet.getField("sysURL.base") + "AccessFile?fileId=" + getFileIdInt(fileName) + pageMessages.getString("html_email_body_3") + "</p>");
                // MessageFormat mf = new MessageFormat("");
                // mf.applyPattern(pageMessages.getString(
                // "you_can_access_spss"));
                // Object[] arguments = { getFileIdInt(fileName) };
                // auditMessage.append(mf.format(arguments));
                // auditMessage.append(
                // "You can access your SPSS files <a href='AccessFile?fileId="
                // + getFileIdInt(fileName) + "'>here</a>.<br/>");
                auditMessage.append(pageMessages.getString("you_can_access_spss") + getFileIdInt(fileName) + pageMessages.getString("access_end"));
            }
            // wrap up the message, and send the email
            message.append("<p>" + pageMessages.getString("html_email_body_5") + "</P><P>" + pageMessages.getString("email_footer"));
            try {
                mailSender.sendEmail(alertEmail.trim(), pageMessages.getString("job_ran_for") + " " + datasetBean.getName(), message.toString(), true);
            } catch (OpenClinicaSystemException ose) {
            // Do Nothing, In the future we might want to have an email
            // status added to system.
            }
            TriggerBean triggerBean = new TriggerBean();
            triggerBean.setDataset(datasetBean);
            triggerBean.setUserAccount(userBean);
            triggerBean.setFullName(trigger.getKey().getName());
            auditEventDAO.createRowForExtractDataJobSuccess(triggerBean, auditMessage.toString());
        } else {
            TriggerBean triggerBean = new TriggerBean();
            // triggerBean.setDataset(datasetBean);
            triggerBean.setUserAccount(userBean);
            triggerBean.setFullName(trigger.getKey().getName());
            auditEventDAO.createRowForExtractDataJobFailure(triggerBean);
        // logger.debug("-- made it here for some reason, ds id: "
        // + dsId);
        }
    // logger.debug("-- generated file: " + fileNameStr);
    // dataSource.
    } catch (Exception e) {
        // TODO Auto-generated catch block -- ideally should generate a fail
        // msg here, tbh 02/2009
        logger.debug("-- found exception: " + e.getMessage());
        e.printStackTrace();
    }
}
Also used : Locale(java.util.Locale) HashMap(java.util.HashMap) CoreResources(org.akaza.openclinica.dao.core.CoreResources) DatasetBean(org.akaza.openclinica.bean.extract.DatasetBean) ApplicationContext(org.springframework.context.ApplicationContext) SimpleTrigger(org.quartz.SimpleTrigger) StudyDAO(org.akaza.openclinica.dao.managestudy.StudyDAO) SPSSReportBean(org.akaza.openclinica.bean.extract.SPSSReportBean) JobDataMap(org.quartz.JobDataMap) GenerateExtractFileService(org.akaza.openclinica.service.extract.GenerateExtractFileService) TriggerBean(org.akaza.openclinica.bean.admin.TriggerBean) MessageFormat(java.text.MessageFormat) StudyBean(org.akaza.openclinica.bean.managestudy.StudyBean) AuditEventDAO(org.akaza.openclinica.dao.admin.AuditEventDAO) OpenClinicaSystemException(org.akaza.openclinica.exception.OpenClinicaSystemException) DatasetDAO(org.akaza.openclinica.dao.extract.DatasetDAO) UserAccountDAO(org.akaza.openclinica.dao.login.UserAccountDAO) JobExecutionException(org.quartz.JobExecutionException) OpenClinicaSystemException(org.akaza.openclinica.exception.OpenClinicaSystemException) ExtractBean(org.akaza.openclinica.bean.extract.ExtractBean) ResourceBundle(java.util.ResourceBundle) SimpleDateFormat(java.text.SimpleDateFormat)

Aggregations

JobExecutionException (org.quartz.JobExecutionException)33 SchedulerException (org.quartz.SchedulerException)9 EmailException (org.apache.commons.mail.EmailException)6 JobDataMap (org.quartz.JobDataMap)6 ArrayList (java.util.ArrayList)5 HashMap (java.util.HashMap)5 SchedulerContext (org.quartz.SchedulerContext)4 TimeSeriesResponse (com.linkedin.thirdeye.client.timeseries.TimeSeriesResponse)3 ByteArrayOutputStream (java.io.ByteArrayOutputStream)3 ExecutionException (java.util.concurrent.ExecutionException)3 CamelContext (org.apache.camel.CamelContext)3 Route (org.apache.camel.Route)3 HtmlEmail (org.apache.commons.mail.HtmlEmail)3 ThirdEyeAnomalyConfiguration (com.linkedin.thirdeye.anomaly.ThirdEyeAnomalyConfiguration)2 MetricDimensionReport (com.linkedin.thirdeye.anomaly.alert.template.pojo.MetricDimensionReport)2 DataReportHelper (com.linkedin.thirdeye.anomaly.alert.util.DataReportHelper)2 DimensionKey (com.linkedin.thirdeye.api.DimensionKey)2 MetricTimeSeries (com.linkedin.thirdeye.api.MetricTimeSeries)2 TimeGranularity (com.linkedin.thirdeye.api.TimeGranularity)2 MetricExpression (com.linkedin.thirdeye.client.MetricExpression)2