Search in sources :

Example 26 with TaskDef

use of com.netflix.conductor.common.metadata.tasks.TaskDef in project conductor by Netflix.

the class ExecutionService method poll.

public List<Task> poll(String taskType, String workerId, String domain, int count, int timeoutInMilliSecond) {
    if (timeoutInMilliSecond > MAX_POLL_TIMEOUT_MS) {
        throw new ApplicationException(ApplicationException.Code.INVALID_INPUT, "Long Poll Timeout value cannot be more than 5 seconds");
    }
    String queueName = QueueUtils.getQueueName(taskType, domain, null, null);
    List<String> taskIds = new LinkedList<>();
    List<Task> tasks = new LinkedList<>();
    try {
        taskIds = queueDAO.pop(queueName, count, timeoutInMilliSecond);
    } catch (Exception e) {
        logger.error("Error polling for task: {} from worker: {} in domain: {}, count: {}", taskType, workerId, domain, count, e);
        Monitors.error(this.getClass().getCanonicalName(), "taskPoll");
        Monitors.recordTaskPollError(taskType, domain, e.getClass().getSimpleName());
    }
    for (String taskId : taskIds) {
        try {
            Task task = getTask(taskId);
            if (task == null || task.getStatus().isTerminal()) {
                // Remove taskId(s) without a valid Task/terminal state task from the queue
                queueDAO.remove(queueName, taskId);
                logger.debug("Removed task: {} from the queue: {}", taskId, queueName);
                continue;
            }
            if (executionDAOFacade.exceedsInProgressLimit(task)) {
                // Postpone this message, so that it would be available for poll again.
                queueDAO.postpone(queueName, taskId, task.getWorkflowPriority(), queueTaskMessagePostponeSeconds);
                logger.debug("Postponed task: {} in queue: {} by {} seconds", taskId, queueName, queueTaskMessagePostponeSeconds);
                continue;
            }
            TaskDef taskDef = task.getTaskDefinition().isPresent() ? task.getTaskDefinition().get() : null;
            if (task.getRateLimitPerFrequency() > 0 && executionDAOFacade.exceedsRateLimitPerFrequency(task, taskDef)) {
                // Postpone this message, so that it would be available for poll again.
                queueDAO.postpone(queueName, taskId, task.getWorkflowPriority(), queueTaskMessagePostponeSeconds);
                logger.debug("RateLimit Execution limited for {}:{}, limit:{}", taskId, task.getTaskDefName(), task.getRateLimitPerFrequency());
                continue;
            }
            task.setStatus(Status.IN_PROGRESS);
            if (task.getStartTime() == 0) {
                task.setStartTime(System.currentTimeMillis());
                Monitors.recordQueueWaitTime(task.getTaskDefName(), task.getQueueWaitTime());
            }
            // reset callbackAfterSeconds when giving the task to the worker
            task.setCallbackAfterSeconds(0);
            task.setWorkerId(workerId);
            task.setPollCount(task.getPollCount() + 1);
            executionDAOFacade.updateTask(task);
            tasks.add(task);
        } catch (Exception e) {
            // db operation failed for dequeued message, re-enqueue with a delay
            logger.warn("DB operation failed for task: {}, postponing task in queue", taskId, e);
            Monitors.recordTaskPollError(taskType, domain, e.getClass().getSimpleName());
            queueDAO.postpone(queueName, taskId, 0, queueTaskMessagePostponeSeconds);
        }
    }
    executionDAOFacade.updateTaskLastPoll(taskType, domain, workerId);
    Monitors.recordTaskPoll(queueName);
    return tasks;
}
Also used : Task(com.netflix.conductor.common.metadata.tasks.Task) ApplicationException(com.netflix.conductor.core.execution.ApplicationException) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) LinkedList(java.util.LinkedList) ApplicationException(com.netflix.conductor.core.execution.ApplicationException)

Example 27 with TaskDef

use of com.netflix.conductor.common.metadata.tasks.TaskDef in project conductor by Netflix.

the class DeciderService method decide.

private DeciderOutcome decide(final Workflow workflow, List<Task> preScheduledTasks) throws TerminateWorkflowException {
    DeciderOutcome outcome = new DeciderOutcome();
    if (workflow.getStatus().isTerminal()) {
        // you cannot evaluate a terminal workflow
        LOGGER.debug("Workflow {} is already finished. Reason: {}", workflow, workflow.getReasonForIncompletion());
        return outcome;
    }
    checkWorkflowTimeout(workflow);
    if (workflow.getStatus().equals(WorkflowStatus.PAUSED)) {
        LOGGER.debug("Workflow " + workflow.getWorkflowId() + " is paused");
        return outcome;
    }
    // Filter the list of tasks and include only tasks that are not retried, not executed
    // marked to be skipped and not part of System tasks that is DECISION, FORK, JOIN
    // This list will be empty for a new workflow being started
    List<Task> pendingTasks = workflow.getTasks().stream().filter(isNonPendingTask).collect(Collectors.toList());
    // Get all the tasks that have not completed their lifecycle yet
    // This list will be empty for a new workflow
    Set<String> executedTaskRefNames = workflow.getTasks().stream().filter(Task::isExecuted).map(Task::getReferenceTaskName).collect(Collectors.toSet());
    Map<String, Task> tasksToBeScheduled = new LinkedHashMap<>();
    preScheduledTasks.forEach(preScheduledTask -> {
        tasksToBeScheduled.put(preScheduledTask.getReferenceTaskName(), preScheduledTask);
    });
    // A new workflow does not enter this code branch
    for (Task pendingTask : pendingTasks) {
        if (SystemTaskType.is(pendingTask.getTaskType()) && !pendingTask.getStatus().isTerminal()) {
            tasksToBeScheduled.putIfAbsent(pendingTask.getReferenceTaskName(), pendingTask);
            executedTaskRefNames.remove(pendingTask.getReferenceTaskName());
        }
        Optional<TaskDef> taskDefinition = pendingTask.getTaskDefinition();
        if (!taskDefinition.isPresent()) {
            taskDefinition = Optional.ofNullable(workflow.getWorkflowDefinition().getTaskByRefName(pendingTask.getReferenceTaskName())).map(WorkflowTask::getTaskDefinition);
        }
        if (taskDefinition.isPresent()) {
            checkTaskTimeout(taskDefinition.get(), pendingTask);
            checkTaskPollTimeout(taskDefinition.get(), pendingTask);
            // If the task has not been updated for "responseTimeoutSeconds" then mark task as TIMED_OUT
            if (isResponseTimedOut(taskDefinition.get(), pendingTask)) {
                timeoutTask(taskDefinition.get(), pendingTask);
            }
        }
        if (!pendingTask.getStatus().isSuccessful()) {
            WorkflowTask workflowTask = pendingTask.getWorkflowTask();
            if (workflowTask == null) {
                workflowTask = workflow.getWorkflowDefinition().getTaskByRefName(pendingTask.getReferenceTaskName());
            }
            Optional<Task> retryTask = retry(taskDefinition.orElse(null), workflowTask, pendingTask, workflow);
            if (retryTask.isPresent()) {
                tasksToBeScheduled.put(retryTask.get().getReferenceTaskName(), retryTask.get());
                executedTaskRefNames.remove(retryTask.get().getReferenceTaskName());
                outcome.tasksToBeUpdated.add(pendingTask);
            } else {
                pendingTask.setStatus(COMPLETED_WITH_ERRORS);
            }
        }
        if (!pendingTask.isExecuted() && !pendingTask.isRetried() && pendingTask.getStatus().isTerminal()) {
            pendingTask.setExecuted(true);
            List<Task> nextTasks = getNextTask(workflow, pendingTask);
            if (pendingTask.isLoopOverTask() && !TaskType.DO_WHILE.name().equals(pendingTask.getTaskType()) && !nextTasks.isEmpty()) {
                nextTasks = filterNextLoopOverTasks(nextTasks, pendingTask, workflow);
            }
            nextTasks.forEach(nextTask -> tasksToBeScheduled.putIfAbsent(nextTask.getReferenceTaskName(), nextTask));
            outcome.tasksToBeUpdated.add(pendingTask);
            LOGGER.debug("Scheduling Tasks from {}, next = {} for workflowId: {}", pendingTask.getTaskDefName(), nextTasks.stream().map(Task::getTaskDefName).collect(Collectors.toList()), workflow.getWorkflowId());
        }
    }
    // All the tasks that need to scheduled are added to the outcome, in case of
    List<Task> unScheduledTasks = tasksToBeScheduled.values().stream().filter(task -> !executedTaskRefNames.contains(task.getReferenceTaskName())).collect(Collectors.toList());
    if (!unScheduledTasks.isEmpty()) {
        LOGGER.debug("Scheduling Tasks: {} for workflow: {}", unScheduledTasks.stream().map(Task::getTaskDefName).collect(Collectors.toList()), workflow.getWorkflowId());
        outcome.tasksToBeScheduled.addAll(unScheduledTasks);
    }
    if (containsSuccessfulTerminateTask.test(workflow) || (outcome.tasksToBeScheduled.isEmpty() && checkForWorkflowCompletion(workflow))) {
        LOGGER.debug("Marking workflow: {} as complete.", workflow);
        outcome.isComplete = true;
    }
    return outcome;
}
Also used : TaskUtils(com.netflix.conductor.common.utils.TaskUtils) TaskMapper(com.netflix.conductor.core.execution.mapper.TaskMapper) IDGenerator(com.netflix.conductor.core.utils.IDGenerator) Status(com.netflix.conductor.common.metadata.tasks.Task.Status) LoggerFactory(org.slf4j.LoggerFactory) TaskMapperContext(com.netflix.conductor.core.execution.mapper.TaskMapperContext) HashMap(java.util.HashMap) MetadataDAO(com.netflix.conductor.dao.MetadataDAO) Task(com.netflix.conductor.common.metadata.tasks.Task) StringUtils(org.apache.commons.lang3.StringUtils) LinkedHashMap(java.util.LinkedHashMap) Inject(javax.inject.Inject) SUB_WORKFLOW(com.netflix.conductor.common.metadata.workflow.TaskType.SUB_WORKFLOW) COMPLETED_WITH_ERRORS(com.netflix.conductor.common.metadata.tasks.Task.Status.COMPLETED_WITH_ERRORS) ExternalPayloadStorageUtils(com.netflix.conductor.core.utils.ExternalPayloadStorageUtils) Workflow(com.netflix.conductor.common.run.Workflow) IN_PROGRESS(com.netflix.conductor.common.metadata.tasks.Task.Status.IN_PROGRESS) Map(java.util.Map) SKIPPED(com.netflix.conductor.common.metadata.tasks.Task.Status.SKIPPED) Operation(com.netflix.conductor.common.utils.ExternalPayloadStorage.Operation) Named(javax.inject.Named) LinkedList(java.util.LinkedList) Nullable(javax.annotation.Nullable) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) Logger(org.slf4j.Logger) WorkflowStatus(com.netflix.conductor.common.run.Workflow.WorkflowStatus) Predicate(java.util.function.Predicate) WorkflowDef(com.netflix.conductor.common.metadata.workflow.WorkflowDef) Set(java.util.Set) Collectors(java.util.stream.Collectors) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask) SCHEDULED(com.netflix.conductor.common.metadata.tasks.Task.Status.SCHEDULED) Monitors(com.netflix.conductor.metrics.Monitors) TERMINATE(com.netflix.conductor.common.metadata.workflow.TaskType.TERMINATE) List(java.util.List) TIMED_OUT(com.netflix.conductor.common.metadata.tasks.Task.Status.TIMED_OUT) Optional(java.util.Optional) TaskType(com.netflix.conductor.common.metadata.workflow.TaskType) VisibleForTesting(com.google.common.annotations.VisibleForTesting) Configuration(com.netflix.conductor.core.config.Configuration) PayloadType(com.netflix.conductor.common.utils.ExternalPayloadStorage.PayloadType) Collections(java.util.Collections) Task(com.netflix.conductor.common.metadata.tasks.Task) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask) LinkedHashMap(java.util.LinkedHashMap)

Example 28 with TaskDef

use of com.netflix.conductor.common.metadata.tasks.TaskDef in project conductor by Netflix.

the class DoWhileTaskMapper method getMappedTasks.

/**
 * This method maps {@link TaskMapper} to map a {@link WorkflowTask} of type {@link TaskType#DO_WHILE} to a {@link Task} of type {@link SystemTaskType#DO_WHILE}
 * with a status of {@link Task.Status#IN_PROGRESS}
 *
 * @param taskMapperContext: A wrapper class containing the {@link WorkflowTask}, {@link WorkflowDef}, {@link Workflow} and a string representation of the TaskId
 * @return: A {@link Task} of type {@link SystemTaskType#DO_WHILE} in a List
 */
@Override
public List<Task> getMappedTasks(TaskMapperContext taskMapperContext) {
    logger.debug("TaskMapperContext {} in DoWhileTaskMapper", taskMapperContext);
    WorkflowTask taskToSchedule = taskMapperContext.getTaskToSchedule();
    Workflow workflowInstance = taskMapperContext.getWorkflowInstance();
    Task task = workflowInstance.getTaskByRefName(taskToSchedule.getTaskReferenceName());
    if (task != null && task.getStatus().isTerminal()) {
        // Since loopTask is already completed no need to schedule task again.
        return Collections.emptyList();
    }
    String taskId = taskMapperContext.getTaskId();
    List<Task> tasksToBeScheduled = new ArrayList<>();
    int retryCount = taskMapperContext.getRetryCount();
    TaskDef taskDefinition = Optional.ofNullable(taskMapperContext.getTaskDefinition()).orElseGet(() -> Optional.ofNullable(metadataDAO.getTaskDef(taskToSchedule.getName())).orElseGet(TaskDef::new));
    Task loopTask = new Task();
    loopTask.setTaskType(SystemTaskType.DO_WHILE.name());
    loopTask.setTaskDefName(taskToSchedule.getName());
    loopTask.setReferenceTaskName(taskToSchedule.getTaskReferenceName());
    loopTask.setWorkflowInstanceId(workflowInstance.getWorkflowId());
    loopTask.setCorrelationId(workflowInstance.getCorrelationId());
    loopTask.setWorkflowType(workflowInstance.getWorkflowName());
    loopTask.setScheduledTime(System.currentTimeMillis());
    loopTask.setTaskId(taskId);
    loopTask.setIteration(1);
    loopTask.setStatus(Task.Status.IN_PROGRESS);
    loopTask.setWorkflowTask(taskToSchedule);
    loopTask.setRateLimitPerFrequency(taskDefinition.getRateLimitPerFrequency());
    loopTask.setRateLimitFrequencyInSeconds(taskDefinition.getRateLimitFrequencyInSeconds());
    tasksToBeScheduled.add(loopTask);
    List<WorkflowTask> loopOverTasks = taskToSchedule.getLoopOver();
    List<Task> tasks2 = taskMapperContext.getDeciderService().getTasksToBeScheduled(workflowInstance, loopOverTasks.get(0), retryCount);
    tasks2.forEach(t -> {
        t.setReferenceTaskName(TaskUtils.appendIteration(t.getReferenceTaskName(), loopTask.getIteration()));
        t.setIteration(loopTask.getIteration());
    });
    tasksToBeScheduled.addAll(tasks2);
    return tasksToBeScheduled;
}
Also used : Task(com.netflix.conductor.common.metadata.tasks.Task) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) ArrayList(java.util.ArrayList) Workflow(com.netflix.conductor.common.run.Workflow) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask)

Example 29 with TaskDef

use of com.netflix.conductor.common.metadata.tasks.TaskDef in project conductor by Netflix.

the class DynamicTaskMapper method getMappedTasks.

/**
 * This method maps a dynamic task to a {@link Task} based on the input params
 *
 * @param taskMapperContext: A wrapper class containing the {@link WorkflowTask}, {@link WorkflowDef}, {@link Workflow} and a string representation of the TaskId
 * @return A {@link List} that contains a single {@link Task} with a {@link Task.Status#SCHEDULED}
 */
@Override
public List<Task> getMappedTasks(TaskMapperContext taskMapperContext) throws TerminateWorkflowException {
    logger.debug("TaskMapperContext {} in DynamicTaskMapper", taskMapperContext);
    WorkflowTask taskToSchedule = taskMapperContext.getTaskToSchedule();
    Map<String, Object> taskInput = taskMapperContext.getTaskInput();
    Workflow workflowInstance = taskMapperContext.getWorkflowInstance();
    int retryCount = taskMapperContext.getRetryCount();
    String retriedTaskId = taskMapperContext.getRetryTaskId();
    String taskNameParam = taskToSchedule.getDynamicTaskNameParam();
    String taskName = getDynamicTaskName(taskInput, taskNameParam);
    taskToSchedule.setName(taskName);
    TaskDef taskDefinition = getDynamicTaskDefinition(taskToSchedule);
    taskToSchedule.setTaskDefinition(taskDefinition);
    Map<String, Object> input = parametersUtils.getTaskInput(taskToSchedule.getInputParameters(), workflowInstance, taskDefinition, taskMapperContext.getTaskId());
    Task dynamicTask = new Task();
    dynamicTask.setStartDelayInSeconds(taskToSchedule.getStartDelay());
    dynamicTask.setTaskId(taskMapperContext.getTaskId());
    dynamicTask.setReferenceTaskName(taskToSchedule.getTaskReferenceName());
    dynamicTask.setInputData(input);
    dynamicTask.setWorkflowInstanceId(workflowInstance.getWorkflowId());
    dynamicTask.setWorkflowType(workflowInstance.getWorkflowName());
    dynamicTask.setStatus(Task.Status.SCHEDULED);
    dynamicTask.setTaskType(taskToSchedule.getType());
    dynamicTask.setTaskDefName(taskToSchedule.getName());
    dynamicTask.setCorrelationId(workflowInstance.getCorrelationId());
    dynamicTask.setScheduledTime(System.currentTimeMillis());
    dynamicTask.setRetryCount(retryCount);
    dynamicTask.setCallbackAfterSeconds(taskToSchedule.getStartDelay());
    dynamicTask.setResponseTimeoutSeconds(taskDefinition.getResponseTimeoutSeconds());
    dynamicTask.setWorkflowTask(taskToSchedule);
    dynamicTask.setTaskType(taskName);
    dynamicTask.setRetriedTaskId(retriedTaskId);
    dynamicTask.setWorkflowPriority(workflowInstance.getPriority());
    return Collections.singletonList(dynamicTask);
}
Also used : Task(com.netflix.conductor.common.metadata.tasks.Task) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) Workflow(com.netflix.conductor.common.run.Workflow) WorkflowTask(com.netflix.conductor.common.metadata.workflow.WorkflowTask)

Example 30 with TaskDef

use of com.netflix.conductor.common.metadata.tasks.TaskDef in project conductor by Netflix.

the class DoWhile method getEvaluatedCondition.

@VisibleForTesting
boolean getEvaluatedCondition(Workflow workflow, Task task, WorkflowExecutor workflowExecutor) throws ScriptException {
    TaskDef taskDefinition = null;
    try {
        taskDefinition = workflowExecutor.getTaskDefinition(task);
    } catch (TerminateWorkflowException e) {
    // It is ok to not have a task definition for a DO_WHILE task
    }
    Map<String, Object> taskInput = parametersUtils.getTaskInputV2(task.getWorkflowTask().getInputParameters(), workflow, task.getTaskId(), taskDefinition);
    taskInput.put(task.getReferenceTaskName(), task.getOutputData());
    List<Task> loopOver = workflow.getTasks().stream().filter(t -> (task.getWorkflowTask().has(TaskUtils.removeIterationFromTaskRefName(t.getReferenceTaskName())) && !task.getReferenceTaskName().equals(t.getReferenceTaskName()))).collect(Collectors.toList());
    for (Task loopOverTask : loopOver) {
        taskInput.put(TaskUtils.removeIterationFromTaskRefName(loopOverTask.getReferenceTaskName()), loopOverTask.getOutputData());
    }
    String condition = task.getWorkflowTask().getLoopCondition();
    boolean shouldContinue = false;
    if (condition != null) {
        logger.debug("Condition: {} is being evaluated", condition);
        // Evaluate the expression by using the Nashhorn based script evaluator
        shouldContinue = ScriptEvaluator.evalBool(condition, taskInput);
    }
    return shouldContinue;
}
Also used : TaskUtils(com.netflix.conductor.common.utils.TaskUtils) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) Logger(org.slf4j.Logger) ScriptEvaluator(com.netflix.conductor.core.events.ScriptEvaluator) Collection(java.util.Collection) Status(com.netflix.conductor.common.metadata.tasks.Task.Status) LoggerFactory(org.slf4j.LoggerFactory) HashMap(java.util.HashMap) Task(com.netflix.conductor.common.metadata.tasks.Task) TerminateWorkflowException(com.netflix.conductor.core.execution.TerminateWorkflowException) Collectors(java.util.stream.Collectors) ParametersUtils(com.netflix.conductor.core.execution.ParametersUtils) LinkedHashMap(java.util.LinkedHashMap) List(java.util.List) Workflow(com.netflix.conductor.common.run.Workflow) Map(java.util.Map) WorkflowExecutor(com.netflix.conductor.core.execution.WorkflowExecutor) VisibleForTesting(com.google.common.annotations.VisibleForTesting) ScriptException(javax.script.ScriptException) Task(com.netflix.conductor.common.metadata.tasks.Task) TerminateWorkflowException(com.netflix.conductor.core.execution.TerminateWorkflowException) TaskDef(com.netflix.conductor.common.metadata.tasks.TaskDef) VisibleForTesting(com.google.common.annotations.VisibleForTesting)

Aggregations

TaskDef (com.netflix.conductor.common.metadata.tasks.TaskDef)172 Test (org.junit.Test)128 WorkflowTask (com.netflix.conductor.common.metadata.workflow.WorkflowTask)121 Task (com.netflix.conductor.common.metadata.tasks.Task)77 Workflow (com.netflix.conductor.common.run.Workflow)76 WorkflowDef (com.netflix.conductor.common.metadata.workflow.WorkflowDef)73 HashMap (java.util.HashMap)56 ArrayList (java.util.ArrayList)32 ConstraintViolation (javax.validation.ConstraintViolation)31 SubWorkflow (com.netflix.conductor.core.execution.tasks.SubWorkflow)30 UserTask (com.netflix.conductor.tests.utils.UserTask)28 LinkedList (java.util.LinkedList)28 ArgumentMatchers.anyString (org.mockito.ArgumentMatchers.anyString)27 List (java.util.List)22 Map (java.util.Map)19 ApplicationException (com.netflix.conductor.core.execution.ApplicationException)18 Before (org.junit.Before)14 ExpectedException (org.junit.rules.ExpectedException)13 ObjectMapper (com.fasterxml.jackson.databind.ObjectMapper)12 Collectors (java.util.stream.Collectors)11