Search in sources :

Example 1 with JobRunInfo

use of com.flink.platform.dao.entity.JobRunInfo in project flink-platform-backend by itinycheng.

the class JobExecuteThread method waitForComplete.

public StatusInfo waitForComplete(String routeUrl, JobRunInfo jobRunInfo) {
    int retryTimes = 0;
    int errorTimes = 0;
    boolean isRemote = isRemoteUrl(routeUrl);
    while (AppRunner.isRunning()) {
        try {
            StatusInfo statusInfo;
            if (isRemote) {
                HttpHeaders headers = new HttpHeaders();
                headers.setContentType(MediaType.APPLICATION_JSON);
                HttpEntity<JobRunInfo> requestEntity = new HttpEntity<>(jobRunInfo, headers);
                statusInfo = restTemplate.postForObject(routeUrl + REST_GET_STATUS, requestEntity, StatusInfo.class);
            } else {
                statusInfo = processJobStatusService.getStatus(jobRunInfo);
            }
            if (jobRunInfo.getExecMode() == STREAMING) {
                if (jobRunInfo.getCreateTime() == null) {
                    jobRunInfo.setCreateTime(LocalDateTime.now());
                }
                statusInfo = updateAndGetStreamJobStatus(statusInfo, jobRunInfo.getCreateTime());
            }
            if (statusInfo != null) {
                log.info("Job runId: {}, name: {} Status: {}", jobRunInfo.getJobId(), jobRunInfo.getName(), statusInfo.getStatus());
                if (statusInfo.getStatus().isTerminalState()) {
                    return statusInfo;
                }
            }
        } catch (Exception e) {
            if (++errorTimes > errorRetries) {
                return new CustomizeStatusInfo(ERROR, LocalDateTime.now(), LocalDateTime.now());
            }
        }
        sleep(++retryTimes);
    }
    return null;
}
Also used : CustomizeStatusInfo(com.flink.platform.web.monitor.CustomizeStatusInfo) HttpHeaders(org.springframework.http.HttpHeaders) HttpEntity(org.springframework.http.HttpEntity) StatusInfo(com.flink.platform.web.monitor.StatusInfo) CustomizeStatusInfo(com.flink.platform.web.monitor.CustomizeStatusInfo) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo)

Example 2 with JobRunInfo

use of com.flink.platform.dao.entity.JobRunInfo in project flink-platform-backend by itinycheng.

the class JobExecuteThread method call.

@Override
public JobResponse call() {
    Long jobId = jobVertex.getJobId();
    Long jobRunId = jobVertex.getJobRunId();
    try {
        // Step 1: get job info
        JobInfo jobInfo = jobInfoService.getOne(new QueryWrapper<JobInfo>().lambda().eq(JobInfo::getId, jobId).eq(JobInfo::getStatus, JobStatus.ONLINE));
        if (jobInfo == null) {
            log.warn("The job:{} is no longer exists or not in ready/scheduled status.", jobId);
            return new JobResponse(jobId, jobRunId, NOT_EXIST);
        }
        // Step 2: build route url, set localhost as default url if not specified.
        String routeUrl = jobInfo.getRouteUrl();
        routeUrl = HttpUtil.getUrlOrDefault(routeUrl);
        // Step 3: process job and get jobRun.
        JobRunInfo jobRunInfo;
        if (jobRunId != null) {
            jobRunInfo = jobRunInfoService.getById(jobRunId);
            log.info("Job:{} already submitted, runId = {}.", jobId, jobRunId);
        } else {
            jobRunInfo = processRemoteJob(routeUrl, jobId);
        }
        if (jobRunInfo == null) {
            log.warn("The jobRun:{} is no longer exists.", jobRunId);
            return new JobResponse(jobId, jobRunId, NOT_EXIST);
        }
        // Step 4: Update jobRunId in Memory.
        jobRunId = jobRunInfo.getId();
        // Step 5: Wait for job complete and get final status.
        ExecutionStatus status = jobRunInfo.getStatus();
        if (status == null || !status.isTerminalState()) {
            StatusInfo statusInfo = waitForComplete(routeUrl, jobRunInfo);
            if (statusInfo != null) {
                status = statusInfo.getStatus();
                updateJobRunInfo(jobRunId, statusInfo.getStatus(), statusInfo.getEndTime());
            }
        }
        return new JobResponse(jobId, jobRunId, status);
    } catch (Exception e) {
        log.error("Submit job and wait for complete failed.", e);
        updateJobRunInfo(jobRunId, ERROR, LocalDateTime.now());
        return new JobResponse(jobId, jobRunId, ERROR);
    }
}
Also used : JobInfo(com.flink.platform.dao.entity.JobInfo) QueryWrapper(com.baomidou.mybatisplus.core.conditions.query.QueryWrapper) ExecutionStatus(com.flink.platform.common.enums.ExecutionStatus) StatusInfo(com.flink.platform.web.monitor.StatusInfo) CustomizeStatusInfo(com.flink.platform.web.monitor.CustomizeStatusInfo) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo)

Example 3 with JobRunInfo

use of com.flink.platform.dao.entity.JobRunInfo in project flink-platform-backend by itinycheng.

the class InitJobFlowScheduler method appendExistedJobFlowRunToScheduler.

public void appendExistedJobFlowRunToScheduler() {
    List<JobFlowRun> unfinishedFlowRunList = jobFlowRunService.list(new QueryWrapper<JobFlowRun>().lambda().eq(JobFlowRun::getHost, Constant.HOST_IP).in(JobFlowRun::getStatus, getNonTerminals()));
    for (JobFlowRun jobFlowRun : unfinishedFlowRunList) {
        DAG<Long, JobVertex, JobEdge> flow = jobFlowRun.getFlow();
        // Update status of JobVertex in flow.
        jobRunInfoService.list(new QueryWrapper<JobRunInfo>().lambda().eq(JobRunInfo::getFlowRunId, jobFlowRun.getId())).forEach(jobRunInfo -> {
            JobVertex vertex = flow.getVertex(jobRunInfo.getJobId());
            vertex.setJobRunId(jobRunInfo.getId());
            vertex.setJobRunStatus(jobRunInfo.getStatus());
        });
        jobFlowScheduleService.registerToScheduler(jobFlowRun);
    }
}
Also used : JobVertex(com.flink.platform.common.model.JobVertex) QueryWrapper(com.baomidou.mybatisplus.core.conditions.query.QueryWrapper) JobEdge(com.flink.platform.common.model.JobEdge) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo) JobFlowRun(com.flink.platform.dao.entity.JobFlowRun)

Example 4 with JobRunInfo

use of com.flink.platform.dao.entity.JobRunInfo in project flink-platform-backend by itinycheng.

the class ProcessJobService method processJob.

public JobRunInfo processJob(final long jobId, final long flowRunId) throws Exception {
    JobCommand jobCommand = null;
    JobInfo jobInfo = null;
    try {
        // step 1: get job info
        jobInfo = jobInfoService.getOne(new QueryWrapper<JobInfo>().lambda().eq(JobInfo::getId, jobId).eq(JobInfo::getStatus, JobStatus.ONLINE));
        if (jobInfo == null) {
            throw new JobCommandGenException(String.format("The job: %s is no longer exists or in delete status.", jobId));
        }
        // step 2: replace variables in the sql statement
        JobInfo finalJobInfo = jobInfo;
        Map<String, Object> variableMap = Arrays.stream(SqlVar.values()).filter(sqlVar -> sqlVar.type == SqlVar.VarType.VARIABLE).filter(sqlVar -> finalJobInfo.getSubject().contains(sqlVar.variable)).map(sqlVar -> Pair.of(sqlVar.variable, sqlVar.valueProvider.apply(finalJobInfo))).collect(toMap(Pair::getLeft, Pair::getRight));
        MapUtils.emptyIfNull(finalJobInfo.getVariables()).forEach((name, value) -> {
            SqlVar sqlVar = SqlVar.matchPrefix(name);
            variableMap.put(name, sqlVar.valueProvider.apply(value));
        });
        // replace variable with actual value
        for (Map.Entry<String, Object> entry : variableMap.entrySet()) {
            String originSubject = jobInfo.getSubject();
            String distSubject = originSubject.replace(entry.getKey(), entry.getValue().toString());
            jobInfo.setSubject(distSubject);
        }
        JobType jobType = jobInfo.getType();
        String version = jobInfo.getVersion();
        // step 3: build job command, create a SqlContext if needed
        jobCommand = jobCommandBuilders.stream().filter(builder -> builder.isSupported(jobType, version)).findFirst().orElseThrow(() -> new JobCommandGenException("No available job command builder")).buildCommand(jobInfo);
        // step 4: submit job
        LocalDateTime submitTime = LocalDateTime.now();
        String commandString = jobCommand.toCommandString();
        JobCallback callback = jobCommandExecutors.stream().filter(executor -> executor.isSupported(jobType)).findFirst().orElseThrow(() -> new JobCommandGenException("No available job command executor")).execCommand(commandString);
        // step 5: write job run info to db
        ExecutionStatus executionStatus = getExecutionStatus(jobType, callback);
        JobRunInfo jobRunInfo = new JobRunInfo();
        jobRunInfo.setName(jobInfo.getName() + "-" + System.currentTimeMillis());
        jobRunInfo.setJobId(jobInfo.getId());
        jobRunInfo.setFlowRunId(flowRunId);
        jobRunInfo.setDeployMode(jobInfo.getDeployMode());
        jobRunInfo.setExecMode(jobInfo.getExecMode());
        jobRunInfo.setSubject(jobInfo.getSubject());
        jobRunInfo.setStatus(executionStatus);
        jobRunInfo.setVariables(JsonUtil.toJsonString(variableMap));
        jobRunInfo.setBackInfo(JsonUtil.toJsonString(callback));
        jobRunInfo.setSubmitTime(submitTime);
        if (executionStatus.isTerminalState()) {
            jobRunInfo.setStopTime(LocalDateTime.now());
        }
        jobRunInfoService.save(jobRunInfo);
        // step 6: print job command info
        log.info("Job: {} submitted, time: {}", jobId, System.currentTimeMillis());
        return jobRunInfo;
    } finally {
        if (jobInfo != null && jobInfo.getType() == JobType.FLINK_SQL && jobCommand != null) {
            try {
                FlinkCommand flinkCommand = (FlinkCommand) jobCommand;
                if (flinkCommand.getMainArgs() != null) {
                    Files.deleteIfExists(Paths.get(flinkCommand.getMainArgs()));
                }
            } catch (Exception e) {
                log.warn("Delete sql context file failed", e);
            }
        }
    }
}
Also used : Arrays(java.util.Arrays) JsonUtil(com.flink.platform.common.util.JsonUtil) FlinkCommand(com.flink.platform.web.command.FlinkCommand) JobCallback(com.flink.platform.web.command.JobCallback) LocalDateTime(java.time.LocalDateTime) Autowired(org.springframework.beans.factory.annotation.Autowired) JobInfoService(com.flink.platform.dao.service.JobInfoService) SqlVar(com.flink.platform.web.enums.SqlVar) Pair(org.apache.commons.lang3.tuple.Pair) Collectors.toMap(java.util.stream.Collectors.toMap) Service(org.springframework.stereotype.Service) Map(java.util.Map) SUCCESS(com.flink.platform.common.enums.ExecutionStatus.SUCCESS) JobStatus(com.flink.platform.common.enums.JobStatus) CommandBuilder(com.flink.platform.web.command.CommandBuilder) JobType(com.flink.platform.common.enums.JobType) MapUtils(org.apache.commons.collections4.MapUtils) QueryWrapper(com.baomidou.mybatisplus.core.conditions.query.QueryWrapper) JobInfo(com.flink.platform.dao.entity.JobInfo) Files(java.nio.file.Files) JobRunInfoService(com.flink.platform.dao.service.JobRunInfoService) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo) CommandExecutor(com.flink.platform.web.command.CommandExecutor) Slf4j(lombok.extern.slf4j.Slf4j) List(java.util.List) ExecutionStatus(com.flink.platform.common.enums.ExecutionStatus) JobCommand(com.flink.platform.web.command.JobCommand) Paths(java.nio.file.Paths) JobCommandGenException(com.flink.platform.common.exception.JobCommandGenException) LocalDateTime(java.time.LocalDateTime) JobCommandGenException(com.flink.platform.common.exception.JobCommandGenException) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo) JobCommandGenException(com.flink.platform.common.exception.JobCommandGenException) JobType(com.flink.platform.common.enums.JobType) SqlVar(com.flink.platform.web.enums.SqlVar) JobInfo(com.flink.platform.dao.entity.JobInfo) ExecutionStatus(com.flink.platform.common.enums.ExecutionStatus) JobCommand(com.flink.platform.web.command.JobCommand) FlinkCommand(com.flink.platform.web.command.FlinkCommand) Collectors.toMap(java.util.stream.Collectors.toMap) Map(java.util.Map) JobCallback(com.flink.platform.web.command.JobCallback)

Example 5 with JobRunInfo

use of com.flink.platform.dao.entity.JobRunInfo in project flink-platform-backend by itinycheng.

the class StatusRunner method execute.

@Override
public void execute(JobExecutionContext context) {
    List<JobRunInfo> jobRunList = jobRunInfoService.list(new QueryWrapper<JobRunInfo>().lambda().in(JobRunInfo::getStatus, NON_TERMINAL_STATUS_LIST));
    Map<String, List<JobRunInfo>> groupedJobRunList = jobRunList.stream().collect(groupingBy(jobRunInfo -> StringUtils.defaultString(jobRunInfo.getRouteUrl())));
    for (Entry<String, List<JobRunInfo>> entry : groupedJobRunList.entrySet()) {
        String routeUrl = HttpUtil.getUrlOrDefault(entry.getKey());
        List<Long> ids = entry.getValue().stream().map(JobRunInfo::getId).collect(toList());
        HttpHeaders headers = new HttpHeaders();
        headers.setContentType(MediaType.APPLICATION_JSON);
        HttpEntity<List<Long>> requestEntity = new HttpEntity<>(ids, headers);
        ResultInfo<Object> response = restTemplate.exchange(routeUrl + REST_UPDATE_STATUS, HttpMethod.POST, requestEntity, new ParameterizedTypeReference<ResultInfo<Object>>() {
        }).getBody();
        log.info("The job run id in : {} are processed, result: {}", ids, response);
    }
}
Also used : QueryWrapper(com.baomidou.mybatisplus.core.conditions.query.QueryWrapper) JobExecutionContext(org.quartz.JobExecutionContext) ParameterizedTypeReference(org.springframework.core.ParameterizedTypeReference) HttpHeaders(org.springframework.http.HttpHeaders) SpringContext(com.flink.platform.web.common.SpringContext) MediaType(org.springframework.http.MediaType) HttpMethod(org.springframework.http.HttpMethod) Collectors.groupingBy(java.util.stream.Collectors.groupingBy) Job(org.quartz.Job) StringUtils(org.apache.commons.lang3.StringUtils) JobRunInfoService(com.flink.platform.dao.service.JobRunInfoService) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo) HttpUtil(com.flink.platform.web.util.HttpUtil) Slf4j(lombok.extern.slf4j.Slf4j) HttpEntity(org.springframework.http.HttpEntity) List(java.util.List) Collectors.toList(java.util.stream.Collectors.toList) ExecutionStatus(com.flink.platform.common.enums.ExecutionStatus) Map(java.util.Map) Entry(java.util.Map.Entry) ResultInfo(com.flink.platform.web.entity.response.ResultInfo) RestTemplate(org.springframework.web.client.RestTemplate) HttpHeaders(org.springframework.http.HttpHeaders) HttpEntity(org.springframework.http.HttpEntity) JobRunInfo(com.flink.platform.dao.entity.JobRunInfo) ParameterizedTypeReference(org.springframework.core.ParameterizedTypeReference) List(java.util.List) Collectors.toList(java.util.stream.Collectors.toList)

Aggregations

JobRunInfo (com.flink.platform.dao.entity.JobRunInfo)6 QueryWrapper (com.baomidou.mybatisplus.core.conditions.query.QueryWrapper)4 ExecutionStatus (com.flink.platform.common.enums.ExecutionStatus)3 JobInfo (com.flink.platform.dao.entity.JobInfo)2 JobRunInfoService (com.flink.platform.dao.service.JobRunInfoService)2 CustomizeStatusInfo (com.flink.platform.web.monitor.CustomizeStatusInfo)2 StatusInfo (com.flink.platform.web.monitor.StatusInfo)2 List (java.util.List)2 Map (java.util.Map)2 Slf4j (lombok.extern.slf4j.Slf4j)2 HttpEntity (org.springframework.http.HttpEntity)2 HttpHeaders (org.springframework.http.HttpHeaders)2 SUCCESS (com.flink.platform.common.enums.ExecutionStatus.SUCCESS)1 JobStatus (com.flink.platform.common.enums.JobStatus)1 JobType (com.flink.platform.common.enums.JobType)1 JobCommandGenException (com.flink.platform.common.exception.JobCommandGenException)1 JobEdge (com.flink.platform.common.model.JobEdge)1 JobVertex (com.flink.platform.common.model.JobVertex)1 JsonUtil (com.flink.platform.common.util.JsonUtil)1 JobFlowRun (com.flink.platform.dao.entity.JobFlowRun)1