Search in sources :

Example 76 with Job

use of com.google.cloud.scheduler.v1.Job in project cdap by caskdata.

the class DataprocRuntimeJobManager method launch.

@Override
public void launch(RuntimeJobInfo runtimeJobInfo) throws Exception {
    String bucket = DataprocUtils.getBucketName(this.bucket);
    ProgramRunInfo runInfo = runtimeJobInfo.getProgramRunInfo();
    LOG.debug("Launching run {} with following configurations: cluster {}, project {}, region {}, bucket {}.", runInfo.getRun(), clusterName, projectId, region, bucket);
    // TODO: CDAP-16408 use fixed directory for caching twill, application, artifact jars
    File tempDir = Files.createTempDirectory("dataproc.launcher").toFile();
    // on dataproc bucket the run root will be <bucket>/cdap-job/<runid>/. All the files for this run will be copied
    // under that base dir.
    String runRootPath = getPath(DataprocUtils.CDAP_GCS_ROOT, runInfo.getRun());
    try {
        // step 1: build twill.jar and launcher.jar and add them to files to be copied to gcs
        List<LocalFile> localFiles = getRuntimeLocalFiles(runtimeJobInfo.getLocalizeFiles(), tempDir);
        // step 2: upload all the necessary files to gcs so that those files are available to dataproc job
        List<Future<LocalFile>> uploadFutures = new ArrayList<>();
        for (LocalFile fileToUpload : localFiles) {
            String targetFilePath = getPath(runRootPath, fileToUpload.getName());
            uploadFutures.add(provisionerContext.execute(() -> uploadFile(bucket, targetFilePath, fileToUpload)).toCompletableFuture());
        }
        List<LocalFile> uploadedFiles = new ArrayList<>();
        for (Future<LocalFile> uploadFuture : uploadFutures) {
            uploadedFiles.add(uploadFuture.get());
        }
        // step 3: build the hadoop job request to be submitted to dataproc
        SubmitJobRequest request = getSubmitJobRequest(runtimeJobInfo, uploadedFiles);
        // step 4: submit hadoop job to dataproc
        try {
            Job job = getJobControllerClient().submitJob(request);
            LOG.debug("Successfully submitted hadoop job {} to cluster {}.", job.getReference().getJobId(), clusterName);
        } catch (AlreadyExistsException ex) {
            // the job id already exists, ignore the job.
            LOG.warn("The dataproc job {} already exists. Ignoring resubmission of the job.", request.getJob().getReference().getJobId());
        }
        DataprocUtils.emitMetric(provisionerContext, region, "provisioner.submitJob.response.count");
    } catch (Exception e) {
        // delete all uploaded gcs files in case of exception
        DataprocUtils.deleteGCSPath(getStorageClient(), bucket, runRootPath);
        DataprocUtils.emitMetric(provisionerContext, region, "provisioner.submitJob.response.count", e);
        throw new Exception(String.format("Error while launching job %s on cluster %s", getJobId(runInfo), clusterName), e);
    } finally {
        // delete local temp directory
        deleteDirectoryContents(tempDir);
    }
}
Also used : AlreadyExistsException(com.google.api.gax.rpc.AlreadyExistsException) ArrayList(java.util.ArrayList) SubmitJobRequest(com.google.cloud.dataproc.v1beta2.SubmitJobRequest) AlreadyExistsException(com.google.api.gax.rpc.AlreadyExistsException) IOException(java.io.IOException) ApiException(com.google.api.gax.rpc.ApiException) StorageException(com.google.cloud.storage.StorageException) DefaultLocalFile(org.apache.twill.internal.DefaultLocalFile) LocalFile(org.apache.twill.api.LocalFile) Future(java.util.concurrent.Future) HadoopJob(com.google.cloud.dataproc.v1beta2.HadoopJob) Job(com.google.cloud.dataproc.v1beta2.Job) DefaultLocalFile(org.apache.twill.internal.DefaultLocalFile) LocalFile(org.apache.twill.api.LocalFile) File(java.io.File) ProgramRunInfo(io.cdap.cdap.runtime.spi.ProgramRunInfo)

Example 77 with Job

use of com.google.cloud.scheduler.v1.Job in project java-docs-samples by GoogleCloudPlatform.

the class CustomRankingSearchJobs method searchCustomRankingJobs.

// Search Jobs using custom rankings.
public static void searchCustomRankingJobs(String projectId, String tenantId) throws IOException {
    // the "close" method on the client to safely clean up any remaining background resources.
    try (JobServiceClient jobServiceClient = JobServiceClient.create()) {
        TenantName parent = TenantName.of(projectId, tenantId);
        String domain = "www.example.com";
        String sessionId = "Hashed session identifier";
        String userId = "Hashed user identifier";
        RequestMetadata requestMetadata = RequestMetadata.newBuilder().setDomain(domain).setSessionId(sessionId).setUserId(userId).build();
        SearchJobsRequest.CustomRankingInfo.ImportanceLevel importanceLevel = SearchJobsRequest.CustomRankingInfo.ImportanceLevel.EXTREME;
        String rankingExpression = "(someFieldLong + 25) * 0.25";
        SearchJobsRequest.CustomRankingInfo customRankingInfo = SearchJobsRequest.CustomRankingInfo.newBuilder().setImportanceLevel(importanceLevel).setRankingExpression(rankingExpression).build();
        String orderBy = "custom_ranking desc";
        SearchJobsRequest request = SearchJobsRequest.newBuilder().setParent(parent.toString()).setRequestMetadata(requestMetadata).setCustomRankingInfo(customRankingInfo).setOrderBy(orderBy).build();
        for (SearchJobsResponse.MatchingJob responseItem : jobServiceClient.searchJobs(request).iterateAll()) {
            System.out.format("Job summary: %s%n", responseItem.getJobSummary());
            System.out.format("Job title snippet: %s%n", responseItem.getJobTitleSnippet());
            Job job = responseItem.getJob();
            System.out.format("Job name: %s%n", job.getName());
            System.out.format("Job title: %s%n", job.getTitle());
        }
    }
}
Also used : SearchJobsRequest(com.google.cloud.talent.v4beta1.SearchJobsRequest) SearchJobsResponse(com.google.cloud.talent.v4beta1.SearchJobsResponse) TenantName(com.google.cloud.talent.v4beta1.TenantName) JobServiceClient(com.google.cloud.talent.v4beta1.JobServiceClient) Job(com.google.cloud.talent.v4beta1.Job) RequestMetadata(com.google.cloud.talent.v4beta1.RequestMetadata)

Example 78 with Job

use of com.google.cloud.scheduler.v1.Job in project java-docs-samples by GoogleCloudPlatform.

the class JobSearchCreateJobCustomAttributes method createJob.

// Create Job with Custom Attributes.
public static void createJob(String projectId, String tenantId, String companyId, String requisitionId) throws IOException {
    // the "close" method on the client to safely clean up any remaining background resources.
    try (JobServiceClient jobServiceClient = JobServiceClient.create()) {
        TenantName parent = TenantName.of(projectId, tenantId);
        // Custom attribute can be string or numeric value, and can be filtered in search queries.
        // https://cloud.google.com/talent-solution/job-search/docs/custom-attributes
        CustomAttribute customAttribute = CustomAttribute.newBuilder().addStringValues("Internship").addStringValues("Apprenticeship").setFilterable(true).build();
        Job job = Job.newBuilder().setCompany(companyId).setTitle("Software Developer I").setDescription("This is a description of this <i>wonderful</i> job!").putCustomAttributes("FOR_STUDENTS", customAttribute).setRequisitionId(requisitionId).setLanguageCode("en-US").build();
        CreateJobRequest request = CreateJobRequest.newBuilder().setParent(parent.toString()).setJob(job).build();
        Job response = jobServiceClient.createJob(request);
        System.out.printf("Created job: %s\n", response.getName());
    }
}
Also used : TenantName(com.google.cloud.talent.v4beta1.TenantName) CustomAttribute(com.google.cloud.talent.v4beta1.CustomAttribute) JobServiceClient(com.google.cloud.talent.v4beta1.JobServiceClient) Job(com.google.cloud.talent.v4beta1.Job) CreateJobRequest(com.google.cloud.talent.v4beta1.CreateJobRequest)

Example 79 with Job

use of com.google.cloud.scheduler.v1.Job in project java-docs-samples by GoogleCloudPlatform.

the class JobSearchGetJob method getJob.

// Get Job.
public static void getJob(String projectId, String tenantId, String jobId) throws IOException {
    // the "close" method on the client to safely clean up any remaining background resources.
    try (JobServiceClient jobServiceClient = JobServiceClient.create()) {
        JobName name = JobName.ofProjectTenantJobName(projectId, tenantId, jobId);
        GetJobRequest request = GetJobRequest.newBuilder().setName(name.toString()).build();
        Job response = jobServiceClient.getJob(request);
        System.out.format("Job name: %s%n", response.getName());
        System.out.format("Requisition ID: %s%n", response.getRequisitionId());
        System.out.format("Title: %s%n", response.getTitle());
        System.out.format("Description: %s%n", response.getDescription());
        System.out.format("Posting language: %s%n", response.getLanguageCode());
        for (String address : response.getAddressesList()) {
            System.out.format("Address: %s%n", address);
        }
        for (String email : response.getApplicationInfo().getEmailsList()) {
            System.out.format("Email: %s%n", email);
        }
        for (String websiteUri : response.getApplicationInfo().getUrisList()) {
            System.out.format("Website: %s%n", websiteUri);
        }
    }
}
Also used : GetJobRequest(com.google.cloud.talent.v4beta1.GetJobRequest) JobName(com.google.cloud.talent.v4beta1.JobName) JobServiceClient(com.google.cloud.talent.v4beta1.JobServiceClient) Job(com.google.cloud.talent.v4beta1.Job)

Example 80 with Job

use of com.google.cloud.scheduler.v1.Job in project java-docs-samples by GoogleCloudPlatform.

the class CreateJobFromAdHoc method createJobFromAdHoc.

// Creates a job from an ad-hoc configuration.
public static void createJobFromAdHoc(String projectId, String location, String inputUri, String outputUri) throws IOException {
    // once, and can be reused for multiple requests.
    try (TranscoderServiceClient transcoderServiceClient = TranscoderServiceClient.create()) {
        VideoStream videoStream0 = VideoStream.newBuilder().setH264(H264CodecSettings.newBuilder().setBitrateBps(550000).setFrameRate(60).setHeightPixels(360).setWidthPixels(640)).build();
        VideoStream videoStream1 = VideoStream.newBuilder().setH264(H264CodecSettings.newBuilder().setBitrateBps(2500000).setFrameRate(60).setHeightPixels(720).setWidthPixels(1280)).build();
        AudioStream audioStream0 = AudioStream.newBuilder().setCodec("aac").setBitrateBps(64000).build();
        JobConfig config = JobConfig.newBuilder().addInputs(Input.newBuilder().setKey("input0").setUri(inputUri)).setOutput(Output.newBuilder().setUri(outputUri)).addElementaryStreams(ElementaryStream.newBuilder().setKey("video_stream0").setVideoStream(videoStream0)).addElementaryStreams(ElementaryStream.newBuilder().setKey("video_stream1").setVideoStream(videoStream1)).addElementaryStreams(ElementaryStream.newBuilder().setKey("audio_stream0").setAudioStream(audioStream0)).addMuxStreams(MuxStream.newBuilder().setKey("sd").setContainer("mp4").addElementaryStreams("video_stream0").addElementaryStreams("audio_stream0").build()).addMuxStreams(MuxStream.newBuilder().setKey("hd").setContainer("mp4").addElementaryStreams("video_stream1").addElementaryStreams("audio_stream0").build()).build();
        var createJobRequest = CreateJobRequest.newBuilder().setJob(Job.newBuilder().setInputUri(inputUri).setOutputUri(outputUri).setConfig(config).build()).setParent(LocationName.of(projectId, location).toString()).build();
        // Send the job creation request and process the response.
        Job job = transcoderServiceClient.createJob(createJobRequest);
        System.out.println("Job: " + job.getName());
    }
}
Also used : TranscoderServiceClient(com.google.cloud.video.transcoder.v1.TranscoderServiceClient) AudioStream(com.google.cloud.video.transcoder.v1.AudioStream) VideoStream(com.google.cloud.video.transcoder.v1.VideoStream) Job(com.google.cloud.video.transcoder.v1.Job) JobConfig(com.google.cloud.video.transcoder.v1.JobConfig)

Aggregations

Job (org.pentaho.platform.api.scheduler2.Job)94 Test (org.junit.Test)89 Job (io.fabric8.kubernetes.api.model.batch.v1.Job)38 Serializable (java.io.Serializable)25 ArrayList (java.util.ArrayList)24 SimpleJobTrigger (org.pentaho.platform.api.scheduler2.SimpleJobTrigger)21 Job (com.google.cloud.talent.v4beta1.Job)20 HashMap (java.util.HashMap)20 JobScheduleRequest (org.pentaho.platform.web.http.api.resources.JobScheduleRequest)19 ComplexJobTrigger (org.pentaho.platform.api.scheduler2.ComplexJobTrigger)18 SchedulerException (org.pentaho.platform.api.scheduler2.SchedulerException)17 JobServiceClient (com.google.cloud.talent.v4beta1.JobServiceClient)16 Date (java.util.Date)14 IJobFilter (org.pentaho.platform.api.scheduler2.IJobFilter)14 Job (com.google.cloud.video.transcoder.v1.Job)13 TranscoderServiceClient (com.google.cloud.video.transcoder.v1.TranscoderServiceClient)13 JobBuilder (io.fabric8.kubernetes.api.model.batch.v1.JobBuilder)13 IJobTrigger (org.pentaho.platform.api.scheduler2.IJobTrigger)12 Map (java.util.Map)11 Test (org.junit.jupiter.api.Test)10