Search in sources :

Example 1 with Job

use of io.fabric8.kubernetes.api.model.batch.v1.Job in project jointware by isdream.

the class KubernetesKeyValueStyleGeneratorTest method testKubernetesWithAllKind.

protected static void testKubernetesWithAllKind() throws Exception {
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new ServiceAccount());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new ThirdPartyResource());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new ResourceQuota());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Node());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new ConfigMap());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new NetworkPolicy());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new CustomResourceDefinition());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Ingress());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Service());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Namespace());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Secret());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new LimitRange());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Event());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new PersistentVolume());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new StatefulSet());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new PersistentVolumeClaim());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new DaemonSet());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new HorizontalPodAutoscaler());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Pod());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new ReplicaSet());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Job());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new ReplicationController());
    info(KUBERNETES_KIND, KubernetesDocumentKeyValueStyleGenerator.class.getName(), new Deployment());
}
Also used : ServiceAccount(io.fabric8.kubernetes.api.model.ServiceAccount) ConfigMap(io.fabric8.kubernetes.api.model.ConfigMap) Pod(io.fabric8.kubernetes.api.model.Pod) ThirdPartyResource(io.fabric8.kubernetes.api.model.extensions.ThirdPartyResource) NetworkPolicy(io.fabric8.kubernetes.api.model.extensions.NetworkPolicy) CustomResourceDefinition(io.fabric8.kubernetes.api.model.apiextensions.CustomResourceDefinition) Node(io.fabric8.kubernetes.api.model.Node) Ingress(io.fabric8.kubernetes.api.model.extensions.Ingress) Service(io.fabric8.kubernetes.api.model.Service) Deployment(io.fabric8.kubernetes.api.model.extensions.Deployment) Namespace(io.fabric8.kubernetes.api.model.Namespace) Secret(io.fabric8.kubernetes.api.model.Secret) LimitRange(io.fabric8.kubernetes.api.model.LimitRange) ResourceQuota(io.fabric8.kubernetes.api.model.ResourceQuota) ReplicationController(io.fabric8.kubernetes.api.model.ReplicationController) HorizontalPodAutoscaler(io.fabric8.kubernetes.api.model.HorizontalPodAutoscaler) Event(io.fabric8.kubernetes.api.model.Event) PersistentVolumeClaim(io.fabric8.kubernetes.api.model.PersistentVolumeClaim) DaemonSet(io.fabric8.kubernetes.api.model.extensions.DaemonSet) PersistentVolume(io.fabric8.kubernetes.api.model.PersistentVolume) StatefulSet(io.fabric8.kubernetes.api.model.extensions.StatefulSet) Job(io.fabric8.kubernetes.api.model.Job) KubernetesDocumentKeyValueStyleGenerator(com.github.isdream.chameleon.docs.KubernetesDocumentKeyValueStyleGenerator) ReplicaSet(io.fabric8.kubernetes.api.model.extensions.ReplicaSet)

Example 2 with Job

use of io.fabric8.kubernetes.api.model.batch.v1.Job in project pravega by pravega.

the class RemoteSequential method newJob.

private Job newJob(String id, String className, String methodName) {
    Map<String, String> labels = new HashMap<>(1);
    labels.put("testMethodName", methodName);
    // This can be used to set environment variables while executing the job on Metronome.
    Map<String, String> env = new HashMap<>(2);
    env.put("masterIP", System.getProperty("masterIP"));
    env.put("env2", "value102");
    Artifact art = new Artifact();
    // It caches the artifacts, disabling it for now.
    art.setCache(false);
    // jar is not executable.
    art.setExecutable(false);
    art.setExtract(false);
    art.setUri(System.getProperty("testArtifactUrl", "InvalidTestArtifactURL"));
    Restart restart = new Restart();
    // the tests are expected to finish in 2 mins, this can be changed to
    restart.setActiveDeadlineSeconds(120);
    // a higher value if required.
    restart.setPolicy("NEVER");
    Run run = new Run();
    run.setArtifacts(Collections.singletonList(art));
    run.setCmd("docker run --rm -v $(pwd):/data " + System.getProperty("dockerImageRegistry") + "/java:8 java" + " -DmasterIP=" + LoginClient.MESOS_MASTER + " -DskipServiceInstallation=" + Utils.isSkipServiceInstallationEnabled() + " -cp /data/pravega-test-system-" + System.getProperty("testVersion") + ".jar io.pravega.test.system.SingleJUnitTestRunner " + className + "#" + methodName + " > server.log 2>&1" + "; exit $?");
    // CPU shares.
    run.setCpus(0.5);
    // amount of memory required for running test in MB.
    run.setMem(512.0);
    run.setDisk(50.0);
    run.setEnv(env);
    run.setMaxLaunchDelay(3600);
    run.setRestart(restart);
    run.setUser("root");
    Job job = new Job();
    job.setId(id);
    job.setDescription(id);
    job.setLabels(labels);
    job.setRun(run);
    return job;
}
Also used : HashMap(java.util.HashMap) Run(io.pravega.test.system.framework.metronome.model.v1.Run) Restart(io.pravega.test.system.framework.metronome.model.v1.Restart) Job(io.pravega.test.system.framework.metronome.model.v1.Job) Artifact(io.pravega.test.system.framework.metronome.model.v1.Artifact)

Example 3 with Job

use of io.fabric8.kubernetes.api.model.batch.v1.Job in project pentaho-platform by pentaho.

the class SolutionImportHandlerIT method testImportSchedules.

@Test
public void testImportSchedules() throws PlatformImportException, SchedulerException {
    SolutionImportHandler importHandler = new SolutionImportHandler(Collections.emptyList());
    importHandler = spy(importHandler);
    List<JobScheduleRequest> requests = new ArrayList<>(4);
    requests.add(createJobScheduleRequest("NORMAL", JobState.NORMAL));
    requests.add(createJobScheduleRequest("PAUSED", JobState.PAUSED));
    requests.add(createJobScheduleRequest("PAUSED", JobState.COMPLETE));
    requests.add(createJobScheduleRequest("PAUSED", JobState.ERROR));
    doReturn(new ArrayList<Job>()).when(importHandler).getAllJobs(any());
    importHandler.importSchedules(requests);
    List<Job> jobs = scheduler.getJobs(job -> true);
    assertEquals(4, jobs.size());
    for (Job job : jobs) {
        assertEquals(job.getJobName(), job.getState().toString());
    }
}
Also used : ArrayList(java.util.ArrayList) Job(org.pentaho.platform.api.scheduler2.Job) JobScheduleRequest(org.pentaho.platform.web.http.api.resources.JobScheduleRequest) Test(org.junit.Test)

Example 4 with Job

use of io.fabric8.kubernetes.api.model.batch.v1.Job in project pentaho-platform by pentaho.

the class PentahoPlatformExporter method exportSchedules.

protected void exportSchedules() {
    log.debug("export schedules");
    try {
        List<Job> jobs = getScheduler().getJobs(null);
        for (Job job : jobs) {
            if (job.getJobName().equals(EmbeddedVersionCheckSystemListener.VERSION_CHECK_JOBNAME)) {
                // if it doesn't exist and fails if you try to import it due to a null ActionClass
                continue;
            }
            try {
                JobScheduleRequest scheduleRequest = ScheduleExportUtil.createJobScheduleRequest(job);
                getExportManifest().addSchedule(scheduleRequest);
            } catch (IllegalArgumentException e) {
                log.warn(e.getMessage(), e);
            }
        }
    } catch (SchedulerException e) {
        log.error(Messages.getInstance().getString("PentahoPlatformExporter.ERROR_EXPORTING_JOBS"), e);
    }
}
Also used : SchedulerException(org.pentaho.platform.api.scheduler2.SchedulerException) Job(org.pentaho.platform.api.scheduler2.Job) JobScheduleRequest(org.pentaho.platform.web.http.api.resources.JobScheduleRequest)

Example 5 with Job

use of io.fabric8.kubernetes.api.model.batch.v1.Job in project pentaho-platform by pentaho.

the class ScheduleExportUtilTest method testCreateJobScheduleRequest_MultipleTypesJobParam.

@Test
public void testCreateJobScheduleRequest_MultipleTypesJobParam() throws Exception {
    String jobName = "JOB";
    Long l = Long.MAX_VALUE;
    Date d = new Date();
    Boolean b = true;
    Map<String, Serializable> params = new HashMap<>();
    params.put("NumberValue", l);
    params.put("DateValue", d);
    params.put("BooleanValue", b);
    Job job = mock(Job.class);
    CronJobTrigger trigger = mock(CronJobTrigger.class);
    when(job.getJobTrigger()).thenReturn(trigger);
    when(job.getJobName()).thenReturn(jobName);
    when(job.getJobParams()).thenReturn(params);
    JobScheduleRequest jobScheduleRequest = ScheduleExportUtil.createJobScheduleRequest(job);
    for (JobScheduleParam jobScheduleParam : jobScheduleRequest.getJobParameters()) {
        assertTrue(jobScheduleParam.getValue().equals(l) || jobScheduleParam.getValue().equals(d) || jobScheduleParam.getValue().equals(b));
    }
}
Also used : JobScheduleParam(org.pentaho.platform.web.http.api.resources.JobScheduleParam) Serializable(java.io.Serializable) HashMap(java.util.HashMap) Job(org.pentaho.platform.api.scheduler2.Job) JobScheduleRequest(org.pentaho.platform.web.http.api.resources.JobScheduleRequest) Date(java.util.Date) CronJobTrigger(org.pentaho.platform.api.scheduler2.CronJobTrigger) Test(org.junit.Test)

Aggregations

Job (org.pentaho.platform.api.scheduler2.Job)94 Test (org.junit.Test)92 HashMap (java.util.HashMap)34 Job (io.fabric8.kubernetes.api.model.batch.v1.Job)33 JobBuilder (io.fabric8.kubernetes.api.model.batch.v1.JobBuilder)29 Serializable (java.io.Serializable)25 ArrayList (java.util.ArrayList)25 Test (org.junit.jupiter.api.Test)22 SimpleJobTrigger (org.pentaho.platform.api.scheduler2.SimpleJobTrigger)21 Job (com.google.cloud.talent.v4beta1.Job)20 IOException (java.io.IOException)20 KubernetesClient (io.fabric8.kubernetes.client.KubernetesClient)19 JobScheduleRequest (org.pentaho.platform.web.http.api.resources.JobScheduleRequest)19 ComplexJobTrigger (org.pentaho.platform.api.scheduler2.ComplexJobTrigger)18 Pod (io.fabric8.kubernetes.api.model.Pod)17 SchedulerException (org.pentaho.platform.api.scheduler2.SchedulerException)17 JobServiceClient (com.google.cloud.talent.v4beta1.JobServiceClient)16 Date (java.util.Date)14 Job (com.google.cloud.video.transcoder.v1.Job)13 TranscoderServiceClient (com.google.cloud.video.transcoder.v1.TranscoderServiceClient)13