Search in sources :

Example 1 with JobConfiguration

use of org.pentaho.di.job.JobConfiguration in project pentaho-kettle by pentaho.

the class RegisterJobServlet method generateBody.

@Override
WebResult generateBody(HttpServletRequest request, HttpServletResponse response, boolean useXML) throws IOException, KettleException {
    final String xml = IOUtils.toString(request.getInputStream());
    // Parse the XML, create a job configuration
    JobConfiguration jobConfiguration = JobConfiguration.fromXML(xml);
    Job job = createJob(jobConfiguration);
    String message = "Job '" + job.getJobname() + "' was added to the list with id " + job.getContainerObjectId();
    return new WebResult(WebResult.STRING_OK, message, job.getContainerObjectId());
}
Also used : Job(org.pentaho.di.job.Job) JobConfiguration(org.pentaho.di.job.JobConfiguration)

Example 2 with JobConfiguration

use of org.pentaho.di.job.JobConfiguration in project pentaho-kettle by pentaho.

the class RegisterPackageServlet method generateBody.

@Override
WebResult generateBody(HttpServletRequest request, HttpServletResponse response, boolean useXML) throws KettleException {
    String archiveUrl = copyRequestToDirectory(request, createTempDirString());
    // the resource to load
    String load = request.getParameter(PARAMETER_LOAD);
    String zipBaseUrl = extract(archiveUrl);
    if (!Utils.isEmpty(load)) {
        String fileUrl = getStartFileUrl(zipBaseUrl, load);
        String resultId;
        if (isJob(request)) {
            Node node = getConfigNode(zipBaseUrl, Job.CONFIGURATION_IN_EXPORT_FILENAME, JobExecutionConfiguration.XML_TAG);
            JobExecutionConfiguration jobExecutionConfiguration = new JobExecutionConfiguration(node);
            JobMeta jobMeta = new JobMeta(fileUrl, jobExecutionConfiguration.getRepository());
            JobConfiguration jobConfiguration = new JobConfiguration(jobMeta, jobExecutionConfiguration);
            Job job = createJob(jobConfiguration);
            resultId = job.getContainerObjectId();
        } else {
            Node node = getConfigNode(zipBaseUrl, Trans.CONFIGURATION_IN_EXPORT_FILENAME, TransExecutionConfiguration.XML_TAG);
            TransExecutionConfiguration transExecutionConfiguration = new TransExecutionConfiguration(node);
            TransMeta transMeta = new TransMeta(fileUrl, transExecutionConfiguration.getRepository());
            TransConfiguration transConfiguration = new TransConfiguration(transMeta, transExecutionConfiguration);
            Trans trans = createTrans(transConfiguration);
            resultId = trans.getContainerObjectId();
        }
        // zip file no longer needed, contents were extracted
        deleteArchive(archiveUrl);
        return new WebResult(WebResult.STRING_OK, fileUrl, resultId);
    }
    return null;
}
Also used : TransExecutionConfiguration(org.pentaho.di.trans.TransExecutionConfiguration) JobMeta(org.pentaho.di.job.JobMeta) Node(org.w3c.dom.Node) TransMeta(org.pentaho.di.trans.TransMeta) Job(org.pentaho.di.job.Job) JobExecutionConfiguration(org.pentaho.di.job.JobExecutionConfiguration) TransConfiguration(org.pentaho.di.trans.TransConfiguration) Trans(org.pentaho.di.trans.Trans) JobConfiguration(org.pentaho.di.job.JobConfiguration)

Example 3 with JobConfiguration

use of org.pentaho.di.job.JobConfiguration in project pentaho-kettle by pentaho.

the class RunJobServlet method doGet.

/**
 * <div id="mindtouch">
 *    <h1>/kettle/runJob</h1>
 *    <a name="GET"></a>
 *    <h2>GET</h2>
 *    <p>Execute job from enterprise repository. Repository should be configured in Carte xml file.
 *  Response contains <code>ERROR</code> result if error happened during job execution.</p>
 *
 *    <p><b>Example Request:</b><br />
 *    <pre function="syntax.xml">
 *    GET /kettle/runJob?job=home%2Fadmin%2Fdummy_job&level=Debug
 *    </pre>
 *
 *    </p>
 *    <h3>Parameters</h3>
 *    <table class="pentaho-table">
 *    <tbody>
 *    <tr>
 *      <th>name</th>
 *      <th>description</th>
 *      <th>type</th>
 *    </tr>
 *    <tr>
 *    <td>job</td>
 *    <td>Full path to the job in repository.</td>
 *    <td>query</td>
 *    </tr>
 *    <tr>
 *    <td>level</td>
 *    <td>Logging level to be used for job execution (i.e. Debug).</td>
 *    <td>query</td>
 *    </tr>
 *    </tbody>
 *    </table>
 *
 *  <h3>Response Body</h3>
 *
 *  <table class="pentaho-table">
 *    <tbody>
 *      <tr>
 *        <td align="right">element:</td>
 *        <td>(custom)</td>
 *      </tr>
 *      <tr>
 *        <td align="right">media types:</td>
 *        <td>text/xml</td>
 *      </tr>
 *    </tbody>
 *  </table>
 *    <p>Response contains result of the operation. It is either <code>OK</code> or <code>ERROR</code>.
 *     If an error occurred during job execution, response also contains information about the error.</p>
 *
 *    <p><b>Example Response:</b></p>
 *    <pre function="syntax.xml">
 *    <webresult>
 *      <result>OK</result>
 *      <message>Job started</message>
 *      <id>05d919b0-74a3-48d6-84d8-afce359d0449</id>
 *    </webresult>
 *    </pre>
 *
 *    <h3>Status Codes</h3>
 *    <table class="pentaho-table">
 *  <tbody>
 *    <tr>
 *      <th>code</th>
 *      <th>description</th>
 *    </tr>
 *    <tr>
 *      <td>200</td>
 *      <td>Request was processed.</td>
 *    </tr>
 *    <tr>
 *      <td>400</td>
 *      <td>Bad Request: Mandatory parameter job missing</td>
 *    </tr>
 *    <tr>
 *      <td>401</td>
 *      <td>Unauthorized access to the repository</td>
 *    </tr>
 *    <tr>
 *      <td>404</td>
 *      <td>Not found: Job not found</td>
 *    </tr>
 *    <tr>
 *      <td>500</td>
 *      <td>Internal server error occurs during request processing.</td>
 *    </tr>
 *  </tbody>
 *</table>
 *</div>
 */
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    if (isJettyMode() && !request.getContextPath().startsWith(CONTEXT_PATH)) {
        return;
    }
    if (log.isDebug()) {
        logDebug(BaseMessages.getString(PKG, "RunJobServlet.Log.RunJobRequested"));
    }
    // Options taken from PAN
    // 
    String[] knownOptions = new String[] { "job", "level" };
    String transOption = request.getParameter("job");
    String levelOption = request.getParameter("level");
    response.setStatus(HttpServletResponse.SC_OK);
    PrintWriter out = response.getWriter();
    SlaveServerConfig serverConfig = transformationMap.getSlaveServerConfig();
    try {
        Repository slaveServerRepository = serverConfig.getRepository();
        if (slaveServerRepository == null || !slaveServerRepository.isConnected()) {
            response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
            out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.UnableToConnectToRepository", serverConfig.getRepositoryId())));
            return;
        }
        if (transOption == null) {
            response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
            out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.MissingMandatoryParameterJob")));
            return;
        }
        final JobMeta jobMeta = loadJob(slaveServerRepository, transOption);
        // Set the servlet parameters as variables in the transformation
        // 
        String[] parameters = jobMeta.listParameters();
        Enumeration<?> parameterNames = request.getParameterNames();
        while (parameterNames.hasMoreElements()) {
            String parameter = (String) parameterNames.nextElement();
            String[] values = request.getParameterValues(parameter);
            // 
            if (Const.indexOfString(parameter, knownOptions) < 0) {
                // 
                if (Const.indexOfString(parameter, parameters) < 0) {
                    jobMeta.setVariable(parameter, values[0]);
                } else {
                    jobMeta.setParameterValue(parameter, values[0]);
                }
            }
        }
        JobExecutionConfiguration jobExecutionConfiguration = new JobExecutionConfiguration();
        if (levelOption != null && !isValidLogLevel(levelOption)) {
            response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
            out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.InvalidLogLevel")));
            return;
        }
        LogLevel logLevel = LogLevel.getLogLevelForCode(levelOption);
        jobExecutionConfiguration.setLogLevel(logLevel);
        // Create new repository connection for this job
        // 
        final Repository repository = jobExecutionConfiguration.connectRepository(serverConfig.getRepositoryId(), serverConfig.getRepositoryUsername(), serverConfig.getRepositoryPassword());
        JobConfiguration jobConfiguration = new JobConfiguration(jobMeta, jobExecutionConfiguration);
        String carteObjectId = UUID.randomUUID().toString();
        SimpleLoggingObject servletLoggingObject = new SimpleLoggingObject(CONTEXT_PATH, LoggingObjectType.CARTE, null);
        servletLoggingObject.setContainerObjectId(carteObjectId);
        servletLoggingObject.setLogLevel(logLevel);
        // Create the transformation and store in the list...
        // 
        final Job job = new Job(repository, jobMeta, servletLoggingObject);
        // Setting variables
        // 
        job.initializeVariablesFrom(null);
        job.getJobMeta().setInternalKettleVariables(job);
        job.injectVariables(jobConfiguration.getJobExecutionConfiguration().getVariables());
        // Also copy the parameters over...
        // 
        job.copyParametersFrom(jobMeta);
        job.clearParameters();
        /*
       * String[] parameterNames = job.listParameters(); for (int idx = 0; idx < parameterNames.length; idx++) { // Grab
       * the parameter value set in the job entry // String thisValue =
       * jobExecutionConfiguration.getParams().get(parameterNames[idx]); if (!Utils.isEmpty(thisValue)) { // Set the
       * value as specified by the user in the job entry // jobMeta.setParameterValue(parameterNames[idx], thisValue); }
       * }
       */
        jobMeta.activateParameters();
        job.setSocketRepository(getSocketRepository());
        JobMap jobMap = getJobMap();
        jobMap.addJob(job.getJobname(), carteObjectId, job, jobConfiguration);
        // Disconnect from the job's repository when the job finishes.
        // 
        job.addJobListener(new JobAdapter() {

            public void jobFinished(Job job) {
                repository.disconnect();
            }
        });
        String message = "Job '" + job.getJobname() + "' was added to the list with id " + carteObjectId;
        logBasic(message);
        try {
            runJob(job);
            WebResult webResult = new WebResult(WebResult.STRING_OK, "Job started", carteObjectId);
            out.println(webResult.getXML());
            out.flush();
        } catch (Exception executionException) {
            response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
            String logging = KettleLogStore.getAppender().getBuffer(job.getLogChannelId(), false).toString();
            out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.ErrorExecutingJob", serverConfig.getRepositoryId(), logging)));
        }
    } catch (IdNotFoundException idEx) {
        response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
        out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.UnableToRunJob", serverConfig.getRepositoryId())));
    } catch (Exception ex) {
        if (ex.getMessage().contains(UNAUTHORIZED_ACCESS_TO_REPOSITORY)) {
            response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
            out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.UnableToConnectToRepository", serverConfig.getRepositoryId())));
            return;
        } else if (ex.getMessage().contains(UNABLE_TO_LOAD_JOB)) {
            response.setStatus(HttpServletResponse.SC_NOT_FOUND);
            out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.UnableToFindJob", serverConfig.getRepositoryId())));
            return;
        }
        response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
        out.println(new WebResult(WebResult.STRING_ERROR, BaseMessages.getString(PKG, "RunJobServlet.Error.UnexpectedError", Const.CR + Const.getStackTracker(ex))));
    }
}
Also used : JobMeta(org.pentaho.di.job.JobMeta) SimpleLoggingObject(org.pentaho.di.core.logging.SimpleLoggingObject) JobExecutionConfiguration(org.pentaho.di.job.JobExecutionConfiguration) JobAdapter(org.pentaho.di.job.JobAdapter) LogLevel(org.pentaho.di.core.logging.LogLevel) ServletException(javax.servlet.ServletException) KettleException(org.pentaho.di.core.exception.KettleException) IOException(java.io.IOException) IdNotFoundException(org.pentaho.di.core.exception.IdNotFoundException) Repository(org.pentaho.di.repository.Repository) Job(org.pentaho.di.job.Job) JobConfiguration(org.pentaho.di.job.JobConfiguration) IdNotFoundException(org.pentaho.di.core.exception.IdNotFoundException) PrintWriter(java.io.PrintWriter)

Example 4 with JobConfiguration

use of org.pentaho.di.job.JobConfiguration in project pentaho-kettle by pentaho.

the class AddExportServlet method doGet.

/**
 *    <div id="mindtouch">
 *    <h1>/kettle/addExport</h1>
 *    <a name="POST"></a>
 *    <h2>POST</h2>
 *    <p>Returns the list of users in the platform. This list is in an xml format as shown in the example response.
 *    Uploads and executes previously exported job or transformation.
 *    Uploads zip file containing job or transformation to be executed and executes it.
 *    Method relies on the input parameters to find the entity to be executed. The archive is
 *    transferred within request body.
 *
 *    <code>File url of the executed entity </code> will be returned in the Response object
 *    or <code>message</code> describing error occurred. To determine if the call is successful
 *    rely on <code>result</code> parameter in response.</p>
 *
 *    <p><b>Example Request:</b><br />
 *    <pre function="syntax.xml">
 *    POST /kettle/addExport/?type=job&load=dummy_job.kjb
 *    </pre>
 *    Request body should contain zip file prepared for Carte execution.
 *    </p>
 *    <h3>Parameters</h3>
 *    <table class="pentaho-table">
 *    <tbody>
 *    <tr>
 *      <th>name</th>
 *      <th>description</th>
 *      <th>type</th>
 *    </tr>
 *    <tr>
 *    <td>type</td>
 *    <td>The type of the entity to be executed either <code>job</code> or <code>trans</code>.</td>
 *    <td>query</td>
 *    </tr>
 *    <tr>
 *    <td>load</td>
 *    <td>The name of the entity within archive to be executed.</td>
 *    <td>query</td>
 *    </tr>
 *    </tbody>
 *    </table>
 *
 *  <h3>Response Body</h3>
 *
 *  <table class="pentaho-table">
 *    <tbody>
 *      <tr>
 *        <td align="right">element:</td>
 *        <td>(custom)</td>
 *      </tr>
 *      <tr>
 *        <td align="right">media types:</td>
 *        <td>application/xml</td>
 *      </tr>
 *    </tbody>
 *  </table>
 *    <p>Response wraps file url of the entity that was executed or error stack trace if an error occurred.
 *     Response has <code>result</code> OK if there were no errors. Otherwise it returns ERROR.</p>
 *
 *    <p><b>Example Response:</b></p>
 *    <pre function="syntax.xml">
 *    <?xml version="1.0" encoding="UTF-8"?>
 *    <webresult>
 *      <result>OK</result>
 *      <message>zip&#x3a;file&#x3a;&#x2f;&#x2f;&#x2f;temp&#x2f;export_ee2a67de-6a72-11e4-82c0-4701a2bac6a5.zip&#x21;dummy_job.kjb</message>
 *      <id>74cf4219-c881-4633-a71a-2ed16b7db7b8</id>
 *    </webresult>
 *    </pre>
 *
 *    <h3>Status Codes</h3>
 *    <table class="pentaho-table">
 *  <tbody>
 *    <tr>
 *      <th>code</th>
 *      <th>description</th>
 *    </tr>
 *    <tr>
 *      <td>200</td>
 *      <td>Request was processed and XML response is returned.</td>
 *    </tr>
 *    <tr>
 *      <td>500</td>
 *      <td>Internal server error occurs during request processing.</td>
 *    </tr>
 *  </tbody>
 *</table>
 *</div>
 */
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    if (isJettyMode() && !request.getRequestURI().startsWith(CONTEXT_PATH)) {
        return;
    }
    if (log.isDebug()) {
        logDebug("Addition of export requested");
    }
    PrintWriter out = response.getWriter();
    // read from the client
    InputStream in = request.getInputStream();
    if (log.isDetailed()) {
        logDetailed("Encoding: " + request.getCharacterEncoding());
    }
    boolean isJob = TYPE_JOB.equalsIgnoreCase(request.getParameter(PARAMETER_TYPE));
    // the resource to load
    String load = request.getParameter(PARAMETER_LOAD);
    response.setContentType("text/xml");
    out.print(XMLHandler.getXMLHeader());
    response.setStatus(HttpServletResponse.SC_OK);
    OutputStream outputStream = null;
    try {
        FileObject tempFile = KettleVFS.createTempFile("export", ".zip", System.getProperty("java.io.tmpdir"));
        outputStream = KettleVFS.getOutputStream(tempFile, false);
        // Pass the input directly to a temporary file
        // 
        // int size = 0;
        int c;
        while ((c = in.read()) != -1) {
            outputStream.write(c);
        // size++;
        }
        outputStream.flush();
        outputStream.close();
        // don't close it twice
        outputStream = null;
        String archiveUrl = tempFile.getName().toString();
        String fileUrl = null;
        String carteObjectId = null;
        SimpleLoggingObject servletLoggingObject = new SimpleLoggingObject(CONTEXT_PATH, LoggingObjectType.CARTE, null);
        // 
        if (!Utils.isEmpty(load)) {
            fileUrl = "zip:" + archiveUrl + "!" + load;
            if (isJob) {
                // Open the job from inside the ZIP archive
                // 
                KettleVFS.getFileObject(fileUrl);
                // never with a repository
                JobMeta jobMeta = new JobMeta(fileUrl, null);
                // Also read the execution configuration information
                // 
                String configUrl = "zip:" + archiveUrl + "!" + Job.CONFIGURATION_IN_EXPORT_FILENAME;
                Document configDoc = XMLHandler.loadXMLFile(configUrl);
                JobExecutionConfiguration jobExecutionConfiguration = new JobExecutionConfiguration(XMLHandler.getSubNode(configDoc, JobExecutionConfiguration.XML_TAG));
                carteObjectId = UUID.randomUUID().toString();
                servletLoggingObject.setContainerObjectId(carteObjectId);
                servletLoggingObject.setLogLevel(jobExecutionConfiguration.getLogLevel());
                Job job = new Job(null, jobMeta, servletLoggingObject);
                // 
                if (jobExecutionConfiguration.isExpandingRemoteJob()) {
                    job.addDelegationListener(new CarteDelegationHandler(getTransformationMap(), getJobMap()));
                }
                // store it all in the map...
                // 
                getJobMap().addJob(job.getJobname(), carteObjectId, job, new JobConfiguration(jobMeta, jobExecutionConfiguration));
                // Apply the execution configuration...
                // 
                log.setLogLevel(jobExecutionConfiguration.getLogLevel());
                job.setArguments(jobExecutionConfiguration.getArgumentStrings());
                jobMeta.injectVariables(jobExecutionConfiguration.getVariables());
                // Also copy the parameters over...
                // 
                Map<String, String> params = jobExecutionConfiguration.getParams();
                for (Map.Entry<String, String> entry : params.entrySet()) {
                    jobMeta.setParameterValue(entry.getKey(), entry.getValue());
                }
            } else {
                // Open the transformation from inside the ZIP archive
                // 
                TransMeta transMeta = new TransMeta(fileUrl);
                // Also read the execution configuration information
                // 
                String configUrl = "zip:" + archiveUrl + "!" + Trans.CONFIGURATION_IN_EXPORT_FILENAME;
                Document configDoc = XMLHandler.loadXMLFile(configUrl);
                TransExecutionConfiguration executionConfiguration = new TransExecutionConfiguration(XMLHandler.getSubNode(configDoc, TransExecutionConfiguration.XML_TAG));
                carteObjectId = UUID.randomUUID().toString();
                servletLoggingObject.setContainerObjectId(carteObjectId);
                servletLoggingObject.setLogLevel(executionConfiguration.getLogLevel());
                Trans trans = new Trans(transMeta, servletLoggingObject);
                // store it all in the map...
                // 
                getTransformationMap().addTransformation(trans.getName(), carteObjectId, trans, new TransConfiguration(transMeta, executionConfiguration));
            }
        } else {
            fileUrl = archiveUrl;
        }
        out.println(new WebResult(WebResult.STRING_OK, fileUrl, carteObjectId));
    } catch (Exception ex) {
        out.println(new WebResult(WebResult.STRING_ERROR, Const.getStackTracker(ex)));
    } finally {
        if (outputStream != null) {
            outputStream.close();
        }
    }
}
Also used : JobMeta(org.pentaho.di.job.JobMeta) InputStream(java.io.InputStream) OutputStream(java.io.OutputStream) TransMeta(org.pentaho.di.trans.TransMeta) SimpleLoggingObject(org.pentaho.di.core.logging.SimpleLoggingObject) Document(org.w3c.dom.Document) JobExecutionConfiguration(org.pentaho.di.job.JobExecutionConfiguration) TransConfiguration(org.pentaho.di.trans.TransConfiguration) ServletException(javax.servlet.ServletException) IOException(java.io.IOException) TransExecutionConfiguration(org.pentaho.di.trans.TransExecutionConfiguration) FileObject(org.apache.commons.vfs2.FileObject) Job(org.pentaho.di.job.Job) Map(java.util.Map) Trans(org.pentaho.di.trans.Trans) JobConfiguration(org.pentaho.di.job.JobConfiguration) PrintWriter(java.io.PrintWriter)

Example 5 with JobConfiguration

use of org.pentaho.di.job.JobConfiguration in project pentaho-kettle by pentaho.

the class AddJobServlet method doGet.

/**
 * /**
 *
 *    <div id="mindtouch">
 *    <h1>/kettle/addJob</h1>
 *    <a name="POST"></a>
 *    <h2>POST</h2>
 *    <p>Uploads and executes job configuration XML file.
 *  Uploads xml file containing job and job_execution_configuration (wrapped in job_configuration tag)
 *  to be executed and executes it. Method relies on the input parameter to determine if xml or html
 *  reply should be produced. The job_configuration xml is
 *  transferred within request body.
 *
 *  <code>Job name of the executed job </code> will be returned in the Response object
 *  or <code>message</code> describing error occurred. To determine if the call successful or not you should
 *  rely on <code>result</code> parameter in response.</p>
 *
 *    <p><b>Example Request:</b><br />
 *    <pre function="syntax.xml">
 *    POST /kettle/addJob/?xml=Y
 *    </pre>
 *    <p>Request body should contain xml containing job_configuration (job + job_execution_configuration
 *  wrapped in job_configuration tag).</p>
 *    </p>
 *    <h3>Parameters</h3>
 *    <table class="pentaho-table">
 *    <tbody>
 *    <tr>
 *      <th>name</th>
 *      <th>description</th>
 *      <th>type</th>
 *    </tr>
 *    <tr>
 *    <td>xml</td>
 *    <td>Boolean flag set to either <code>Y</code> or <code>N</code> describing if xml or html reply
 *  should be produced.</td>
 *    <td>boolean, optional</td>
 *    </tr>
 *    </tbody>
 *    </table>
 *
 *  <h3>Response Body</h3>
 *
 *  <table class="pentaho-table">
 *    <tbody>
 *      <tr>
 *        <td align="right">element:</td>
 *        <td>(custom)</td>
 *      </tr>
 *      <tr>
 *        <td align="right">media types:</td>
 *        <td>text/xml, text/html</td>
 *      </tr>
 *    </tbody>
 *  </table>
 *    <p>Response wraps job name that was executed or error stack trace
 *  if an error occurred. Response has <code>result</code> OK if there were no errors. Otherwise it returns ERROR.</p>
 *
 *    <p><b>Example Response:</b></p>
 *    <pre function="syntax.xml">
 *    <?xml version="1.0" encoding="UTF-8"?>
 *    <webresult>
 *      <result>OK</result>
 *      <message>Job &#x27;dummy_job&#x27; was added to the list with id 1e90eca8-4d4c-47f7-8e5c-99ec36525e7c</message>
 *      <id>1e90eca8-4d4c-47f7-8e5c-99ec36525e7c</id>
 *    </webresult>
 *    </pre>
 *
 *    <h3>Status Codes</h3>
 *    <table class="pentaho-table">
 *  <tbody>
 *    <tr>
 *      <th>code</th>
 *      <th>description</th>
 *    </tr>
 *    <tr>
 *      <td>200</td>
 *      <td>Request was processed and XML response is returned.</td>
 *    </tr>
 *    <tr>
 *      <td>500</td>
 *      <td>Internal server error occurs during request processing.</td>
 *    </tr>
 *  </tbody>
 *</table>
 *</div>
 */
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    if (isJettyMode() && !request.getRequestURI().startsWith(CONTEXT_PATH)) {
        return;
    }
    if (log.isDebug()) {
        logDebug("Addition of job requested");
    }
    boolean useXML = "Y".equalsIgnoreCase(request.getParameter("xml"));
    PrintWriter out = response.getWriter();
    // read from the client
    BufferedReader in = request.getReader();
    if (log.isDetailed()) {
        logDetailed("Encoding: " + request.getCharacterEncoding());
    }
    if (useXML) {
        response.setContentType("text/xml");
        out.print(XMLHandler.getXMLHeader());
    } else {
        response.setContentType("text/html");
        out.println("<HTML>");
        out.println("<HEAD><TITLE>Add job</TITLE></HEAD>");
        out.println("<BODY>");
    }
    response.setStatus(HttpServletResponse.SC_OK);
    try {
        // First read the complete transformation in memory from the request
        int c;
        StringBuilder xml = new StringBuilder();
        while ((c = in.read()) != -1) {
            xml.append((char) c);
        }
        // Parse the XML, create a job configuration
        // 
        JobConfiguration jobConfiguration = JobConfiguration.fromXML(xml.toString());
        JobMeta jobMeta = jobConfiguration.getJobMeta();
        JobExecutionConfiguration jobExecutionConfiguration = jobConfiguration.getJobExecutionConfiguration();
        jobMeta.setLogLevel(jobExecutionConfiguration.getLogLevel());
        jobMeta.injectVariables(jobExecutionConfiguration.getVariables());
        // If there was a repository, we know about it at this point in time.
        // 
        final Repository repository = jobConfiguration.getJobExecutionConfiguration().getRepository();
        String carteObjectId = UUID.randomUUID().toString();
        SimpleLoggingObject servletLoggingObject = new SimpleLoggingObject(CONTEXT_PATH, LoggingObjectType.CARTE, null);
        servletLoggingObject.setContainerObjectId(carteObjectId);
        servletLoggingObject.setLogLevel(jobExecutionConfiguration.getLogLevel());
        // Create the transformation and store in the list...
        // 
        final Job job = new Job(repository, jobMeta, servletLoggingObject);
        // Setting variables
        // 
        job.initializeVariablesFrom(null);
        job.getJobMeta().setInternalKettleVariables(job);
        job.injectVariables(jobConfiguration.getJobExecutionConfiguration().getVariables());
        job.setArguments(jobExecutionConfiguration.getArgumentStrings());
        // Also copy the parameters over...
        // 
        job.copyParametersFrom(jobMeta);
        job.clearParameters();
        String[] parameterNames = job.listParameters();
        for (int idx = 0; idx < parameterNames.length; idx++) {
            // Grab the parameter value set in the job entry
            // 
            String thisValue = jobExecutionConfiguration.getParams().get(parameterNames[idx]);
            if (!Utils.isEmpty(thisValue)) {
                // Set the value as specified by the user in the job entry
                // 
                jobMeta.setParameterValue(parameterNames[idx], thisValue);
            }
        }
        jobMeta.activateParameters();
        // Check if there is a starting point specified.
        String startCopyName = jobExecutionConfiguration.getStartCopyName();
        if (startCopyName != null && !startCopyName.isEmpty()) {
            int startCopyNr = jobExecutionConfiguration.getStartCopyNr();
            JobEntryCopy startJobEntryCopy = jobMeta.findJobEntry(startCopyName, startCopyNr, false);
            job.setStartJobEntryCopy(startJobEntryCopy);
        }
        job.setSocketRepository(getSocketRepository());
        // 
        if (jobExecutionConfiguration.isExpandingRemoteJob()) {
            job.addDelegationListener(new CarteDelegationHandler(getTransformationMap(), getJobMap()));
        }
        getJobMap().addJob(job.getJobname(), carteObjectId, job, jobConfiguration);
        // 
        if (repository != null) {
            job.addJobListener(new JobAdapter() {

                @Override
                public void jobFinished(Job job) {
                    repository.disconnect();
                }
            });
        }
        String message = Encode.forHtml("Job '" + job.getJobname() + "' was added to the list with id " + carteObjectId);
        if (useXML) {
            out.println(new WebResult(WebResult.STRING_OK, message, carteObjectId));
        } else {
            out.println("<H1>" + message + "</H1>");
            out.println("<p><a href=\"" + convertContextPath(GetJobStatusServlet.CONTEXT_PATH) + "?name=" + Encode.forUriComponent(job.getJobname()) + "&id=" + carteObjectId + "\">Go to the job status page</a><p>");
        }
    } catch (Exception ex) {
        if (useXML) {
            out.println(new WebResult(WebResult.STRING_ERROR, Const.getStackTracker(ex)));
        } else {
            out.println("<p>");
            out.println("<pre>");
            ex.printStackTrace(out);
            out.println("</pre>");
        }
    }
    if (!useXML) {
        out.println("<p>");
        out.println("</BODY>");
        out.println("</HTML>");
    }
}
Also used : JobMeta(org.pentaho.di.job.JobMeta) SimpleLoggingObject(org.pentaho.di.core.logging.SimpleLoggingObject) JobExecutionConfiguration(org.pentaho.di.job.JobExecutionConfiguration) JobAdapter(org.pentaho.di.job.JobAdapter) ServletException(javax.servlet.ServletException) IOException(java.io.IOException) Repository(org.pentaho.di.repository.Repository) JobEntryCopy(org.pentaho.di.job.entry.JobEntryCopy) BufferedReader(java.io.BufferedReader) Job(org.pentaho.di.job.Job) JobConfiguration(org.pentaho.di.job.JobConfiguration) PrintWriter(java.io.PrintWriter)

Aggregations

JobConfiguration (org.pentaho.di.job.JobConfiguration)12 Job (org.pentaho.di.job.Job)9 JobExecutionConfiguration (org.pentaho.di.job.JobExecutionConfiguration)8 SimpleLoggingObject (org.pentaho.di.core.logging.SimpleLoggingObject)7 JobMeta (org.pentaho.di.job.JobMeta)7 KettleException (org.pentaho.di.core.exception.KettleException)6 IOException (java.io.IOException)5 PrintWriter (java.io.PrintWriter)5 ServletException (javax.servlet.ServletException)5 JobAdapter (org.pentaho.di.job.JobAdapter)4 Repository (org.pentaho.di.repository.Repository)4 Path (javax.ws.rs.Path)2 Produces (javax.ws.rs.Produces)2 LogLevel (org.pentaho.di.core.logging.LogLevel)2 UnknownParamException (org.pentaho.di.core.parameters.UnknownParamException)2 JobEntryCopy (org.pentaho.di.job.entry.JobEntryCopy)2 Trans (org.pentaho.di.trans.Trans)2 TransConfiguration (org.pentaho.di.trans.TransConfiguration)2 TransExecutionConfiguration (org.pentaho.di.trans.TransExecutionConfiguration)2 TransMeta (org.pentaho.di.trans.TransMeta)2