Search in sources :

Example 61 with ObjectId

use of org.pentaho.di.repository.ObjectId in project pentaho-kettle by pentaho.

the class RunJobServletTest method doGetInvalidLogLevelTest.

@Test
public void doGetInvalidLogLevelTest() throws Exception {
    HttpServletRequest mockHttpServletRequest = mock(HttpServletRequest.class);
    HttpServletResponse mockHttpServletResponse = mock(HttpServletResponse.class);
    TransformationMap transformationMap = mock(TransformationMap.class);
    SlaveServerConfig slaveServerConfig = mock(SlaveServerConfig.class);
    Repository repository = mock(Repository.class);
    RepositoryDirectoryInterface repDirInterface = mock(RepositoryDirectoryInterface.class);
    JobMeta jobMeta = mock(JobMeta.class);
    ObjectId objId = mock(ObjectId.class);
    Whitebox.setInternalState(runJobServlet, "transformationMap", transformationMap);
    KettleLogStore.init();
    StringWriter out = new StringWriter();
    PrintWriter printWriter = new PrintWriter(out);
    when(mockHttpServletRequest.getParameter("job")).thenReturn("dummyJob");
    when(mockHttpServletRequest.getParameter("level")).thenReturn("SomethingInvalid");
    when(mockHttpServletResponse.getWriter()).thenReturn(printWriter);
    when(transformationMap.getSlaveServerConfig()).thenReturn(slaveServerConfig);
    when(slaveServerConfig.getRepository()).thenReturn(repository);
    when(repository.isConnected()).thenReturn(true);
    when(repository.loadRepositoryDirectoryTree()).thenReturn(repDirInterface);
    when(repDirInterface.findDirectory(anyString())).thenReturn(repDirInterface);
    when(repository.getJobId("dummyJob", repDirInterface)).thenReturn(objId);
    when(repository.loadJob(objId, null)).thenReturn(jobMeta);
    when(mockHttpServletRequest.getParameterNames()).thenReturn(Collections.enumeration(Collections.emptyList()));
    runJobServlet.doGet(mockHttpServletRequest, mockHttpServletResponse);
    verify(mockHttpServletResponse).setStatus(HttpServletResponse.SC_OK);
    verify(mockHttpServletResponse).setStatus(HttpServletResponse.SC_BAD_REQUEST);
}
Also used : HttpServletRequest(javax.servlet.http.HttpServletRequest) RepositoryDirectoryInterface(org.pentaho.di.repository.RepositoryDirectoryInterface) Repository(org.pentaho.di.repository.Repository) JobMeta(org.pentaho.di.job.JobMeta) StringWriter(java.io.StringWriter) ObjectId(org.pentaho.di.repository.ObjectId) HttpServletResponse(javax.servlet.http.HttpServletResponse) PrintWriter(java.io.PrintWriter) Test(org.junit.Test)

Example 62 with ObjectId

use of org.pentaho.di.repository.ObjectId in project pentaho-kettle by pentaho.

the class RunJobServletTest method doGetUnexpectedErrorTest.

@Test
public void doGetUnexpectedErrorTest() throws Exception {
    HttpServletRequest mockHttpServletRequest = mock(HttpServletRequest.class);
    HttpServletResponse mockHttpServletResponse = mock(HttpServletResponse.class);
    TransformationMap transformationMap = mock(TransformationMap.class);
    SlaveServerConfig slaveServerConfig = mock(SlaveServerConfig.class);
    Repository repository = mock(Repository.class);
    RepositoryDirectoryInterface repDirInterface = mock(RepositoryDirectoryInterface.class);
    ObjectId objId = mock(ObjectId.class);
    Whitebox.setInternalState(runJobServlet, "transformationMap", transformationMap);
    KettleLogStore.init();
    StringWriter out = new StringWriter();
    PrintWriter printWriter = new PrintWriter(out);
    when(mockHttpServletRequest.getParameter("job")).thenReturn("dummyJob");
    when(mockHttpServletRequest.getParameter("level")).thenReturn("SomethingInvalid");
    when(mockHttpServletResponse.getWriter()).thenReturn(printWriter);
    when(transformationMap.getSlaveServerConfig()).thenReturn(slaveServerConfig);
    when(slaveServerConfig.getRepository()).thenReturn(repository);
    when(repository.isConnected()).thenReturn(true);
    when(repository.loadRepositoryDirectoryTree()).thenReturn(repDirInterface);
    when(repDirInterface.findDirectory(anyString())).thenReturn(repDirInterface);
    when(repository.getJobId("dummyJob", repDirInterface)).thenReturn(objId);
    when(repository.loadJob(objId, null)).thenThrow(new KettleException(""));
    runJobServlet.doGet(mockHttpServletRequest, mockHttpServletResponse);
    verify(mockHttpServletResponse).setStatus(HttpServletResponse.SC_OK);
    verify(mockHttpServletResponse).setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
}
Also used : HttpServletRequest(javax.servlet.http.HttpServletRequest) RepositoryDirectoryInterface(org.pentaho.di.repository.RepositoryDirectoryInterface) KettleException(org.pentaho.di.core.exception.KettleException) Repository(org.pentaho.di.repository.Repository) StringWriter(java.io.StringWriter) ObjectId(org.pentaho.di.repository.ObjectId) HttpServletResponse(javax.servlet.http.HttpServletResponse) PrintWriter(java.io.PrintWriter) Test(org.junit.Test)

Example 63 with ObjectId

use of org.pentaho.di.repository.ObjectId in project pentaho-kettle by pentaho.

the class JobEntryExportRepository method execute.

public Result execute(Result previousResult, int nr) {
    Result result = previousResult;
    result.setNrErrors(1);
    result.setResult(false);
    String realrepName = environmentSubstitute(repositoryname);
    String realusername = environmentSubstitute(username);
    String realpassword = Encr.decryptPasswordOptionallyEncrypted(environmentSubstitute(password));
    String realfoldername = environmentSubstitute(directoryPath);
    String realoutfilename = environmentSubstitute(targetfilename);
    if (export_type.equals(Export_All) || export_type.equals(Export_Jobs) || export_type.equals(Export_Trans) || export_type.equals(Export_One_Folder)) {
        realoutfilename = buildFilename(realoutfilename);
    }
    NrErrors = 0;
    successConditionBroken = false;
    limitErr = Const.toInt(environmentSubstitute(getNrLimit()), 10);
    try {
        file = KettleVFS.getFileObject(realoutfilename, this);
        if (file.exists()) {
            if (export_type.equals(Export_All) || export_type.equals(Export_Jobs) || export_type.equals(Export_Trans) || export_type.equals(Export_One_Folder)) {
                if (iffileexists.equals(If_FileExists_Fail)) {
                    logError(BaseMessages.getString(PKG, "JobExportRepository.Log.Failing", realoutfilename));
                    return result;
                } else if (iffileexists.equals(If_FileExists_Skip)) {
                    if (log.isDetailed()) {
                        logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.Exit", realoutfilename));
                    }
                    result.setResult(true);
                    result.setNrErrors(0);
                    return result;
                } else if (iffileexists.equals(If_FileExists_Uniquename)) {
                    String parentFolder = KettleVFS.getFilename(file.getParent());
                    String shortFilename = file.getName().getBaseName();
                    shortFilename = buildUniqueFilename(shortFilename);
                    file = KettleVFS.getFileObject(parentFolder + Const.FILE_SEPARATOR + shortFilename, this);
                    if (log.isDetailed()) {
                        logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.NewFilename", file.toString()));
                    }
                }
            } else if (export_type.equals(Export_By_Folder)) {
                if (file.getType() != FileType.FOLDER) {
                    logError(BaseMessages.getString(PKG, "JobExportRepository.Log.NotFolder", "" + file.getName()));
                    return result;
                }
            }
        } else {
            if (export_type.equals(Export_By_Folder)) {
                // create folder?
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderNotExists", "" + file.getName()));
                }
                if (!createfolder) {
                    return result;
                }
                file.createFolder();
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderCreated", file.toString()));
                }
            } else if (export_type.equals(Export_All) || export_type.equals(Export_Jobs) || export_type.equals(Export_Trans) || export_type.equals(Export_One_Folder)) {
                // create parent folder?
                if (!file.getParent().exists()) {
                    if (log.isDetailed()) {
                        logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderNotExists", "" + file.getParent().toString()));
                    }
                    if (createfolder) {
                        file.getParent().createFolder();
                        if (log.isDetailed()) {
                            logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderCreated", file.getParent().toString()));
                        }
                    } else {
                        return result;
                    }
                }
            }
        }
        realoutfilename = KettleVFS.getFilename(this.file);
        // connect to repository
        connectRep(log, realrepName, realusername, realpassword);
        IRepositoryExporter exporter = repository.getExporter();
        if (export_type.equals(Export_All)) {
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.StartingExportAllRep", realoutfilename));
            }
            exporter.exportAllObjects(null, realoutfilename, null, "all");
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExportAllRep", realoutfilename));
            }
            if (add_result_filesname) {
                addFileToResultFilenames(realoutfilename, log, result, parentJob);
            }
        } else if (export_type.equals(Export_Jobs)) {
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.StartingExportJobsRep", realoutfilename));
            }
            exporter.exportAllObjects(null, realoutfilename, null, "jobs");
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExportJobsRep", realoutfilename));
            }
            if (add_result_filesname) {
                addFileToResultFilenames(realoutfilename, log, result, parentJob);
            }
        } else if (export_type.equals(Export_Trans)) {
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.StartingExportTransRep", realoutfilename));
            }
            exporter.exportAllObjects(null, realoutfilename, null, "trans");
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExportTransRep", realoutfilename));
            }
            if (add_result_filesname) {
                addFileToResultFilenames(realoutfilename, log, result, parentJob);
            }
        } else if (export_type.equals(Export_One_Folder)) {
            RepositoryDirectoryInterface directory = new RepositoryDirectory();
            directory = repository.findDirectory(realfoldername);
            if (directory != null) {
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.ExpAllFolderRep", directoryPath, realoutfilename));
                }
                exporter.exportAllObjects(null, realoutfilename, directory, "all");
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExpAllFolderRep", directoryPath, realoutfilename));
                }
                if (add_result_filesname) {
                    addFileToResultFilenames(realoutfilename, log, result, parentJob);
                }
            } else {
                logError(BaseMessages.getString(PKG, "JobExportRepository.Error.CanNotFindFolderInRep", realfoldername, realrepName));
                return result;
            }
        } else if (export_type.equals(Export_By_Folder)) {
            // User must give a destination folder..
            RepositoryDirectoryInterface directory = new RepositoryDirectory();
            directory = this.repository.loadRepositoryDirectoryTree().findRoot();
            // Loop over all the directory id's
            ObjectId[] dirids = directory.getDirectoryIDs();
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.TotalFolders", "" + dirids.length));
            }
            for (int d = 0; d < dirids.length && !parentJob.isStopped(); d++) {
                // Success condition broken?
                if (successConditionBroken) {
                    logError(BaseMessages.getString(PKG, "JobExportRepository.Error.SuccessConditionbroken", "" + NrErrors));
                    throw new Exception(BaseMessages.getString(PKG, "JobExportRepository.Error.SuccessConditionbroken", "" + NrErrors));
                }
                RepositoryDirectoryInterface repdir = directory.findDirectory(dirids[d]);
                if (!processOneFolder(parentJob, result, log, repdir, realoutfilename, d, dirids.length)) {
                    // updateErrors
                    updateErrors();
                }
            }
        // end for
        }
    } catch (Exception e) {
        updateErrors();
        logError(BaseMessages.getString(PKG, "JobExportRepository.UnExpectedError", e.toString()));
        logError("Stack trace: " + Const.CR + Const.getStackTracker(e));
    } finally {
        if (this.repository != null) {
            this.repository.disconnect();
            this.repository = null;
        }
        if (this.repositoryMeta != null) {
            this.repositoryMeta = null;
        }
        if (this.repsinfo != null) {
            this.repsinfo.clear();
            this.repsinfo = null;
        }
        if (this.file != null) {
            try {
                this.file.close();
                this.file = null;
            } catch (Exception e) {
            // Ignore close errors
            }
        }
    }
    // Success Condition
    result.setNrErrors(NrErrors);
    if (getSuccessStatus()) {
        result.setResult(true);
    }
    return result;
}
Also used : RepositoryDirectoryInterface(org.pentaho.di.repository.RepositoryDirectoryInterface) RepositoryDirectory(org.pentaho.di.repository.RepositoryDirectory) ObjectId(org.pentaho.di.repository.ObjectId) IRepositoryExporter(org.pentaho.di.repository.IRepositoryExporter) KettleException(org.pentaho.di.core.exception.KettleException) KettleDatabaseException(org.pentaho.di.core.exception.KettleDatabaseException) KettleXMLException(org.pentaho.di.core.exception.KettleXMLException) Result(org.pentaho.di.core.Result)

Example 64 with ObjectId

use of org.pentaho.di.repository.ObjectId in project pentaho-kettle by pentaho.

the class TransDelegate method dataNodeToElement.

public void dataNodeToElement(final DataNode rootNode, final RepositoryElementInterface element) throws KettleException {
    TransMeta transMeta = (TransMeta) element;
    Set<String> privateDatabases = null;
    // read the private databases
    DataNode privateDbsNode = rootNode.getNode(NODE_TRANS_PRIVATE_DATABASES);
    // BACKLOG-6635
    if (privateDbsNode != null) {
        privateDatabases = new HashSet<String>();
        if (privateDbsNode.hasProperty(PROP_TRANS_PRIVATE_DATABASE_NAMES)) {
            for (String privateDatabaseName : getString(privateDbsNode, PROP_TRANS_PRIVATE_DATABASE_NAMES).split(TRANS_PRIVATE_DATABASE_DELIMITER)) {
                if (!privateDatabaseName.isEmpty()) {
                    privateDatabases.add(privateDatabaseName);
                }
            }
        } else {
            for (DataNode privateDatabase : privateDbsNode.getNodes()) {
                privateDatabases.add(privateDatabase.getName());
            }
        }
    }
    transMeta.setPrivateDatabases(privateDatabases);
    // read the steps...
    // 
    DataNode stepsNode = rootNode.getNode(NODE_STEPS);
    for (DataNode stepNode : stepsNode.getNodes()) {
        StepMeta stepMeta = new StepMeta(new StringObjectId(stepNode.getId().toString()));
        // for tracing, retain hierarchy
        stepMeta.setParentTransMeta(transMeta);
        // Read the basics
        // 
        stepMeta.setName(getString(stepNode, PROP_NAME));
        if (stepNode.hasProperty(PROP_DESCRIPTION)) {
            stepMeta.setDescription(getString(stepNode, PROP_DESCRIPTION));
        }
        stepMeta.setDistributes(stepNode.getProperty(PROP_STEP_DISTRIBUTE).getBoolean());
        DataProperty rowDistributionProperty = stepNode.getProperty(PROP_STEP_ROW_DISTRIBUTION);
        String rowDistributionCode = rowDistributionProperty == null ? null : rowDistributionProperty.getString();
        RowDistributionInterface rowDistribution = PluginRegistry.getInstance().loadClass(RowDistributionPluginType.class, rowDistributionCode, RowDistributionInterface.class);
        stepMeta.setRowDistribution(rowDistribution);
        stepMeta.setDraw(stepNode.getProperty(PROP_STEP_GUI_DRAW).getBoolean());
        int copies = (int) stepNode.getProperty(PROP_STEP_COPIES).getLong();
        String copiesString = stepNode.getProperty(PROP_STEP_COPIES_STRING) != null ? stepNode.getProperty(PROP_STEP_COPIES_STRING).getString() : StringUtils.EMPTY;
        if (!Utils.isEmpty(copiesString)) {
            stepMeta.setCopiesString(copiesString);
        } else {
            // for backward compatibility
            stepMeta.setCopies(copies);
        }
        int x = (int) stepNode.getProperty(PROP_STEP_GUI_LOCATION_X).getLong();
        int y = (int) stepNode.getProperty(PROP_STEP_GUI_LOCATION_Y).getLong();
        stepMeta.setLocation(x, y);
        // Load the group attributes map
        // 
        AttributesMapUtil.loadAttributesMap(stepNode, stepMeta);
        String stepType = getString(stepNode, PROP_STEP_TYPE);
        // Create a new StepMetaInterface object...
        // 
        PluginRegistry registry = PluginRegistry.getInstance();
        PluginInterface stepPlugin = registry.findPluginWithId(StepPluginType.class, stepType);
        StepMetaInterface stepMetaInterface = null;
        if (stepPlugin != null) {
            stepMetaInterface = (StepMetaInterface) registry.loadClass(stepPlugin);
            // revert to the default in case we loaded an alternate version
            stepType = stepPlugin.getIds()[0];
        } else {
            stepMeta.setStepMetaInterface((StepMetaInterface) new MissingTrans(stepMeta.getName(), stepType));
            transMeta.addMissingTrans((MissingTrans) stepMeta.getStepMetaInterface());
        }
        stepMeta.setStepID(stepType);
        // Read the metadata from the repository too...
        // 
        RepositoryProxy proxy = new RepositoryProxy(stepNode.getNode(NODE_STEP_CUSTOM));
        if (!stepMeta.isMissing()) {
            readRepCompatibleStepMeta(stepMetaInterface, proxy, null, transMeta.getDatabases());
            stepMetaInterface.readRep(proxy, transMeta.getMetaStore(), null, transMeta.getDatabases());
            stepMeta.setStepMetaInterface(stepMetaInterface);
        }
        // Get the partitioning as well...
        StepPartitioningMeta stepPartitioningMeta = new StepPartitioningMeta();
        if (stepNode.hasProperty(PROP_PARTITIONING_SCHEMA)) {
            String partSchemaId = stepNode.getProperty(PROP_PARTITIONING_SCHEMA).getRef().getId().toString();
            String schemaName = repo.loadPartitionSchema(new StringObjectId(partSchemaId), null).getName();
            stepPartitioningMeta.setPartitionSchemaName(schemaName);
            String methodCode = getString(stepNode, PROP_PARTITIONING_METHOD);
            stepPartitioningMeta.setMethod(StepPartitioningMeta.getMethod(methodCode));
            if (stepPartitioningMeta.getPartitioner() != null) {
                proxy = new RepositoryProxy(stepNode.getNode(NODE_PARTITIONER_CUSTOM));
                stepPartitioningMeta.getPartitioner().loadRep(proxy, null);
            }
            stepPartitioningMeta.hasChanged(true);
        }
        stepMeta.setStepPartitioningMeta(stepPartitioningMeta);
        stepMeta.getStepPartitioningMeta().setPartitionSchemaAfterLoading(transMeta.getPartitionSchemas());
        // Get the cluster schema name
        String clusterSchemaName = getString(stepNode, PROP_CLUSTER_SCHEMA);
        stepMeta.setClusterSchemaName(clusterSchemaName);
        if (clusterSchemaName != null && transMeta.getClusterSchemas() != null) {
            // Get the cluster schema from the given name
            for (ClusterSchema clusterSchema : transMeta.getClusterSchemas()) {
                if (clusterSchema.getName().equals(clusterSchemaName)) {
                    stepMeta.setClusterSchema(clusterSchema);
                    break;
                }
            }
        }
        transMeta.addStep(stepMeta);
    }
    for (DataNode stepNode : stepsNode.getNodes()) {
        ObjectId stepObjectId = new StringObjectId(stepNode.getId().toString());
        StepMeta stepMeta = StepMeta.findStep(transMeta.getSteps(), stepObjectId);
        // 
        if (stepNode.hasProperty(PROP_STEP_ERROR_HANDLING_SOURCE_STEP)) {
            StepErrorMeta meta = new StepErrorMeta(transMeta, stepMeta);
            meta.setTargetStep(StepMeta.findStep(transMeta.getSteps(), stepNode.getProperty(PROP_STEP_ERROR_HANDLING_TARGET_STEP).getString()));
            meta.setEnabled(stepNode.getProperty(PROP_STEP_ERROR_HANDLING_IS_ENABLED).getBoolean());
            meta.setNrErrorsValuename(getString(stepNode, PROP_STEP_ERROR_HANDLING_NR_VALUENAME));
            meta.setErrorDescriptionsValuename(getString(stepNode, PROP_STEP_ERROR_HANDLING_DESCRIPTIONS_VALUENAME));
            meta.setErrorFieldsValuename(getString(stepNode, PROP_STEP_ERROR_HANDLING_FIELDS_VALUENAME));
            meta.setErrorCodesValuename(getString(stepNode, PROP_STEP_ERROR_HANDLING_CODES_VALUENAME));
            meta.setMaxErrors(getString(stepNode, PROP_STEP_ERROR_HANDLING_MAX_ERRORS));
            meta.setMaxPercentErrors(getString(stepNode, PROP_STEP_ERROR_HANDLING_MAX_PCT_ERRORS));
            meta.setMinPercentRows(getString(stepNode, PROP_STEP_ERROR_HANDLING_MIN_PCT_ROWS));
            // a bit of a trick, I know.
            meta.getSourceStep().setStepErrorMeta(meta);
        }
    }
    // 
    for (int i = 0; i < transMeta.nrSteps(); i++) {
        StepMeta stepMeta = transMeta.getStep(i);
        StepMetaInterface sii = stepMeta.getStepMetaInterface();
        if (sii != null) {
            sii.searchInfoAndTargetSteps(transMeta.getSteps());
        }
    }
    // Read the notes...
    // 
    DataNode notesNode = rootNode.getNode(NODE_NOTES);
    int nrNotes = (int) notesNode.getProperty(PROP_NR_NOTES).getLong();
    for (DataNode noteNode : notesNode.getNodes()) {
        String xml = getString(noteNode, PROP_XML);
        transMeta.addNote(new NotePadMeta(XMLHandler.getSubNode(XMLHandler.loadXMLString(xml), NotePadMeta.XML_TAG)));
    }
    if (transMeta.nrNotes() != nrNotes) {
        throw new KettleException("The number of notes read [" + transMeta.nrNotes() + "] was not the number we expected [" + nrNotes + "]");
    }
    // Read the hops...
    // 
    DataNode hopsNode = rootNode.getNode(NODE_HOPS);
    int nrHops = (int) hopsNode.getProperty(PROP_NR_HOPS).getLong();
    for (DataNode hopNode : hopsNode.getNodes()) {
        String stepFromName = getString(hopNode, TRANS_HOP_FROM);
        String stepToName = getString(hopNode, TRANS_HOP_TO);
        boolean enabled = true;
        if (hopNode.hasProperty(TRANS_HOP_ENABLED)) {
            enabled = hopNode.getProperty(TRANS_HOP_ENABLED).getBoolean();
        }
        StepMeta stepFrom = StepMeta.findStep(transMeta.getSteps(), stepFromName);
        StepMeta stepTo = StepMeta.findStep(transMeta.getSteps(), stepToName);
        // 
        if (stepFrom != null && stepTo != null) {
            transMeta.addTransHop(new TransHopMeta(stepFrom, stepTo, enabled));
        }
    }
    if (transMeta.nrTransHops() != nrHops) {
        throw new KettleException("The number of hops read [" + transMeta.nrTransHops() + "] was not the number we expected [" + nrHops + "]");
    }
    // Load the details at the end, to make sure we reference the databases correctly, etc.
    // 
    loadTransformationDetails(rootNode, transMeta);
    loadDependencies(rootNode, transMeta);
    transMeta.eraseParameters();
    DataNode paramsNode = rootNode.getNode(NODE_PARAMETERS);
    int count = (int) paramsNode.getProperty(PROP_NR_PARAMETERS).getLong();
    for (int idx = 0; idx < count; idx++) {
        DataNode paramNode = paramsNode.getNode(TRANS_PARAM_PREFIX + idx);
        String key = getString(paramNode, PARAM_KEY);
        String def = getString(paramNode, PARAM_DEFAULT);
        String desc = getString(paramNode, PARAM_DESC);
        transMeta.addParameterDefinition(key, def, desc);
    }
    transMeta.activateParameters();
}
Also used : KettleException(org.pentaho.di.core.exception.KettleException) StringObjectId(org.pentaho.di.repository.StringObjectId) ObjectId(org.pentaho.di.repository.ObjectId) PluginInterface(org.pentaho.di.core.plugins.PluginInterface) TransMeta(org.pentaho.di.trans.TransMeta) StepMetaInterface(org.pentaho.di.trans.step.StepMetaInterface) StepErrorMeta(org.pentaho.di.trans.step.StepErrorMeta) DataProperty(org.pentaho.platform.api.repository2.unified.data.node.DataProperty) StepPartitioningMeta(org.pentaho.di.trans.step.StepPartitioningMeta) StepMeta(org.pentaho.di.trans.step.StepMeta) StringObjectId(org.pentaho.di.repository.StringObjectId) DataNode(org.pentaho.platform.api.repository2.unified.data.node.DataNode) PluginRegistry(org.pentaho.di.core.plugins.PluginRegistry) RowDistributionInterface(org.pentaho.di.trans.step.RowDistributionInterface) MissingTrans(org.pentaho.di.trans.steps.missing.MissingTrans) NotePadMeta(org.pentaho.di.core.NotePadMeta) TransHopMeta(org.pentaho.di.trans.TransHopMeta) ClusterSchema(org.pentaho.di.cluster.ClusterSchema)

Example 65 with ObjectId

use of org.pentaho.di.repository.ObjectId in project pentaho-kettle by pentaho.

the class PurRepository method getPartitionSchemaIDs.

@Override
public ObjectId[] getPartitionSchemaIDs(boolean includeDeleted) throws KettleException {
    try {
        List<RepositoryFile> children = getAllFilesOfType(null, RepositoryObjectType.PARTITION_SCHEMA, includeDeleted);
        List<ObjectId> ids = new ArrayList<ObjectId>();
        for (RepositoryFile file : children) {
            ids.add(new StringObjectId(file.getId().toString()));
        }
        return ids.toArray(new ObjectId[0]);
    } catch (Exception e) {
        throw new KettleException("Unable to get all partition schema IDs", e);
    }
}
Also used : KettleException(org.pentaho.di.core.exception.KettleException) StringObjectId(org.pentaho.di.repository.StringObjectId) ObjectId(org.pentaho.di.repository.ObjectId) ArrayList(java.util.ArrayList) RepositoryFile(org.pentaho.platform.api.repository2.unified.RepositoryFile) StringObjectId(org.pentaho.di.repository.StringObjectId) MetaStoreException(org.pentaho.metastore.api.exceptions.MetaStoreException) UnifiedRepositoryCreateFileException(org.pentaho.platform.api.repository2.unified.UnifiedRepositoryCreateFileException) UnifiedRepositoryUpdateFileException(org.pentaho.platform.api.repository2.unified.UnifiedRepositoryUpdateFileException) IdNotFoundException(org.pentaho.di.core.exception.IdNotFoundException) URISyntaxException(java.net.URISyntaxException) MetaStoreNamespaceExistsException(org.pentaho.metastore.api.exceptions.MetaStoreNamespaceExistsException) KettleFileException(org.pentaho.di.core.exception.KettleFileException) SOAPFaultException(javax.xml.ws.soap.SOAPFaultException) KettleException(org.pentaho.di.core.exception.KettleException) KettleSecurityException(org.pentaho.di.core.exception.KettleSecurityException)

Aggregations

ObjectId (org.pentaho.di.repository.ObjectId)233 KettleException (org.pentaho.di.core.exception.KettleException)94 LongObjectId (org.pentaho.di.repository.LongObjectId)76 StringObjectId (org.pentaho.di.repository.StringObjectId)76 RowMetaAndData (org.pentaho.di.core.RowMetaAndData)49 RepositoryDirectoryInterface (org.pentaho.di.repository.RepositoryDirectoryInterface)38 Test (org.junit.Test)37 ValueMetaString (org.pentaho.di.core.row.value.ValueMetaString)37 ErrorDialog (org.pentaho.di.ui.core.dialog.ErrorDialog)34 ValueMetaInteger (org.pentaho.di.core.row.value.ValueMetaInteger)33 ArrayList (java.util.ArrayList)30 DatabaseMeta (org.pentaho.di.core.database.DatabaseMeta)29 MessageBox (org.eclipse.swt.widgets.MessageBox)26 Repository (org.pentaho.di.repository.Repository)18 RepositoryFile (org.pentaho.platform.api.repository2.unified.RepositoryFile)18 SlaveServer (org.pentaho.di.cluster.SlaveServer)15 KettleDatabaseException (org.pentaho.di.core.exception.KettleDatabaseException)14 PartitionSchema (org.pentaho.di.partition.PartitionSchema)14 Matchers.anyString (org.mockito.Matchers.anyString)13 MetaStoreException (org.pentaho.metastore.api.exceptions.MetaStoreException)13