Search in sources :

Example 6 with RepositoryDirectory

use of org.pentaho.di.repository.RepositoryDirectory in project pentaho-kettle by pentaho.

the class JobEntryTrans method exportResources.

/**
 * We're going to load the transformation meta data referenced here. Then we're going to give it a new filename,
 * modify that filename in this entries. The parent caller will have made a copy of it, so it should be OK to do so.
 * <p/>
 * Exports the object to a flat-file system, adding content with filename keys to a set of definitions. The supplied
 * resource naming interface allows the object to name appropriately without worrying about those parts of the
 * implementation specific details.
 *
 * @param space           The variable space to resolve (environment) variables with.
 * @param definitions     The map containing the filenames and content
 * @param namingInterface The resource naming interface allows the object to be named appropriately
 * @param repository      The repository to load resources from
 * @param metaStore       the metaStore to load external metadata from
 * @return The filename for this object. (also contained in the definitions map)
 * @throws KettleException in case something goes wrong during the export
 */
@Override
public String exportResources(VariableSpace space, Map<String, ResourceDefinition> definitions, ResourceNamingInterface namingInterface, Repository repository, IMetaStore metaStore) throws KettleException {
    // Try to load the transformation from repository or file.
    // Modify this recursively too...
    // 
    // AGAIN: there is no need to clone this job entry because the caller is responsible for this.
    // 
    // First load the transformation metadata...
    // 
    copyVariablesFrom(space);
    TransMeta transMeta = getTransMeta(repository, space);
    // Also go down into the transformation and export the files there. (mapping recursively down)
    // 
    String proposedNewFilename = transMeta.exportResources(transMeta, definitions, namingInterface, repository, metaStore);
    // To get a relative path to it, we inject ${Internal.Entry.Current.Directory}
    // 
    String newFilename = "${" + Const.INTERNAL_VARIABLE_ENTRY_CURRENT_DIRECTORY + "}/" + proposedNewFilename;
    // Set the correct filename inside the XML.
    // 
    transMeta.setFilename(newFilename);
    // exports always reside in the root directory, in case we want to turn this into a file repository...
    // 
    transMeta.setRepositoryDirectory(new RepositoryDirectory());
    // export to filename ALWAYS (this allows the exported XML to be executed remotely)
    // 
    setSpecificationMethod(ObjectLocationSpecificationMethod.FILENAME);
    // change it in the job entry
    // 
    filename = newFilename;
    return proposedNewFilename;
}
Also used : RepositoryDirectory(org.pentaho.di.repository.RepositoryDirectory) TransMeta(org.pentaho.di.trans.TransMeta)

Example 7 with RepositoryDirectory

use of org.pentaho.di.repository.RepositoryDirectory in project pentaho-kettle by pentaho.

the class TransMeta method exportResources.

/**
 * Exports the specified objects to a flat-file system, adding content with filename keys to a set of definitions. The
 * supplied resource naming interface allows the object to name appropriately without worrying about those parts of
 * the implementation specific details.
 *
 * @param space
 *          the variable space to use
 * @param definitions
 * @param resourceNamingInterface
 * @param repository
 *          The repository to optionally load other resources from (to be converted to XML)
 * @param metaStore
 *          the metaStore in which non-kettle metadata could reside.
 *
 * @return the filename of the exported resource
 */
@Override
public String exportResources(VariableSpace space, Map<String, ResourceDefinition> definitions, ResourceNamingInterface resourceNamingInterface, Repository repository, IMetaStore metaStore) throws KettleException {
    try {
        // Handle naming for both repository and XML bases resources...
        // 
        String baseName;
        String originalPath;
        String fullname;
        String extension = "ktr";
        if (Utils.isEmpty(getFilename())) {
            // Assume repository...
            // 
            originalPath = directory.getPath();
            baseName = getName();
            fullname = directory.getPath() + (directory.getPath().endsWith(RepositoryDirectory.DIRECTORY_SEPARATOR) ? "" : RepositoryDirectory.DIRECTORY_SEPARATOR) + getName() + "." + // 
            extension;
        } else {
            // Assume file
            // 
            FileObject fileObject = KettleVFS.getFileObject(space.environmentSubstitute(getFilename()), space);
            originalPath = fileObject.getParent().getURL().toString();
            baseName = fileObject.getName().getBaseName();
            fullname = fileObject.getURL().toString();
        }
        String exportFileName = resourceNamingInterface.nameResource(baseName, originalPath, extension, ResourceNamingInterface.FileNamingType.TRANSFORMATION);
        ResourceDefinition definition = definitions.get(exportFileName);
        if (definition == null) {
            // If we do this once, it will be plenty :-)
            // 
            TransMeta transMeta = (TransMeta) this.realClone(false);
            // 
            for (StepMeta stepMeta : transMeta.getSteps()) {
                stepMeta.exportResources(space, definitions, resourceNamingInterface, repository, metaStore);
            }
            // Change the filename, calling this sets internal variables
            // inside of the transformation.
            // 
            transMeta.setFilename(exportFileName);
            // All objects get re-located to the root folder
            // 
            transMeta.setRepositoryDirectory(new RepositoryDirectory());
            // Set a number of parameters for all the data files referenced so far...
            // 
            Map<String, String> directoryMap = resourceNamingInterface.getDirectoryMap();
            if (directoryMap != null) {
                for (String directory : directoryMap.keySet()) {
                    String parameterName = directoryMap.get(directory);
                    transMeta.addParameterDefinition(parameterName, directory, "Data file path discovered during export");
                }
            }
            // At the end, add ourselves to the map...
            // 
            String transMetaContent = transMeta.getXML();
            definition = new ResourceDefinition(exportFileName, transMetaContent);
            // 
            if (Utils.isEmpty(this.getFilename())) {
                // Repository
                definition.setOrigin(fullname);
            } else {
                definition.setOrigin(this.getFilename());
            }
            definitions.put(fullname, definition);
        }
        return exportFileName;
    } catch (FileSystemException e) {
        throw new KettleException(BaseMessages.getString(PKG, "TransMeta.Exception.ErrorOpeningOrValidatingTheXMLFile", getFilename()), e);
    } catch (KettleFileException e) {
        throw new KettleException(BaseMessages.getString(PKG, "TransMeta.Exception.ErrorOpeningOrValidatingTheXMLFile", getFilename()), e);
    }
}
Also used : KettleException(org.pentaho.di.core.exception.KettleException) FileSystemException(org.apache.commons.vfs2.FileSystemException) KettleFileException(org.pentaho.di.core.exception.KettleFileException) RepositoryDirectory(org.pentaho.di.repository.RepositoryDirectory) ResourceDefinition(org.pentaho.di.resource.ResourceDefinition) FileObject(org.apache.commons.vfs2.FileObject) StepMeta(org.pentaho.di.trans.step.StepMeta)

Example 8 with RepositoryDirectory

use of org.pentaho.di.repository.RepositoryDirectory in project pentaho-kettle by pentaho.

the class JobEntryExportRepository method execute.

public Result execute(Result previousResult, int nr) {
    Result result = previousResult;
    result.setNrErrors(1);
    result.setResult(false);
    String realrepName = environmentSubstitute(repositoryname);
    String realusername = environmentSubstitute(username);
    String realpassword = Encr.decryptPasswordOptionallyEncrypted(environmentSubstitute(password));
    String realfoldername = environmentSubstitute(directoryPath);
    String realoutfilename = environmentSubstitute(targetfilename);
    if (export_type.equals(Export_All) || export_type.equals(Export_Jobs) || export_type.equals(Export_Trans) || export_type.equals(Export_One_Folder)) {
        realoutfilename = buildFilename(realoutfilename);
    }
    NrErrors = 0;
    successConditionBroken = false;
    limitErr = Const.toInt(environmentSubstitute(getNrLimit()), 10);
    try {
        file = KettleVFS.getFileObject(realoutfilename, this);
        if (file.exists()) {
            if (export_type.equals(Export_All) || export_type.equals(Export_Jobs) || export_type.equals(Export_Trans) || export_type.equals(Export_One_Folder)) {
                if (iffileexists.equals(If_FileExists_Fail)) {
                    logError(BaseMessages.getString(PKG, "JobExportRepository.Log.Failing", realoutfilename));
                    return result;
                } else if (iffileexists.equals(If_FileExists_Skip)) {
                    if (log.isDetailed()) {
                        logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.Exit", realoutfilename));
                    }
                    result.setResult(true);
                    result.setNrErrors(0);
                    return result;
                } else if (iffileexists.equals(If_FileExists_Uniquename)) {
                    String parentFolder = KettleVFS.getFilename(file.getParent());
                    String shortFilename = file.getName().getBaseName();
                    shortFilename = buildUniqueFilename(shortFilename);
                    file = KettleVFS.getFileObject(parentFolder + Const.FILE_SEPARATOR + shortFilename, this);
                    if (log.isDetailed()) {
                        logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.NewFilename", file.toString()));
                    }
                }
            } else if (export_type.equals(Export_By_Folder)) {
                if (file.getType() != FileType.FOLDER) {
                    logError(BaseMessages.getString(PKG, "JobExportRepository.Log.NotFolder", "" + file.getName()));
                    return result;
                }
            }
        } else {
            if (export_type.equals(Export_By_Folder)) {
                // create folder?
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderNotExists", "" + file.getName()));
                }
                if (!createfolder) {
                    return result;
                }
                file.createFolder();
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderCreated", file.toString()));
                }
            } else if (export_type.equals(Export_All) || export_type.equals(Export_Jobs) || export_type.equals(Export_Trans) || export_type.equals(Export_One_Folder)) {
                // create parent folder?
                if (!file.getParent().exists()) {
                    if (log.isDetailed()) {
                        logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderNotExists", "" + file.getParent().toString()));
                    }
                    if (createfolder) {
                        file.getParent().createFolder();
                        if (log.isDetailed()) {
                            logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.FolderCreated", file.getParent().toString()));
                        }
                    } else {
                        return result;
                    }
                }
            }
        }
        realoutfilename = KettleVFS.getFilename(this.file);
        // connect to repository
        connectRep(log, realrepName, realusername, realpassword);
        IRepositoryExporter exporter = repository.getExporter();
        if (export_type.equals(Export_All)) {
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.StartingExportAllRep", realoutfilename));
            }
            exporter.exportAllObjects(null, realoutfilename, null, "all");
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExportAllRep", realoutfilename));
            }
            if (add_result_filesname) {
                addFileToResultFilenames(realoutfilename, log, result, parentJob);
            }
        } else if (export_type.equals(Export_Jobs)) {
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.StartingExportJobsRep", realoutfilename));
            }
            exporter.exportAllObjects(null, realoutfilename, null, "jobs");
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExportJobsRep", realoutfilename));
            }
            if (add_result_filesname) {
                addFileToResultFilenames(realoutfilename, log, result, parentJob);
            }
        } else if (export_type.equals(Export_Trans)) {
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.StartingExportTransRep", realoutfilename));
            }
            exporter.exportAllObjects(null, realoutfilename, null, "trans");
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExportTransRep", realoutfilename));
            }
            if (add_result_filesname) {
                addFileToResultFilenames(realoutfilename, log, result, parentJob);
            }
        } else if (export_type.equals(Export_One_Folder)) {
            RepositoryDirectoryInterface directory = new RepositoryDirectory();
            directory = repository.findDirectory(realfoldername);
            if (directory != null) {
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.ExpAllFolderRep", directoryPath, realoutfilename));
                }
                exporter.exportAllObjects(null, realoutfilename, directory, "all");
                if (log.isDetailed()) {
                    logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.EndExpAllFolderRep", directoryPath, realoutfilename));
                }
                if (add_result_filesname) {
                    addFileToResultFilenames(realoutfilename, log, result, parentJob);
                }
            } else {
                logError(BaseMessages.getString(PKG, "JobExportRepository.Error.CanNotFindFolderInRep", realfoldername, realrepName));
                return result;
            }
        } else if (export_type.equals(Export_By_Folder)) {
            // User must give a destination folder..
            RepositoryDirectoryInterface directory = new RepositoryDirectory();
            directory = this.repository.loadRepositoryDirectoryTree().findRoot();
            // Loop over all the directory id's
            ObjectId[] dirids = directory.getDirectoryIDs();
            if (log.isDetailed()) {
                logDetailed(BaseMessages.getString(PKG, "JobExportRepository.Log.TotalFolders", "" + dirids.length));
            }
            for (int d = 0; d < dirids.length && !parentJob.isStopped(); d++) {
                // Success condition broken?
                if (successConditionBroken) {
                    logError(BaseMessages.getString(PKG, "JobExportRepository.Error.SuccessConditionbroken", "" + NrErrors));
                    throw new Exception(BaseMessages.getString(PKG, "JobExportRepository.Error.SuccessConditionbroken", "" + NrErrors));
                }
                RepositoryDirectoryInterface repdir = directory.findDirectory(dirids[d]);
                if (!processOneFolder(parentJob, result, log, repdir, realoutfilename, d, dirids.length)) {
                    // updateErrors
                    updateErrors();
                }
            }
        // end for
        }
    } catch (Exception e) {
        updateErrors();
        logError(BaseMessages.getString(PKG, "JobExportRepository.UnExpectedError", e.toString()));
        logError("Stack trace: " + Const.CR + Const.getStackTracker(e));
    } finally {
        if (this.repository != null) {
            this.repository.disconnect();
            this.repository = null;
        }
        if (this.repositoryMeta != null) {
            this.repositoryMeta = null;
        }
        if (this.repsinfo != null) {
            this.repsinfo.clear();
            this.repsinfo = null;
        }
        if (this.file != null) {
            try {
                this.file.close();
                this.file = null;
            } catch (Exception e) {
            // Ignore close errors
            }
        }
    }
    // Success Condition
    result.setNrErrors(NrErrors);
    if (getSuccessStatus()) {
        result.setResult(true);
    }
    return result;
}
Also used : RepositoryDirectoryInterface(org.pentaho.di.repository.RepositoryDirectoryInterface) RepositoryDirectory(org.pentaho.di.repository.RepositoryDirectory) ObjectId(org.pentaho.di.repository.ObjectId) IRepositoryExporter(org.pentaho.di.repository.IRepositoryExporter) KettleException(org.pentaho.di.core.exception.KettleException) KettleDatabaseException(org.pentaho.di.core.exception.KettleDatabaseException) KettleXMLException(org.pentaho.di.core.exception.KettleXMLException) Result(org.pentaho.di.core.Result)

Example 9 with RepositoryDirectory

use of org.pentaho.di.repository.RepositoryDirectory in project pentaho-kettle by pentaho.

the class JobEntryJob method exportResources.

/**
 * Exports the object to a flat-file system, adding content with filename keys to a set of definitions. The supplied
 * resource naming interface allows the object to name appropriately without worrying about those parts of the
 * implementation specific details.
 *
 * @param space
 *          The variable space to resolve (environment) variables with.
 * @param definitions
 *          The map containing the filenames and content
 * @param namingInterface
 *          The resource naming interface allows the object to be named appropriately
 * @param repository
 *          The repository to load resources from
 * @param metaStore
 *          the metaStore to load external metadata from
 *
 * @return The filename for this object. (also contained in the definitions map)
 * @throws KettleException
 *           in case something goes wrong during the export
 */
@Override
public String exportResources(VariableSpace space, Map<String, ResourceDefinition> definitions, ResourceNamingInterface namingInterface, Repository repository, IMetaStore metaStore) throws KettleException {
    // Try to load the transformation from repository or file.
    // Modify this recursively too...
    // 
    // AGAIN: there is no need to clone this job entry because the caller is
    // responsible for this.
    // 
    // First load the job meta data...
    // 
    // To make sure variables are available.
    copyVariablesFrom(space);
    JobMeta jobMeta = getJobMeta(repository, metaStore, space);
    // Also go down into the job and export the files there. (going down
    // recursively)
    // 
    String proposedNewFilename = jobMeta.exportResources(jobMeta, definitions, namingInterface, repository, metaStore);
    // To get a relative path to it, we inject
    // ${Internal.Entry.Current.Directory}
    // 
    String newFilename = "${" + Const.INTERNAL_VARIABLE_ENTRY_CURRENT_DIRECTORY + "}/" + proposedNewFilename;
    // Set the filename in the job
    // 
    jobMeta.setFilename(newFilename);
    // exports always reside in the root directory, in case we want to turn this
    // into a file repository...
    // 
    jobMeta.setRepositoryDirectory(new RepositoryDirectory());
    // export to filename ALWAYS (this allows the exported XML to be executed remotely)
    // 
    setSpecificationMethod(ObjectLocationSpecificationMethod.FILENAME);
    // change it in the job entry
    // 
    filename = newFilename;
    return proposedNewFilename;
}
Also used : JobMeta(org.pentaho.di.job.JobMeta) RepositoryDirectory(org.pentaho.di.repository.RepositoryDirectory)

Example 10 with RepositoryDirectory

use of org.pentaho.di.repository.RepositoryDirectory in project pentaho-kettle by pentaho.

the class KettleDatabaseRepositoryDirectoryDelegate method loadPathToRoot.

public RepositoryDirectoryInterface loadPathToRoot(ObjectId id_directory) throws KettleException {
    List<RepositoryDirectory> path = new ArrayList<>();
    ObjectId directoryId = id_directory;
    RowMetaAndData directoryRow = getDirectory(directoryId);
    Long parentId = directoryRow.getInteger(1);
    // 
    while (parentId != null && parentId >= 0) {
        RepositoryDirectory directory = new RepositoryDirectory();
        // Name of the directory
        directory.setName(directoryRow.getString(2, null));
        directory.setObjectId(directoryId);
        path.add(directory);
        // System.out.println( "+ dir '" + directory.getName() + "'" );
        directoryId = new LongObjectId(parentId);
        directoryRow = getDirectory(directoryId);
        parentId = directoryRow.getInteger(KettleDatabaseRepository.FIELD_DIRECTORY_ID_DIRECTORY_PARENT);
    }
    RepositoryDirectory root = new RepositoryDirectory();
    root.setObjectId(new LongObjectId(0));
    path.add(root);
    // 
    for (int i = 0; i < path.size() - 1; i++) {
        RepositoryDirectory item = path.get(i);
        RepositoryDirectory parent = path.get(i + 1);
        item.setParent(parent);
        parent.addSubdirectory(item);
    }
    RepositoryDirectory repositoryDirectory = path.get(0);
    return repositoryDirectory;
}
Also used : RepositoryDirectory(org.pentaho.di.repository.RepositoryDirectory) RowMetaAndData(org.pentaho.di.core.RowMetaAndData) LongObjectId(org.pentaho.di.repository.LongObjectId) ObjectId(org.pentaho.di.repository.ObjectId) ArrayList(java.util.ArrayList) LongObjectId(org.pentaho.di.repository.LongObjectId)

Aggregations

RepositoryDirectory (org.pentaho.di.repository.RepositoryDirectory)57 KettleException (org.pentaho.di.core.exception.KettleException)23 RepositoryDirectoryInterface (org.pentaho.di.repository.RepositoryDirectoryInterface)15 Test (org.junit.Test)14 TransMeta (org.pentaho.di.trans.TransMeta)11 LongObjectId (org.pentaho.di.repository.LongObjectId)10 ObjectId (org.pentaho.di.repository.ObjectId)9 KettleFileException (org.pentaho.di.core.exception.KettleFileException)8 ValueMetaString (org.pentaho.di.core.row.value.ValueMetaString)8 Repository (org.pentaho.di.repository.Repository)8 RepositoryElementMetaInterface (org.pentaho.di.repository.RepositoryElementMetaInterface)8 ArrayList (java.util.ArrayList)7 StringObjectId (org.pentaho.di.repository.StringObjectId)7 FileSystemException (org.apache.commons.vfs2.FileSystemException)6 KettleXMLException (org.pentaho.di.core.exception.KettleXMLException)6 ErrorDialog (org.pentaho.di.ui.core.dialog.ErrorDialog)6 MetaStoreException (org.pentaho.metastore.api.exceptions.MetaStoreException)6 DatabaseMeta (org.pentaho.di.core.database.DatabaseMeta)5 JobMeta (org.pentaho.di.job.JobMeta)5 TreeItem (org.eclipse.swt.widgets.TreeItem)4