Search in sources :

Example 1 with WorkingTimeLog

use of net.sourceforge.processdash.log.time.WorkingTimeLog in project processdash by dtuma.

the class MigrationToolIndiv method cleanupTimeLog.

@MigrationTask("002 Consolidating time log data")
protected void cleanupTimeLog() throws IOException {
    changesWritten = true;
    timeLog = new WorkingTimeLog(targetDir);
}
Also used : WorkingTimeLog(net.sourceforge.processdash.log.time.WorkingTimeLog)

Example 2 with WorkingTimeLog

use of net.sourceforge.processdash.log.time.WorkingTimeLog in project processdash by dtuma.

the class DataExtractionScaffold method init.

public void init() throws Exception {
    DashController.setDataDirectory(dataDirectory);
    String dataDirPath = dataDirectory.getAbsolutePath() + System.getProperty("file.separator");
    // load and initialize settings
    String settingsFilename = dataDirPath + InternalSettings.getSettingsFilename();
    InternalSettings.initialize(settingsFilename);
    InternalSettings.setReadOnly(true);
    InternalSettings.set(SCAFFOLD_MODE_SETTING, "true");
    InternalSettings.set("templates.disableSearchPath", "true");
    InternalSettings.set("export.disableAutoExport", "true");
    InternalSettings.set("slowNetwork", "true");
    for (Map.Entry<String, String> e : extraSettings.entrySet()) {
        InternalSettings.set(e.getKey(), e.getValue());
    }
    extraSettings = null;
    // reset the template loader search path
    TemplateLoader.resetTemplateURLs();
    // setup the defect analyzer
    DefectAnalyzer.setDataDirectory(dataDirPath);
    // possibly initialize external resource mappings
    if (useExternalResourceMappingFile)
        ExternalResourceManager.getInstance().initializeMappings(dataDirectory, ExternalResourceManager.INITIALIZATION_MODE_ARCHIVE);
    // create the data repository.
    data = new DataRepository();
    DashHierarchy templates = TemplateLoader.loadTemplates(data);
    data.setDatafileSearchURLs(TemplateLoader.getTemplateURLs());
    // open and load the the user's work breakdown structure
    hierarchy = new DashHierarchy(null);
    String hierFilename = dataDirPath + Settings.getFile("stateFile");
    hierarchy.loadXML(hierFilename, templates);
    data.setNodeComparator(hierarchy);
    // create the time log
    timeLog = new WorkingTimeLog(dataDirectory);
    DashboardTimeLog.setDefault(timeLog);
    // open all the datafiles that were specified in the properties file.
    data.startInconsistency();
    openDataFiles(dataDirPath, PropertyKey.ROOT);
    data.openDatafile("", dataDirPath + "global.dat");
    // import data files
    DataImporter.setDynamic(false);
    ImportManager.init(data);
    data.finishInconsistency();
    // configure the task dependency resolver
    EVTaskDependencyResolver.init(this);
    EVTaskDependencyResolver.getInstance().setDynamic(false);
    if (createWebServer) {
        DashboardURLStreamHandlerFactory.disable();
        try {
            webServer = new WebServer();
            webServer.setDashboardContext(this);
            webServer.setData(data);
            webServer.setProps(hierarchy);
            webServer.setRoots(TemplateLoader.getTemplateURLs());
            WebServer.setOutputCharset(getWebCharset());
        } catch (IOException ioe) {
        }
    }
}
Also used : WebServer(net.sourceforge.processdash.net.http.WebServer) DashHierarchy(net.sourceforge.processdash.hier.DashHierarchy) DataRepository(net.sourceforge.processdash.data.repository.DataRepository) IOException(java.io.IOException) HashMap(java.util.HashMap) Map(java.util.Map) WorkingTimeLog(net.sourceforge.processdash.log.time.WorkingTimeLog)

Aggregations

WorkingTimeLog (net.sourceforge.processdash.log.time.WorkingTimeLog)2 IOException (java.io.IOException)1 HashMap (java.util.HashMap)1 Map (java.util.Map)1 DataRepository (net.sourceforge.processdash.data.repository.DataRepository)1 DashHierarchy (net.sourceforge.processdash.hier.DashHierarchy)1 WebServer (net.sourceforge.processdash.net.http.WebServer)1