Search in sources :

Example 21 with StudyBean

use of org.akaza.openclinica.bean.managestudy.StudyBean in project OpenClinica by OpenClinica.

the class StudyDAO method findAllByUser.

public Collection findAllByUser(String username) {
    this.unsetTypeExpected();
    this.setTypesExpected();
    HashMap variables = new HashMap();
    variables.put(new Integer(1), username);
    ArrayList alist = this.select(digester.getQuery("findAllByUser"), variables);
    ArrayList al = new ArrayList();
    Iterator it = alist.iterator();
    while (it.hasNext()) {
        StudyBean eb = (StudyBean) this.getEntityFromHashMap((HashMap) it.next());
        al.add(eb);
    }
    return al;
}
Also used : HashMap(java.util.HashMap) StudyBean(org.akaza.openclinica.bean.managestudy.StudyBean) ArrayList(java.util.ArrayList) Iterator(java.util.Iterator)

Example 22 with StudyBean

use of org.akaza.openclinica.bean.managestudy.StudyBean in project OpenClinica by OpenClinica.

the class RandomizeActionProcessor method mayProceed.

private boolean mayProceed(String studyOid) throws Exception {
    boolean accessPermission = false;
    StudyBean siteStudy = getStudy(studyOid);
    StudyBean study = getParentStudy(studyOid);
    StudyParameterValueDAO spvdao = new StudyParameterValueDAO(ds);
    StudyParameterValueBean pStatus = spvdao.findByHandleAndStudy(study.getId(), "randomization");
    randomizationRegistrar = new RandomizationRegistrar();
    SeRandomizationDTO seRandomizationDTO = randomizationRegistrar.getCachedRandomizationDTOObject(study.getOid().toString(), false);
    String randomizationStatusFromOCUI = seRandomizationDTO.getStatus();
    // enabled , disabled
    String randomizationStatusFromOC = pStatus.getValue().toString();
    // available , pending , frozen , locked
    String studyStatus = study.getStatus().getName().toString();
    // available , pending , frozen , locked
    String siteStatus = siteStudy.getStatus().getName().toString();
    System.out.println("randomizationStatusFromOCUI: " + randomizationStatusFromOCUI + "  randomizationStatusFromOC: " + randomizationStatusFromOC + "   studyStatus: " + studyStatus + "   siteStatus: " + siteStatus);
    logger.info("randomizationStatusFromOCUI: " + randomizationStatusFromOCUI + "  randomizationStatusFromOC: " + randomizationStatusFromOC + "   studyStatus: " + studyStatus + "   siteStatus: " + siteStatus);
    if (randomizationStatusFromOC.equalsIgnoreCase("enabled") && studyStatus.equalsIgnoreCase("available") && siteStatus.equalsIgnoreCase("available") && randomizationStatusFromOCUI.equalsIgnoreCase("ACTIVE")) {
        accessPermission = true;
    }
    return accessPermission;
}
Also used : StudyParameterValueBean(org.akaza.openclinica.bean.service.StudyParameterValueBean) StudyBean(org.akaza.openclinica.bean.managestudy.StudyBean) RandomizationRegistrar(org.akaza.openclinica.service.pmanage.RandomizationRegistrar) StudyParameterValueDAO(org.akaza.openclinica.dao.service.StudyParameterValueDAO) SeRandomizationDTO(org.akaza.openclinica.service.pmanage.SeRandomizationDTO)

Example 23 with StudyBean

use of org.akaza.openclinica.bean.managestudy.StudyBean in project OpenClinica by OpenClinica.

the class RandomizeActionProcessor method getStudy.

private StudyBean getStudy(String oid) {
    sdao = new StudyDAO(ds);
    StudyBean studyBean = (StudyBean) sdao.findByOid(oid);
    return studyBean;
}
Also used : StudyBean(org.akaza.openclinica.bean.managestudy.StudyBean) StudyDAO(org.akaza.openclinica.dao.managestudy.StudyDAO)

Example 24 with StudyBean

use of org.akaza.openclinica.bean.managestudy.StudyBean in project OpenClinica by OpenClinica.

the class XsltTransformJob method executeInternal.

@Override
protected void executeInternal(JobExecutionContext context) {
    logger.info("Job " + context.getJobDetail().getDescription() + " started.");
    initDependencies(context.getScheduler());
    // need to generate a Locale for emailing users with i18n
    // TODO make dynamic?
    Locale locale = new Locale("en-US");
    ResourceBundleProvider.updateLocale(locale);
    ResourceBundle pageMessages = ResourceBundleProvider.getPageMessagesBundle();
    List<File> markForDelete = new LinkedList<File>();
    Boolean zipped = true;
    Boolean deleteOld = true;
    Boolean exceptions = false;
    JobDataMap dataMap = context.getMergedJobDataMap();
    String localeStr = dataMap.getString(LOCALE);
    String[] doNotDeleteUntilExtract = new String[4];
    int cnt = dataMap.getInt("count");
    DatasetBean datasetBean = null;
    if (localeStr != null) {
        locale = new Locale(localeStr);
        ResourceBundleProvider.updateLocale(locale);
        pageMessages = ResourceBundleProvider.getPageMessagesBundle();
    }
    // get the file information from the job
    String alertEmail = dataMap.getString(EMAIL);
    java.io.InputStream in = null;
    FileOutputStream endFileStream = null;
    UserAccountBean userBean = null;
    try {
        // init all fields from the data map
        int userAccountId = dataMap.getInt(USER_ID);
        int studyId = dataMap.getInt(STUDY_ID);
        String outputPath = dataMap.getString(POST_FILE_PATH);
        // get all user info, generate xml
        logger.debug("found output path: " + outputPath);
        String generalFileDir = dataMap.getString(XML_FILE_PATH);
        int dsId = dataMap.getInt(DATASET_ID);
        // JN: Change from earlier versions, cannot get static reference as
        // static references don't work. Reason being for example there could be
        // datasetId as a variable which is different for each dataset and
        // that needs to be loaded dynamically
        ExtractPropertyBean epBean = (ExtractPropertyBean) dataMap.get(EP_BEAN);
        File doNotDelDir = new File(generalFileDir);
        if (doNotDelDir.isDirectory()) {
            doNotDeleteUntilExtract = doNotDelDir.list();
        }
        zipped = epBean.getZipFormat();
        deleteOld = epBean.getDeleteOld();
        long sysTimeBegin = System.currentTimeMillis();
        userBean = (UserAccountBean) userAccountDao.findByPK(userAccountId);
        StudyBean currentStudy = (StudyBean) studyDao.findByPK(studyId);
        StudyBean parentStudy = (StudyBean) studyDao.findByPK(currentStudy.getParentStudyId());
        String successMsg = epBean.getSuccessMessage();
        String failureMsg = epBean.getFailureMessage();
        final long start = System.currentTimeMillis();
        datasetBean = (DatasetBean) datasetDao.findByPK(dsId);
        ExtractBean eb = generateFileService.generateExtractBean(datasetBean, currentStudy, parentStudy);
        // generate file directory for file service
        datasetBean.setName(datasetBean.getName().replaceAll(" ", "_"));
        logger.debug("--> job starting: ");
        HashMap<String, Integer> answerMap = odmFileCreation.createODMFile(epBean.getFormat(), sysTimeBegin, generalFileDir, datasetBean, currentStudy, "", eb, currentStudy.getId(), currentStudy.getParentStudyId(), "99", (Boolean) dataMap.get(ZIPPED), false, (Boolean) dataMap.get(DELETE_OLD), epBean.getOdmType(), userBean);
        // won't save a record of the XML to db
        // won't be a zipped file, so that we can submit it for
        // transformation
        // this will have to be toggled by the export data format? no, the
        // export file will have to be zipped/not zipped
        String ODMXMLFileName = "";
        int fId = 0;
        Iterator<Entry<String, Integer>> it = answerMap.entrySet().iterator();
        while (it.hasNext()) {
            JobTerminationMonitor.check();
            Entry<String, Integer> entry = it.next();
            String key = entry.getKey();
            Integer value = entry.getValue();
            // JN: Since there is a logic to
            ODMXMLFileName = key;
            // delete all the intermittent
            // files, this file could be a zip
            // file.
            Integer fileID = value;
            fId = fileID.intValue();
            logger.debug("found " + fId + " and " + ODMXMLFileName);
        }
        logger.info("Finished ODM generation of job " + context.getJobDetail().getDescription());
        // create dirs
        File output = new File(outputPath);
        if (!output.isDirectory()) {
            output.mkdirs();
        }
        int numXLS = epBean.getFileName().length;
        int fileCntr = 0;
        String xmlFilePath = new File(generalFileDir + ODMXMLFileName).toURI().toURL().toExternalForm();
        String endFile = null;
        File oldFilesPath = new File(generalFileDir);
        while (fileCntr < numXLS) {
            JobTerminationMonitor.check();
            String xsltPath = dataMap.getString(XSLT_PATH) + File.separator + epBean.getFileName()[fileCntr];
            in = new java.io.FileInputStream(xsltPath);
            Transformer transformer = transformerFactory.newTransformer(new StreamSource(in));
            endFile = outputPath + File.separator + epBean.getExportFileName()[fileCntr];
            endFileStream = new FileOutputStream(endFile);
            transformer.transform(new StreamSource(xmlFilePath), new StreamResult(endFileStream));
            // JN...CLOSE THE STREAM...HMMMM
            in.close();
            endFileStream.close();
            fileCntr++;
            JobTerminationMonitor.check();
        }
        if (oldFilesPath.isDirectory()) {
            markForDelete = Arrays.asList(oldFilesPath.listFiles());
        // logic to prevent deleting the file being created.
        }
        final double done = setFormat(new Double(System.currentTimeMillis() - start) / 1000);
        logger.info("--> job completed in " + done + " ms");
        // run post processing
        ProcessingFunction function = epBean.getPostProcessing();
        String subject = "";
        String jobName = dataMap.getString(XsltTriggerService.JOB_NAME);
        StringBuffer emailBuffer = new StringBuffer("");
        emailBuffer.append("<p>" + pageMessages.getString("email_header_1") + " " + EmailEngine.getAdminEmail() + " " + pageMessages.getString("email_header_2") + " Job Execution " + pageMessages.getString("email_header_3") + "</p>");
        emailBuffer.append("<P>Dataset: " + datasetBean.getName() + "</P>");
        emailBuffer.append("<P>Study: " + currentStudy.getName() + "</P>");
        if (function != null && function.getClass().equals(org.akaza.openclinica.bean.service.SqlProcessingFunction.class)) {
            String dbUrl = ((org.akaza.openclinica.bean.service.SqlProcessingFunction) function).getDatabaseUrl();
            int lastIndex = dbUrl.lastIndexOf('/');
            String schemaName = dbUrl.substring(lastIndex);
            int HostIndex = dbUrl.substring(0, lastIndex).indexOf("//");
            String Host = dbUrl.substring(HostIndex, lastIndex);
            emailBuffer.append("<P>Database: " + ((org.akaza.openclinica.bean.service.SqlProcessingFunction) function).getDatabaseType() + "</P>");
            emailBuffer.append("<P>Schema: " + schemaName.replace("/", "") + "</P>");
            emailBuffer.append("<P>Host: " + Host.replace("//", "") + "</P>");
        }
        emailBuffer.append("<p>" + pageMessages.getString("html_email_body_1") + datasetBean.getName() + pageMessages.getString("html_email_body_2_2") + "</p>");
        if (function != null) {
            function.setTransformFileName(outputPath + File.separator + dataMap.getString(POST_FILE_NAME));
            function.setODMXMLFileName(endFile);
            function.setXslFileName(dataMap.getString(XSL_FILE_PATH));
            function.setDeleteOld((Boolean) dataMap.get(POST_PROC_DELETE_OLD));
            function.setZip((Boolean) dataMap.get(POST_PROC_ZIP));
            function.setLocation(dataMap.getString(POST_PROC_LOCATION));
            function.setExportFileName(dataMap.getString(POST_PROC_EXPORT_NAME));
            File[] oldFiles = getOldFiles(outputPath, dataMap.getString(POST_PROC_LOCATION));
            function.setOldFiles(oldFiles);
            File[] intermediateFiles = getInterFiles(dataMap.getString(POST_FILE_PATH));
            ProcessingResultType message = function.run();
            // Delete these files only in case when there is no failure
            if (message.getCode().intValue() != 2) {
                deleteOldFiles(intermediateFiles);
            }
            final long done2 = System.currentTimeMillis() - start;
            logger.info("--> postprocessing completed in " + done2 + " ms, found result type " + message.getCode());
            logger.info("--> postprocessing completed in " + done2 + " ms, found result type " + message.getCode());
            if (!function.getClass().equals(org.akaza.openclinica.bean.service.SqlProcessingFunction.class)) {
                String archivedFile = dataMap.getString(POST_FILE_NAME) + "." + function.getFileType();
                // download the zip file
                if (function.isZip()) {
                    archivedFile = archivedFile + ".zip";
                }
                // post processing as well.
                if (function.getClass().equals(org.akaza.openclinica.bean.service.PdfProcessingFunction.class)) {
                    archivedFile = function.getArchivedFileName();
                }
                ArchivedDatasetFileBean fbFinal = generateFileRecord(archivedFile, outputPath, datasetBean, done, new File(outputPath + File.separator + archivedFile).length(), ExportFormatBean.PDFFILE, userAccountId);
                if (successMsg.contains("$linkURL")) {
                    successMsg = successMsg.replace("$linkURL", "<a href=\"" + CoreResources.getField("sysURL.base") + "AccessFile?fileId=" + fbFinal.getId() + "\">" + CoreResources.getField("sysURL.base") + "AccessFile?fileId=" + fbFinal.getId() + " </a>");
                }
                emailBuffer.append("<p>" + successMsg + "</p>");
                logger.debug("System time begining.." + sysTimeBegin);
                logger.debug("System time end.." + System.currentTimeMillis());
                double sysTimeEnd = setFormat((System.currentTimeMillis() - sysTimeBegin) / 1000);
                logger.debug("difference" + sysTimeEnd);
                if (fbFinal != null) {
                    fbFinal.setFileSize((int) bytesToKilo(new File(archivedFile).length()));
                    fbFinal.setRunTime(sysTimeEnd);
                }
            }
            // otherwise don't do it
            if (message.getCode().intValue() == 1) {
                if (jobName != null) {
                    subject = "Success: " + jobName;
                } else {
                    subject = "Success: " + datasetBean.getName();
                }
            } else if (message.getCode().intValue() == 2) {
                if (jobName != null) {
                    subject = "Failure: " + jobName;
                } else {
                    subject = "Failure: " + datasetBean.getName();
                }
                if (failureMsg != null && !failureMsg.isEmpty()) {
                    emailBuffer.append(failureMsg);
                }
                emailBuffer.append("<P>").append(message.getDescription());
                postErrorMessage(message.getDescription(), context);
            } else if (message.getCode().intValue() == 3) {
                if (jobName != null) {
                    subject = "Update: " + jobName;
                } else {
                    subject = "Update: " + datasetBean.getName();
                }
            }
        } else {
            // extract ran but no post-processing - we send an email with
            // success and url to link to
            // generate archived dataset file bean here, and use the id to
            // build the URL
            String archivedFilename = dataMap.getString(POST_FILE_NAME);
            // the zip file
            if (zipped) {
                archivedFilename = dataMap.getString(POST_FILE_NAME) + ".zip";
            }
            // delete old files now
            List<File> intermediateFiles = generateFileService.getOldFiles();
            String[] dontDelFiles = epBean.getDoNotDelFiles();
            //JN: The following is the code for zipping up the files, in case of more than one xsl being provided.
            if (dontDelFiles.length > 1 && zipped) {
                logger.debug("count =====" + cnt + "dontDelFiles length==---" + dontDelFiles.length);
                logger.debug("Entering this?" + cnt + "dontDelFiles" + dontDelFiles);
                String path = outputPath + File.separator;
                logger.debug("path = " + path);
                logger.debug("zipName?? = " + epBean.getZipName());
                String zipName = epBean.getZipName() == null || epBean.getZipName().isEmpty() ? endFile + ".zip" : path + epBean.getZipName() + ".zip";
                archivedFilename = new File(zipName).getName();
                zipAll(path, epBean.getDoNotDelFiles(), zipName);
                String[] tempArray = { archivedFilename };
                dontDelFiles = tempArray;
                endFile = archivedFilename;
            } else if (zipped) {
                markForDelete = zipxmls(markForDelete, endFile);
                endFile = endFile + ".zip";
                String[] temp = new String[dontDelFiles.length];
                int i = 0;
                while (i < dontDelFiles.length) {
                    temp[i] = dontDelFiles[i] + ".zip";
                    i++;
                }
                dontDelFiles = temp;
                // Actually deleting all the xml files which are produced
                // since its zipped
                FilenameFilter xmlFilter = new XMLFileFilter();
                File tempFile = new File(generalFileDir);
                deleteOldFiles(tempFile.listFiles(xmlFilter));
            }
            ArchivedDatasetFileBean fbFinal = generateFileRecord(archivedFilename, outputPath, datasetBean, done, new File(outputPath + File.separator + archivedFilename).length(), ExportFormatBean.TXTFILE, userAccountId);
            if (jobName != null) {
                subject = "Job Ran: " + jobName;
            } else {
                subject = "Job Ran: " + datasetBean.getName();
            }
            if (successMsg == null || successMsg.isEmpty()) {
                logger.info("email buffer??" + emailBuffer);
            } else {
                if (successMsg.contains("$linkURL")) {
                    successMsg = successMsg.replace("$linkURL", "<a href=\"" + CoreResources.getField("sysURL.base") + "AccessFile?fileId=" + fbFinal.getId() + "\">" + CoreResources.getField("sysURL.base") + "AccessFile?fileId=" + fbFinal.getId() + " </a>");
                }
                emailBuffer.append("<p>" + successMsg + "</p>");
            }
            if (deleteOld) {
                deleteIntermFiles(intermediateFiles, endFile, dontDelFiles);
                deleteIntermFiles(markForDelete, endFile, dontDelFiles);
            }
        }
        // email the message to the user
        emailBuffer.append("<p>" + pageMessages.getString("html_email_body_5") + "</p>");
        try {
            // @pgawade 19-April-2011 Log the event into audit_event table
            if (null != dataMap.get("job_type") && ((String) dataMap.get("job_type")).equalsIgnoreCase("exportJob")) {
                String extractName = (String) dataMap.get(XsltTriggerService.JOB_NAME);
                TriggerBean triggerBean = new TriggerBean();
                triggerBean.setDataset(datasetBean);
                triggerBean.setUserAccount(userBean);
                triggerBean.setFullName(extractName);
                String actionMsg = "You may access the " + (String) dataMap.get(XsltTriggerService.EXPORT_FORMAT) + " file by changing your study/site to " + currentStudy.getName() + " and selecting the Export Data icon for " + datasetBean.getName() + " dataset on the View Datasets page.";
                auditEventDAO.createRowForExtractDataJobSuccess(triggerBean, actionMsg);
            }
            mailSender.sendEmail(alertEmail, EmailEngine.getAdminEmail(), subject, emailBuffer.toString(), true);
        } catch (OpenClinicaSystemException ose) {
            // Do Nothing, In the future we might want to have an email
            // status added to system.
            logger.info("exception sending mail: " + ose.getMessage());
            logger.error("exception sending mail: " + ose.getMessage());
        }
        logger.info("just sent email to " + alertEmail + ", from " + EmailEngine.getAdminEmail());
        if (successMsg == null) {
            successMsg = " ";
        }
        postSuccessMessage(successMsg, context);
    } catch (JobInterruptedException e) {
        logger.info("Job was cancelled by the user");
        exceptions = true;
    } catch (TransformerConfigurationException e) {
        sendErrorEmail(e.getMessage(), context, alertEmail);
        postErrorMessage(e.getMessage(), context);
        logger.error("Error executing extract", e);
        exceptions = true;
    } catch (FileNotFoundException e) {
        sendErrorEmail(e.getMessage(), context, alertEmail);
        postErrorMessage(e.getMessage(), context);
        logger.error("Error executing extract", e);
        exceptions = true;
    } catch (TransformerFactoryConfigurationError e) {
        sendErrorEmail(e.getMessage(), context, alertEmail);
        postErrorMessage(e.getMessage(), context);
        logger.error("Error executing extract", e);
        exceptions = true;
    } catch (TransformerException e) {
        sendErrorEmail(e.getMessage(), context, alertEmail);
        postErrorMessage(e.getMessage(), context);
        logger.error("Error executing extract", e);
        exceptions = true;
    } catch (Exception ee) {
        sendErrorEmail(ee.getMessage(), context, alertEmail);
        postErrorMessage(ee.getMessage(), context);
        logger.error("Error executing extract", ee);
        exceptions = true;
        if (null != dataMap.get("job_type") && ((String) dataMap.get("job_type")).equalsIgnoreCase("exportJob")) {
            TriggerBean triggerBean = new TriggerBean();
            triggerBean.setUserAccount(userBean);
            triggerBean.setFullName((String) dataMap.get(XsltTriggerService.JOB_NAME));
            auditEventDAO.createRowForExtractDataJobFailure(triggerBean);
        }
    } finally {
        if (in != null)
            try {
                in.close();
            } catch (IOException e) {
                logger.error("Error executing extract", e);
            }
        if (endFileStream != null)
            try {
                endFileStream.close();
            } catch (IOException e) {
                logger.error("Error executing extract", e);
            }
        if (exceptions) {
            logger.debug("EXCEPTIONS... EVEN TEHN DELETING OFF OLD FILES");
            String generalFileDir = dataMap.getString(XML_FILE_PATH);
            File oldFilesPath = new File(generalFileDir);
            if (oldFilesPath.isDirectory()) {
                markForDelete = Arrays.asList(oldFilesPath.listFiles());
            }
            logger.debug("deleting the old files reference from archive dataset");
            if (deleteOld) {
                deleteIntermFiles(markForDelete, "", doNotDeleteUntilExtract);
            }
        }
        if (datasetBean != null)
            resetArchiveDataset(datasetBean.getId());
        logger.info("Job " + context.getJobDetail().getDescription() + " finished.");
    }
}
Also used : Locale(java.util.Locale) DatasetBean(org.akaza.openclinica.bean.extract.DatasetBean) FileNotFoundException(java.io.FileNotFoundException) ExtractPropertyBean(org.akaza.openclinica.bean.extract.ExtractPropertyBean) JobDataMap(org.quartz.JobDataMap) TriggerBean(org.akaza.openclinica.bean.admin.TriggerBean) StreamResult(javax.xml.transform.stream.StreamResult) ProcessingFunction(org.akaza.openclinica.bean.service.ProcessingFunction) StudyBean(org.akaza.openclinica.bean.managestudy.StudyBean) OpenClinicaSystemException(org.akaza.openclinica.exception.OpenClinicaSystemException) XMLFileFilter(org.akaza.openclinica.core.util.XMLFileFilter) LinkedList(java.util.LinkedList) ExtractBean(org.akaza.openclinica.bean.extract.ExtractBean) FileOutputStream(java.io.FileOutputStream) ResourceBundle(java.util.ResourceBundle) File(java.io.File) Transformer(javax.xml.transform.Transformer) TransformerConfigurationException(javax.xml.transform.TransformerConfigurationException) FilenameFilter(java.io.FilenameFilter) ZipEntry(java.util.zip.ZipEntry) Entry(java.util.Map.Entry) UserAccountBean(org.akaza.openclinica.bean.login.UserAccountBean) TransformerException(javax.xml.transform.TransformerException) ArchivedDatasetFileBean(org.akaza.openclinica.bean.extract.ArchivedDatasetFileBean) TransformerFactoryConfigurationError(javax.xml.transform.TransformerFactoryConfigurationError) StreamSource(javax.xml.transform.stream.StreamSource) FileInputStream(java.io.FileInputStream) IOException(java.io.IOException) FileNotFoundException(java.io.FileNotFoundException) OpenClinicaSystemException(org.akaza.openclinica.exception.OpenClinicaSystemException) TransformerException(javax.xml.transform.TransformerException) SchedulerException(org.quartz.SchedulerException) TransformerConfigurationException(javax.xml.transform.TransformerConfigurationException) IOException(java.io.IOException) ProcessingResultType(org.akaza.openclinica.bean.service.ProcessingResultType)

Example 25 with StudyBean

use of org.akaza.openclinica.bean.managestudy.StudyBean in project OpenClinica by OpenClinica.

the class ImportDataHelper method createEventCRF.

public EventCRFBean createEventCRF(HashMap<String, String> importedObject) {
    EventCRFBean eventCrfBean = null;
    int studyEventId = importedObject.get("study_event_id") == null ? -1 : Integer.parseInt(importedObject.get("study_event_id"));
    String crfVersionName = importedObject.get("crf_version_name") == null ? "" : importedObject.get("crf_version_name").toString();
    String crfName = importedObject.get("crf_name") == null ? "" : importedObject.get("crf_name").toString();
    String eventDefinitionCRFName = importedObject.get("event_definition_crf_name") == null ? "" : importedObject.get("event_definition_crf_name").toString();
    String subjectName = importedObject.get("subject_name") == null ? "" : importedObject.get("subject_name").toString();
    String studyName = importedObject.get("study_name") == null ? "" : importedObject.get("study_name").toString();
    logger.info("found the following: study event id " + studyEventId + ", crf version name " + crfVersionName + ", crf name " + crfName + ", event def crf name " + eventDefinitionCRFName + ", subject name " + subjectName + ", study name " + studyName);
    // << tbh
    int eventCRFId = 0;
    EventCRFDAO eventCrfDao = new EventCRFDAO(sm.getDataSource());
    StudyDAO studyDao = new StudyDAO(sm.getDataSource());
    StudySubjectDAO studySubjectDao = new StudySubjectDAO(sm.getDataSource());
    StudyEventDefinitionDAO studyEventDefinistionDao = new StudyEventDefinitionDAO(sm.getDataSource());
    CRFVersionDAO crfVersionDao = new CRFVersionDAO(sm.getDataSource());
    StudyEventDAO studyEventDao = new StudyEventDAO(sm.getDataSource());
    CRFDAO crfdao = new CRFDAO(sm.getDataSource());
    SubjectDAO subjectDao = new SubjectDAO(sm.getDataSource());
    StudyBean studyBean = (StudyBean) studyDao.findByName(studyName);
    // .findByPK(studyId);
    // generate the subject bean first, so that we can have the subject id
    // below...
    SubjectBean subjectBean = // .findByUniqueIdentifierAndStudy(subjectName,
    subjectDao.findByUniqueIdentifier(subjectName);
    StudySubjectBean studySubjectBean = studySubjectDao.findBySubjectIdAndStudy(subjectBean.getId(), studyBean);
    // .findByLabelAndStudy(subjectName, studyBean);
    logger.info("::: found study subject id here: " + studySubjectBean.getId() + " with the following: subject ID " + subjectBean.getId() + " study bean name " + studyBean.getName());
    StudyEventBean studyEventBean = (StudyEventBean) studyEventDao.findByPK(studyEventId);
    // TODO need to replace, can't really replace
    logger.info("found study event status: " + studyEventBean.getStatus().getName());
    // [study] event should be scheduled, event crf should be not started
    CRFVersionBean crfVersion = (CRFVersionBean) crfVersionDao.findByFullName(crfVersionName, crfName);
    // .findByPK(crfVersionId);
    // replaced by findByName(name, version)
    logger.info("found crf version name here: " + crfVersion.getName());
    EntityBean crf = crfdao.findByPK(crfVersion.getCrfId());
    logger.info("found crf name here: " + crf.getName());
    // trying it again up here since down there doesn't seem to work, tbh
    StudyEventDefinitionBean studyEventDefinitionBean = (StudyEventDefinitionBean) studyEventDefinistionDao.findByName(eventDefinitionCRFName);
    if (studySubjectBean.getId() <= 0 && studyEventBean.getId() <= 0 && crfVersion.getId() <= 0 && studyBean.getId() <= 0 && studyEventDefinitionBean.getId() <= 0) {
        logger.info("Throw an Exception, One of the provided ids is not valid");
    }
    // >> tbh repeating items:
    ArrayList eventCrfBeans = eventCrfDao.findByEventSubjectVersion(studyEventBean, studySubjectBean, crfVersion);
    // TODO repeating items here? not yet
    if (eventCrfBeans.size() > 1) {
        logger.info("found more than one");
    }
    if (!eventCrfBeans.isEmpty() && eventCrfBeans.size() == 1) {
        eventCrfBean = (EventCRFBean) eventCrfBeans.get(0);
        logger.info("This EventCrfBean was found");
    }
    if (!eventCrfBeans.isEmpty() && eventCrfBeans.size() > 1) {
        logger.info("Throw a System exception , result should either be 0 or 1");
    }
    if (eventCrfBean == null) {
        StudyBean studyWithSED = studyBean;
        if (studyBean.getParentStudyId() > 0) {
            studyWithSED = new StudyBean();
            studyWithSED.setId(studyBean.getParentStudyId());
        }
        AuditableEntityBean studyEvent = studyEventDao.findByPKAndStudy(studyEventId, studyWithSED);
        if (studyEvent.getId() <= 0) {
            logger.info("Hello Exception");
        }
        eventCrfBean = new EventCRFBean();
        // eventCrfBean.setCrfVersion(crfVersion);
        if (eventCRFId == 0) {
            // ???
            if (studyBean.getStudyParameterConfig().getInterviewerNameDefault().equals("blank")) {
                eventCrfBean.setInterviewerName("");
            } else {
                // default will be event's owner name
                eventCrfBean.setInterviewerName(studyEventBean.getOwner().getName());
            }
            if (!studyBean.getStudyParameterConfig().getInterviewDateDefault().equals("blank")) {
                if (studyEventBean.getDateStarted() != null) {
                    // default
                    eventCrfBean.setDateInterviewed(studyEventBean.getDateStarted());
                // date
                } else {
                    // logger.info("evnet start date is null, so date
                    // interviewed is null");
                    eventCrfBean.setDateInterviewed(null);
                }
            } else {
                eventCrfBean.setDateInterviewed(null);
            }
            eventCrfBean.setAnnotations("");
            eventCrfBean.setCreatedDate(new Date());
            eventCrfBean.setCRFVersionId(crfVersion.getId());
            // eventCrfBean.setCrfVersion((CRFVersionBean)crfVersion);
            eventCrfBean.setOwner(ub);
            // eventCrfBean.setCrf((CRFBean)crf);
            eventCrfBean.setStatus(Status.AVAILABLE);
            eventCrfBean.setCompletionStatusId(1);
            // problem with the line below
            eventCrfBean.setStudySubjectId(studySubjectBean.getId());
            eventCrfBean.setStudyEventId(studyEventId);
            eventCrfBean.setValidateString("");
            eventCrfBean.setValidatorAnnotations("");
            try {
                eventCrfBean = (EventCRFBean) eventCrfDao.create(eventCrfBean);
            // TODO review
            // eventCrfBean.setCrfVersion((CRFVersionBean)crfVersion);
            // eventCrfBean.setCrf((CRFBean)crf);
            } catch (Exception ee) {
                logger.info(ee.getMessage());
                logger.info("throws with crf version id " + crfVersion.getId() + " and study event id " + studyEventId + " study subject id " + studySubjectBean.getId());
            }
        // note that you need to catch an exception if the numbers are
        // bogus, ie you can throw an error here
        // however, putting the try catch allows you to pass which is
        // also bad
        // logger.info("CREATED EVENT CRF");
        } else {
            // there is an event CRF already, only need to update
            // is the status not started???
            logger.info("*** already-started event CRF with msg: " + eventCrfBean.getStatus().getName());
            if (eventCrfBean.getStatus().equals(Status.PENDING)) {
                logger.info("Not Started???");
            }
            eventCrfBean = (EventCRFBean) eventCrfDao.findByPK(eventCRFId);
            eventCrfBean.setCRFVersionId(crfVersion.getId());
            eventCrfBean.setUpdatedDate(new Date());
            eventCrfBean.setUpdater(ub);
            eventCrfBean = (EventCRFBean) eventCrfDao.update(eventCrfBean);
        // eventCrfBean.setCrfVersion((CRFVersionBean)crfVersion);
        // eventCrfBean.setCrf((CRFBean)crf);
        }
        if (eventCrfBean.getId() <= 0) {
            logger.info("error");
        } else {
            // TODO change status here, tbh
            // 2/08 this part seems to work, tbh
            studyEventBean.setSubjectEventStatus(SubjectEventStatus.DATA_ENTRY_STARTED);
            studyEventBean.setUpdater(ub);
            studyEventBean.setUpdatedDate(new Date());
            studyEventDao.update(studyEventBean);
        }
    }
    eventCrfBean.setCrfVersion(crfVersion);
    eventCrfBean.setCrf((CRFBean) crf);
    // repeating?
    return eventCrfBean;
}
Also used : EventCRFDAO(org.akaza.openclinica.dao.submit.EventCRFDAO) CRFDAO(org.akaza.openclinica.dao.admin.CRFDAO) CRFVersionDAO(org.akaza.openclinica.dao.submit.CRFVersionDAO) AuditableEntityBean(org.akaza.openclinica.bean.core.AuditableEntityBean) StudySubjectDAO(org.akaza.openclinica.dao.managestudy.StudySubjectDAO) SubjectDAO(org.akaza.openclinica.dao.submit.SubjectDAO) StudyBean(org.akaza.openclinica.bean.managestudy.StudyBean) ArrayList(java.util.ArrayList) StudyEventDefinitionBean(org.akaza.openclinica.bean.managestudy.StudyEventDefinitionBean) StudyEventBean(org.akaza.openclinica.bean.managestudy.StudyEventBean) StudySubjectDAO(org.akaza.openclinica.dao.managestudy.StudySubjectDAO) Date(java.util.Date) SubjectBean(org.akaza.openclinica.bean.submit.SubjectBean) StudySubjectBean(org.akaza.openclinica.bean.managestudy.StudySubjectBean) EventCRFBean(org.akaza.openclinica.bean.submit.EventCRFBean) StudyEventDefinitionDAO(org.akaza.openclinica.dao.managestudy.StudyEventDefinitionDAO) StudySubjectBean(org.akaza.openclinica.bean.managestudy.StudySubjectBean) StudyEventDAO(org.akaza.openclinica.dao.managestudy.StudyEventDAO) EntityBean(org.akaza.openclinica.bean.core.EntityBean) AuditableEntityBean(org.akaza.openclinica.bean.core.AuditableEntityBean) CRFVersionBean(org.akaza.openclinica.bean.submit.CRFVersionBean) StudyDAO(org.akaza.openclinica.dao.managestudy.StudyDAO) EventCRFDAO(org.akaza.openclinica.dao.submit.EventCRFDAO)

Aggregations

StudyBean (org.akaza.openclinica.bean.managestudy.StudyBean)350 ArrayList (java.util.ArrayList)183 StudyDAO (org.akaza.openclinica.dao.managestudy.StudyDAO)175 FormProcessor (org.akaza.openclinica.control.form.FormProcessor)92 HashMap (java.util.HashMap)84 StudySubjectBean (org.akaza.openclinica.bean.managestudy.StudySubjectBean)66 StudyEventDefinitionBean (org.akaza.openclinica.bean.managestudy.StudyEventDefinitionBean)65 StudyEventDefinitionDAO (org.akaza.openclinica.dao.managestudy.StudyEventDefinitionDAO)62 Date (java.util.Date)59 StudyEventBean (org.akaza.openclinica.bean.managestudy.StudyEventBean)58 StudyEventDAO (org.akaza.openclinica.dao.managestudy.StudyEventDAO)57 UserAccountBean (org.akaza.openclinica.bean.login.UserAccountBean)56 StudySubjectDAO (org.akaza.openclinica.dao.managestudy.StudySubjectDAO)54 EventCRFBean (org.akaza.openclinica.bean.submit.EventCRFBean)52 EventCRFDAO (org.akaza.openclinica.dao.submit.EventCRFDAO)52 EventDefinitionCRFDAO (org.akaza.openclinica.dao.managestudy.EventDefinitionCRFDAO)45 Locale (java.util.Locale)39 UserAccountDAO (org.akaza.openclinica.dao.login.UserAccountDAO)39 EventDefinitionCRFBean (org.akaza.openclinica.bean.managestudy.EventDefinitionCRFBean)37 Iterator (java.util.Iterator)36