Search in sources :

Example 1 with DataMigrationException

use of com.xpn.xwiki.store.migration.DataMigrationException in project xwiki-platform by xwiki.

the class R40000XWIKI6990DataMigration method getLiquibaseChangeLog.

@Override
public String getLiquibaseChangeLog() throws DataMigrationException {
    final XWikiHibernateBaseStore store = getStore();
    this.configuration = store.getConfiguration();
    final StringBuilder sb = new StringBuilder(12000);
    final List<PersistentClass> classes = new ArrayList<PersistentClass>();
    detectDatabaseProducts(store);
    if (this.logger.isDebugEnabled()) {
        if (this.isOracle) {
            this.logger.debug("Oracle database detected, proceeding to all updates manually with deferred constraints.");
        }
        if (this.isMySQL && !this.isMySQLMyISAM) {
            this.logger.debug("MySQL innoDB database detected, proceeding to simplified updates with cascaded updates.");
        }
        if (this.isMySQLMyISAM) {
            this.logger.debug("MySQL MyISAM database detected, proceeding to all updates manually without constraints.");
        }
        if (this.isMSSQL) {
            this.logger.debug("Microsoft SQL Server database detected, proceeding to simplified updates with cascaded u" + "pdates. During data type changes, Primary Key constraints and indexes are temporarily dropped.");
        }
    }
    // Build the list of classes to check for updates
    classes.add(getClassMapping(BaseObject.class.getName()));
    for (Class<?> klass : PROPERTY_CLASS) {
        classes.add(getClassMapping(klass.getName()));
    }
    for (Class<?> klass : STATS_CLASSES) {
        classes.add(getClassMapping(klass.getName()));
    }
    // Initialize the counter of Change Logs
    this.logCount = 0;
    // do not prevent type changes, we skip all this processing for MySQL table stored using the MyISAM engine.
    if (!this.isMySQLMyISAM) {
        for (PersistentClass klass : classes) {
            this.fkTables.addAll(getForeignKeyTables(klass));
        }
    }
    // Drop all FK constraints
    for (Table table : this.fkTables) {
        appendDropForeignKeyChangeLog(sb, table);
    }
    // Process internal classes
    for (PersistentClass klass : classes) {
        // The same table mapped for StringListProperty and LargeStringProperty
        if (klass.getMappedClass() != StringListProperty.class) {
            // Update key types
            appendDataTypeChangeLogs(sb, klass);
        }
    }
    // Process dynamic and custom mapping
    final XWikiContext context = getXWikiContext();
    try {
        processCustomMappings((XWikiHibernateStore) store, new CustomMappingCallback() {

            @Override
            public void processCustomMapping(XWikiHibernateStore store, String name, String mapping, boolean hasDynamicMapping) throws XWikiException {
                if (INTERNAL.equals(mapping) || hasDynamicMapping) {
                    PersistentClass klass = R40000XWIKI6990DataMigration.this.configuration.getClassMapping(name);
                    if (!R40000XWIKI6990DataMigration.this.isMySQLMyISAM) {
                        List<Table> tables = getForeignKeyTables(klass);
                        for (Table table : tables) {
                            if (!R40000XWIKI6990DataMigration.this.fkTables.contains(table)) {
                                // Drop FK constraints for custom mapped class
                                appendDropForeignKeyChangeLog(sb, table);
                                R40000XWIKI6990DataMigration.this.fkTables.add(table);
                            }
                        }
                    }
                    // Update key types for custom mapped class
                    appendDataTypeChangeLogs(sb, klass);
                }
            }
        }, context);
    } catch (XWikiException e) {
        throw new DataMigrationException("Unable to process custom mapped classes during schema updated", e);
    }
    // Add FK constraints back, activating cascaded updates
    for (Table table : this.fkTables) {
        appendAddForeignKeyChangeLog(sb, table);
    }
    // Oracle doesn't support cascaded updates, so we still need to manually update each table
    if (this.isOracle) {
        this.fkTables.clear();
    }
    logProgress("%d schema updates required.", this.logCount);
    if (this.logger.isDebugEnabled()) {
        this.logger.debug("About to execute this Liquibase XML: {}", sb.toString());
    }
    return sb.toString();
}
Also used : Table(org.hibernate.mapping.Table) ArrayList(java.util.ArrayList) XWikiContext(com.xpn.xwiki.XWikiContext) XWikiHibernateStore(com.xpn.xwiki.store.XWikiHibernateStore) List(java.util.List) ArrayList(java.util.ArrayList) LinkedList(java.util.LinkedList) XWikiHibernateBaseStore(com.xpn.xwiki.store.XWikiHibernateBaseStore) DataMigrationException(com.xpn.xwiki.store.migration.DataMigrationException) XWikiException(com.xpn.xwiki.XWikiException) PersistentClass(org.hibernate.mapping.PersistentClass)

Example 2 with DataMigrationException

use of com.xpn.xwiki.store.migration.DataMigrationException in project xwiki-platform by xwiki.

the class R40000XWIKI6990DataMigration method hibernateMigrate.

@Override
public void hibernateMigrate() throws DataMigrationException, XWikiException {
    final Map<Long, Long> docs = new HashMap<Long, Long>();
    final List<String> customMappedClasses = new ArrayList<String>();
    final Map<Long, Long> objs = new HashMap<Long, Long>();
    final Queue<Map<Long, Long>> stats = new LinkedList<Map<Long, Long>>();
    // Get ids conversion list
    getStore().executeRead(getXWikiContext(), new HibernateCallback<Object>() {

        private void fillDocumentIdConversion(Session session, Map<Long, Long> map) {
            String database = getXWikiContext().getWikiId();
            @SuppressWarnings("unchecked") List<Object[]> results = session.createQuery("select doc.id, doc.space, doc.name, doc.defaultLanguage, doc.language from " + XWikiDocument.class.getName() + " as doc").list();
            for (Object[] result : results) {
                long oldId = (Long) result[0];
                String space = (String) result[1];
                String name = (String) result[2];
                String defaultLanguage = (String) result[3];
                String language = (String) result[4];
                // Use a real document, since we need the language to be appended.
                // TODO: Change this when the locale is integrated
                XWikiDocument doc = new XWikiDocument(new DocumentReference(database, space, name));
                doc.setDefaultLanguage(defaultLanguage);
                doc.setLanguage(language);
                long newId = doc.getId();
                if (oldId != newId) {
                    map.put(oldId, newId);
                }
            }
            logProgress("Retrieved %d document IDs to be converted.", map.size());
        }

        private void fillObjectIdConversion(Session session, Map<Long, Long> map) {
            @SuppressWarnings("unchecked") List<Object[]> results = session.createQuery("select obj.id, obj.name, obj.className, obj.number from " + BaseObject.class.getName() + " as obj").list();
            for (Object[] result : results) {
                long oldId = (Long) result[0];
                String docName = (String) result[1];
                String className = (String) result[2];
                Integer number = (Integer) result[3];
                BaseObjectReference objRef = new BaseObjectReference(R40000XWIKI6990DataMigration.this.resolver.resolve(className), number, R40000XWIKI6990DataMigration.this.resolver.resolve(docName));
                long newId = Util.getHash(R40000XWIKI6990DataMigration.this.serializer.serialize(objRef));
                if (oldId != newId) {
                    map.put(oldId, newId);
                }
            }
            logProgress("Retrieved %d object IDs to be converted.", map.size());
        }

        private void fillCustomMappingMap(XWikiHibernateStore store, XWikiContext context) throws XWikiException {
            processCustomMappings(store, new CustomMappingCallback() {

                @Override
                public void processCustomMapping(XWikiHibernateStore store, String name, String mapping, boolean hasDynamicMapping) throws XWikiException {
                    if (INTERNAL.equals(mapping) || hasDynamicMapping) {
                        customMappedClasses.add(name);
                    }
                }
            }, context);
            logProgress("Retrieved %d custom mapped classes to be processed.", customMappedClasses.size());
        }

        private void fillStatsConversionMap(Session session, Class<?> klass, Map<Long, Long> map) {
            @SuppressWarnings("unchecked") List<Object[]> results = session.createQuery("select stats.id, stats.name, stats.number from " + klass.getName() + " as stats").list();
            for (Object[] result : results) {
                long oldId = (Long) result[0];
                String statsName = (String) result[1];
                Integer number = (Integer) result[2];
                // Do not try to convert broken records which would cause duplicated ids
                if (statsName != null && !statsName.startsWith(".") && !statsName.endsWith(".")) {
                    long newId = R40000XWIKI6990DataMigration.this.statsIdComputer.getId(statsName, number);
                    if (oldId != newId) {
                        map.put(oldId, newId);
                    }
                } else {
                    R40000XWIKI6990DataMigration.this.logger.debug("Skipping invalid statistical entry [{}] with name [{}]", oldId, statsName);
                }
            }
            String klassName = klass.getName().substring(klass.getName().lastIndexOf('.') + 1);
            logProgress("Retrieved %d %s statistics IDs to be converted.", map.size(), klassName.substring(0, klassName.length() - 5).toLowerCase());
        }

        @Override
        public Object doInHibernate(Session session) throws XWikiException {
            try {
                fillDocumentIdConversion(session, docs);
                fillObjectIdConversion(session, objs);
                // Retrieve custom mapped classes
                if (getStore() instanceof XWikiHibernateStore) {
                    fillCustomMappingMap((XWikiHibernateStore) getStore(), getXWikiContext());
                }
                // Retrieve statistics ID conversion
                for (Class<?> statsClass : STATS_CLASSES) {
                    Map<Long, Long> map = new HashMap<Long, Long>();
                    fillStatsConversionMap(session, statsClass, map);
                    stats.add(map);
                }
                session.clear();
            } catch (Exception e) {
                throw new XWikiException(XWikiException.MODULE_XWIKI_STORE, XWikiException.ERROR_XWIKI_STORE_MIGRATION, getName() + " migration failed", e);
            }
            return null;
        }
    });
    // Cache the configuration and the dialect
    this.configuration = getStore().getConfiguration();
    this.dialect = this.configuration.buildSettings().getDialect();
    // Check configuration for safe mode
    /* True if migration should use safe but slower non-bulk native updates. */
    boolean useSafeUpdates = "1".equals(getXWikiContext().getWiki().Param("xwiki.store.migration." + this.getName() + ".safemode", "0"));
    // Use safe mode if the database has no temporary table supported by hibernate
    useSafeUpdates = useSafeUpdates || !this.configuration.buildSettings().getDialect().supportsTemporaryTables();
    // Proceed to document id conversion
    if (!docs.isEmpty()) {
        if (!useSafeUpdates) {
            // Pair table,key for table that need manual updates
            final List<String[]> tableToProcess = new ArrayList<String[]>();
            for (Class<?> docClass : DOC_CLASSES) {
                tableToProcess.addAll(getAllTableToProcess(docClass.getName()));
            }
            for (Class<?> docClass : DOCLINK_CLASSES) {
                tableToProcess.addAll(getAllTableToProcess(docClass.getName(), "docId"));
            }
            logProgress("Converting %d document IDs in %d tables...", docs.size(), tableToProcess.size());
            final long[] times = new long[tableToProcess.size() + 1];
            try {
                getStore().executeWrite(getXWikiContext(), new AbstractBulkIdConversionHibernateCallback() {

                    @Override
                    public void doBulkIdUpdate() {
                        times[this.timer++] += insertIdUpdates(docs);
                        for (String[] table : tableToProcess) {
                            times[this.timer++] += executeSqlIdUpdate(table[0], table[1]);
                        }
                    }
                });
            } catch (Exception e) {
                throw new XWikiException(XWikiException.MODULE_XWIKI_STORE, XWikiException.ERROR_XWIKI_STORE_MIGRATION, getName() + " migration failed", e);
            }
            if (this.logger.isDebugEnabled()) {
                int timer = 0;
                this.logger.debug("Time elapsed for inserts: {} ms", times[timer++] / 1000000);
                for (String[] table : tableToProcess) {
                    this.logger.debug("Time elapsed for {} table: {} ms", table[0], times[timer++] / 1000000);
                }
            }
        } else {
            final List<String[]> docsColl = new ArrayList<String[]>();
            for (Class<?> docClass : DOC_CLASSES) {
                docsColl.addAll(getCollectionProperties(getClassMapping(docClass.getName())));
            }
            for (Class<?> docClass : DOCLINK_CLASSES) {
                docsColl.addAll(getCollectionProperties(getClassMapping(docClass.getName())));
            }
            logProgress("Converting %d document IDs in %d tables and %d collection tables...", docs.size(), DOC_CLASSES.length + DOCLINK_CLASSES.length, docsColl.size());
            final long[] times = new long[DOC_CLASSES.length + DOCLINK_CLASSES.length + docsColl.size()];
            convertDbId(docs, new AbstractIdConversionHibernateCallback() {

                @Override
                public void doSingleUpdate() {
                    for (String[] coll : docsColl) {
                        times[this.timer++] += executeSqlIdUpdate(coll[0], coll[1]);
                    }
                    for (Class<?> doclinkClass : DOCLINK_CLASSES) {
                        times[this.timer++] += executeIdUpdate(doclinkClass, DOCID);
                    }
                    times[this.timer++] += executeIdUpdate(XWikiLink.class, DOCID);
                    times[this.timer++] += executeIdUpdate(XWikiRCSNodeInfo.class, ID + '.' + DOCID);
                    times[this.timer++] += executeIdUpdate(XWikiDocument.class, ID);
                }
            });
            if (this.logger.isDebugEnabled()) {
                int timer = 0;
                for (String[] coll : docsColl) {
                    this.logger.debug("Time elapsed for {} collection: {} ms", coll[0], times[timer++] / 1000000);
                }
                for (Class<?> doclinkClass : DOCLINK_CLASSES) {
                    this.logger.debug("Time elapsed for {} class: {} ms", doclinkClass.getName(), times[timer++] / 1000000);
                }
                this.logger.debug("Time elapsed for {} class: {} ms", XWikiRCSNodeInfo.class.getName(), times[timer++] / 1000000);
                this.logger.debug("Time elapsed for {} class: {} ms", XWikiDocument.class.getName(), times[timer++] / 1000000);
            }
        }
        logProgress("All document IDs has been converted successfully.");
    } else {
        logProgress("No document IDs to convert, skipping.");
    }
    // Proceed to object id conversion
    if (!objs.isEmpty()) {
        if (!useSafeUpdates) {
            // Pair table,key for table that need manual updates
            final List<String[]> tableToProcess = new ArrayList<String[]>();
            PersistentClass objklass = getClassMapping(BaseObject.class.getName());
            tableToProcess.addAll(getCollectionProperties(objklass));
            for (Class<?> propertyClass : PROPERTY_CLASS) {
                tableToProcess.addAll(getAllTableToProcess(propertyClass.getName()));
            }
            for (String customClass : customMappedClasses) {
                tableToProcess.addAll(getAllTableToProcess(customClass));
            }
            tableToProcess.add(new String[] { objklass.getTable().getName(), getKeyColumnName(objklass) });
            logProgress("Converting %d object IDs in %d tables...", objs.size(), tableToProcess.size());
            final long[] times = new long[tableToProcess.size() + 1];
            try {
                getStore().executeWrite(getXWikiContext(), new AbstractBulkIdConversionHibernateCallback() {

                    @Override
                    public void doBulkIdUpdate() {
                        times[this.timer++] += insertIdUpdates(objs);
                        for (String[] table : tableToProcess) {
                            times[this.timer++] += executeSqlIdUpdate(table[0], table[1]);
                        }
                    }
                });
            } catch (Exception e) {
                throw new XWikiException(XWikiException.MODULE_XWIKI_STORE, XWikiException.ERROR_XWIKI_STORE_MIGRATION, getName() + " migration failed", e);
            }
            if (this.logger.isDebugEnabled()) {
                int timer = 0;
                this.logger.debug("Time elapsed for inserts: {} ms", times[timer++] / 1000000);
                for (String[] table : tableToProcess) {
                    this.logger.debug("Time elapsed for {} table: {} ms", table[0], times[timer++] / 1000000);
                }
            }
        } else {
            // Name of classes that need manual updates
            final List<String> classToProcess = new ArrayList<String>();
            // Name of custom classes that need manual updates
            final List<String> customClassToProcess = new ArrayList<String>();
            // Pair table,key for collection table that need manual updates
            final List<String[]> objsColl = new ArrayList<String[]>();
            objsColl.addAll(getCollectionProperties(getClassMapping(BaseObject.class.getName())));
            for (Class<?> propertyClass : PROPERTY_CLASS) {
                String className = propertyClass.getName();
                PersistentClass klass = getClassMapping(className);
                // Add collection table that will not be updated by cascaded updates
                objsColl.addAll(getCollectionProperties(klass));
                // Skip classes that will be updated by cascaded updates
                if (!this.fkTables.contains(klass.getTable())) {
                    classToProcess.add(className);
                }
            }
            for (String customClass : customMappedClasses) {
                PersistentClass klass = getClassMapping(customClass);
                // Add collection table that will not be updated by cascaded updates
                objsColl.addAll(getCollectionProperties(klass));
                // Skip classes that will be updated by cascaded updates
                if (!this.fkTables.contains(klass.getTable())) {
                    customClassToProcess.add(customClass);
                }
            }
            logProgress("Converting %d object IDs in %d tables, %d custom mapped tables and %d collection tables...", objs.size(), classToProcess.size() + 1, customClassToProcess.size(), objsColl.size());
            final long[] times = new long[classToProcess.size() + 1 + customClassToProcess.size() + objsColl.size()];
            convertDbId(objs, new AbstractIdConversionHibernateCallback() {

                @Override
                public void doSingleUpdate() {
                    for (String[] coll : objsColl) {
                        times[this.timer++] += executeSqlIdUpdate(coll[0], coll[1]);
                    }
                    for (String customMappedClass : customClassToProcess) {
                        times[this.timer++] += executeIdUpdate(customMappedClass, ID);
                    }
                    for (String propertyClass : classToProcess) {
                        times[this.timer++] += executeIdUpdate(propertyClass, IDID);
                    }
                    times[this.timer++] += executeIdUpdate(BaseObject.class, ID);
                }
            });
            if (this.logger.isDebugEnabled()) {
                int timer = 0;
                for (String[] coll : objsColl) {
                    this.logger.debug("Time elapsed for {} collection: {} ms", coll[0], times[timer++] / 1000000);
                }
                for (String customMappedClass : customClassToProcess) {
                    this.logger.debug("Time elapsed for {} custom table: {} ms", customMappedClass, times[timer++] / 1000000);
                }
                for (String propertyClass : classToProcess) {
                    this.logger.debug("Time elapsed for {} property table: {} ms", propertyClass, times[timer++] / 1000000);
                }
                this.logger.debug("Time elapsed for {} class: {} ms", BaseObject.class.getName(), times[timer++] / 1000000);
            }
        }
        logProgress("All object IDs has been converted successfully.");
    } else {
        logProgress("No object IDs to convert, skipping.");
    }
    // Proceed to statistics id conversions
    for (final Class<?> statsClass : STATS_CLASSES) {
        Map<Long, Long> map = stats.poll();
        String klassName = statsClass.getName().substring(statsClass.getName().lastIndexOf('.') + 1);
        klassName = klassName.substring(0, klassName.length() - 5).toLowerCase();
        if (!map.isEmpty()) {
            if (!useSafeUpdates) {
                final List<String[]> tableToProcess = new ArrayList<String[]>();
                final Map<Long, Long> statids = map;
                PersistentClass statklass = getClassMapping(statsClass.getName());
                tableToProcess.addAll(getCollectionProperties(statklass));
                tableToProcess.add(new String[] { statklass.getTable().getName(), getKeyColumnName(statklass) });
                logProgress("Converting %d %s statistics IDs in %d tables...", map.size(), klassName, tableToProcess.size());
                final long[] times = new long[tableToProcess.size() + 1];
                try {
                    getStore().executeWrite(getXWikiContext(), new AbstractBulkIdConversionHibernateCallback() {

                        @Override
                        public void doBulkIdUpdate() {
                            times[this.timer++] += insertIdUpdates(statids);
                            for (String[] table : tableToProcess) {
                                times[this.timer++] += executeSqlIdUpdate(table[0], table[1]);
                            }
                        }
                    });
                } catch (Exception e) {
                    throw new XWikiException(XWikiException.MODULE_XWIKI_STORE, XWikiException.ERROR_XWIKI_STORE_MIGRATION, getName() + " migration failed", e);
                }
                if (this.logger.isDebugEnabled()) {
                    int timer = 0;
                    this.logger.debug("Time elapsed for inserts: {} ms", times[timer++] / 1000000);
                    for (String[] table : tableToProcess) {
                        this.logger.debug("Time elapsed for {} table: {} ms", table[0], times[timer++] / 1000000);
                    }
                }
            } else {
                final List<String[]> statsColl = new ArrayList<String[]>();
                statsColl.addAll(getCollectionProperties(getClassMapping(statsClass.getName())));
                logProgress("Converting %d %s statistics IDs in 1 tables and %d collection tables...", map.size(), klassName, statsColl.size());
                final long[] times = new long[statsColl.size() + 1];
                convertDbId(map, new AbstractIdConversionHibernateCallback() {

                    @Override
                    public void doSingleUpdate() {
                        for (String[] coll : statsColl) {
                            times[this.timer++] += executeSqlIdUpdate(coll[0], coll[1]);
                        }
                        times[this.timer++] += executeIdUpdate(statsClass, ID);
                    }
                });
                if (this.logger.isDebugEnabled()) {
                    int timer = 0;
                    for (String[] coll : statsColl) {
                        this.logger.debug("Time elapsed for {} collection: {} ms", coll[0], times[timer++] / 1000000);
                    }
                    this.logger.debug("Time elapsed for {} class: {} ms", statsClass.getName(), times[timer++] / 1000000);
                }
            }
            logProgress("All %s statistics IDs has been converted successfully.", klassName);
        } else {
            logProgress("No %s statistics IDs to convert, skipping.", klassName);
        }
    }
}
Also used : HashMap(java.util.HashMap) ArrayList(java.util.ArrayList) XWikiDocument(com.xpn.xwiki.doc.XWikiDocument) BaseObjectReference(com.xpn.xwiki.objects.BaseObjectReference) List(java.util.List) ArrayList(java.util.ArrayList) LinkedList(java.util.LinkedList) DocumentReference(org.xwiki.model.reference.DocumentReference) XWikiException(com.xpn.xwiki.XWikiException) PersistentClass(org.hibernate.mapping.PersistentClass) XWikiContext(com.xpn.xwiki.XWikiContext) LinkedList(java.util.LinkedList) XWikiException(com.xpn.xwiki.XWikiException) DataMigrationException(com.xpn.xwiki.store.migration.DataMigrationException) HibernateException(org.hibernate.HibernateException) BaseObject(com.xpn.xwiki.objects.BaseObject) XWikiHibernateStore(com.xpn.xwiki.store.XWikiHibernateStore) XWikiRCSNodeInfo(com.xpn.xwiki.doc.rcs.XWikiRCSNodeInfo) BaseObject(com.xpn.xwiki.objects.BaseObject) PersistentClass(org.hibernate.mapping.PersistentClass) Map(java.util.Map) HashMap(java.util.HashMap) Session(org.hibernate.Session)

Example 3 with DataMigrationException

use of com.xpn.xwiki.store.migration.DataMigrationException in project xwiki-platform by xwiki.

the class WikiUserFromXEMMigration method upgradeWorkspaceConfiguration.

/**
 * Convert the old WorkspaceManager.WorkspaceClass objects to the new configuration format.
 *
 * @param oldObject old workspace object
 * @param wikiId id of the wiki to upgrade
 * @param oldWikiDescriptor document that holds the old object
 * @throws DataMigrationException if problems occur
 * @throws XWikiException if problems occur
 */
private void upgradeWorkspaceConfiguration(BaseObject oldObject, String wikiId, XWikiDocument oldWikiDescriptor) throws DataMigrationException, XWikiException {
    // Context, XWiki
    XWikiContext context = getXWikiContext();
    XWiki xwiki = context.getWiki();
    // Create the new configuration
    WikiUserConfiguration configuration = new WikiUserConfiguration();
    // No local users
    configuration.setUserScope(UserScope.GLOBAL_ONLY);
    // Set the membershipType value
    if (oldObject != null) {
        // Get the membershipType value
        String membershipTypeValue = oldObject.getStringValue("membershipType");
        MembershipType membershipType;
        try {
            membershipType = MembershipType.valueOf(membershipTypeValue.toUpperCase());
        } catch (Exception e) {
            // Default value
            membershipType = MembershipType.INVITE;
        }
        configuration.setMembershipType(membershipType);
    } else {
        // If there is no workspace object, we put a default value.
        configuration.setMembershipType(MembershipType.INVITE);
    }
    // Save the new configuration
    saveConfiguration(configuration, wikiId);
}
Also used : WikiUserConfiguration(org.xwiki.wiki.user.WikiUserConfiguration) MembershipType(org.xwiki.wiki.user.MembershipType) XWikiContext(com.xpn.xwiki.XWikiContext) XWiki(com.xpn.xwiki.XWiki) XWikiException(com.xpn.xwiki.XWikiException) WikiUserManagerException(org.xwiki.wiki.user.WikiUserManagerException) DataMigrationException(com.xpn.xwiki.store.migration.DataMigrationException)

Example 4 with DataMigrationException

use of com.xpn.xwiki.store.migration.DataMigrationException in project xwiki-platform by xwiki.

the class WikiTemplateMigration method hibernateMigrate.

@Override
protected void hibernateMigrate() throws DataMigrationException, XWikiException {
    // XWiki objects
    XWikiContext context = getXWikiContext();
    XWiki xwiki = context.getWiki();
    // WikiManager.WikiTemplateClass reference
    DocumentReference templateClassReference = new DocumentReference(wikiDescriptorManager.getCurrentWikiId(), WikiTemplateClassDocumentInitializer.DOCUMENT_SPACE, WikiTemplateClassDocumentInitializer.DOCUMENT_NAME);
    // XWiki.XWikiServerClass reference
    DocumentReference descriptorClassReference = new DocumentReference(wikiDescriptorManager.getCurrentWikiId(), XWiki.SYSTEM_SPACE, "XWikiServerClass");
    // Superadmin reference
    DocumentReference superAdmin = new DocumentReference(wikiDescriptorManager.getMainWikiId(), XWiki.SYSTEM_SPACE, "superadmin");
    try {
        // Get all the descriptor documents
        String statement = "select distinct doc.fullName " + "from Document doc, doc.object(XWiki.XWikiServerClass) as obj";
        Query query = queryManager.createQuery(statement, Query.XWQL);
        List<String> results = query.execute();
        for (String wikiPage : results) {
            XWikiDocument document = xwiki.getDocument(documentReferenceResolver.resolve(wikiPage), context);
            // Get the "iswikitemplate" value
            BaseObject descriptorObject = document.getXObject(descriptorClassReference);
            int isTemplate = descriptorObject.getIntValue(OLD_TEMPLATE_PROPERTY, 0);
            // We remove the deprecated property from the descriptor
            descriptorObject.removeField(OLD_TEMPLATE_PROPERTY);
            // Add the new WikiManager.WikiTemplateClass object
            BaseObject object = document.getXObject(templateClassReference, true, context);
            // The new object might already exists and have a template property already set
            isTemplate = object.getIntValue(WikiTemplateClassDocumentInitializer.FIELD_ISWIKITEMPLATE, isTemplate);
            // Set the (new) value
            object.setIntValue(WikiTemplateClassDocumentInitializer.FIELD_ISWIKITEMPLATE, isTemplate);
            // The document must have an author
            document.setAuthorReference(superAdmin);
            // Save the document
            xwiki.saveDocument(document, "[UPGRADE] Upgrade the template section.", context);
        }
    } catch (QueryException e) {
        throw new DataMigrationException("Failed to get the list of all existing descriptors.", e);
    } catch (XWikiException e) {
        throw new DataMigrationException("Failed to upgrade a wiki descriptor.", e);
    }
}
Also used : XWikiDocument(com.xpn.xwiki.doc.XWikiDocument) QueryException(org.xwiki.query.QueryException) Query(org.xwiki.query.Query) XWikiContext(com.xpn.xwiki.XWikiContext) XWiki(com.xpn.xwiki.XWiki) DataMigrationException(com.xpn.xwiki.store.migration.DataMigrationException) DocumentReference(org.xwiki.model.reference.DocumentReference) XWikiException(com.xpn.xwiki.XWikiException) BaseObject(com.xpn.xwiki.objects.BaseObject)

Example 5 with DataMigrationException

use of com.xpn.xwiki.store.migration.DataMigrationException in project xwiki-platform by xwiki.

the class WikiUserFromXEMMigration method deleteOldWorkspaceObject.

private void deleteOldWorkspaceObject(BaseObject oldObject, XWikiDocument oldWikiDescriptor) throws DataMigrationException {
    // Context, XWiki
    XWikiContext context = getXWikiContext();
    XWiki xwiki = context.getWiki();
    // Delete the old object
    oldWikiDescriptor.removeXObject(oldObject);
    // Save the document
    try {
        xwiki.saveDocument(oldWikiDescriptor, "[UPGRADE] Remove the old WorkspaceManager.WorkspaceClass object.", context);
    } catch (XWikiException e) {
        throw new DataMigrationException(String.format("Failed to save the document [%s] to remove the WorkspaceManager.WorkspaceClass object.", oldWikiDescriptor.getDocumentReference().toString()), e);
    }
}
Also used : XWikiContext(com.xpn.xwiki.XWikiContext) XWiki(com.xpn.xwiki.XWiki) DataMigrationException(com.xpn.xwiki.store.migration.DataMigrationException) XWikiException(com.xpn.xwiki.XWikiException)

Aggregations

DataMigrationException (com.xpn.xwiki.store.migration.DataMigrationException)8 XWikiException (com.xpn.xwiki.XWikiException)7 XWikiContext (com.xpn.xwiki.XWikiContext)6 XWiki (com.xpn.xwiki.XWiki)3 HibernateException (org.hibernate.HibernateException)3 XWikiDocument (com.xpn.xwiki.doc.XWikiDocument)2 BaseObject (com.xpn.xwiki.objects.BaseObject)2 XWikiHibernateStore (com.xpn.xwiki.store.XWikiHibernateStore)2 ArrayList (java.util.ArrayList)2 LinkedList (java.util.LinkedList)2 List (java.util.List)2 PersistentClass (org.hibernate.mapping.PersistentClass)2 DocumentReference (org.xwiki.model.reference.DocumentReference)2 Query (org.xwiki.query.Query)2 QueryException (org.xwiki.query.QueryException)2 XWikiRCSNodeInfo (com.xpn.xwiki.doc.rcs.XWikiRCSNodeInfo)1 BaseObjectReference (com.xpn.xwiki.objects.BaseObjectReference)1 BaseClass (com.xpn.xwiki.objects.classes.BaseClass)1 ActivityEventImpl (com.xpn.xwiki.plugin.activitystream.impl.ActivityEventImpl)1 XWikiHibernateBaseStore (com.xpn.xwiki.store.XWikiHibernateBaseStore)1