Search in sources :

Example 71 with Dataset

use of org.apache.asterix.metadata.entities.Dataset in project asterixdb by apache.

the class MetadataBootstrap method insertMetadataDatasets.

/**
     * Inserts a metadata dataset to the physical dataset index
     * Should be performed on a bootstrap of a new universe
     *
     * @param mdTxnCtx
     * @param indexes
     * @throws MetadataException
     */
public static void insertMetadataDatasets(MetadataTransactionContext mdTxnCtx, IMetadataIndex[] indexes) throws MetadataException {
    for (int i = 0; i < indexes.length; i++) {
        IDatasetDetails id = new InternalDatasetDetails(FileStructure.BTREE, PartitioningStrategy.HASH, indexes[i].getPartitioningExpr(), indexes[i].getPartitioningExpr(), null, indexes[i].getPartitioningExprType(), false, null, false);
        MetadataManager.INSTANCE.addDataset(mdTxnCtx, new Dataset(indexes[i].getDataverseName(), indexes[i].getIndexedDatasetName(), indexes[i].getDataverseName(), indexes[i].getPayloadRecordType().getTypeName(), indexes[i].getNodeGroupName(), GlobalConfig.DEFAULT_COMPACTION_POLICY_NAME, GlobalConfig.DEFAULT_COMPACTION_POLICY_PROPERTIES, id, new HashMap<String, String>(), DatasetType.INTERNAL, indexes[i].getDatasetId().getId(), MetadataUtil.PENDING_NO_OP));
    }
    if (LOGGER.isLoggable(Level.INFO)) {
        LOGGER.info("Finished inserting initial datasets.");
    }
}
Also used : HashMap(java.util.HashMap) Dataset(org.apache.asterix.metadata.entities.Dataset) InternalDatasetDetails(org.apache.asterix.metadata.entities.InternalDatasetDetails) IDatasetDetails(org.apache.asterix.metadata.IDatasetDetails)

Example 72 with Dataset

use of org.apache.asterix.metadata.entities.Dataset in project asterixdb by apache.

the class MetadataNode method getDatasetNamesPartitionedOnThisNodeGroup.

public List<String> getDatasetNamesPartitionedOnThisNodeGroup(JobId jobId, String nodegroup) throws MetadataException, RemoteException {
    //this needs to scan the datasets and return the datasets that use this nodegroup
    List<String> nodeGroupDatasets = new ArrayList<>();
    List<Dataset> datasets = getAllDatasets(jobId);
    for (Dataset set : datasets) {
        if (set.getNodeGroupName().equals(nodegroup)) {
            nodeGroupDatasets.add(set.getDatasetName());
        }
    }
    return nodeGroupDatasets;
}
Also used : ExtensionMetadataDataset(org.apache.asterix.metadata.api.ExtensionMetadataDataset) Dataset(org.apache.asterix.metadata.entities.Dataset) ArrayList(java.util.ArrayList) AString(org.apache.asterix.om.base.AString) AMutableString(org.apache.asterix.om.base.AMutableString)

Example 73 with Dataset

use of org.apache.asterix.metadata.entities.Dataset in project asterixdb by apache.

the class MetadataNode method dropDataset.

@Override
public void dropDataset(JobId jobId, String dataverseName, String datasetName) throws MetadataException, RemoteException {
    Dataset dataset = getDataset(jobId, dataverseName, datasetName);
    if (dataset == null) {
        throw new MetadataException("Cannot drop dataset '" + datasetName + "' because it doesn't exist.");
    }
    try {
        // Delete entry from the 'datasets' dataset.
        ITupleReference searchKey = createTuple(dataverseName, datasetName);
        // Searches the index for the tuple to be deleted. Acquires an S
        // lock on the 'dataset' dataset.
        ITupleReference datasetTuple = null;
        try {
            datasetTuple = getTupleToBeDeleted(jobId, MetadataPrimaryIndexes.DATASET_DATASET, searchKey);
            // Delete entry(s) from the 'indexes' dataset.
            List<Index> datasetIndexes = getDatasetIndexes(jobId, dataverseName, datasetName);
            if (datasetIndexes != null) {
                for (Index index : datasetIndexes) {
                    dropIndex(jobId, dataverseName, datasetName, index.getIndexName());
                }
            }
            if (dataset.getDatasetType() == DatasetType.EXTERNAL) {
                // Delete External Files
                // As a side effect, acquires an S lock on the 'ExternalFile' dataset
                // on behalf of txnId.
                List<ExternalFile> datasetFiles = getExternalFiles(jobId, dataset);
                if (datasetFiles != null && datasetFiles.size() > 0) {
                    // Drop all external files in this dataset.
                    for (ExternalFile file : datasetFiles) {
                        dropExternalFile(jobId, dataverseName, file.getDatasetName(), file.getFileNumber());
                    }
                }
            }
        } catch (HyracksDataException hde) {
            // artifacts.
            if (!hde.getComponent().equals(ErrorCode.HYRACKS) || hde.getErrorCode() != ErrorCode.UPDATE_OR_DELETE_NON_EXISTENT_KEY) {
                throw new MetadataException(hde);
            }
        } finally {
            deleteTupleFromIndex(jobId, MetadataPrimaryIndexes.DATASET_DATASET, datasetTuple);
        }
    } catch (HyracksDataException | ACIDException e) {
        throw new MetadataException(e);
    }
}
Also used : ExtensionMetadataDataset(org.apache.asterix.metadata.api.ExtensionMetadataDataset) Dataset(org.apache.asterix.metadata.entities.Dataset) ITupleReference(org.apache.hyracks.dataflow.common.data.accessors.ITupleReference) IMetadataIndex(org.apache.asterix.metadata.api.IMetadataIndex) Index(org.apache.asterix.metadata.entities.Index) AbstractLSMIndex(org.apache.hyracks.storage.am.lsm.common.impls.AbstractLSMIndex) ILSMIndex(org.apache.hyracks.storage.am.lsm.common.api.ILSMIndex) IIndex(org.apache.hyracks.storage.common.IIndex) ExternalFile(org.apache.asterix.external.indexing.ExternalFile) HyracksDataException(org.apache.hyracks.api.exceptions.HyracksDataException) ACIDException(org.apache.asterix.common.exceptions.ACIDException)

Example 74 with Dataset

use of org.apache.asterix.metadata.entities.Dataset in project asterixdb by apache.

the class MetadataNode method dropDataverse.

@Override
public void dropDataverse(JobId jobId, String dataverseName) throws MetadataException, RemoteException {
    try {
        confirmDataverseCanBeDeleted(jobId, dataverseName);
        List<Dataset> dataverseDatasets;
        Dataset ds;
        dataverseDatasets = getDataverseDatasets(jobId, dataverseName);
        // Drop all datasets in this dataverse.
        for (int i = 0; i < dataverseDatasets.size(); i++) {
            ds = dataverseDatasets.get(i);
            dropDataset(jobId, dataverseName, ds.getDatasetName());
        }
        //After dropping datasets, drop datatypes
        List<Datatype> dataverseDatatypes;
        // As a side effect, acquires an S lock on the 'datatype' dataset
        // on behalf of txnId.
        dataverseDatatypes = getDataverseDatatypes(jobId, dataverseName);
        // Drop all types in this dataverse.
        for (int i = 0; i < dataverseDatatypes.size(); i++) {
            forceDropDatatype(jobId, dataverseName, dataverseDatatypes.get(i).getDatatypeName());
        }
        // As a side effect, acquires an S lock on the 'Function' dataset
        // on behalf of txnId.
        List<Function> dataverseFunctions = getDataverseFunctions(jobId, dataverseName);
        // Drop all functions in this dataverse.
        for (Function function : dataverseFunctions) {
            dropFunction(jobId, new FunctionSignature(dataverseName, function.getName(), function.getArity()));
        }
        // As a side effect, acquires an S lock on the 'Adapter' dataset
        // on behalf of txnId.
        List<DatasourceAdapter> dataverseAdapters = getDataverseAdapters(jobId, dataverseName);
        // Drop all functions in this dataverse.
        for (DatasourceAdapter adapter : dataverseAdapters) {
            dropAdapter(jobId, dataverseName, adapter.getAdapterIdentifier().getName());
        }
        List<Feed> dataverseFeeds;
        List<FeedConnection> feedConnections;
        Feed feed;
        dataverseFeeds = getDataverseFeeds(jobId, dataverseName);
        // Drop all feeds&connections in this dataverse.
        for (int i = 0; i < dataverseFeeds.size(); i++) {
            feed = dataverseFeeds.get(i);
            feedConnections = getFeedConnections(jobId, dataverseName, feed.getFeedName());
            for (FeedConnection feedConnection : feedConnections) {
                dropFeedConnection(jobId, dataverseName, feed.getFeedName(), feedConnection.getDatasetName());
            }
            dropFeed(jobId, dataverseName, feed.getFeedName());
        }
        List<FeedPolicyEntity> feedPolicies = getDataversePolicies(jobId, dataverseName);
        if (feedPolicies != null && feedPolicies.size() > 0) {
            // Drop all feed ingestion policies in this dataverse.
            for (FeedPolicyEntity feedPolicy : feedPolicies) {
                dropFeedPolicy(jobId, dataverseName, feedPolicy.getPolicyName());
            }
        }
        // Delete the dataverse entry from the 'dataverse' dataset.
        ITupleReference searchKey = createTuple(dataverseName);
        // As a side effect, acquires an S lock on the 'dataverse' dataset
        // on behalf of txnId.
        ITupleReference tuple = getTupleToBeDeleted(jobId, MetadataPrimaryIndexes.DATAVERSE_DATASET, searchKey);
        deleteTupleFromIndex(jobId, MetadataPrimaryIndexes.DATAVERSE_DATASET, tuple);
    // TODO: Change this to be a BTree specific exception, e.g.,
    // BTreeKeyDoesNotExistException.
    } catch (HyracksDataException e) {
        if (e.getComponent().equals(ErrorCode.HYRACKS) && e.getErrorCode() == ErrorCode.UPDATE_OR_DELETE_NON_EXISTENT_KEY) {
            throw new MetadataException("Cannot drop dataverse '" + dataverseName + "' because it doesn't exist.", e);
        } else {
            throw new MetadataException(e);
        }
    } catch (ACIDException e) {
        throw new MetadataException(e);
    }
}
Also used : DatasourceAdapter(org.apache.asterix.metadata.entities.DatasourceAdapter) FeedConnection(org.apache.asterix.metadata.entities.FeedConnection) ExtensionMetadataDataset(org.apache.asterix.metadata.api.ExtensionMetadataDataset) Dataset(org.apache.asterix.metadata.entities.Dataset) FunctionSignature(org.apache.asterix.common.functions.FunctionSignature) HyracksDataException(org.apache.hyracks.api.exceptions.HyracksDataException) Datatype(org.apache.asterix.metadata.entities.Datatype) ACIDException(org.apache.asterix.common.exceptions.ACIDException) Function(org.apache.asterix.metadata.entities.Function) FeedPolicyEntity(org.apache.asterix.metadata.entities.FeedPolicyEntity) ITupleReference(org.apache.hyracks.dataflow.common.data.accessors.ITupleReference) Feed(org.apache.asterix.metadata.entities.Feed)

Example 75 with Dataset

use of org.apache.asterix.metadata.entities.Dataset in project asterixdb by apache.

the class MetadataTransactionContext method dropDataset.

public void dropDataset(String dataverseName, String datasetName) {
    Dataset dataset = new Dataset(dataverseName, datasetName, null, null, null, null, null, null, null, null, -1, MetadataUtil.PENDING_NO_OP);
    droppedCache.addDatasetIfNotExists(dataset);
    logAndApply(new MetadataLogicalOperation(dataset, false));
}
Also used : Dataset(org.apache.asterix.metadata.entities.Dataset)

Aggregations

Dataset (org.apache.asterix.metadata.entities.Dataset)77 ArrayList (java.util.ArrayList)33 AlgebricksException (org.apache.hyracks.algebricks.common.exceptions.AlgebricksException)32 Index (org.apache.asterix.metadata.entities.Index)25 LogicalVariable (org.apache.hyracks.algebricks.core.algebra.base.LogicalVariable)23 MetadataException (org.apache.asterix.metadata.MetadataException)19 ARecordType (org.apache.asterix.om.types.ARecordType)19 IAType (org.apache.asterix.om.types.IAType)18 ILogicalExpression (org.apache.hyracks.algebricks.core.algebra.base.ILogicalExpression)18 List (java.util.List)17 ILogicalOperator (org.apache.hyracks.algebricks.core.algebra.base.ILogicalOperator)16 RemoteException (java.rmi.RemoteException)15 AsterixException (org.apache.asterix.common.exceptions.AsterixException)15 MetadataProvider (org.apache.asterix.metadata.declared.MetadataProvider)15 HyracksDataException (org.apache.hyracks.api.exceptions.HyracksDataException)15 IOException (java.io.IOException)14 MetadataTransactionContext (org.apache.asterix.metadata.MetadataTransactionContext)14 CompilationException (org.apache.asterix.common.exceptions.CompilationException)13 AlgebricksPartitionConstraint (org.apache.hyracks.algebricks.common.constraints.AlgebricksPartitionConstraint)12 ACIDException (org.apache.asterix.common.exceptions.ACIDException)11