Search in sources :

Example 41 with Index

use of org.apache.asterix.metadata.entities.Index in project asterixdb by apache.

the class TestNodeController method getPrimaryResourceFactory.

public IResourceFactory getPrimaryResourceFactory(IHyracksTaskContext ctx, PrimaryIndexInfo primaryIndexInfo, IStorageComponentProvider storageComponentProvider, Dataset dataset) throws AlgebricksException {
    Dataverse dataverse = new Dataverse(dataset.getDataverseName(), NonTaggedDataFormat.class.getName(), MetadataUtil.PENDING_NO_OP);
    Index index = primaryIndexInfo.getIndex();
    CcApplicationContext appCtx = (CcApplicationContext) ExecutionTestUtil.integrationUtil.cc.getApplicationContext();
    MetadataProvider mdProvider = new MetadataProvider(appCtx, dataverse, storageComponentProvider);
    try {
        return dataset.getResourceFactory(mdProvider, index, primaryIndexInfo.recordType, primaryIndexInfo.metaType, primaryIndexInfo.mergePolicyFactory, primaryIndexInfo.mergePolicyProperties);
    } finally {
        mdProvider.getLocks().unlock();
    }
}
Also used : CcApplicationContext(org.apache.asterix.runtime.utils.CcApplicationContext) ICcApplicationContext(org.apache.asterix.common.dataflow.ICcApplicationContext) MetadataProvider(org.apache.asterix.metadata.declared.MetadataProvider) Index(org.apache.asterix.metadata.entities.Index) NonTaggedDataFormat(org.apache.asterix.runtime.formats.NonTaggedDataFormat) Dataverse(org.apache.asterix.metadata.entities.Dataverse)

Example 42 with Index

use of org.apache.asterix.metadata.entities.Index in project asterixdb by apache.

the class MetadataNode method getIndex.

@Override
public Index getIndex(JobId jobId, String dataverseName, String datasetName, String indexName) throws MetadataException, RemoteException {
    try {
        ITupleReference searchKey = createTuple(dataverseName, datasetName, indexName);
        IndexTupleTranslator tupleReaderWriter = tupleTranslatorProvider.getIndexTupleTranslator(jobId, this, false);
        IValueExtractor<Index> valueExtractor = new MetadataEntityValueExtractor<>(tupleReaderWriter);
        List<Index> results = new ArrayList<>();
        searchIndex(jobId, MetadataPrimaryIndexes.INDEX_DATASET, searchKey, valueExtractor, results);
        if (results.isEmpty()) {
            return null;
        }
        return results.get(0);
    } catch (HyracksDataException e) {
        throw new MetadataException(e);
    }
}
Also used : MetadataEntityValueExtractor(org.apache.asterix.metadata.valueextractors.MetadataEntityValueExtractor) IndexTupleTranslator(org.apache.asterix.metadata.entitytupletranslators.IndexTupleTranslator) ITupleReference(org.apache.hyracks.dataflow.common.data.accessors.ITupleReference) ArrayList(java.util.ArrayList) IMetadataIndex(org.apache.asterix.metadata.api.IMetadataIndex) Index(org.apache.asterix.metadata.entities.Index) AbstractLSMIndex(org.apache.hyracks.storage.am.lsm.common.impls.AbstractLSMIndex) ILSMIndex(org.apache.hyracks.storage.am.lsm.common.api.ILSMIndex) IIndex(org.apache.hyracks.storage.common.IIndex) HyracksDataException(org.apache.hyracks.api.exceptions.HyracksDataException)

Example 43 with Index

use of org.apache.asterix.metadata.entities.Index in project asterixdb by apache.

the class MetadataManager method getDatasetIndexes.

@Override
public List<Index> getDatasetIndexes(MetadataTransactionContext ctx, String dataverseName, String datasetName) throws MetadataException {
    List<Index> datasetIndexes = new ArrayList<>();
    Dataset dataset = findDataset(ctx, dataverseName, datasetName);
    if (dataset == null) {
        return datasetIndexes;
    }
    if (dataset.getDatasetDetails().isTemp()) {
        // for temp datsets
        datasetIndexes = cache.getDatasetIndexes(dataverseName, datasetName);
    } else {
        try {
            // for persistent datasets
            datasetIndexes = metadataNode.getDatasetIndexes(ctx.getJobId(), dataverseName, datasetName);
        } catch (RemoteException e) {
            throw new MetadataException(e);
        }
    }
    return datasetIndexes;
}
Also used : Dataset(org.apache.asterix.metadata.entities.Dataset) ArrayList(java.util.ArrayList) Index(org.apache.asterix.metadata.entities.Index) RemoteException(java.rmi.RemoteException)

Example 44 with Index

use of org.apache.asterix.metadata.entities.Index in project asterixdb by apache.

the class MetadataManager method getIndex.

@Override
public Index getIndex(MetadataTransactionContext ctx, String dataverseName, String datasetName, String indexName) throws MetadataException {
    // First look in the context to see if this transaction created the
    // requested index itself (but the index is still uncommitted).
    Index index = ctx.getIndex(dataverseName, datasetName, indexName);
    if (index != null) {
        // uncommitted.
        return index;
    }
    if (ctx.indexIsDropped(dataverseName, datasetName, indexName)) {
        // in the cache.
        return null;
    }
    index = cache.getIndex(dataverseName, datasetName, indexName);
    if (index != null) {
        // Index is already in the cache, don't add it again.
        return index;
    }
    try {
        index = metadataNode.getIndex(ctx.getJobId(), dataverseName, datasetName, indexName);
    } catch (RemoteException e) {
        throw new MetadataException(e);
    }
    // when this transaction commits.
    if (index != null) {
        ctx.addIndex(index);
    }
    return index;
}
Also used : Index(org.apache.asterix.metadata.entities.Index) RemoteException(java.rmi.RemoteException)

Example 45 with Index

use of org.apache.asterix.metadata.entities.Index in project asterixdb by apache.

the class DatasetUtil method createPrimaryIndexUpsertOp.

/**
     * Creates a primary index upsert operator for a given dataset.
     *
     * @param spec,
     *            the job specification.
     * @param metadataProvider,
     *            the metadata provider.
     * @param dataset,
     *            the dataset to upsert.
     * @param inputRecordDesc,the
     *            record descriptor for an input tuple.
     * @param fieldPermutation,
     *            the field permutation according to the input.
     * @param missingWriterFactory,
     *            the factory for customizing missing value serialization.
     * @return a primary index scan operator and its location constraints.
     * @throws AlgebricksException
     */
public static Pair<IOperatorDescriptor, AlgebricksPartitionConstraint> createPrimaryIndexUpsertOp(JobSpecification spec, MetadataProvider metadataProvider, Dataset dataset, RecordDescriptor inputRecordDesc, int[] fieldPermutation, IMissingWriterFactory missingWriterFactory) throws AlgebricksException {
    int numKeys = dataset.getPrimaryKeys().size();
    int numFilterFields = DatasetUtil.getFilterField(dataset) == null ? 0 : 1;
    ARecordType itemType = (ARecordType) metadataProvider.findType(dataset);
    ARecordType metaItemType = (ARecordType) metadataProvider.findMetaType(dataset);
    try {
        Index primaryIndex = metadataProvider.getIndex(dataset.getDataverseName(), dataset.getDatasetName(), dataset.getDatasetName());
        Pair<IFileSplitProvider, AlgebricksPartitionConstraint> splitsAndConstraint = metadataProvider.getSplitProviderAndConstraints(dataset);
        // prepare callback
        JobId jobId = ((JobEventListenerFactory) spec.getJobletEventListenerFactory()).getJobId();
        int[] primaryKeyFields = new int[numKeys];
        for (int i = 0; i < numKeys; i++) {
            primaryKeyFields[i] = i;
        }
        boolean hasSecondaries = metadataProvider.getDatasetIndexes(dataset.getDataverseName(), dataset.getDatasetName()).size() > 1;
        IStorageComponentProvider storageComponentProvider = metadataProvider.getStorageComponentProvider();
        IModificationOperationCallbackFactory modificationCallbackFactory = dataset.getModificationCallbackFactory(storageComponentProvider, primaryIndex, jobId, IndexOperation.UPSERT, primaryKeyFields);
        ISearchOperationCallbackFactory searchCallbackFactory = dataset.getSearchCallbackFactory(storageComponentProvider, primaryIndex, jobId, IndexOperation.UPSERT, primaryKeyFields);
        IIndexDataflowHelperFactory idfh = new IndexDataflowHelperFactory(storageComponentProvider.getStorageManager(), splitsAndConstraint.first);
        LSMPrimaryUpsertOperatorDescriptor op;
        ITypeTraits[] outputTypeTraits = new ITypeTraits[inputRecordDesc.getFieldCount() + (dataset.hasMetaPart() ? 2 : 1) + numFilterFields];
        ISerializerDeserializer<?>[] outputSerDes = new ISerializerDeserializer[inputRecordDesc.getFieldCount() + (dataset.hasMetaPart() ? 2 : 1) + numFilterFields];
        // add the previous record first
        int f = 0;
        outputSerDes[f] = FormatUtils.getDefaultFormat().getSerdeProvider().getSerializerDeserializer(itemType);
        f++;
        // add the previous meta second
        if (dataset.hasMetaPart()) {
            outputSerDes[f] = FormatUtils.getDefaultFormat().getSerdeProvider().getSerializerDeserializer(metaItemType);
            outputTypeTraits[f] = FormatUtils.getDefaultFormat().getTypeTraitProvider().getTypeTrait(metaItemType);
            f++;
        }
        // add the previous filter third
        int fieldIdx = -1;
        if (numFilterFields > 0) {
            String filterField = DatasetUtil.getFilterField(dataset).get(0);
            String[] fieldNames = itemType.getFieldNames();
            int i = 0;
            for (; i < fieldNames.length; i++) {
                if (fieldNames[i].equals(filterField)) {
                    break;
                }
            }
            fieldIdx = i;
            outputTypeTraits[f] = FormatUtils.getDefaultFormat().getTypeTraitProvider().getTypeTrait(itemType.getFieldTypes()[fieldIdx]);
            outputSerDes[f] = FormatUtils.getDefaultFormat().getSerdeProvider().getSerializerDeserializer(itemType.getFieldTypes()[fieldIdx]);
            f++;
        }
        for (int j = 0; j < inputRecordDesc.getFieldCount(); j++) {
            outputTypeTraits[j + f] = inputRecordDesc.getTypeTraits()[j];
            outputSerDes[j + f] = inputRecordDesc.getFields()[j];
        }
        RecordDescriptor outputRecordDesc = new RecordDescriptor(outputSerDes, outputTypeTraits);
        op = new LSMPrimaryUpsertOperatorDescriptor(spec, outputRecordDesc, fieldPermutation, idfh, missingWriterFactory, modificationCallbackFactory, searchCallbackFactory, dataset.getFrameOpCallbackFactory(), numKeys, itemType, fieldIdx, hasSecondaries);
        return new Pair<>(op, splitsAndConstraint.second);
    } catch (MetadataException me) {
        throw new AlgebricksException(me);
    }
}
Also used : LSMPrimaryUpsertOperatorDescriptor(org.apache.asterix.runtime.operators.LSMPrimaryUpsertOperatorDescriptor) IFileSplitProvider(org.apache.hyracks.dataflow.std.file.IFileSplitProvider) RecordDescriptor(org.apache.hyracks.api.dataflow.value.RecordDescriptor) Index(org.apache.asterix.metadata.entities.Index) AMutableString(org.apache.asterix.om.base.AMutableString) AString(org.apache.asterix.om.base.AString) MetadataException(org.apache.asterix.metadata.MetadataException) AlgebricksPartitionConstraint(org.apache.hyracks.algebricks.common.constraints.AlgebricksPartitionConstraint) IIndexDataflowHelperFactory(org.apache.hyracks.storage.am.common.dataflow.IIndexDataflowHelperFactory) IndexDataflowHelperFactory(org.apache.hyracks.storage.am.common.dataflow.IndexDataflowHelperFactory) JobId(org.apache.asterix.common.transactions.JobId) Pair(org.apache.hyracks.algebricks.common.utils.Pair) IStorageComponentProvider(org.apache.asterix.common.context.IStorageComponentProvider) ITypeTraits(org.apache.hyracks.api.dataflow.value.ITypeTraits) AlgebricksException(org.apache.hyracks.algebricks.common.exceptions.AlgebricksException) JobEventListenerFactory(org.apache.asterix.runtime.job.listener.JobEventListenerFactory) AlgebricksPartitionConstraint(org.apache.hyracks.algebricks.common.constraints.AlgebricksPartitionConstraint) ISerializerDeserializer(org.apache.hyracks.api.dataflow.value.ISerializerDeserializer) ISearchOperationCallbackFactory(org.apache.hyracks.storage.am.common.api.ISearchOperationCallbackFactory) IIndexDataflowHelperFactory(org.apache.hyracks.storage.am.common.dataflow.IIndexDataflowHelperFactory) IModificationOperationCallbackFactory(org.apache.hyracks.storage.am.common.api.IModificationOperationCallbackFactory) ARecordType(org.apache.asterix.om.types.ARecordType)

Aggregations

Index (org.apache.asterix.metadata.entities.Index)53 AlgebricksException (org.apache.hyracks.algebricks.common.exceptions.AlgebricksException)26 Dataset (org.apache.asterix.metadata.entities.Dataset)25 ArrayList (java.util.ArrayList)24 MetadataException (org.apache.asterix.metadata.MetadataException)20 AlgebricksPartitionConstraint (org.apache.hyracks.algebricks.common.constraints.AlgebricksPartitionConstraint)16 ARecordType (org.apache.asterix.om.types.ARecordType)15 IFileSplitProvider (org.apache.hyracks.dataflow.std.file.IFileSplitProvider)15 Pair (org.apache.hyracks.algebricks.common.utils.Pair)14 LogicalVariable (org.apache.hyracks.algebricks.core.algebra.base.LogicalVariable)14 JobSpecification (org.apache.hyracks.api.job.JobSpecification)13 IIndexDataflowHelperFactory (org.apache.hyracks.storage.am.common.dataflow.IIndexDataflowHelperFactory)13 IndexDataflowHelperFactory (org.apache.hyracks.storage.am.common.dataflow.IndexDataflowHelperFactory)13 AsterixException (org.apache.asterix.common.exceptions.AsterixException)12 IAType (org.apache.asterix.om.types.IAType)12 IDataSourceIndex (org.apache.hyracks.algebricks.core.algebra.metadata.IDataSourceIndex)12 IOException (java.io.IOException)11 CompilationException (org.apache.asterix.common.exceptions.CompilationException)11 List (java.util.List)10 HyracksDataException (org.apache.hyracks.api.exceptions.HyracksDataException)10