Search in sources :

Example 6 with InferenceEngineException

use of org.apache.rya.rdftriplestore.inference.InferenceEngineException in project incubator-rya by apache.

the class AccumuloBatchUpdatePCJ method connectToRya.

private Sail connectToRya(final String ryaInstanceName) throws RyaClientException {
    try {
        final AccumuloConnectionDetails connectionDetails = super.getAccumuloConnectionDetails();
        final AccumuloRdfConfiguration ryaConf = new AccumuloRdfConfiguration();
        ryaConf.setTablePrefix(ryaInstanceName);
        ryaConf.set(ConfigUtils.CLOUDBASE_USER, connectionDetails.getUsername());
        ryaConf.set(ConfigUtils.CLOUDBASE_PASSWORD, new String(connectionDetails.getUserPass()));
        ryaConf.set(ConfigUtils.CLOUDBASE_ZOOKEEPERS, connectionDetails.getZookeepers());
        ryaConf.set(ConfigUtils.CLOUDBASE_INSTANCE, connectionDetails.getInstanceName());
        // Turn PCJs off so that we will only scan the core Rya tables while building the PCJ results.
        ryaConf.set(ConfigUtils.USE_PCJ, "false");
        return RyaSailFactory.getInstance(ryaConf);
    } catch (SailException | AccumuloException | AccumuloSecurityException | RyaDAOException | InferenceEngineException e) {
        throw new RyaClientException("Could not connect to the Rya instance named '" + ryaInstanceName + "'.", e);
    }
}
Also used : AccumuloException(org.apache.accumulo.core.client.AccumuloException) RyaClientException(org.apache.rya.api.client.RyaClientException) RyaDAOException(org.apache.rya.api.persist.RyaDAOException) AccumuloSecurityException(org.apache.accumulo.core.client.AccumuloSecurityException) InferenceEngineException(org.apache.rya.rdftriplestore.inference.InferenceEngineException) SailException(org.openrdf.sail.SailException) AccumuloRdfConfiguration(org.apache.rya.accumulo.AccumuloRdfConfiguration)

Example 7 with InferenceEngineException

use of org.apache.rya.rdftriplestore.inference.InferenceEngineException in project incubator-rya by apache.

the class MongoLoadStatementsFile method loadStatements.

@Override
public void loadStatements(final String ryaInstanceName, final Path statementsFile, final RDFFormat format) throws InstanceDoesNotExistException, RyaClientException {
    requireNonNull(ryaInstanceName);
    requireNonNull(statementsFile);
    requireNonNull(format);
    // Ensure the Rya Instance exists.
    if (!instanceExists.exists(ryaInstanceName)) {
        throw new InstanceDoesNotExistException(String.format("There is no Rya instance named '%s'.", ryaInstanceName));
    }
    Sail sail = null;
    SailRepositoryConnection sailRepoConn = null;
    try {
        // Get a Sail object that is connected to the Rya instance.
        final MongoDBRdfConfiguration ryaConf = connectionDetails.build(ryaInstanceName);
        sail = RyaSailFactory.getInstance(ryaConf);
        final SailRepository sailRepo = new SailRepository(sail);
        sailRepoConn = sailRepo.getConnection();
        // Load the file.
        sailRepoConn.add(statementsFile.toFile(), null, format);
    } catch (SailException | RyaDAOException | InferenceEngineException | AccumuloException | AccumuloSecurityException e) {
        throw new RyaClientException("Could not load statements into Rya because of a problem while creating the Sail object.", e);
    } catch (RDFParseException | RepositoryException | IOException e) {
        throw new RyaClientException("Could not load the statements into Rya.", e);
    } finally {
        // Close the resources that were opened.
        if (sailRepoConn != null) {
            try {
                sailRepoConn.close();
            } catch (final RepositoryException e) {
                log.error("Couldn't close the SailRepositoryConnection object.", e);
            }
        }
        if (sail != null) {
            try {
                sail.shutDown();
            } catch (final SailException e) {
                log.error("Couldn't close the Sail object.", e);
            }
        }
    }
}
Also used : AccumuloException(org.apache.accumulo.core.client.AccumuloException) RyaClientException(org.apache.rya.api.client.RyaClientException) SailRepository(org.openrdf.repository.sail.SailRepository) InferenceEngineException(org.apache.rya.rdftriplestore.inference.InferenceEngineException) RepositoryException(org.openrdf.repository.RepositoryException) InstanceDoesNotExistException(org.apache.rya.api.client.InstanceDoesNotExistException) SailException(org.openrdf.sail.SailException) IOException(java.io.IOException) SailRepositoryConnection(org.openrdf.repository.sail.SailRepositoryConnection) Sail(org.openrdf.sail.Sail) RyaDAOException(org.apache.rya.api.persist.RyaDAOException) AccumuloSecurityException(org.apache.accumulo.core.client.AccumuloSecurityException) MongoDBRdfConfiguration(org.apache.rya.mongodb.MongoDBRdfConfiguration) RDFParseException(org.openrdf.rio.RDFParseException)

Example 8 with InferenceEngineException

use of org.apache.rya.rdftriplestore.inference.InferenceEngineException in project incubator-rya by apache.

the class MongoLoadStatements method loadStatements.

@Override
public void loadStatements(final String ryaInstanceName, final Iterable<? extends Statement> statements) throws InstanceDoesNotExistException, RyaClientException {
    requireNonNull(ryaInstanceName);
    requireNonNull(statements);
    // Ensure the Rya Instance exists.
    if (!instanceExists.exists(ryaInstanceName)) {
        throw new InstanceDoesNotExistException(String.format("There is no Rya instance named '%s'.", ryaInstanceName));
    }
    Sail sail = null;
    SailRepositoryConnection sailRepoConn = null;
    try {
        // Get a Sail object that is connected to the Rya instance.
        final MongoDBRdfConfiguration ryaConf = connectionDetails.build(ryaInstanceName);
        sail = RyaSailFactory.getInstance(ryaConf);
        final SailRepository sailRepo = new SailRepository(sail);
        sailRepoConn = sailRepo.getConnection();
        // Load the statements.
        sailRepoConn = sailRepo.getConnection();
        sailRepoConn.add(statements);
    } catch (SailException | RyaDAOException | InferenceEngineException | AccumuloException | AccumuloSecurityException e) {
        throw new RyaClientException("Could not load statements into Rya because of a problem while creating the Sail object.", e);
    } catch (final RepositoryException e) {
        throw new RyaClientException("Could not load the statements into Rya.", e);
    } finally {
        // Close the resources that were opened.
        if (sailRepoConn != null) {
            try {
                sailRepoConn.close();
            } catch (final RepositoryException e) {
                log.error("Couldn't close the SailRepositoryConnection object.", e);
            }
        }
        if (sail != null) {
            try {
                sail.shutDown();
            } catch (final SailException e) {
                log.error("Couldn't close the Sail object.", e);
            }
        }
    }
}
Also used : AccumuloException(org.apache.accumulo.core.client.AccumuloException) RyaClientException(org.apache.rya.api.client.RyaClientException) SailRepository(org.openrdf.repository.sail.SailRepository) InferenceEngineException(org.apache.rya.rdftriplestore.inference.InferenceEngineException) RepositoryException(org.openrdf.repository.RepositoryException) InstanceDoesNotExistException(org.apache.rya.api.client.InstanceDoesNotExistException) SailException(org.openrdf.sail.SailException) SailRepositoryConnection(org.openrdf.repository.sail.SailRepositoryConnection) Sail(org.openrdf.sail.Sail) RyaDAOException(org.apache.rya.api.persist.RyaDAOException) AccumuloSecurityException(org.apache.accumulo.core.client.AccumuloSecurityException) MongoDBRdfConfiguration(org.apache.rya.mongodb.MongoDBRdfConfiguration)

Example 9 with InferenceEngineException

use of org.apache.rya.rdftriplestore.inference.InferenceEngineException in project incubator-rya by apache.

the class StatementPatternStorage method addInferredRanges.

protected void addInferredRanges(String tablePrefix, Job job) throws IOException {
    logger.info("Adding inferences to statement pattern[subject:" + subject_value + ", predicate:" + predicate_value + ", object:" + object_value + "]");
    // inference engine
    AccumuloRyaDAO ryaDAO = new AccumuloRyaDAO();
    InferenceEngine inferenceEngine = new InferenceEngine();
    try {
        AccumuloRdfConfiguration rdfConf = new AccumuloRdfConfiguration(job.getConfiguration());
        rdfConf.setTablePrefix(tablePrefix);
        ryaDAO.setConf(rdfConf);
        try {
            if (!mock) {
                ryaDAO.setConnector(new ZooKeeperInstance(inst, zookeepers).getConnector(user, userP.getBytes(StandardCharsets.UTF_8)));
            } else {
                ryaDAO.setConnector(new MockInstance(inst).getConnector(user, userP.getBytes(StandardCharsets.UTF_8)));
            }
        } catch (Exception e) {
            throw new IOException(e);
        }
        ryaDAO.init();
        inferenceEngine.setConf(rdfConf);
        inferenceEngine.setRyaDAO(ryaDAO);
        inferenceEngine.setSchedule(false);
        inferenceEngine.init();
        // is it subclassof or subpropertyof
        if (RDF.TYPE.equals(predicate_value)) {
            // try subclassof
            Collection<URI> parents = inferenceEngine.findParents(inferenceEngine.getSubClassOfGraph(), (URI) object_value);
            if (parents != null && parents.size() > 0) {
                // add all relationships
                for (URI parent : parents) {
                    Map.Entry<TABLE_LAYOUT, Range> temp = createRange(subject_value, predicate_value, parent);
                    Range range = temp.getValue();
                    if (logger.isDebugEnabled()) {
                        logger.debug("Found subClassOf relationship [type:" + object_value + " is subClassOf:" + parent + "]");
                    }
                    addRange(range);
                }
            }
        } else if (predicate_value != null) {
            // subpropertyof check
            Set<URI> parents = inferenceEngine.findParents(inferenceEngine.getSubPropertyOfGraph(), (URI) predicate_value);
            for (URI parent : parents) {
                Map.Entry<TABLE_LAYOUT, Range> temp = createRange(subject_value, parent, object_value);
                Range range = temp.getValue();
                if (logger.isDebugEnabled()) {
                    logger.debug("Found subPropertyOf relationship [type:" + predicate_value + " is subPropertyOf:" + parent + "]");
                }
                addRange(range);
            }
        }
    } catch (Exception e) {
        logger.error("Exception in adding inferred ranges", e);
        throw new IOException(e);
    } finally {
        if (inferenceEngine != null) {
            try {
                inferenceEngine.destroy();
            } catch (InferenceEngineException e) {
                logger.error("Exception closing InferenceEngine", e);
            }
        }
        if (ryaDAO != null) {
            try {
                ryaDAO.destroy();
            } catch (RyaDAOException e) {
                logger.error("Exception closing ryadao", e);
            }
        }
    }
}
Also used : AccumuloRyaDAO(org.apache.rya.accumulo.AccumuloRyaDAO) Set(java.util.Set) InferenceEngineException(org.apache.rya.rdftriplestore.inference.InferenceEngineException) IOException(java.io.IOException) Range(org.apache.accumulo.core.data.Range) ByteRange(org.apache.rya.api.query.strategy.ByteRange) AccumuloRdfConfiguration(org.apache.rya.accumulo.AccumuloRdfConfiguration) URI(org.openrdf.model.URI) RyaURI(org.apache.rya.api.domain.RyaURI) InferenceEngineException(org.apache.rya.rdftriplestore.inference.InferenceEngineException) MalformedQueryException(org.openrdf.query.MalformedQueryException) IOException(java.io.IOException) RyaDAOException(org.apache.rya.api.persist.RyaDAOException) ZooKeeperInstance(org.apache.accumulo.core.client.ZooKeeperInstance) TABLE_LAYOUT(org.apache.rya.api.RdfCloudTripleStoreConstants.TABLE_LAYOUT) InferenceEngine(org.apache.rya.rdftriplestore.inference.InferenceEngine) MockInstance(org.apache.accumulo.core.client.mock.MockInstance) RyaDAOException(org.apache.rya.api.persist.RyaDAOException) Map(java.util.Map)

Example 10 with InferenceEngineException

use of org.apache.rya.rdftriplestore.inference.InferenceEngineException in project incubator-rya by apache.

the class ParallelEvaluationStrategyImpl method evaluate.

public CloseableIteration<BindingSet, QueryEvaluationException> evaluate(final StatementPattern sp, Collection<BindingSet> bindings) throws QueryEvaluationException {
    final Var subjVar = sp.getSubjectVar();
    final Var predVar = sp.getPredicateVar();
    final Var objVar = sp.getObjectVar();
    final Var cntxtVar = sp.getContextVar();
    List<Map.Entry<Statement, BindingSet>> stmts = new ArrayList<Map.Entry<Statement, BindingSet>>();
    Iteration<? extends Map.Entry<Statement, BindingSet>, QueryEvaluationException> iter;
    if (sp instanceof FixedStatementPattern) {
        Collection<Map.Entry<Statement, BindingSet>> coll = Lists.newArrayList();
        for (BindingSet binding : bindings) {
            Value subjValue = getVarValue(subjVar, binding);
            Value predValue = getVarValue(predVar, binding);
            Value objValue = getVarValue(objVar, binding);
            Resource contxtValue = (Resource) getVarValue(cntxtVar, binding);
            for (Statement st : ((FixedStatementPattern) sp).statements) {
                if (!((subjValue != null && !subjValue.equals(st.getSubject())) || (predValue != null && !predValue.equals(st.getPredicate())) || (objValue != null && !objValue.equals(st.getObject())))) {
                    coll.add(new RdfCloudTripleStoreUtils.CustomEntry<Statement, BindingSet>(st, binding));
                }
            }
        }
        iter = new IteratorIteration(coll.iterator());
    } else if (sp instanceof TransitivePropertySP && ((subjVar != null && subjVar.getValue() != null) || (objVar != null && objVar.getValue() != null)) && sp.getPredicateVar() != null) {
        // if this is a transitive prop ref, we need to make sure that either the subj or obj is not null
        // TODO: Cannot handle a open ended transitive property where subj and obj are null
        // TODO: Should one day handle filling in the subj or obj with bindings and working this
        // TODO: a lot of assumptions, and might be a large set returned causing an OME
        Set<Statement> sts = null;
        try {
            sts = inferenceEngine.findTransitiveProperty((Resource) getVarValue(subjVar), (URI) getVarValue(predVar), getVarValue(objVar), (Resource) getVarValue(cntxtVar));
        } catch (InferenceEngineException e) {
            throw new QueryEvaluationException(e);
        }
        Collection<Map.Entry<Statement, BindingSet>> coll = new ArrayList();
        for (BindingSet binding : bindings) {
            for (Statement st : sts) {
                coll.add(new RdfCloudTripleStoreUtils.CustomEntry<Statement, BindingSet>(st, binding));
            }
        }
        iter = new IteratorIteration(coll.iterator());
    } else {
        for (BindingSet binding : bindings) {
            Value subjValue = getVarValue(subjVar, binding);
            Value predValue = getVarValue(predVar, binding);
            Value objValue = getVarValue(objVar, binding);
            Resource contxtValue = (Resource) getVarValue(cntxtVar, binding);
            if ((subjValue != null && !(subjValue instanceof Resource)) || (predValue != null && !(predValue instanceof URI))) {
                continue;
            }
            stmts.add(new RdfCloudTripleStoreUtils.CustomEntry<Statement, BindingSet>(new NullableStatementImpl((Resource) subjValue, (URI) predValue, objValue, contxtValue), binding));
        }
        if (stmts.size() == 0) {
            return new EmptyIteration();
        }
        iter = ((RdfCloudTripleStoreConnection.StoreTripleSource) tripleSource).getStatements(stmts);
    }
    return new ConvertingIteration<Map.Entry<Statement, BindingSet>, BindingSet, QueryEvaluationException>(iter) {

        @Override
        protected BindingSet convert(Map.Entry<Statement, BindingSet> stbs) throws QueryEvaluationException {
            Statement st = stbs.getKey();
            BindingSet bs = stbs.getValue();
            QueryBindingSet result = new QueryBindingSet(bs);
            // contain a Value for that Var name
            if (subjVar != null && !subjVar.isConstant() && !result.hasBinding(subjVar.getName())) {
                result.addBinding(subjVar.getName(), st.getSubject());
            }
            if (predVar != null && !predVar.isConstant() && !result.hasBinding(predVar.getName())) {
                result.addBinding(predVar.getName(), st.getPredicate());
            }
            if (objVar != null && !objVar.isConstant() && !result.hasBinding(objVar.getName())) {
                result.addBinding(objVar.getName(), st.getObject());
            }
            if (cntxtVar != null && !cntxtVar.isConstant() && !result.hasBinding(cntxtVar.getName()) && st.getContext() != null) {
                result.addBinding(cntxtVar.getName(), st.getContext());
            }
            return result;
        }
    };
}
Also used : QueryBindingSet(org.openrdf.query.algebra.evaluation.QueryBindingSet) BindingSet(org.openrdf.query.BindingSet) Set(java.util.Set) StoreTripleSource(org.apache.rya.rdftriplestore.RdfCloudTripleStoreConnection.StoreTripleSource) ConvertingIteration(info.aduna.iteration.ConvertingIteration) Var(org.openrdf.query.algebra.Var) ArrayList(java.util.ArrayList) URI(org.openrdf.model.URI) FixedStatementPattern(org.apache.rya.rdftriplestore.utils.FixedStatementPattern) QueryBindingSet(org.openrdf.query.algebra.evaluation.QueryBindingSet) BindingSet(org.openrdf.query.BindingSet) Statement(org.openrdf.model.Statement) EmptyIteration(info.aduna.iteration.EmptyIteration) Resource(org.openrdf.model.Resource) InferenceEngineException(org.apache.rya.rdftriplestore.inference.InferenceEngineException) TransitivePropertySP(org.apache.rya.rdftriplestore.utils.TransitivePropertySP) QueryBindingSet(org.openrdf.query.algebra.evaluation.QueryBindingSet) RdfCloudTripleStoreUtils(org.apache.rya.api.RdfCloudTripleStoreUtils) NullableStatementImpl(org.apache.rya.api.utils.NullableStatementImpl) QueryEvaluationException(org.openrdf.query.QueryEvaluationException) Value(org.openrdf.model.Value) IteratorIteration(info.aduna.iteration.IteratorIteration) Collection(java.util.Collection) Map(java.util.Map)

Aggregations

InferenceEngineException (org.apache.rya.rdftriplestore.inference.InferenceEngineException)12 AccumuloException (org.apache.accumulo.core.client.AccumuloException)10 AccumuloSecurityException (org.apache.accumulo.core.client.AccumuloSecurityException)10 RyaDAOException (org.apache.rya.api.persist.RyaDAOException)10 SailException (org.openrdf.sail.SailException)10 RyaClientException (org.apache.rya.api.client.RyaClientException)9 RepositoryException (org.openrdf.repository.RepositoryException)7 Sail (org.openrdf.sail.Sail)7 AccumuloRdfConfiguration (org.apache.rya.accumulo.AccumuloRdfConfiguration)6 InstanceDoesNotExistException (org.apache.rya.api.client.InstanceDoesNotExistException)6 SailRepository (org.openrdf.repository.sail.SailRepository)5 SailRepositoryConnection (org.openrdf.repository.sail.SailRepositoryConnection)5 IOException (java.io.IOException)4 MongoDBRdfConfiguration (org.apache.rya.mongodb.MongoDBRdfConfiguration)4 MalformedQueryException (org.openrdf.query.MalformedQueryException)4 ByteArrayOutputStream (java.io.ByteArrayOutputStream)2 DecimalFormat (java.text.DecimalFormat)2 Map (java.util.Map)2 Set (java.util.Set)2 RyaDetails (org.apache.rya.api.instance.RyaDetails)2