Search in sources :

Example 36 with StatementImpl

use of org.openrdf.model.impl.StatementImpl in project incubator-rya by apache.

the class AccumuloTemporalIndexerTest method testDelete.

@Test
public void testDelete() throws IOException, AccumuloException, AccumuloSecurityException, TableNotFoundException, TableExistsException, NoSuchAlgorithmException {
    // count rows expected to store:
    int rowsStoredExpected = 0;
    ValueFactory vf = new ValueFactoryImpl();
    URI pred1_atTime = vf.createURI(URI_PROPERTY_AT_TIME);
    URI pred2_circa = vf.createURI(URI_PROPERTY_CIRCA);
    final String testDate2014InBRST = "2014-12-31T23:59:59-02:00";
    final String testDate2016InET = "2016-12-31T20:59:59-05:00";
    // These should be stored because they are in the predicate list.
    // BUT they will get converted to the same exact datetime in UTC.
    Statement s1 = new StatementImpl(vf.createURI("foo:subj3"), pred1_atTime, vf.createLiteral(testDate2014InBRST));
    Statement s2 = new StatementImpl(vf.createURI("foo:subj4"), pred2_circa, vf.createLiteral(testDate2016InET));
    tIndexer.storeStatement(convertStatement(s1));
    rowsStoredExpected++;
    tIndexer.storeStatement(convertStatement(s2));
    rowsStoredExpected++;
    tIndexer.flush();
    int rowsStoredActual = printTables("junit testing: Temporal entities stored in testDelete before delete", System.out, null);
    // 4 index entries per statement
    Assert.assertEquals("Number of rows stored.", rowsStoredExpected * 4, rowsStoredActual);
    tIndexer.deleteStatement(convertStatement(s1));
    tIndexer.deleteStatement(convertStatement(s2));
    int afterDeleteRowsStoredActual = printTables("junit testing: Temporal entities stored in testDelete after delete", System.out, null);
    Assert.assertEquals("Number of rows stored after delete.", 0, afterDeleteRowsStoredActual);
}
Also used : RyaStatement(org.apache.rya.api.domain.RyaStatement) Statement(org.openrdf.model.Statement) RdfToRyaConversions.convertStatement(org.apache.rya.api.resolver.RdfToRyaConversions.convertStatement) StatementImpl(org.openrdf.model.impl.StatementImpl) ValueFactoryImpl(org.openrdf.model.impl.ValueFactoryImpl) ValueFactory(org.openrdf.model.ValueFactory) URI(org.openrdf.model.URI) Test(org.junit.Test)

Example 37 with StatementImpl

use of org.openrdf.model.impl.StatementImpl in project incubator-rya by apache.

the class AccumuloTemporalIndexerTest method testStoreStatement.

/**
 * Test method for {@link AccumuloTemporalIndexer#storeStatement(convertStatement(org.openrdf.model.Statement)}
 *
 * @throws NoSuchAlgorithmException
 */
@Test
public void testStoreStatement() throws IOException, AccumuloException, AccumuloSecurityException, TableNotFoundException, TableExistsException, NoSuchAlgorithmException {
    // count rows expected to store:
    int rowsStoredExpected = 0;
    ValueFactory vf = new ValueFactoryImpl();
    URI pred1_atTime = vf.createURI(URI_PROPERTY_AT_TIME);
    URI pred2_circa = vf.createURI(URI_PROPERTY_CIRCA);
    // Should not be stored because they are not in the predicate list
    String validDateStringWithThirteens = "1313-12-13T13:13:13Z";
    tIndexer.storeStatement(convertStatement(new StatementImpl(vf.createURI("foo:subj1"), RDFS.LABEL, vf.createLiteral(validDateStringWithThirteens))));
    // Test: Should not store an improper date, and log a warning (log warning not tested).
    final String invalidDateString = "ThisIsAnInvalidDate";
    // // Silently logs a warning for bad dates.  Old: Set true when we catch the error:
    // boolean catchErrorThrownCorrectly = false;
    // try {
    tIndexer.storeStatement(convertStatement(new StatementImpl(vf.createURI("foo:subj2"), pred1_atTime, vf.createLiteral(invalidDateString))));
    // } catch (IllegalArgumentException e) {
    // catchErrorThrownCorrectly = true;
    // Assert.assertTrue(
    // "Invalid date parse error should include the invalid string. message=" + e.getMessage(),
    // e.getMessage().contains(invalidDateString));
    // }
    // Assert.assertTrue("Invalid date parse error should be thrown for this bad date=" + invalidDateString, catchErrorThrownCorrectly);
    // These are different datetimes instant but from different time zones.
    // This is an arbitrary zone, BRST=Brazil, better if not local.
    // same as "2015-01-01T01:59:59Z"
    final String testDate2014InBRST = "2014-12-31T23:59:59-02:00";
    // next year, same as "2017-01-01T01:59:59Z"
    final String testDate2016InET = "2016-12-31T20:59:59-05:00";
    // These should be stored because they are in the predicate list.
    // BUT they will get converted to the same exact datetime in UTC.
    Statement s3 = new StatementImpl(vf.createURI("foo:subj3"), pred1_atTime, vf.createLiteral(testDate2014InBRST));
    Statement s4 = new StatementImpl(vf.createURI("foo:subj4"), pred2_circa, vf.createLiteral(testDate2016InET));
    tIndexer.storeStatement(convertStatement(s3));
    rowsStoredExpected++;
    tIndexer.storeStatement(convertStatement(s4));
    rowsStoredExpected++;
    // This should not be stored because the object is not a literal
    tIndexer.storeStatement(convertStatement(new StatementImpl(vf.createURI("foo:subj5"), pred1_atTime, vf.createURI("in:valid"))));
    tIndexer.flush();
    int rowsStoredActual = printTables("junit testing: Temporal entities stored in testStoreStatement", null, null);
    // 4 index entries per statement
    assertEquals("Number of rows stored.", rowsStoredExpected * 4, rowsStoredActual);
}
Also used : RyaStatement(org.apache.rya.api.domain.RyaStatement) Statement(org.openrdf.model.Statement) RdfToRyaConversions.convertStatement(org.apache.rya.api.resolver.RdfToRyaConversions.convertStatement) StatementImpl(org.openrdf.model.impl.StatementImpl) ValueFactoryImpl(org.openrdf.model.impl.ValueFactoryImpl) ValueFactory(org.openrdf.model.ValueFactory) URI(org.openrdf.model.URI) Test(org.junit.Test)

Example 38 with StatementImpl

use of org.openrdf.model.impl.StatementImpl in project incubator-rya by apache.

the class KeyParts method keyPartsForQuery.

/**
 * List all the index keys to find for any query.  Set the strategy via the column qualifier, ex: CQ_S_P_AT.
 * Column Family (CF) is the context/named-graph.
 * @param queryInstant
 * @param contraints
 * @return
 */
public static List<KeyParts> keyPartsForQuery(final TemporalInstant queryInstant, final StatementConstraints contraints) {
    final List<KeyParts> keys = new LinkedList<KeyParts>();
    final URI urlNull = new URIImpl("urn:null");
    final Resource currentContext = contraints.getContext();
    final boolean hasSubj = contraints.hasSubject();
    if (contraints.hasPredicates()) {
        for (final URI nextPredicate : contraints.getPredicates()) {
            final Text contraintPrefix = new Text();
            final Statement statement = new ContextStatementImpl(hasSubj ? contraints.getSubject() : urlNull, nextPredicate, urlNull, contraints.getContext());
            if (hasSubj) {
                appendSubjectPredicate(statement, contraintPrefix);
            } else {
                appendPredicate(statement, contraintPrefix);
            }
            keys.add(new KeyParts(contraintPrefix, queryInstant, (currentContext == null) ? "" : currentContext.toString(), hasSubj ? CQ_S_P_AT : CQ_P_AT));
        }
    } else if (contraints.hasSubject()) {
        // and no predicates
        final Text contraintPrefix = new Text();
        final Statement statement = new StatementImpl(contraints.getSubject(), urlNull, urlNull);
        appendSubject(statement, contraintPrefix);
        keys.add(new KeyParts(contraintPrefix, queryInstant, (currentContext == null) ? "" : currentContext.toString(), CQ_S_AT));
    } else {
        // No constraints except possibly a context/named-graph, handled by the CF
        keys.add(new KeyParts(null, queryInstant, (currentContext == null) ? "" : currentContext.toString(), CQ_O_AT));
    }
    return keys;
}
Also used : ContextStatementImpl(org.openrdf.model.impl.ContextStatementImpl) Statement(org.openrdf.model.Statement) StatementImpl(org.openrdf.model.impl.StatementImpl) ContextStatementImpl(org.openrdf.model.impl.ContextStatementImpl) Resource(org.openrdf.model.Resource) URIImpl(org.openrdf.model.impl.URIImpl) Text(org.apache.hadoop.io.Text) URI(org.openrdf.model.URI) LinkedList(java.util.LinkedList)

Example 39 with StatementImpl

use of org.openrdf.model.impl.StatementImpl in project incubator-rya by apache.

the class PcjTablesIT method createAndPopulatePcj.

/**
 * Ensure the method that creates a new PCJ table, scans Rya for matches, and
 * stores them in the PCJ table works.
 * <p>
 * The method being tested is: {@link PcjTables#createAndPopulatePcj(RepositoryConnection, Connector, String, String, String[], Optional)}
 */
@Test
public void createAndPopulatePcj() throws RepositoryException, PcjException, TableNotFoundException, BindingSetConversionException, AccumuloException, AccumuloSecurityException {
    // Load some Triples into Rya.
    final Set<Statement> triples = new HashSet<>();
    triples.add(new StatementImpl(new URIImpl("http://Alice"), new URIImpl("http://hasAge"), new NumericLiteralImpl(14, XMLSchema.INTEGER)));
    triples.add(new StatementImpl(new URIImpl("http://Alice"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
    triples.add(new StatementImpl(new URIImpl("http://Bob"), new URIImpl("http://hasAge"), new NumericLiteralImpl(16, XMLSchema.INTEGER)));
    triples.add(new StatementImpl(new URIImpl("http://Bob"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
    triples.add(new StatementImpl(new URIImpl("http://Charlie"), new URIImpl("http://hasAge"), new NumericLiteralImpl(12, XMLSchema.INTEGER)));
    triples.add(new StatementImpl(new URIImpl("http://Charlie"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
    triples.add(new StatementImpl(new URIImpl("http://Eve"), new URIImpl("http://hasAge"), new NumericLiteralImpl(43, XMLSchema.INTEGER)));
    triples.add(new StatementImpl(new URIImpl("http://Eve"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
    for (final Statement triple : triples) {
        ryaConn.add(triple);
    }
    // Create a PCJ table that will include those triples in its results.
    final String sparql = "SELECT ?name ?age " + "{" + "FILTER(?age < 30) ." + "?name <http://hasAge> ?age." + "?name <http://playsSport> \"Soccer\" " + "}";
    final Connector accumuloConn = cluster.getConnector();
    final String pcjTableName = new PcjTableNameFactory().makeTableName(getRyaInstanceName(), "testPcj");
    // Create and populate the PCJ table.
    final PcjTables pcjs = new PcjTables();
    pcjs.createAndPopulatePcj(ryaConn, accumuloConn, pcjTableName, sparql, new String[] { "name", "age" }, Optional.<PcjVarOrderFactory>absent());
    // Make sure the cardinality was updated.
    final PcjMetadata metadata = pcjs.getPcjMetadata(accumuloConn, pcjTableName);
    assertEquals(3, metadata.getCardinality());
    // Scan Accumulo for the stored results.
    final Multimap<String, BindingSet> fetchedResults = loadPcjResults(accumuloConn, pcjTableName);
    // Ensure the expected results match those that were stored.
    final MapBindingSet alice = new MapBindingSet();
    alice.addBinding("name", new URIImpl("http://Alice"));
    alice.addBinding("age", new NumericLiteralImpl(14, XMLSchema.INTEGER));
    final MapBindingSet bob = new MapBindingSet();
    bob.addBinding("name", new URIImpl("http://Bob"));
    bob.addBinding("age", new NumericLiteralImpl(16, XMLSchema.INTEGER));
    final MapBindingSet charlie = new MapBindingSet();
    charlie.addBinding("name", new URIImpl("http://Charlie"));
    charlie.addBinding("age", new NumericLiteralImpl(12, XMLSchema.INTEGER));
    final Set<BindingSet> results = Sets.<BindingSet>newHashSet(alice, bob, charlie);
    final Multimap<String, BindingSet> expectedResults = HashMultimap.create();
    expectedResults.putAll("name;age", results);
    expectedResults.putAll("age;name", results);
    assertEquals(expectedResults, fetchedResults);
}
Also used : Connector(org.apache.accumulo.core.client.Connector) MapBindingSet(org.openrdf.query.impl.MapBindingSet) VisibilityBindingSet(org.apache.rya.api.model.VisibilityBindingSet) BindingSet(org.openrdf.query.BindingSet) Statement(org.openrdf.model.Statement) URIImpl(org.openrdf.model.impl.URIImpl) LiteralImpl(org.openrdf.model.impl.LiteralImpl) NumericLiteralImpl(org.openrdf.model.impl.NumericLiteralImpl) NumericLiteralImpl(org.openrdf.model.impl.NumericLiteralImpl) StatementImpl(org.openrdf.model.impl.StatementImpl) PcjMetadata(org.apache.rya.indexing.pcj.storage.PcjMetadata) MapBindingSet(org.openrdf.query.impl.MapBindingSet) HashSet(java.util.HashSet) Test(org.junit.Test)

Example 40 with StatementImpl

use of org.openrdf.model.impl.StatementImpl in project incubator-rya by apache.

the class PcjDocumentsIntegrationTest method populatePcj.

/**
 * Ensure when results are already stored in Rya, that we are able to populate
 * the PCJ table for a new SPARQL query using those results.
 * <p>
 * The method being tested is: {@link PcjTables#populatePcj(Connector, String, RepositoryConnection, String)}
 */
@Test
public void populatePcj() throws Exception {
    final MongoDBRyaDAO dao = new MongoDBRyaDAO();
    dao.setConf(new StatefulMongoDBRdfConfiguration(conf, getMongoClient()));
    dao.init();
    final RdfCloudTripleStore ryaStore = new RdfCloudTripleStore();
    ryaStore.setRyaDAO(dao);
    ryaStore.initialize();
    final SailRepositoryConnection ryaConn = new RyaSailRepository(ryaStore).getConnection();
    ryaConn.begin();
    try {
        // Load some Triples into Rya.
        final Set<Statement> triples = new HashSet<>();
        triples.add(new StatementImpl(new URIImpl("http://Alice"), new URIImpl("http://hasAge"), new NumericLiteralImpl(14, XMLSchema.INTEGER)));
        triples.add(new StatementImpl(new URIImpl("http://Alice"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
        triples.add(new StatementImpl(new URIImpl("http://Bob"), new URIImpl("http://hasAge"), new NumericLiteralImpl(16, XMLSchema.INTEGER)));
        triples.add(new StatementImpl(new URIImpl("http://Bob"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
        triples.add(new StatementImpl(new URIImpl("http://Charlie"), new URIImpl("http://hasAge"), new NumericLiteralImpl(12, XMLSchema.INTEGER)));
        triples.add(new StatementImpl(new URIImpl("http://Charlie"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
        triples.add(new StatementImpl(new URIImpl("http://Eve"), new URIImpl("http://hasAge"), new NumericLiteralImpl(43, XMLSchema.INTEGER)));
        triples.add(new StatementImpl(new URIImpl("http://Eve"), new URIImpl("http://playsSport"), new LiteralImpl("Soccer")));
        for (final Statement triple : triples) {
            ryaConn.add(triple);
        }
        // Create a PCJ table that will include those triples in its results.
        final String sparql = "SELECT ?name ?age " + "{" + "FILTER(?age < 30) ." + "?name <http://hasAge> ?age." + "?name <http://playsSport> \"Soccer\" " + "}";
        final String pcjTableName = "testPcj";
        final MongoPcjDocuments pcjs = new MongoPcjDocuments(getMongoClient(), conf.getRyaInstanceName());
        pcjs.createPcj(pcjTableName, sparql);
        // Populate the PCJ table using a Rya connection.
        pcjs.populatePcj(pcjTableName, ryaConn);
        final Collection<BindingSet> fetchedResults = loadPcjResults(pcjTableName);
        // Make sure the cardinality was updated.
        final PcjMetadata metadata = pcjs.getPcjMetadata(pcjTableName);
        assertEquals(3, metadata.getCardinality());
        // Ensure the expected results match those that were stored.
        final MapBindingSet alice = new MapBindingSet();
        alice.addBinding("name", new URIImpl("http://Alice"));
        alice.addBinding("age", new NumericLiteralImpl(14, XMLSchema.INTEGER));
        final MapBindingSet bob = new MapBindingSet();
        bob.addBinding("name", new URIImpl("http://Bob"));
        bob.addBinding("age", new NumericLiteralImpl(16, XMLSchema.INTEGER));
        final MapBindingSet charlie = new MapBindingSet();
        charlie.addBinding("name", new URIImpl("http://Charlie"));
        charlie.addBinding("age", new NumericLiteralImpl(12, XMLSchema.INTEGER));
        final Set<BindingSet> expected = Sets.<BindingSet>newHashSet(alice, bob, charlie);
        assertEquals(expected, fetchedResults);
    } finally {
        ryaConn.close();
        ryaStore.shutDown();
    }
}
Also used : RdfCloudTripleStore(org.apache.rya.rdftriplestore.RdfCloudTripleStore) MapBindingSet(org.openrdf.query.impl.MapBindingSet) VisibilityBindingSet(org.apache.rya.api.model.VisibilityBindingSet) BindingSet(org.openrdf.query.BindingSet) StatefulMongoDBRdfConfiguration(org.apache.rya.mongodb.StatefulMongoDBRdfConfiguration) Statement(org.openrdf.model.Statement) RyaSailRepository(org.apache.rya.rdftriplestore.RyaSailRepository) URIImpl(org.openrdf.model.impl.URIImpl) SailRepositoryConnection(org.openrdf.repository.sail.SailRepositoryConnection) MongoDBRyaDAO(org.apache.rya.mongodb.MongoDBRyaDAO) LiteralImpl(org.openrdf.model.impl.LiteralImpl) NumericLiteralImpl(org.openrdf.model.impl.NumericLiteralImpl) NumericLiteralImpl(org.openrdf.model.impl.NumericLiteralImpl) StatementImpl(org.openrdf.model.impl.StatementImpl) PcjMetadata(org.apache.rya.indexing.pcj.storage.PcjMetadata) MapBindingSet(org.openrdf.query.impl.MapBindingSet) HashSet(java.util.HashSet) Test(org.junit.Test)

Aggregations

StatementImpl (org.openrdf.model.impl.StatementImpl)66 Statement (org.openrdf.model.Statement)40 Test (org.junit.Test)34 URI (org.openrdf.model.URI)32 ValueFactory (org.openrdf.model.ValueFactory)27 ValueFactoryImpl (org.openrdf.model.impl.ValueFactoryImpl)26 Resource (org.openrdf.model.Resource)19 Value (org.openrdf.model.Value)19 URIImpl (org.openrdf.model.impl.URIImpl)19 HashSet (java.util.HashSet)16 LiteralImpl (org.openrdf.model.impl.LiteralImpl)16 NumericLiteralImpl (org.openrdf.model.impl.NumericLiteralImpl)15 ContextStatementImpl (org.openrdf.model.impl.ContextStatementImpl)14 BindingSet (org.openrdf.query.BindingSet)14 QueryEvaluationException (org.openrdf.query.QueryEvaluationException)14 RdfToRyaConversions.convertStatement (org.apache.rya.api.resolver.RdfToRyaConversions.convertStatement)10 PcjTableNameFactory (org.apache.rya.indexing.pcj.storage.accumulo.PcjTableNameFactory)10 QueryBindingSet (org.openrdf.query.algebra.evaluation.QueryBindingSet)10 TupleQuery (org.openrdf.query.TupleQuery)7 RepositoryConnection (org.openrdf.repository.RepositoryConnection)7