Search in sources :

Example 1 with InclusiveStopFilter

use of org.apache.hadoop.hbase.filter.InclusiveStopFilter in project hbase by apache.

the class TestFromClientSide method testFilterAllRecords.

@Test
public void testFilterAllRecords() throws IOException {
    Scan scan = new Scan();
    scan.setBatch(1);
    scan.setCaching(1);
    // Filter out any records
    scan.setFilter(new FilterList(new FirstKeyOnlyFilter(), new InclusiveStopFilter(new byte[0])));
    try (Table table = TEST_UTIL.getConnection().getTable(TableName.NAMESPACE_TABLE_NAME)) {
        try (ResultScanner s = table.getScanner(scan)) {
            assertNull(s.next());
        }
    }
}
Also used : FirstKeyOnlyFilter(org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter) InclusiveStopFilter(org.apache.hadoop.hbase.filter.InclusiveStopFilter) FilterList(org.apache.hadoop.hbase.filter.FilterList) Test(org.junit.Test)

Example 2 with InclusiveStopFilter

use of org.apache.hadoop.hbase.filter.InclusiveStopFilter in project hbase by apache.

the class TestScanner method testFilters.

@Test
public void testFilters() throws IOException {
    try {
        this.region = TEST_UTIL.createLocalHRegion(TESTTABLEDESC, null, null);
        HTestConst.addContent(this.region, HConstants.CATALOG_FAMILY);
        byte[] prefix = Bytes.toBytes("ab");
        Filter newFilter = new PrefixFilter(prefix);
        Scan scan = new Scan();
        scan.setFilter(newFilter);
        rowPrefixFilter(scan);
        byte[] stopRow = Bytes.toBytes("bbc");
        newFilter = new WhileMatchFilter(new InclusiveStopFilter(stopRow));
        scan = new Scan();
        scan.setFilter(newFilter);
        rowInclusiveStopFilter(scan, stopRow);
    } finally {
        HBaseTestingUtil.closeRegionAndWAL(this.region);
    }
}
Also used : PrefixFilter(org.apache.hadoop.hbase.filter.PrefixFilter) InclusiveStopFilter(org.apache.hadoop.hbase.filter.InclusiveStopFilter) WhileMatchFilter(org.apache.hadoop.hbase.filter.WhileMatchFilter) PrefixFilter(org.apache.hadoop.hbase.filter.PrefixFilter) Filter(org.apache.hadoop.hbase.filter.Filter) WhileMatchFilter(org.apache.hadoop.hbase.filter.WhileMatchFilter) InclusiveStopFilter(org.apache.hadoop.hbase.filter.InclusiveStopFilter) Scan(org.apache.hadoop.hbase.client.Scan) Test(org.junit.Test)

Example 3 with InclusiveStopFilter

use of org.apache.hadoop.hbase.filter.InclusiveStopFilter in project hbase by apache.

the class TestScannersWithFilters method testInclusiveStopFilter.

@Test
public void testInclusiveStopFilter() throws Exception {
    // Grab rows from group one
    // If we just use start/stop row, we get total/2 - 1 rows
    long expectedRows = (numRows / 2) - 1;
    long expectedKeys = colsPerRow;
    Scan s = new Scan().withStartRow(Bytes.toBytes("testRowOne-0")).withStopRow(Bytes.toBytes("testRowOne-3"));
    verifyScan(s, expectedRows, expectedKeys);
    // Now use start row with inclusive stop filter
    expectedRows = numRows / 2;
    s = new Scan().withStartRow(Bytes.toBytes("testRowOne-0"));
    s.setFilter(new InclusiveStopFilter(Bytes.toBytes("testRowOne-3")));
    verifyScan(s, expectedRows, expectedKeys);
    // Grab rows from group two
    // If we just use start/stop row, we get total/2 - 1 rows
    expectedRows = (numRows / 2) - 1;
    expectedKeys = colsPerRow;
    s = new Scan().withStartRow(Bytes.toBytes("testRowTwo-0")).withStopRow(Bytes.toBytes("testRowTwo-3"));
    verifyScan(s, expectedRows, expectedKeys);
    // Now use start row with inclusive stop filter
    expectedRows = numRows / 2;
    s = new Scan().withStartRow(Bytes.toBytes("testRowTwo-0"));
    s.setFilter(new InclusiveStopFilter(Bytes.toBytes("testRowTwo-3")));
    verifyScan(s, expectedRows, expectedKeys);
}
Also used : InclusiveStopFilter(org.apache.hadoop.hbase.filter.InclusiveStopFilter) Scan(org.apache.hadoop.hbase.client.Scan) Test(org.junit.Test)

Example 4 with InclusiveStopFilter

use of org.apache.hadoop.hbase.filter.InclusiveStopFilter in project hbase by apache.

the class TestFromClientSide5 method testFilterAllRecords.

@Test
public void testFilterAllRecords() throws IOException {
    Scan scan = new Scan();
    scan.setBatch(1);
    scan.setCaching(1);
    // Filter out any records
    scan.setFilter(new FilterList(new FirstKeyOnlyFilter(), new InclusiveStopFilter(new byte[0])));
    try (Table table = TEST_UTIL.getConnection().getTable(TableName.META_TABLE_NAME)) {
        try (ResultScanner s = table.getScanner(scan)) {
            assertNull(s.next());
        }
    }
}
Also used : FirstKeyOnlyFilter(org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter) InclusiveStopFilter(org.apache.hadoop.hbase.filter.InclusiveStopFilter) FilterList(org.apache.hadoop.hbase.filter.FilterList) Test(org.junit.Test)

Aggregations

InclusiveStopFilter (org.apache.hadoop.hbase.filter.InclusiveStopFilter)4 Test (org.junit.Test)4 Scan (org.apache.hadoop.hbase.client.Scan)2 FilterList (org.apache.hadoop.hbase.filter.FilterList)2 FirstKeyOnlyFilter (org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter)2 Filter (org.apache.hadoop.hbase.filter.Filter)1 PrefixFilter (org.apache.hadoop.hbase.filter.PrefixFilter)1 WhileMatchFilter (org.apache.hadoop.hbase.filter.WhileMatchFilter)1