Search in sources :

Example 1 with BinaryComparator

use of org.apache.hadoop.hbase.filter.BinaryComparator in project hadoop by apache.

the class TimelineFilterUtils method createFilterForConfsOrMetricsToRetrieve.

/**
   * Create filters for confs or metrics to retrieve. This list includes a
   * configs/metrics family filter and relevant filters for confs/metrics to
   * retrieve, if present.
   *
   * @param <T> Describes the type of column prefix.
   * @param confsOrMetricToRetrieve configs/metrics to retrieve.
   * @param columnFamily config or metric column family.
   * @param columnPrefix config or metric column prefix.
   * @return a filter list.
   * @throws IOException if any problem occurs while creating the filters.
   */
public static <T> Filter createFilterForConfsOrMetricsToRetrieve(TimelineFilterList confsOrMetricToRetrieve, ColumnFamily<T> columnFamily, ColumnPrefix<T> columnPrefix) throws IOException {
    Filter familyFilter = new FamilyFilter(CompareOp.EQUAL, new BinaryComparator(columnFamily.getBytes()));
    if (confsOrMetricToRetrieve != null && !confsOrMetricToRetrieve.getFilterList().isEmpty()) {
        // If confsOrMetricsToRetrive are specified, create a filter list based
        // on it and family filter.
        FilterList filter = new FilterList(familyFilter);
        filter.addFilter(createHBaseFilterList(columnPrefix, confsOrMetricToRetrieve));
        return filter;
    } else {
        // Only the family filter needs to be added.
        return familyFilter;
    }
}
Also used : FamilyFilter(org.apache.hadoop.hbase.filter.FamilyFilter) QualifierFilter(org.apache.hadoop.hbase.filter.QualifierFilter) Filter(org.apache.hadoop.hbase.filter.Filter) SingleColumnValueFilter(org.apache.hadoop.hbase.filter.SingleColumnValueFilter) FilterList(org.apache.hadoop.hbase.filter.FilterList) FamilyFilter(org.apache.hadoop.hbase.filter.FamilyFilter) BinaryComparator(org.apache.hadoop.hbase.filter.BinaryComparator)

Example 2 with BinaryComparator

use of org.apache.hadoop.hbase.filter.BinaryComparator in project hadoop by apache.

the class TimelineFilterUtils method createHBaseSingleColValueFilter.

/**
   * Creates a HBase {@link SingleColumnValueFilter}.
   *
   * @param columnFamily Column Family represented as bytes.
   * @param columnQualifier Column Qualifier represented as bytes.
   * @param value Value.
   * @param compareOp Compare operator.
   * @param filterIfMissing This flag decides if we should filter the row if the
   *     specified column is missing. This is based on the filter's keyMustExist
   *     field.
   * @return a {@link SingleColumnValueFilter} object
   * @throws IOException
   */
private static SingleColumnValueFilter createHBaseSingleColValueFilter(byte[] columnFamily, byte[] columnQualifier, byte[] value, CompareOp compareOp, boolean filterIfMissing) throws IOException {
    SingleColumnValueFilter singleColValFilter = new SingleColumnValueFilter(columnFamily, columnQualifier, compareOp, new BinaryComparator(value));
    singleColValFilter.setLatestVersionOnly(true);
    singleColValFilter.setFilterIfMissing(filterIfMissing);
    return singleColValFilter;
}
Also used : SingleColumnValueFilter(org.apache.hadoop.hbase.filter.SingleColumnValueFilter) BinaryComparator(org.apache.hadoop.hbase.filter.BinaryComparator)

Example 3 with BinaryComparator

use of org.apache.hadoop.hbase.filter.BinaryComparator in project hadoop by apache.

the class ApplicationEntityReader method constructFilterListBasedOnFields.

@Override
protected FilterList constructFilterListBasedOnFields() throws IOException {
    if (!needCreateFilterListBasedOnFields()) {
        // Fetch all the columns. No need of a filter.
        return null;
    }
    FilterList listBasedOnFields = new FilterList(Operator.MUST_PASS_ONE);
    FilterList infoColFamilyList = new FilterList();
    // By default fetch everything in INFO column family.
    FamilyFilter infoColumnFamily = new FamilyFilter(CompareOp.EQUAL, new BinaryComparator(ApplicationColumnFamily.INFO.getBytes()));
    infoColFamilyList.addFilter(infoColumnFamily);
    if (!isSingleEntityRead() && fetchPartialColsFromInfoFamily()) {
        // We can fetch only some of the columns from info family.
        infoColFamilyList.addFilter(createFilterListForColsOfInfoFamily());
    } else {
        // Exclude column prefixes in info column family which are not required
        // based on fields to retrieve.
        excludeFieldsFromInfoColFamily(infoColFamilyList);
    }
    listBasedOnFields.addFilter(infoColFamilyList);
    updateFilterForConfsAndMetricsToRetrieve(listBasedOnFields);
    return listBasedOnFields;
}
Also used : FilterList(org.apache.hadoop.hbase.filter.FilterList) TimelineFilterList(org.apache.hadoop.yarn.server.timelineservice.reader.filter.TimelineFilterList) FamilyFilter(org.apache.hadoop.hbase.filter.FamilyFilter) BinaryComparator(org.apache.hadoop.hbase.filter.BinaryComparator)

Example 4 with BinaryComparator

use of org.apache.hadoop.hbase.filter.BinaryComparator in project hbase by apache.

the class TestSerialization method testCompareFilter.

@Test
public void testCompareFilter() throws Exception {
    Filter f = new RowFilter(CompareOp.EQUAL, new BinaryComparator(Bytes.toBytes("testRowOne-2")));
    byte[] bytes = f.toByteArray();
    Filter ff = RowFilter.parseFrom(bytes);
    assertNotNull(ff);
}
Also used : RowFilter(org.apache.hadoop.hbase.filter.RowFilter) RowFilter(org.apache.hadoop.hbase.filter.RowFilter) PrefixFilter(org.apache.hadoop.hbase.filter.PrefixFilter) Filter(org.apache.hadoop.hbase.filter.Filter) BinaryComparator(org.apache.hadoop.hbase.filter.BinaryComparator) Test(org.junit.Test)

Example 5 with BinaryComparator

use of org.apache.hadoop.hbase.filter.BinaryComparator in project hbase by apache.

the class TestServerSideScanMetricsFromClientSide method testRowsFilteredMetric.

public void testRowsFilteredMetric(Scan baseScan) throws Exception {
    testRowsFilteredMetric(baseScan, null, 0);
    // Row filter doesn't match any row key. All rows should be filtered
    Filter filter = new RowFilter(CompareOp.EQUAL, new BinaryComparator("xyz".getBytes()));
    testRowsFilteredMetric(baseScan, filter, ROWS.length);
    // Filter will return results containing only the first key. Number of entire rows filtered
    // should be 0.
    filter = new FirstKeyOnlyFilter();
    testRowsFilteredMetric(baseScan, filter, 0);
    // Column prefix will find some matching qualifier on each row. Number of entire rows filtered
    // should be 0
    filter = new ColumnPrefixFilter(QUALIFIERS[0]);
    testRowsFilteredMetric(baseScan, filter, 0);
    // Column prefix will NOT find any matching qualifier on any row. All rows should be filtered
    filter = new ColumnPrefixFilter("xyz".getBytes());
    testRowsFilteredMetric(baseScan, filter, ROWS.length);
    // Matching column value should exist in each row. No rows should be filtered.
    filter = new SingleColumnValueFilter(FAMILIES[0], QUALIFIERS[0], CompareOp.EQUAL, VALUE);
    testRowsFilteredMetric(baseScan, filter, 0);
    // No matching column value should exist in any row. Filter all rows
    filter = new SingleColumnValueFilter(FAMILIES[0], QUALIFIERS[0], CompareOp.NOT_EQUAL, VALUE);
    testRowsFilteredMetric(baseScan, filter, ROWS.length);
    List<Filter> filters = new ArrayList<>();
    filters.add(new RowFilter(CompareOp.EQUAL, new BinaryComparator(ROWS[0])));
    filters.add(new RowFilter(CompareOp.EQUAL, new BinaryComparator(ROWS[3])));
    int numberOfMatchingRowFilters = filters.size();
    filter = new FilterList(Operator.MUST_PASS_ONE, filters);
    testRowsFilteredMetric(baseScan, filter, ROWS.length - numberOfMatchingRowFilters);
    filters.clear();
    // array in RegionScanner#nextInternal which should be interpreted as a row being filtered.
    for (int family = 0; family < FAMILIES.length; family++) {
        for (int qualifier = 0; qualifier < QUALIFIERS.length; qualifier++) {
            filters.add(new SingleColumnValueExcludeFilter(FAMILIES[family], QUALIFIERS[qualifier], CompareOp.EQUAL, VALUE));
        }
    }
    filter = new FilterList(Operator.MUST_PASS_ONE, filters);
    testRowsFilteredMetric(baseScan, filter, ROWS.length);
}
Also used : ColumnPrefixFilter(org.apache.hadoop.hbase.filter.ColumnPrefixFilter) RowFilter(org.apache.hadoop.hbase.filter.RowFilter) SingleColumnValueFilter(org.apache.hadoop.hbase.filter.SingleColumnValueFilter) SingleColumnValueExcludeFilter(org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter) ColumnPrefixFilter(org.apache.hadoop.hbase.filter.ColumnPrefixFilter) RowFilter(org.apache.hadoop.hbase.filter.RowFilter) FirstKeyOnlyFilter(org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter) SingleColumnValueExcludeFilter(org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter) Filter(org.apache.hadoop.hbase.filter.Filter) SingleColumnValueFilter(org.apache.hadoop.hbase.filter.SingleColumnValueFilter) FirstKeyOnlyFilter(org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter) ArrayList(java.util.ArrayList) FilterList(org.apache.hadoop.hbase.filter.FilterList) BinaryComparator(org.apache.hadoop.hbase.filter.BinaryComparator)

Aggregations

BinaryComparator (org.apache.hadoop.hbase.filter.BinaryComparator)41 Test (org.junit.Test)18 Filter (org.apache.hadoop.hbase.filter.Filter)15 RowFilter (org.apache.hadoop.hbase.filter.RowFilter)14 Put (org.apache.hadoop.hbase.client.Put)12 SingleColumnValueFilter (org.apache.hadoop.hbase.filter.SingleColumnValueFilter)12 Scan (org.apache.hadoop.hbase.client.Scan)9 QualifierFilter (org.apache.hadoop.hbase.filter.QualifierFilter)9 FilterList (org.apache.hadoop.hbase.filter.FilterList)8 FirstKeyOnlyFilter (org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter)7 PrefixFilter (org.apache.hadoop.hbase.filter.PrefixFilter)7 ArrayList (java.util.ArrayList)6 Cell (org.apache.hadoop.hbase.Cell)5 KeyValue (org.apache.hadoop.hbase.KeyValue)5 Delete (org.apache.hadoop.hbase.client.Delete)5 FamilyFilter (org.apache.hadoop.hbase.filter.FamilyFilter)5 InclusiveStopFilter (org.apache.hadoop.hbase.filter.InclusiveStopFilter)5 RegexStringComparator (org.apache.hadoop.hbase.filter.RegexStringComparator)5 Get (org.apache.hadoop.hbase.client.Get)4 Result (org.apache.hadoop.hbase.client.Result)4