Search in sources :

Example 1 with Repeating

use of com.google.code.tempusfugit.concurrency.annotations.Repeating in project hive by apache.

the class TestTimestampTZWritable method testSeconds.

@Test
@Repeating(repetition = 10)
public void testSeconds() {
    // just 1 VInt
    long seconds = ThreadLocalRandom.current().nextLong(Integer.MAX_VALUE);
    TimestampTZ tstz = new TimestampTZ(seconds, 0, ZoneId.of("UTC"));
    verifyConversion(tstz);
    // 2 VInt
    seconds = ThreadLocalRandom.current().nextLong(Integer.MAX_VALUE) + Integer.MAX_VALUE + 1;
    if (ThreadLocalRandom.current().nextBoolean()) {
        seconds = -seconds;
    }
    tstz.set(seconds, 0, ZoneId.of("UTC"));
    verifyConversion(tstz);
}
Also used : TimestampTZ(org.apache.hadoop.hive.common.type.TimestampTZ) Test(org.junit.Test) Repeating(com.google.code.tempusfugit.concurrency.annotations.Repeating)

Example 2 with Repeating

use of com.google.code.tempusfugit.concurrency.annotations.Repeating in project hive by apache.

the class TestTimestampTZWritable method testSecondsWithNanos.

@Test
@Repeating(repetition = 10)
public void testSecondsWithNanos() {
    long seconds = ThreadLocalRandom.current().nextLong(31556889864403199L);
    if (ThreadLocalRandom.current().nextBoolean()) {
        seconds = -seconds;
    }
    int nanos = ThreadLocalRandom.current().nextInt(999999999) + 1;
    TimestampTZ tstz = new TimestampTZ(seconds, nanos, ZoneId.of("UTC"));
    verifyConversion(tstz);
}
Also used : TimestampTZ(org.apache.hadoop.hive.common.type.TimestampTZ) Test(org.junit.Test) Repeating(com.google.code.tempusfugit.concurrency.annotations.Repeating)

Example 3 with Repeating

use of com.google.code.tempusfugit.concurrency.annotations.Repeating in project hive by apache.

the class TestColumnBuffer method testNullsInSubset.

/**
 * Test if the nulls BitSet is maintained properly when we extract subset from ColumnBuffer.
 * E.g. suppose we have a ColumnBuffer with nulls [0, 0, 1, 0]. When we split it evenly into
 * two subsets, the subsets should have nulls [0, 0] and [1, 0] respectively.
 */
@Test
@Repeating(repetition = 10)
public void testNullsInSubset() {
    prepareNullIndices();
    BitSet nulls = new BitSet(NUM_VARS);
    for (int index : nullIndices) {
        nulls.set(index);
    }
    ColumnBuffer columnBuffer = new ColumnBuffer(type, nulls, vars);
    Random random = ThreadLocalRandom.current();
    int remaining = NUM_VARS;
    while (remaining > 0) {
        int toExtract = random.nextInt(remaining) + 1;
        ColumnBuffer subset = columnBuffer.extractSubset(toExtract);
        verifyNulls(subset, NUM_VARS - remaining);
        remaining -= toExtract;
    }
}
Also used : Random(java.util.Random) ThreadLocalRandom(java.util.concurrent.ThreadLocalRandom) BitSet(java.util.BitSet) Test(org.junit.Test) Repeating(com.google.code.tempusfugit.concurrency.annotations.Repeating)

Aggregations

Repeating (com.google.code.tempusfugit.concurrency.annotations.Repeating)3 Test (org.junit.Test)3 TimestampTZ (org.apache.hadoop.hive.common.type.TimestampTZ)2 BitSet (java.util.BitSet)1 Random (java.util.Random)1 ThreadLocalRandom (java.util.concurrent.ThreadLocalRandom)1