use of io.dropwizard.metrics.WeightedSnapshot.WeightedSample in project light-4j by networknt.
the class WeightedSnapshotTest method worksWithOverestimatedCollections.
@Test
public void worksWithOverestimatedCollections() throws Exception {
final List<WeightedSample> items = spy(WeightedArray(new long[] { 5, 1, 2, 3, 4 }, new double[] { 1, 2, 3, 2, 2 }));
when(items.size()).thenReturn(6, 5);
final Snapshot other = new WeightedSnapshot(items);
assertThat(other.getValues()).containsOnly(1, 2, 3, 4, 5);
}
use of io.dropwizard.metrics.WeightedSnapshot.WeightedSample in project light-4j by networknt.
the class ExponentiallyDecayingReservoir method update.
/**
* Adds an old value with a fixed timestamp to the reservoir.
*
* @param value the value to be added
* @param timestamp the epoch timestamp of {@code value} in seconds
*/
public void update(long value, long timestamp) {
rescaleIfNeeded();
lockForRegularUsage();
try {
final double itemWeight = weight(timestamp - startTime);
final WeightedSample sample = new WeightedSample(value, itemWeight);
final double priority = itemWeight / (1.0d - ThreadLocalRandom.current().nextDouble());
final long newCount = count.incrementAndGet();
if (newCount <= size) {
values.put(priority, sample);
} else {
Double first = values.firstKey();
if (first < priority && values.putIfAbsent(priority, sample) == null) {
// ensure we always remove an item
while (values.remove(first) == null) {
first = values.firstKey();
}
}
}
} finally {
unlockForRegularUsage();
}
}
use of io.dropwizard.metrics.WeightedSnapshot.WeightedSample in project light-4j by networknt.
the class ExponentiallyDecayingReservoir method rescale.
/* "A common feature of the above techniques—indeed, the key technique that
* allows us to track the decayed weights efficiently—is that they maintain
* counts and other quantities based on g(ti − L), and only scale by g(t − L)
* at query time. But while g(ti −L)/g(t−L) is guaranteed to lie between zero
* and one, the intermediate values of g(ti − L) could become very large. For
* polynomial functions, these values should not grow too large, and should be
* effectively represented in practice by floating point values without loss of
* precision. For exponential functions, these values could grow quite large as
* new values of (ti − L) become large, and potentially exceed the capacity of
* common floating point types. However, since the values stored by the
* algorithms are linear combinations of g values (scaled sums), they can be
* rescaled relative to a new landmark. That is, by the analysis of exponential
* decay in Section III-A, the choice of L does not affect the final result. We
* can therefore multiply each value based on L by a factor of exp(−α(L′ − L)),
* and obtain the correct value as if we had instead computed relative to a new
* landmark L′ (and then use this new L′ at query time). This can be done with
* a linear pass over whatever data structure is being used."
*/
private void rescale(long now, long next) {
if (nextScaleTime.compareAndSet(next, now + RESCALE_THRESHOLD)) {
lockForRescale();
try {
final long oldStartTime = startTime;
this.startTime = currentTimeInSeconds();
final double scalingFactor = exp(-alpha * (startTime - oldStartTime));
final ArrayList<Double> keys = new ArrayList<>(values.keySet());
for (Double key : keys) {
final WeightedSample sample = values.remove(key);
final WeightedSample newSample = new WeightedSample(sample.value, sample.weight * scalingFactor);
values.put(key * scalingFactor, newSample);
}
// make sure the counter is in sync with the number of stored samples.
count.set(values.size());
} finally {
unlockForRescale();
}
}
}
use of io.dropwizard.metrics.WeightedSnapshot.WeightedSample in project light-4j by networknt.
the class WeightedSnapshotTest method worksWithUnderestimatedCollections.
@Test
public void worksWithUnderestimatedCollections() throws Exception {
final List<WeightedSample> items = spy(WeightedArray(new long[] { 5, 1, 2, 3, 4 }, new double[] { 1, 2, 3, 2, 2 }));
when(items.size()).thenReturn(4, 5);
final Snapshot other = new WeightedSnapshot(items);
assertThat(other.getValues()).containsOnly(1, 2, 3, 4, 5);
}
Aggregations