Search in sources :

Example 1 with Gauge

use of org.apache.kafka.common.metrics.Gauge in project kafka by apache.

the class ThreadMetricsTest method shouldAddTotalBlockedTimeMetric.

@Test
public void shouldAddTotalBlockedTimeMetric() {
    // Given:
    final double startTime = 123.45;
    final StreamThreadTotalBlockedTime blockedTime = mock(StreamThreadTotalBlockedTime.class);
    when(blockedTime.compute()).thenReturn(startTime);
    // When:
    ThreadMetrics.addThreadBlockedTimeMetric("burger", blockedTime, streamsMetrics);
    // Then:
    final ArgumentCaptor<Gauge<Double>> captor = gaugeCaptor();
    verify(streamsMetrics).addThreadLevelMutableMetric(eq("blocked-time-ns-total"), eq("The total time the thread spent blocked on kafka in nanoseconds"), eq("burger"), captor.capture());
    assertThat(captor.getValue().value(null, 678L), is(startTime));
}
Also used : StreamThreadTotalBlockedTime(org.apache.kafka.streams.processor.internals.StreamThreadTotalBlockedTime) Gauge(org.apache.kafka.common.metrics.Gauge) Test(org.junit.Test)

Example 2 with Gauge

use of org.apache.kafka.common.metrics.Gauge in project kafka by apache.

the class StreamsMetricsImpl method addThreadLevelMutableMetric.

public <T> void addThreadLevelMutableMetric(final String name, final String description, final String threadId, final Gauge<T> valueProvider) {
    final MetricName metricName = metrics.metricName(name, THREAD_LEVEL_GROUP, description, threadLevelTagMap(threadId));
    synchronized (threadLevelMetrics) {
        threadLevelMetrics.computeIfAbsent(threadSensorPrefix(threadId), tid -> new LinkedList<>()).add(metricName);
        metrics.addMetric(metricName, valueProvider);
    }
}
Also used : Max(org.apache.kafka.common.metrics.stats.Max) RecordingLevel(org.apache.kafka.common.metrics.Sensor.RecordingLevel) Rate(org.apache.kafka.common.metrics.stats.Rate) RocksDBMetricsRecordingTrigger(org.apache.kafka.streams.state.internals.metrics.RocksDBMetricsRecordingTrigger) HashMap(java.util.HashMap) CumulativeCount(org.apache.kafka.common.metrics.stats.CumulativeCount) Deque(java.util.Deque) Supplier(java.util.function.Supplier) ConcurrentMap(java.util.concurrent.ConcurrentMap) LinkedHashMap(java.util.LinkedHashMap) Map(java.util.Map) Metric(org.apache.kafka.common.Metric) MetricName(org.apache.kafka.common.MetricName) WindowedSum(org.apache.kafka.common.metrics.stats.WindowedSum) LinkedList(java.util.LinkedList) Value(org.apache.kafka.common.metrics.stats.Value) Sensor(org.apache.kafka.common.metrics.Sensor) Time(org.apache.kafka.common.utils.Time) MetricConfig(org.apache.kafka.common.metrics.MetricConfig) CumulativeSum(org.apache.kafka.common.metrics.stats.CumulativeSum) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Objects(java.util.Objects) TimeUnit(java.util.concurrent.TimeUnit) Metrics(org.apache.kafka.common.metrics.Metrics) WindowedCount(org.apache.kafka.common.metrics.stats.WindowedCount) Avg(org.apache.kafka.common.metrics.stats.Avg) Min(org.apache.kafka.common.metrics.stats.Min) Gauge(org.apache.kafka.common.metrics.Gauge) StreamsMetrics(org.apache.kafka.streams.StreamsMetrics) Collections(java.util.Collections) MetricName(org.apache.kafka.common.MetricName) LinkedList(java.util.LinkedList)

Example 3 with Gauge

use of org.apache.kafka.common.metrics.Gauge in project kafka by apache.

the class StreamsMetricsImpl method addStoreLevelMutableMetric.

public <T> void addStoreLevelMutableMetric(final String taskId, final String metricsScope, final String storeName, final String name, final String description, final RecordingLevel recordingLevel, final Gauge<T> valueProvider) {
    final MetricName metricName = metrics.metricName(name, STATE_STORE_LEVEL_GROUP, description, storeLevelTagMap(taskId, metricsScope, storeName));
    if (metrics.metric(metricName) == null) {
        final MetricConfig metricConfig = new MetricConfig().recordLevel(recordingLevel);
        final String key = storeSensorPrefix(Thread.currentThread().getName(), taskId, storeName);
        metrics.addMetric(metricName, metricConfig, valueProvider);
        storeLevelMetrics.computeIfAbsent(key, ignored -> new LinkedList<>()).push(metricName);
    }
}
Also used : MetricConfig(org.apache.kafka.common.metrics.MetricConfig) Max(org.apache.kafka.common.metrics.stats.Max) RecordingLevel(org.apache.kafka.common.metrics.Sensor.RecordingLevel) Rate(org.apache.kafka.common.metrics.stats.Rate) RocksDBMetricsRecordingTrigger(org.apache.kafka.streams.state.internals.metrics.RocksDBMetricsRecordingTrigger) HashMap(java.util.HashMap) CumulativeCount(org.apache.kafka.common.metrics.stats.CumulativeCount) Deque(java.util.Deque) Supplier(java.util.function.Supplier) ConcurrentMap(java.util.concurrent.ConcurrentMap) LinkedHashMap(java.util.LinkedHashMap) Map(java.util.Map) Metric(org.apache.kafka.common.Metric) MetricName(org.apache.kafka.common.MetricName) WindowedSum(org.apache.kafka.common.metrics.stats.WindowedSum) LinkedList(java.util.LinkedList) Value(org.apache.kafka.common.metrics.stats.Value) Sensor(org.apache.kafka.common.metrics.Sensor) Time(org.apache.kafka.common.utils.Time) MetricConfig(org.apache.kafka.common.metrics.MetricConfig) CumulativeSum(org.apache.kafka.common.metrics.stats.CumulativeSum) ConcurrentHashMap(java.util.concurrent.ConcurrentHashMap) Objects(java.util.Objects) TimeUnit(java.util.concurrent.TimeUnit) Metrics(org.apache.kafka.common.metrics.Metrics) WindowedCount(org.apache.kafka.common.metrics.stats.WindowedCount) Avg(org.apache.kafka.common.metrics.stats.Avg) Min(org.apache.kafka.common.metrics.stats.Min) Gauge(org.apache.kafka.common.metrics.Gauge) StreamsMetrics(org.apache.kafka.streams.StreamsMetrics) Collections(java.util.Collections) MetricName(org.apache.kafka.common.MetricName) LinkedList(java.util.LinkedList)

Example 4 with Gauge

use of org.apache.kafka.common.metrics.Gauge in project kafka by apache.

the class StreamsMetricsImplTest method shouldAddClientLevelMutableMetric.

@Test
public void shouldAddClientLevelMutableMetric() {
    final Metrics metrics = mock(Metrics.class);
    final RecordingLevel recordingLevel = RecordingLevel.INFO;
    final MetricConfig metricConfig = new MetricConfig().recordLevel(recordingLevel);
    final Gauge<String> valueProvider = (config, now) -> "mutable-value";
    expect(metrics.metricName(METRIC_NAME1, CLIENT_LEVEL_GROUP, DESCRIPTION1, clientLevelTags)).andReturn(metricName1);
    metrics.addMetric(EasyMock.eq(metricName1), eqMetricConfig(metricConfig), eq(valueProvider));
    replay(metrics);
    final StreamsMetricsImpl streamsMetrics = new StreamsMetricsImpl(metrics, CLIENT_ID, VERSION, time);
    streamsMetrics.addClientLevelMutableMetric(METRIC_NAME1, DESCRIPTION1, recordingLevel, valueProvider);
    verify(metrics);
}
Also used : RecordingLevel(org.apache.kafka.common.metrics.Sensor.RecordingLevel) MetricConfig(org.apache.kafka.common.metrics.MetricConfig) CoreMatchers.is(org.hamcrest.CoreMatchers.is) RecordingLevel(org.apache.kafka.common.metrics.Sensor.RecordingLevel) MockTime(org.apache.kafka.common.utils.MockTime) Arrays(java.util.Arrays) Rate(org.apache.kafka.common.metrics.stats.Rate) EasyMock.capture(org.easymock.EasyMock.capture) THREAD_LEVEL_GROUP(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.THREAD_LEVEL_GROUP) ImmutableMetricValue(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.ImmutableMetricValue) RATE_SUFFIX(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.RATE_SUFFIX) CoreMatchers.notNullValue(org.hamcrest.CoreMatchers.notNullValue) Utils.mkMap(org.apache.kafka.common.utils.Utils.mkMap) IArgumentMatcher(org.easymock.IArgumentMatcher) ROLLUP_VALUE(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.ROLLUP_VALUE) Duration(java.time.Duration) Map(java.util.Map) AVG_SUFFIX(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.AVG_SUFFIX) MetricName(org.apache.kafka.common.MetricName) EasyMock.eq(org.easymock.EasyMock.eq) Sensor(org.apache.kafka.common.metrics.Sensor) TOTAL_SUFFIX(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.TOTAL_SUFFIX) Time(org.apache.kafka.common.utils.Time) EasyMock.newCapture(org.easymock.EasyMock.newCapture) MetricConfig(org.apache.kafka.common.metrics.MetricConfig) EasyMock.resetToDefault(org.easymock.EasyMock.resetToDefault) List(java.util.List) Metrics(org.apache.kafka.common.metrics.Metrics) Utils.mkEntry(org.apache.kafka.common.utils.Utils.mkEntry) Matchers.equalTo(org.hamcrest.Matchers.equalTo) KafkaMetric(org.apache.kafka.common.metrics.KafkaMetric) CoreMatchers.equalToObject(org.hamcrest.CoreMatchers.equalToObject) Matchers.greaterThan(org.hamcrest.Matchers.greaterThan) CLIENT_LEVEL_GROUP(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.CLIENT_LEVEL_GROUP) PowerMock.createMock(org.powermock.api.easymock.PowerMock.createMock) StreamsConfig(org.apache.kafka.streams.StreamsConfig) Assert.assertThrows(org.junit.Assert.assertThrows) RunWith(org.junit.runner.RunWith) CoreMatchers.not(org.hamcrest.CoreMatchers.not) CLIENT_ID_TAG(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.CLIENT_ID_TAG) EasyMock.mock(org.easymock.EasyMock.mock) StreamsMetricsImpl.addInvocationRateAndCountToSensor(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.addInvocationRateAndCountToSensor) StreamsMetricsImpl.addAvgAndMaxLatencyToSensor(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.addAvgAndMaxLatencyToSensor) PrepareForTest(org.powermock.core.classloader.annotations.PrepareForTest) EasyMock.replay(org.easymock.EasyMock.replay) MatcherAssert.assertThat(org.hamcrest.MatcherAssert.assertThat) PowerMockRunner(org.powermock.modules.junit4.PowerMockRunner) CoreMatchers.nullValue(org.hamcrest.CoreMatchers.nullValue) EasyMock.anyObject(org.easymock.EasyMock.anyObject) Capture(org.easymock.Capture) EasyMock.anyString(org.easymock.EasyMock.anyString) MAX_SUFFIX(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.MAX_SUFFIX) EasyMock.niceMock(org.easymock.EasyMock.niceMock) Assert.assertTrue(org.junit.Assert.assertTrue) Test(org.junit.Test) LATENCY_SUFFIX(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.LATENCY_SUFFIX) Version(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.Version) EasyMock(org.easymock.EasyMock) EasyMock.expect(org.easymock.EasyMock.expect) TimeUnit(java.util.concurrent.TimeUnit) STATE_STORE_LEVEL_GROUP(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.STATE_STORE_LEVEL_GROUP) CaptureType(org.easymock.CaptureType) Gauge(org.apache.kafka.common.metrics.Gauge) StreamsTestUtils(org.apache.kafka.test.StreamsTestUtils) EasyMock.verify(org.easymock.EasyMock.verify) Collections(java.util.Collections) Assert.assertEquals(org.junit.Assert.assertEquals) PROCESSOR_NODE_LEVEL_GROUP(org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.PROCESSOR_NODE_LEVEL_GROUP) Metrics(org.apache.kafka.common.metrics.Metrics) EasyMock.anyString(org.easymock.EasyMock.anyString) PrepareForTest(org.powermock.core.classloader.annotations.PrepareForTest) Test(org.junit.Test)

Aggregations

Gauge (org.apache.kafka.common.metrics.Gauge)4 Collections (java.util.Collections)3 Map (java.util.Map)3 TimeUnit (java.util.concurrent.TimeUnit)3 MetricName (org.apache.kafka.common.MetricName)3 MetricConfig (org.apache.kafka.common.metrics.MetricConfig)3 Metrics (org.apache.kafka.common.metrics.Metrics)3 Sensor (org.apache.kafka.common.metrics.Sensor)3 RecordingLevel (org.apache.kafka.common.metrics.Sensor.RecordingLevel)3 Rate (org.apache.kafka.common.metrics.stats.Rate)3 Time (org.apache.kafka.common.utils.Time)3 Deque (java.util.Deque)2 HashMap (java.util.HashMap)2 LinkedHashMap (java.util.LinkedHashMap)2 LinkedList (java.util.LinkedList)2 Objects (java.util.Objects)2 ConcurrentHashMap (java.util.concurrent.ConcurrentHashMap)2 ConcurrentMap (java.util.concurrent.ConcurrentMap)2 Supplier (java.util.function.Supplier)2 Metric (org.apache.kafka.common.Metric)2