Search in sources :

Example 6 with Point

use of org.influxdb.dto.Point in project cas by apereo.

the class CasMetricsRepositoryConfiguration method influxDbMetricsWriter.

@ConditionalOnProperty(prefix = "cas.metrics.influxDb", name = "url")
@Bean
@ExportMetricWriter
public GaugeWriter influxDbMetricsWriter() {
    final MetricsProperties.InfluxDb influxDb = casProperties.getMetrics().getInfluxDb();
    final InfluxDbConnectionFactory factory = new InfluxDbConnectionFactory(influxDb);
    return value -> {
        final Point point = Point.measurement(value.getName()).time(value.getTimestamp().getTime(), TimeUnit.MILLISECONDS).addField("value", value.getValue()).addField("name", value.getName()).tag("type", value.getClass().getSimpleName()).build();
        factory.write(point, influxDb.getDatabase());
    };
}
Also used : CasConfigurationProperties(org.apereo.cas.configuration.CasConfigurationProperties) Getter(lombok.Getter) MongoDbConnectionFactory(org.apereo.cas.mongo.MongoDbConnectionFactory) Date(java.util.Date) Autowired(org.springframework.beans.factory.annotation.Autowired) MetricsProperties(org.apereo.cas.configuration.model.core.metrics.MetricsProperties) StatsdMetricWriter(org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter) EnableConfigurationProperties(org.springframework.boot.context.properties.EnableConfigurationProperties) ToString(lombok.ToString) ConditionalOnProperty(org.springframework.boot.autoconfigure.condition.ConditionalOnProperty) MongoTemplate(org.springframework.data.mongodb.core.MongoTemplate) MetricWriter(org.springframework.boot.actuate.metrics.writer.MetricWriter) RedisConnectionFactory(org.springframework.data.redis.connection.RedisConnectionFactory) GaugeWriter(org.springframework.boot.actuate.metrics.writer.GaugeWriter) RedisObjectFactory(org.apereo.cas.redis.core.RedisObjectFactory) RedisMetricRepository(org.springframework.boot.actuate.metrics.repository.redis.RedisMetricRepository) InfluxDbConnectionFactory(org.apereo.cas.influxdb.InfluxDbConnectionFactory) Serializable(java.io.Serializable) ExportMetricWriter(org.springframework.boot.actuate.autoconfigure.ExportMetricWriter) Configuration(org.springframework.context.annotation.Configuration) TimeUnit(java.util.concurrent.TimeUnit) Slf4j(lombok.extern.slf4j.Slf4j) JsonTypeInfo(com.fasterxml.jackson.annotation.JsonTypeInfo) Metric(org.springframework.boot.actuate.metrics.Metric) OpenTsdbGaugeWriter(org.springframework.boot.actuate.metrics.opentsdb.OpenTsdbGaugeWriter) Point(org.influxdb.dto.Point) Bean(org.springframework.context.annotation.Bean) MetricsProperties(org.apereo.cas.configuration.model.core.metrics.MetricsProperties) Point(org.influxdb.dto.Point) InfluxDbConnectionFactory(org.apereo.cas.influxdb.InfluxDbConnectionFactory) ExportMetricWriter(org.springframework.boot.actuate.autoconfigure.ExportMetricWriter) ConditionalOnProperty(org.springframework.boot.autoconfigure.condition.ConditionalOnProperty) Bean(org.springframework.context.annotation.Bean)

Example 7 with Point

use of org.influxdb.dto.Point in project camel by apache.

the class InfluxDbProducer method doInsert.

private void doInsert(Exchange exchange, String dataBaseName, String retentionPolicy) throws InvalidPayloadException {
    if (!endpoint.isBatch()) {
        Point p = exchange.getIn().getMandatoryBody(Point.class);
        try {
            LOG.debug("Writing point {}", p.lineProtocol());
            connection.write(dataBaseName, retentionPolicy, p);
        } catch (Exception ex) {
            exchange.setException(new CamelInfluxDbException(ex));
        }
    } else {
        BatchPoints batchPoints = exchange.getIn().getMandatoryBody(BatchPoints.class);
        try {
            LOG.debug("Writing BatchPoints {}", batchPoints.lineProtocol());
            connection.write(batchPoints);
        } catch (Exception ex) {
            exchange.setException(new CamelInfluxDbException(ex));
        }
    }
}
Also used : BatchPoints(org.influxdb.dto.BatchPoints) Point(org.influxdb.dto.Point) InvalidPayloadException(org.apache.camel.InvalidPayloadException)

Example 8 with Point

use of org.influxdb.dto.Point in project camel by apache.

the class CamelInfluxDbConverterTest method canAddInt.

@Test
public void canAddInt() {
    Map<String, Object> pointInMapFormat = new HashMap<>();
    pointInMapFormat.put(InfluxDbConstants.MEASUREMENT_NAME, "testCPU");
    int value = 99999999;
    pointInMapFormat.put("busy", value);
    Point p = CamelInfluxDbConverters.fromMapToPoint(pointInMapFormat);
    assertNotNull(p);
    String line = p.lineProtocol();
    assertNotNull(line);
    LOG.debug("Int command generated: \"{}\"", line);
    assertTrue(line.contains("busy=99999999"));
}
Also used : HashMap(java.util.HashMap) Point(org.influxdb.dto.Point) Point(org.influxdb.dto.Point) Test(org.junit.Test)

Example 9 with Point

use of org.influxdb.dto.Point in project camel by apache.

the class CamelInfluxDbConverterTest method canAddByte.

@Test
public void canAddByte() {
    Map<String, Object> pointInMapFormat = new HashMap<>();
    pointInMapFormat.put(InfluxDbConstants.MEASUREMENT_NAME, "testCPU");
    byte value = Byte.MAX_VALUE;
    pointInMapFormat.put("busy", value);
    Point p = CamelInfluxDbConverters.fromMapToPoint(pointInMapFormat);
    assertNotNull(p);
    String line = p.lineProtocol();
    assertNotNull(line);
    LOG.debug("Byte command generated: \"{}\"", line);
    assertTrue(line.contains("busy=127"));
}
Also used : HashMap(java.util.HashMap) Point(org.influxdb.dto.Point) Test(org.junit.Test)

Example 10 with Point

use of org.influxdb.dto.Point in project jmxtrans by jmxtrans.

the class InfluxDbWriter method doWrite.

/**
	 * <p>
	 * Each {@link Result} is written as a {@link Point} to InfluxDB
	 * </p>
	 *
	 * <p>
	 * The measurement for the {@link Point} is to {@link Result#getKeyAlias()}
	 * <p>
	 * <a href=
	 * "https://influxdb.com/docs/v0.9/concepts/key_concepts.html#retention-policy">
	 * The retention policy</a> for the measurement is set to "default" unless
	 * overridden in settings:
	 * </p>
	 *
	 * <p>
	 * The write consistency level defaults to "ALL" unless overridden in
	 * settings:
	 *
	 * <ul>
	 * <li>ALL = Write succeeds only if write reached all cluster members.</li>
	 * <li>ANY = Write succeeds if write reached any cluster members.</li>
	 * <li>ONE = Write succeeds if write reached at least one cluster members.
	 * </li>
	 * <li>QUORUM = Write succeeds only if write reached a quorum of cluster
	 * members.</li>
	 * </ul>
	 *
	 * <p>
	 * The time key for the {@link Point} is set to {@link Result#getEpoch()}
	 * </p>
	 *
	 * <p>
	 * All {@link Result#getValues()} are written as fields to the {@link Point}
	 * </p>
	 *
	 * <p>
	 * The following properties from {@link Result} are written as tags to the
	 * {@link Point} unless overriden in settings:
	 *
	 * <ul>
	 * <li>{@link Result#getAttributeName()}</li>
	 * <li>{@link Result#getClassName()}</li>
	 * <li>{@link Result#getObjDomain()}</li>
	 * <li>{@link Result#getTypeName()}</li>
	 * </ul>
	 * <p>
	 * {@link Server#getHost()} is set as a tag on every {@link Point}
	 * </p>
	 *
	 */
@Override
public void doWrite(Server server, Query query, Iterable<Result> results) throws Exception {
    // Creates only if it doesn't already exist
    if (createDatabase)
        influxDB.createDatabase(database);
    BatchPoints.Builder batchPointsBuilder = BatchPoints.database(database).retentionPolicy(retentionPolicy).tag(TAG_HOSTNAME, server.getSource());
    for (Map.Entry<String, String> tag : tags.entrySet()) {
        batchPointsBuilder.tag(tag.getKey(), tag.getValue());
    }
    BatchPoints batchPoints = batchPointsBuilder.consistency(writeConsistency).build();
    for (Result result : results) {
        HashMap<String, Object> filteredValues = newHashMap(Maps.filterValues(result.getValues(), isNotNaN));
        // send the point if filteredValues isn't empty
        if (!filteredValues.isEmpty()) {
            filteredValues.put("_jmx_port", Integer.parseInt(server.getPort()));
            Map<String, String> resultTagsToApply = buildResultTagMap(result);
            Point point = Point.measurement(result.getKeyAlias()).time(result.getEpoch(), MILLISECONDS).tag(resultTagsToApply).fields(filteredValues).build();
            batchPoints.point(point);
        }
    }
    influxDB.write(batchPoints);
}
Also used : BatchPoints(org.influxdb.dto.BatchPoints) Point(org.influxdb.dto.Point) ImmutableMap(com.google.common.collect.ImmutableMap) Maps.newHashMap(com.google.common.collect.Maps.newHashMap) HashMap(java.util.HashMap) TreeMap(java.util.TreeMap) Map(java.util.Map) Result(com.googlecode.jmxtrans.model.Result)

Aggregations

Point (org.influxdb.dto.Point)14 Test (org.junit.Test)8 BatchPoints (org.influxdb.dto.BatchPoints)6 HashMap (java.util.HashMap)5 TreeMap (java.util.TreeMap)2 Matchers.anyString (org.mockito.Matchers.anyString)2 JsonTypeInfo (com.fasterxml.jackson.annotation.JsonTypeInfo)1 ImmutableMap (com.google.common.collect.ImmutableMap)1 Maps.newHashMap (com.google.common.collect.Maps.newHashMap)1 Result (com.googlecode.jmxtrans.model.Result)1 Serializable (java.io.Serializable)1 Date (java.util.Date)1 Map (java.util.Map)1 TimeUnit (java.util.concurrent.TimeUnit)1 Getter (lombok.Getter)1 ToString (lombok.ToString)1 Slf4j (lombok.extern.slf4j.Slf4j)1 InvalidPayloadException (org.apache.camel.InvalidPayloadException)1 CasConfigurationProperties (org.apereo.cas.configuration.CasConfigurationProperties)1 MetricsProperties (org.apereo.cas.configuration.model.core.metrics.MetricsProperties)1