Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Graphite Fabric | 181 | 7 years ago | 2 | Nginx | ||||||
fabric-graphite is a fabric script to install Graphite and all dependencies on a debian-based host (with optional vagrant support) | ||||||||||
Docker Graphite | 82 | 5 years ago | 7 | Nginx | ||||||
Graphite + Carbon in a docker image | ||||||||||
Jmx2graphite | 77 | 6 months ago | 7 | mit | Java | |||||
JMX to Graphite every x seconds in one command line (Docker based) (also come in Java Agent flavour) | ||||||||||
Graphite Stresser | 32 | 7 years ago | Java | |||||||
A stress testing tool for Graphite | ||||||||||
Graphite Dashgen | 29 | 8 years ago | 7 | November 10, 2014 | mit | Python | ||||
Graphite-dashgen automates the creation of Graphite dashboards | ||||||||||
Diamond Dockercontainercollector | 28 | 6 years ago | 5 | Python | ||||||
A diamond collector for docker containers | ||||||||||
Cadvisor Collectd | 22 | 5 years ago | 1 | apache-2.0 | Python | |||||
Host and Container metrics using CAdvisor and Collectd | ||||||||||
Splunk_graphite | 17 | 5 years ago | 1 | other | Python | |||||
Graphite Output for Splunk App - Enables Graphite Output for Splunk Searches, Events and Alerts. | ||||||||||
Mysql Stats To Graphite | 16 | 9 years ago | 2 | gpl-2.0 | Python | |||||
Graphping | 14 | 3 years ago | 1 | apache-2.0 | Python | |||||
Send ping statistics for multiple hosts to Graphite |
Send ping statistics for multiple hosts to Graphite
This Python program takes a list of hostnames to collect ping statistics for. It uses the 'fping' program to do the real work, and send the results to Graphite. By default, it detaches itself from the console and continues to run in the background.
Until stopped, the program starts an 'fping' for all the given targets and sends 30 ICMP ping packets. The results (min, max, average and mean ping round-trip times, as well as packet loss percentage) are sent to a Graphite server, which by default should be running on localhost:2003. It then sleeps until the next minute on the system clock before it starts another 'fping'. This way, you get one measurement per minute. This is not configurable at this time.
By default, it uses 'ping' as a prefix for the metric name in Graphite, followed by the hostname, with dots replaced by underscores. So for a host named 'host.domain.tld', you would get the following metrics in Graphite, all of which are gauges:
The program uses regular expressions to get useful numbers from the fping ouput, so it depends on specific formatting of the output. The following fping programs are known to be compatible:
The output should look like this (for 2 hosts, 3 packets per host):
www.github.io : [0], 84 bytes, 4.26 ms (4.26 avg, 0% loss)
www.google.com : [0], 84 bytes, 8.35 ms (8.35 avg, 0% loss)
www.github.io : [1], 84 bytes, 3.25 ms (3.75 avg, 0% loss)
www.google.com : [1], 84 bytes, 7.84 ms (8.09 avg, 0% loss)
www.github.io : [2], 84 bytes, 3.32 ms (3.61 avg, 0% loss)
www.google.com : [2], 84 bytes, 7.86 ms (8.01 avg, 0% loss)
www.github.io : xmt/rcv/%loss = 3/3/0%, min/avg/max = 3.25/3.61/4.26
www.google.com : xmt/rcv/%loss = 3/3/0%, min/avg/max = 7.84/8.01/8.35
Set up a Carbon/Whisper retention schema for the ping statistics, keeping one sample per minute for some time, and optionally some aggregations:
[ping]
pattern = ^ping\.
retentions = 1m:1d,5m:31d,60m:366d
At the top of the script, there are some configurable items:
# number of ping packets to send
packets = 30
# Graphite
g_host = 'localhost'
g_port = 2003
g_prefix = 'ping'
# System
fping = '/usr/sbin/fping'
logfile = '/tmp/ping-graphite.log'
If you want to run the program in the foreground instead of daemonized, search the program for a line that says
self.daemonize = True
and change the value to False
.
./graphping <target> [<target> ...]
Graphping is licensed under the Apache License, version 2.0. A copy of the license can be found in the 'COPYING' file and on the web.