chore: clean up all markdown lint errors in processor plugins (#10157)

This commit is contained in:
Joshua Powers 2021-11-24 11:47:11 -07:00 committed by GitHub
parent 97826bdc73
commit 4605c977da
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
24 changed files with 179 additions and 154 deletions

View File

@ -22,7 +22,7 @@ created.
A typical use-case is gathering metrics once and cloning them to simulate
having several hosts (modifying ``host`` tag).
### Configuration:
## Configuration
```toml
# Apply metric modifications using override semantics.

View File

@ -11,7 +11,8 @@ will overwrite one another.
**Note on large strings being converted to numeric types:** When converting a string value to a numeric type, precision may be lost if the number is too large. The largest numeric type this plugin supports is `float64`, and if a string 'number' exceeds its size limit, accuracy may be lost.
### Configuration
## Configuration
```toml
# Convert values to another metric value type
[[processors.converter]]
@ -46,6 +47,7 @@ will overwrite one another.
### Example
Convert `port` tag to a string field:
```toml
[[processors.converter]]
[processors.converter.tags]
@ -58,6 +60,7 @@ Convert `port` tag to a string field:
```
Convert all `scboard_*` fields to an integer:
```toml
[[processors.converter]]
[processors.converter.fields]
@ -70,6 +73,7 @@ Convert all `scboard_*` fields to an integer:
```
Rename the measurement from a tag value:
```toml
[[processors.converter]]
[processors.converter.tags]

View File

@ -5,11 +5,12 @@ Use the `date` processor to add the metric timestamp as a human readable tag.
A common use is to add a tag that can be used to group by month or year.
A few example usecases include:
1) consumption data for utilities on per month basis
2) bandwidth capacity per month
3) compare energy production or sales on a yearly or monthly basis
### Configuration
## Configuration
```toml
[[processors.date]]
@ -37,16 +38,17 @@ A few example usecases include:
# timezone = "UTC"
```
#### timezone
### timezone
On Windows, only the `Local` and `UTC` zones are available by default. To use
other timezones, set the `ZONEINFO` environment variable to the location of
[`zoneinfo.zip`][zoneinfo]:
```
```text
set ZONEINFO=C:\zoneinfo.zip
```
### Example
## Example
```diff
- throughput lower=10i,upper=1000i,mean=500i 1560540094000000000

View File

@ -2,7 +2,7 @@
Filter metrics whose field values are exact repetitions of the previous values.
### Configuration
## Configuration
```toml
[[processors.dedup]]
@ -10,7 +10,7 @@ Filter metrics whose field values are exact repetitions of the previous values.
dedup_interval = "600s"
```
### Example
## Example
```diff
- cpu,cpu=cpu0 time_idle=42i,time_guest=1i

View File

@ -10,7 +10,8 @@ There are three cases where this processor will insert a configured default fiel
Telegraf minimum version: Telegraf 1.15.0
### Configuration
## Configuration
```toml
## Set default fields on your metric(s) when they are nil or empty
[[processors.defaults]]
@ -22,7 +23,8 @@ Telegraf minimum version: Telegraf 1.15.0
is_error = true
```
### Example
## Example
Ensure a _status\_code_ field with _N/A_ is inserted in the metric when one is not set in the metric by default:
```toml

View File

@ -9,7 +9,7 @@ used for all values, which are not contained in the value_mappings. The
processor supports explicit configuration of a destination tag or field. By default the
source tag or field is overwritten.
### Configuration:
## Configuration
```toml
[[processors.enum]]
@ -25,7 +25,7 @@ source tag or field is overwritten.
dest = "status_code"
## Default value to be used for all values not contained in the mapping
## table. When unset and no match is found, the original field will remain
## table. When unset and no match is found, the original field will remain
## unmodified and the destination tag or field will not be created.
# default = 0
@ -36,7 +36,7 @@ source tag or field is overwritten.
red = 3
```
### Example:
## Example
```diff
- xyzzy status="green" 1502489900000000000
@ -44,6 +44,7 @@ source tag or field is overwritten.
```
With unknown value and no default set:
```diff
- xyzzy status="black" 1502489900000000000
+ xyzzy status="black" 1502489900000000000

View File

@ -9,7 +9,7 @@ Program output on standard error is mirrored to the telegraf log.
Telegraf minimum version: Telegraf 1.15.0
### Caveats
## Caveats
- Metrics with tracking will be considered "delivered" as soon as they are passed
to the external process. There is currently no way to match up which metric
@ -20,7 +20,7 @@ Telegraf minimum version: Telegraf 1.15.0
the requirement that it is serialize-parse symmetrical and does not lose any
critical type data.
### Configuration:
## Configuration
```toml
[[processors.execd]]
@ -33,9 +33,9 @@ Telegraf minimum version: Telegraf 1.15.0
# restart_delay = "10s"
```
### Example
## Example
#### Go daemon example
### Go daemon example
This go daemon reads a metric from stdin, multiplies the "count" field by 2,
and writes the metric back out.
@ -44,55 +44,55 @@ and writes the metric back out.
package main
import (
"fmt"
"os"
"fmt"
"os"
"github.com/influxdata/telegraf/metric"
"github.com/influxdata/telegraf/plugins/parsers/influx"
"github.com/influxdata/telegraf/plugins/serializers"
"github.com/influxdata/telegraf/metric"
"github.com/influxdata/telegraf/plugins/parsers/influx"
"github.com/influxdata/telegraf/plugins/serializers"
)
func main() {
parser := influx.NewStreamParser(os.Stdin)
serializer, _ := serializers.NewInfluxSerializer()
parser := influx.NewStreamParser(os.Stdin)
serializer, _ := serializers.NewInfluxSerializer()
for {
metric, err := parser.Next()
if err != nil {
if err == influx.EOF {
return // stream ended
}
if parseErr, isParseError := err.(*influx.ParseError); isParseError {
fmt.Fprintf(os.Stderr, "parse ERR %v\n", parseErr)
os.Exit(1)
}
fmt.Fprintf(os.Stderr, "ERR %v\n", err)
os.Exit(1)
}
for {
metric, err := parser.Next()
if err != nil {
if err == influx.EOF {
return // stream ended
}
if parseErr, isParseError := err.(*influx.ParseError); isParseError {
fmt.Fprintf(os.Stderr, "parse ERR %v\n", parseErr)
os.Exit(1)
}
fmt.Fprintf(os.Stderr, "ERR %v\n", err)
os.Exit(1)
}
c, found := metric.GetField("count")
if !found {
fmt.Fprintf(os.Stderr, "metric has no count field\n")
os.Exit(1)
}
switch t := c.(type) {
case float64:
t *= 2
metric.AddField("count", t)
case int64:
t *= 2
metric.AddField("count", t)
default:
fmt.Fprintf(os.Stderr, "count is not an unknown type, it's a %T\n", c)
os.Exit(1)
}
b, err := serializer.Serialize(metric)
if err != nil {
fmt.Fprintf(os.Stderr, "ERR %v\n", err)
os.Exit(1)
}
fmt.Fprint(os.Stdout, string(b))
}
c, found := metric.GetField("count")
if !found {
fmt.Fprintf(os.Stderr, "metric has no count field\n")
os.Exit(1)
}
switch t := c.(type) {
case float64:
t *= 2
metric.AddField("count", t)
case int64:
t *= 2
metric.AddField("count", t)
default:
fmt.Fprintf(os.Stderr, "count is not an unknown type, it's a %T\n", c)
os.Exit(1)
}
b, err := serializer.Serialize(metric)
if err != nil {
fmt.Fprintf(os.Stderr, "ERR %v\n", err)
os.Exit(1)
}
fmt.Fprint(os.Stdout, string(b))
}
}
```
@ -103,7 +103,7 @@ to run it, you'd build the binary using go, eg `go build -o multiplier.exe main.
command = ["multiplier.exe"]
```
#### Ruby daemon
### Ruby daemon
- See [Ruby daemon](./examples/multiplier_line_protocol/multiplier_line_protocol.rb)

View File

@ -1,3 +1,5 @@
<!-- markdownlint-disable MD024 -->
# Filepath Processor Plugin
The `filepath` processor plugin maps certain go functions from [path/filepath](https://golang.org/pkg/path/filepath/)
@ -24,7 +26,7 @@ If you plan to apply multiple transformations to the same `tag`/`field`, bear in
Telegraf minimum version: Telegraf 1.15.0
### Configuration
## Configuration
```toml
[[processors.filepath]]
@ -58,9 +60,9 @@ Telegraf minimum version: Telegraf 1.15.0
# tag = "path"
```
### Considerations
## Considerations
#### Clean
### Clean
Even though `clean` is provided a standalone function, it is also invoked when using the `rel` and `dirname` functions,
so there is no need to use it along with them.
@ -83,14 +85,14 @@ Is equivalent to:
tag = "path"
```
#### ToSlash
### ToSlash
The effects of this function are only noticeable on Windows platforms, because of the underlying golang implementation.
### Examples
## Examples
### Basename
#### Basename
```toml
[[processors.filepath]]
[[processors.filepath.basename]]
@ -102,7 +104,7 @@ The effects of this function are only noticeable on Windows platforms, because o
+ my_metric,path="ajob.log" duration_seconds=134 1587920425000000000
```
#### Dirname
### Dirname
```toml
[[processors.filepath]]
@ -116,7 +118,7 @@ The effects of this function are only noticeable on Windows platforms, because o
+ my_metric path="/var/log/batch/ajob.log",folder="/var/log/batch",duration_seconds=134 1587920425000000000
```
#### Stem
### Stem
```toml
[[processors.filepath]]
@ -129,7 +131,7 @@ The effects of this function are only noticeable on Windows platforms, because o
+ my_metric,path="ajob" duration_seconds=134 1587920425000000000
```
#### Clean
### Clean
```toml
[[processors.filepath]]
@ -142,7 +144,7 @@ The effects of this function are only noticeable on Windows platforms, because o
+ my_metric,path="/var/log/batch/ajob.log" duration_seconds=134 1587920425000000000
```
#### Rel
### Rel
```toml
[[processors.filepath]]
@ -156,7 +158,7 @@ The effects of this function are only noticeable on Windows platforms, because o
+ my_metric,path="batch/ajob.log" duration_seconds=134 1587920425000000000
```
#### ToSlash
### ToSlash
```toml
[[processors.filepath]]
@ -169,7 +171,7 @@ The effects of this function are only noticeable on Windows platforms, because o
+ my_metric,path="/var/log/batch/ajob.log" duration_seconds=134 1587920425000000000
```
### Processing paths from tail plugin
## Processing paths from tail plugin
This plugin can be used together with the
[tail input plugn](https://github.com/influxdata/telegraf/tree/master/plugins/inputs/tail) to make modifications
@ -181,9 +183,9 @@ Scenario:
written to the log file following this format: `2020-04-05 11:45:21 total time execution: 70 seconds`
* We want to generate a measurement that captures the duration of the script as a field and includes the `path` as a
tag
* We are interested in the filename without its extensions, since it might be enough information for plotting our
* We are interested in the filename without its extensions, since it might be enough information for plotting our
execution times in a dashboard
* Just in case, we don't want to override the original path (if for some reason we end up having duplicates we might
* Just in case, we don't want to override the original path (if for some reason we end up having duplicates we might
want this information)
For this purpose, we will use the `tail` input plugin, the `grok` parser plugin and the `filepath` processor.
@ -199,7 +201,6 @@ For this purpose, we will use the `tail` input plugin, the `grok` parser plugin
[[processors.filepath.stem]]
tag = "path"
dest = "stempath"
```
The resulting output for a job taking 70 seconds for the mentioned log file would look like:

View File

@ -4,7 +4,7 @@ The `ifname` plugin looks up network interface names using SNMP.
Telegraf minimum version: Telegraf 1.15.0
### Configuration:
## Configuration
```toml
[[processors.ifname]]
@ -66,7 +66,7 @@ Telegraf minimum version: Telegraf 1.15.0
# cache_ttl = "8h"
```
### Example processing:
## Example
Example config:

View File

@ -21,7 +21,7 @@ Use-case of this plugin encompass ensuring certain tags or naming conventions
are adhered to irrespective of input plugin configurations, e.g. by
`taginclude`.
### Configuration:
## Configuration
```toml
# Apply metric modifications using override semantics.

View File

@ -4,6 +4,7 @@ This plugin parses defined fields containing the specified data format and
creates new metrics based on the contents of the field.
## Configuration
```toml
[[processors.parser]]
## The name of the fields whose value will be parsed.
@ -23,7 +24,7 @@ creates new metrics based on the contents of the field.
data_format = "influx"
```
### Example:
## Example
```toml
[[processors.parser]]
@ -32,14 +33,14 @@ creates new metrics based on the contents of the field.
data_format = "logfmt"
```
**Input**:
```
### Input
```text
syslog,appname=influxd,facility=daemon,hostname=http://influxdb.example.org\ (influxdb.example.org),severity=info facility_code=3i,message=" ts=2018-08-09T21:01:48.137963Z lvl=info msg=\"Executing query\" log_id=09p7QbOG000 service=query query=\"SHOW DATABASES\"",procid="6629",severity_code=6i,timestamp=1533848508138040000i,version=1i
```
**Output**:
```
### Output
```text
syslog,appname=influxd,facility=daemon,hostname=http://influxdb.example.org\ (influxdb.example.org),severity=info facility_code=3i,log_id="09p7QbOG000",lvl="info",message=" ts=2018-08-09T21:01:48.137963Z lvl=info msg=\"Executing query\" log_id=09p7QbOG000 service=query query=\"SHOW DATABASES\"",msg="Executing query",procid="6629",query="SHOW DATABASES",service="query",severity_code=6i,timestamp=1533848508138040000i,ts="2018-08-09T21:01:48.137963Z",version=1i
```

View File

@ -8,7 +8,7 @@ formats.
To perform the reverse operation use the [unpivot] processor.
### Configuration
## Configuration
```toml
[[processors.pivot]]
@ -18,7 +18,7 @@ To perform the reverse operation use the [unpivot] processor.
value_key = "value"
```
### Example
## Example
```diff
- cpu,cpu=cpu0,name=time_idle value=42i

View File

@ -8,7 +8,7 @@ If the source was found in tag, the service name will be added as a tag. If the
Telegraf minimum version: Telegraf 1.15.0
### Configuration
## Configuration
```toml
[[processors.port_name]]
@ -30,7 +30,7 @@ Telegraf minimum version: Telegraf 1.15.0
# protocol_field = "proto"
```
### Example
## Example
```diff
- measurement,port=80 field=123 1560540094000000000

View File

@ -2,13 +2,13 @@
The printer processor plugin simple prints every metric passing through it.
### Configuration:
## Configuration
```toml
# Print all metrics that pass through this filter.
[[processors.printer]]
```
### Tags:
## Tags
No tags are applied by this processor.

View File

@ -6,7 +6,7 @@ For tags transforms, if `append` is set to `true`, it will append the transforma
For metrics transforms, `key` denotes the element that should be transformed. Furthermore, `result_key` allows control over the behavior applied in case the resulting `tag` or `field` name already exists.
### Configuration:
## Configuration
```toml
[[processors.regex]]
@ -74,11 +74,12 @@ For metrics transforms, `key` denotes the element that should be transformed. Fu
# replacement = "${1}"
```
### Tags:
## Tags
No tags are applied by this processor.
### Example Output:
```
## Example
```text
nginx_requests,verb=GET,resp_code=2xx request="/api/search/?category=plugins&q=regex&sort=asc",method="/search/",category="plugins",referrer="-",ident="-",http_version=1.1,agent="UserAgent",client_ip="127.0.0.1",auth="-",resp_bytes=270i 1519652321000000000
```

View File

@ -2,7 +2,7 @@
The `rename` processor renames measurements, fields, and tags.
### Configuration:
## Configuration
```toml
[[processors.rename]]
@ -24,11 +24,11 @@ The `rename` processor renames measurements, fields, and tags.
dest = "max"
```
### Tags:
## Tags
No tags are applied by this processor, though it can alter them by renaming.
### Example processing:
## Example
```diff
- network_interface_throughput,hostname=backend.example.com lower=10i,upper=1000i,mean=500i 1502489900000000000

View File

@ -5,7 +5,7 @@ IPs in them.
Telegraf minimum version: Telegraf 1.15.0
### Configuration:
## Configuration
```toml
[[processors.reverse_dns]]
@ -55,9 +55,7 @@ Telegraf minimum version: Telegraf 1.15.0
## processors.converter after this one, specifying the order attribute.
```
### Example processing:
## Example
example config:

View File

@ -4,7 +4,7 @@ Use the `s2geo` processor to add tag with S2 cell ID token of specified [cell le
The tag is used in `experimental/geo` Flux package functions.
The `lat` and `lon` fields values should contain WGS-84 coordinates in decimal degrees.
### Configuration
## Configuration
```toml
[[processors.s2geo]]
@ -20,7 +20,7 @@ The `lat` and `lon` fields values should contain WGS-84 coordinates in decimal d
# cell_level = 9
```
### Example
## Example
```diff
- mta,area=llir,id=GO505_20_2704,status=1 lat=40.878738,lon=-72.517572 1560540094

View File

@ -14,7 +14,7 @@ functions.
Telegraf minimum version: Telegraf 1.15.0
### Configuration
## Configuration
```toml
[[processors.starlark]]
@ -25,7 +25,7 @@ Telegraf minimum version: Telegraf 1.15.0
## Source of the Starlark script.
source = '''
def apply(metric):
return metric
return metric
'''
## File containing a Starlark script.
@ -39,7 +39,7 @@ def apply(metric):
# debug_mode = true
```
### Usage
## Usage
The Starlark code should contain a function called `apply` that takes a metric as
its single argument. The function will be called with each metric, and can
@ -47,7 +47,7 @@ return `None`, a single metric, or a list of metrics.
```python
def apply(metric):
return metric
return metric
```
For a list of available types and functions that can be used in the code, see
@ -90,7 +90,8 @@ While Starlark is similar to Python, there are important differences to note:
- It is not possible to open files or sockets.
- These common keywords are **not supported** in the Starlark grammar:
```
```text
as finally nonlocal
assert from raise
class global try
@ -102,10 +103,10 @@ While Starlark is similar to Python, there are important differences to note:
The ability to load external scripts other than your own is pretty limited. The following libraries are available for loading:
* json: `load("json.star", "json")` provides the following functions: `json.encode()`, `json.decode()`, `json.indent()`. See [json.star](/plugins/processors/starlark/testdata/json.star) for an example. For more details about the functions, please refer to [the documentation of this library](https://pkg.go.dev/go.starlark.net/lib/json).
* log: `load("logging.star", "log")` provides the following functions: `log.debug()`, `log.info()`, `log.warn()`, `log.error()`. See [logging.star](/plugins/processors/starlark/testdata/logging.star) for an example.
* math: `load("math.star", "math")` provides [the following functions and constants](https://pkg.go.dev/go.starlark.net/lib/math). See [math.star](/plugins/processors/starlark/testdata/math.star) for an example.
* time: `load("time.star", "time")` provides the following functions: `time.from_timestamp()`, `time.is_valid_timezone()`, `time.now()`, `time.parse_duration()`, `time.parseTime()`, `time.time()`. See [time_date.star](/plugins/processors/starlark/testdata/time_date.star), [time_duration.star](/plugins/processors/starlark/testdata/time_duration.star) and/or [time_timestamp.star](/plugins/processors/starlark/testdata/time_timestamp.star) for an example. For more details about the functions, please refer to [the documentation of this library](https://pkg.go.dev/go.starlark.net/lib/time).
- json: `load("json.star", "json")` provides the following functions: `json.encode()`, `json.decode()`, `json.indent()`. See [json.star](/plugins/processors/starlark/testdata/json.star) for an example. For more details about the functions, please refer to [the documentation of this library](https://pkg.go.dev/go.starlark.net/lib/json).
- log: `load("logging.star", "log")` provides the following functions: `log.debug()`, `log.info()`, `log.warn()`, `log.error()`. See [logging.star](/plugins/processors/starlark/testdata/logging.star) for an example.
- math: `load("math.star", "math")` provides [the following functions and constants](https://pkg.go.dev/go.starlark.net/lib/math). See [math.star](/plugins/processors/starlark/testdata/math.star) for an example.
- time: `load("time.star", "time")` provides the following functions: `time.from_timestamp()`, `time.is_valid_timezone()`, `time.now()`, `time.parse_duration()`, `time.parseTime()`, `time.time()`. See [time_date.star](/plugins/processors/starlark/testdata/time_date.star), [time_duration.star](/plugins/processors/starlark/testdata/time_duration.star) and/or [time_timestamp.star](/plugins/processors/starlark/testdata/time_timestamp.star) for an example. For more details about the functions, please refer to [the documentation of this library](https://pkg.go.dev/go.starlark.net/lib/time).
If you would like to see support for something else here, please open an issue.
@ -167,7 +168,7 @@ def apply(metric):
**How can I save values across multiple calls to the script?**
Telegraf freezes the global scope, which prevents it from being modified, except for a special shared global dictionary
Telegraf freezes the global scope, which prevents it from being modified, except for a special shared global dictionary
named `state`, this can be used by the `apply` function.
See an example of this in [compare with previous metric](/plugins/processors/starlark/testdata/compare_metrics.star)
@ -194,6 +195,7 @@ def apply(metric):
def failing(metric):
json.decode("non-json-content")
```
**How to reuse the same script but with different parameters?**
In case you have a generic script that you would like to reuse for different instances of the plugin, you can use constants as input parameters of your script.

View File

@ -3,6 +3,7 @@
The `strings` plugin maps certain go string functions onto measurement, tag, and field values. Values can be modified in place or stored in another key.
Implemented functions are:
- lowercase
- uppercase
- titlecase
@ -22,9 +23,9 @@ Specify the `measurement`, `tag`, `tag_key`, `field`, or `field_key` that you wa
If you'd like to apply the change to every `tag`, `tag_key`, `field`, `field_key`, or `measurement`, use the value `"*"` for each respective field. Note that the `dest` field will be ignored if `"*"` is used.
If you'd like to apply multiple processings to the same `tag_key` or `field_key`, note the process order stated above. See [Example 2]() for an example.
If you'd like to apply multiple processings to the same `tag_key` or `field_key`, note the process order stated above. See the second example below for an example.
### Configuration:
## Configuration
```toml
[[processors.strings]]
@ -87,16 +88,16 @@ If you'd like to apply multiple processings to the same `tag_key` or `field_key`
# replacement = ""
```
#### Trim, TrimLeft, TrimRight
### Trim, TrimLeft, TrimRight
The `trim`, `trim_left`, and `trim_right` functions take an optional parameter: `cutset`. This value is a string containing the characters to remove from the value.
#### TrimPrefix, TrimSuffix
### TrimPrefix, TrimSuffix
The `trim_prefix` and `trim_suffix` functions remote the given `prefix` or `suffix`
respectively from the string.
#### Replace
### Replace
The `replace` function does a substring replacement across the entire
string to allow for different conventions between various input and output
@ -106,8 +107,10 @@ Can also be used to eliminate unneeded chars that were in metrics.
If the entire name would be deleted, it will refuse to perform
the operation and keep the old name.
### Example
**Config**
## Example
A sample configuration:
```toml
[[processors.strings]]
[[processors.strings.lowercase]]
@ -122,18 +125,22 @@ the operation and keep the old name.
dest = "cs-host_normalised"
```
**Input**
```
Sample input:
```text
iis_log,method=get,uri_stem=/API/HealthCheck cs-host="MIXEDCASE_host",http_version=1.1 1519652321000000000
```
**Output**
```
Sample output:
```text
iis_log,method=get,uri_stem=healthcheck cs-host="MIXEDCASE_host",http_version=1.1,cs-host_normalised="MIXEDCASE_HOST" 1519652321000000000
```
### Example 2
**Config**
### Second Example
A sample configuration:
```toml
[[processors.strings]]
[[processors.strings.lowercase]]
@ -145,12 +152,14 @@ iis_log,method=get,uri_stem=healthcheck cs-host="MIXEDCASE_host",http_version=1.
new = "_"
```
**Input**
```
Sample input:
```text
iis_log,URI-Stem=/API/HealthCheck http_version=1.1 1519652321000000000
```
**Output**
```
Sample output:
```text
iis_log,uri_stem=/API/HealthCheck http_version=1.1 1519652321000000000
```

View File

@ -8,7 +8,7 @@ This can be useful when dealing with output systems (e.g. Stackdriver) that
impose hard limits on the number of tags/labels per metric or where high
levels of cardinality are computationally and/or financially expensive.
### Configuration
## Configuration
```toml
[[processors.tag_limit]]
@ -19,7 +19,7 @@ levels of cardinality are computationally and/or financially expensive.
keep = ["environment", "region"]
```
### Example
## Example
```diff
+ throughput month=Jun,environment=qa,region=us-east1,lower=10i,upper=1000i,mean=500i 1560540094000000000

View File

@ -10,7 +10,7 @@ timestamp using the [interface in `/template_metric.go`](template_metric.go).
Read the full [Go Template Documentation][].
### Configuration
## Configuration
```toml
[[processors.template]]
@ -23,9 +23,10 @@ Read the full [Go Template Documentation][].
template = '{{ .Tag "hostname" }}.{{ .Tag "level" }}'
```
### Example
## Example
Combine multiple tags to create a single tag:
```toml
[[processors.template]]
tag = "topic"
@ -38,6 +39,7 @@ Combine multiple tags to create a single tag:
```
Add measurement name as a tag:
```toml
[[processors.template]]
tag = "measurement"
@ -50,6 +52,7 @@ Add measurement name as a tag:
```
Add the year as a tag, similar to the date processor:
```toml
[[processors.template]]
tag = "year"

View File

@ -4,17 +4,18 @@ The TopK processor plugin is a filter designed to get the top series over a peri
This processor goes through these steps when processing a batch of metrics:
1. Groups measurements in buckets based on their tags and name
2. Every N seconds, for each bucket, for each selected field: aggregate all the measurements using a given aggregation function (min, sum, mean, etc) and the field.
3. For each computed aggregation: order the buckets by the aggregation, then returns all measurements in the top `K` buckets
1. Groups measurements in buckets based on their tags and name
2. Every N seconds, for each bucket, for each selected field: aggregate all the measurements using a given aggregation function (min, sum, mean, etc) and the field.
3. For each computed aggregation: order the buckets by the aggregation, then returns all measurements in the top `K` buckets
Notes:
* The deduplicates metrics
* The name of the measurement is always used when grouping it
* Depending on the amount of metrics on each bucket, more than `K` series may be returned
* If a measurement does not have one of the selected fields, it is dropped from the aggregation
### Configuration:
* The deduplicates metrics
* The name of the measurement is always used when grouping it
* Depending on the amount of metrics on each bucket, more than `K` series may be returned
* If a measurement does not have one of the selected fields, it is dropped from the aggregation
## Configuration
```toml
[[processors.topk]]
@ -60,18 +61,18 @@ Notes:
# add_aggregate_fields = []
```
### Tags:
### Tags
This processor does not add tags by default. But the setting `add_groupby_tag` will add a tag if set to anything other than ""
### Fields:
### Fields
This processor does not add fields by default. But the settings `add_rank_fields` and `add_aggregation_fields` will add one or several fields if set to anything other than ""
### Example
**Config**
Below is an example configuration:
```toml
[[processors.topk]]
period = 20
@ -80,7 +81,8 @@ This processor does not add fields by default. But the settings `add_rank_fields
fields = ["cpu_usage"]
```
**Output difference with topk**
Output difference with topk:
```diff
< procstat,pid=2088,process_name=Xorg cpu_usage=7.296576662282613 1546473820000000000
< procstat,pid=2780,process_name=ibus-engine-simple cpu_usage=0 1546473820000000000

View File

@ -4,7 +4,7 @@ You can use the `unpivot` processor to rotate a multi field series into single v
To perform the reverse operation use the [pivot] processor.
### Configuration
## Configuration
```toml
[[processors.unpivot]]
@ -14,7 +14,7 @@ To perform the reverse operation use the [pivot] processor.
value_key = "value"
```
### Example
## Example
```diff
- cpu,cpu=cpu0 time_idle=42i,time_user=43i
@ -23,4 +23,3 @@ To perform the reverse operation use the [pivot] processor.
```
[pivot]: /plugins/processors/pivot/README.md