Mutate string processors
The substitute_string
processor matches a key’s value against a regular expression (regex) and replaces all returned matches with a replacement string.
You can configure the substitute_string
processor with the following options.
Usage
To get started, create the following pipeline.yaml
file:
copy
Next, create a log file named logs_json.log
. After that, replace the path
of the file source in your pipeline.yaml
file with your file path. For more detailed information, see Configuring Data Prepper.
Before you run Data Prepper, the source appears in the following format:
{"message": "ab:cd:ab:cd"}
After you run Data Prepper, the source is converted to the following format:
{"message": "ab-cd-ab-cd"}
from
defines which string is replaced, and to
defines the string that replaces the from
string. In the preceding example, string ab:cd:ab:cd
becomes ab-cd-ab-cd
. If the from
regex string does not return a match, the key is returned without any changes.
The split_string
processor splits a field into an array using a delimiter character.
Configuration
You can configure the split_string
processor with the following options.
To get started, create the following pipeline.yaml
file:
pipeline:
file:
record_type: "event"
format: "json"
processor:
- split_string:
entries:
- source: "message"
delimiter: ","
sink:
- stdout:
Next, create a log file named logs_json.log
. After that, replace the path
in the file source of your pipeline.yaml
file with your file path. For more detailed information, see Configuring Data Prepper.
Before you run Data Prepper, the source appears in the following format:
{"message": "hello,world"}
After you run Data Prepper, the source is converted to the following format:
The uppercase_string
processor converts the value (a string) of a key from its current case to uppercase.
Configuration
You can configure the uppercase_string
processor with the following options.
Usage
To get started, create the following pipeline.yaml
file:
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- uppercase_string:
with_keys:
- "uppercaseField"
sink:
- stdout:
copy
Next, create a log file named logs_json.log
. After that, replace the path
in the file source of your pipeline.yaml
file with the correct file path. For more detailed information, see Configuring Data Prepper.
Before you run Data Prepper, the source appears in the following format:
{"uppercaseField": "hello"}
After you run Data Prepper, the source is converted to the following format:
{"uppercaseField": "HELLO"}
The lowercase string
processor converts a string to lowercase.
Usage
To get started, create the following pipeline.yaml
file:
pipeline:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- lowercase_string:
with_keys:
- "lowercaseField"
sink:
- stdout:
copy
Next, create a log file named logs_json.log
. After that, replace the path
in the file source of your pipeline.yaml
file with the correct file path. For more detailed information, see Configuring Data Prepper.
Before you run Data Prepper, the source appears in the following format:
After you run Data Prepper, the source is converted to the following format:
{"lowercaseField": "testmessage"}
The trim_string
processor removes whitespace from the beginning and end of a key.
Configuration
You can configure the trim_string
processor with the following options.
To get started, create the following pipeline.yaml
file:
pipeline:
source:
file:
path: "/full/path/to/logs_json.log"
record_type: "event"
format: "json"
processor:
- trim_string:
with_keys:
- "trimField"
sink:
- stdout:
copy
Next, create a log file named logs_json.log
. After that, replace the path
in the file source of your pipeline.yaml
file with the correct file path. For more detailed information, see Configuring Data Prepper.
Before you run Data Prepper, the source appears in the following format:
{"trimField": " Space Ship "}
{"trimField": "Space Ship"}