23 Feb 2026
10:00 AM
- last edited on
24 Feb 2026
07:33 AM
by
MaciejNeumann
Hi
There is a dashboard we are having in which there is a column named "content". Content column has information in each row like this in json format -> {"level":"INFO","message":"hello world","ts":"2026-01-13T08:50:37","mac_id":"server1"}
That is one row like above in each row of this dashboard tile. This dashboard is created from log ingestion so we cannot go inside the dashboard tile in data explorer mode. Now the requirement is that we need to split the above json line /row into multiple columns. That is we need to split above row and make "level" a new column with first row having a value of INFO. Then we need to have "message" as a new column and have "hello world" as its value in first row , then so on until "mac_id" attribute and then repeat for all rows. Now i have already tried going into settings > "Log monitoring" > "Custom attributes" section and defining above attributes there but its not working as the above are not attributes for dynatrace but rather a content. All i found out is that i need a log processing rule here where i need to create a matcher rule and processing rule(processor definition). Please check and help here. I need these rules.
23 Feb 2026 10:24 AM
@jasper23 I suppose you use Dynatrace Managed with Log monitoring classic and not Dynatrace SaaS where this is accomplished differently. In this case, you need a Log processing rule, and you've already figured that out.
Here is an example for parsing JSON- https://docs.dynatrace.com/managed/shortlink/log-monitoring-log-processing-examples#lpexample4
As for the matcher, it depends on your log source and for which logs you want that rule to apply. You don't necessarily need to define log attributes if you just want to see the fields in the table.
23 Feb 2026 12:07 PM
Thanks Julius for replying.
I think i forgot to mention many things here.
So this is a SAAS not managed. This dt version is 1.332.56.20260219-200703 but we are not on new dynatrace.
Things i forgot to inform is -> I already went on this link which you pasted -> https://docs.dynatrace.com/managed/shortlink/log-monitoring-log-processing-examples#lpexample4
Above one i already checked. Something is not right and i am seeing errors. Need help in that.
This example says ->
The sample log would look like this:
Example 4's Processing rule ->
PARSE(content, "
JSON{
STRING:stringField,
JSON {STRING:nestedStringField1}:nested
}:parsedJson")
| FIELDS_ADD(top_level.attribute1: parsedJson["stringField"], top_level.attribute2: parsedJson["nested"]["nestedStringField1"])
| FIELDS_REMOVE(parsedJson)
So the use case shown in the example 4 is not actually my use case but still taking inspiration from above processing rule i enter something like this in the processor definition field ->
PARSE(content, "
JSON{
STRING:level,
}:parsedJson")
| FIELDS_ADD(level_code: parsedJson["level"])
| FIELDS_REMOVE(parsedJson)
And when i put above in the processor definition i get below error and the test the rule button gets greyed out ->
Attached screenshot also.
So two things here -> 1.) the matcher expression needs some update i am sure - > dt.entity.host = "HOST-xxxxxxxxx" ( there should be something more here)
2.) The error needs to go away which is shown above. Obviously the processor definition expression is not correct even if i took inspiration from the example 4 on the link. See attachment.
I have already spent 7-8 hours on this in entirety.
23 Feb 2026 12:22 PM
Thanks Julius for replying. This is SAAS but we not on new dynatrace.
Yes there is no option but to parse this thing.
I already looked at the link you gave above last week and tried to use example 4 from it.
Though the use case is not entirely the same as mine.
Especially i used below use case from example 4 ->
{
"content": "{"intField": 13, "stringField": "someValue", "nested": {"nestedStringField1": "someNestedValue1", "nestedStringField2": "someNestedValue2"} }"
}
Processing rule ->
PARSE(content, "
JSON{
STRING:stringField,
JSON {STRING:nestedStringField1}:nested
}:parsedJson")
| FIELDS_ADD(top_level.attribute1: parsedJson["stringField"], top_level.attribute2: parsedJson["nested"]["nestedStringField1"])
| FIELDS_REMOVE(parsedJson)
Now i replicate above in my gui without using the nested stuff from above example(no nesting needed in my case)
My log content as shown on the processing rule page under "log sample" field is ->
{
"event.type": "LOG",
"content": "{\"level\":\"INFO\",\"message\":\"hello world...\",\"ts\":\"2025-12-14T07:50:37.084703142Z\",\"mac_id\":\"server1\"}",
"status": "INFO",
"timestamp": "1771846247556",
"loglevel": "INFO",
"log.source": "/opt/dynatrace/logs/*.log",
"dt.entity.host": "HOST-xxxxxxxx",
"dt.smartscape.host": "HOST-xxxxxxxxx"
}
First question is -> why it has put backslashes in the above content shown ?? under log sample field ?
i.e this -> "content": "{\"level\":\"INFO\",\"message\":\"hello world...\",\"ts\":\"2025-12-14T07:50:37.084703142Z\",\"mac_id\":\"server1\"}",
The original json log file content column doesnot have these backslashes.Does it make a difference ?
Then ->
Matcher expression(automatically created by dynatrace) -> dt.entity.host = "HOST-xxxxxxxx"( yes this is correct as this is the host from where we have fetched the logs and i have come to this page by selecting the log line and clicking on "create processing rule " button.
I feel the matcher expression is not complete yet, something more needs to come here.
Now taking inspiration from the example 4 i pasted below in the "processor definition" field ->
PARSE(content, "
JSON{
STRING:level,
}:parsedJson")
| FIELDS_ADD(level_code: parsedJson["level"])
| FIELDS_REMOVE(parsedJson)
And i get below error ->
Attached screenshot as well. Kindly help. I feel the expression is not correct.
23 Feb 2026 12:23 PM
replied to above ,pls check
24 Feb 2026 12:36 PM
This looks like a trailing comma in your JSON pattern (STRING:level,). In DPL, the last JSON member must not end with a comma, which matches the “mismatched input '}'” error.
Try this:
PARSE(content, "JSON{ STRING:level }:parsedJson")
| FIELDS_ADD(level_code: parsedJson["level"])
| FIELDS_REMOVE(parsedJson)The backslashes you see in the sample are expected (the UI renders the full log record as JSON, and content is a string containing JSON).
I couldn’t verify it directly in your environment, but IMO the trailing comma is the root cause and removing it should make the rule testable
Featured Posts