03 Nov 2022 04:04 AM - last edited on 03 Nov 2022 09:47 AM by MaciejNeumann
I have a custom logfile and want to use log processing, but it doesn't split the loglines as it should, is there a way to tell the system to do that?
The log is json formatted, so every {} is one logline
example of the content:
"content": "{\"servertime\":\"2022-11-02T23:44:48.3637831-04:00\",\"action\":\"Start US\",\"region\":\"US\",\"time\":0}{\"region\":\"US\",\"action\":\"Start US\",\"time\":0,\"servertime\":\"2022-11-02T23:45:17.8491586-04:00\"}{\"region\":\"US\",\"action\":\"Start US\",\"time\":0,\"servertime\":\"2022-11-02T23:46:00.7540443-04:00\"}"
Solved! Go to Solution.
03 Nov 2022 08:05 AM
Hi mferstl,
All depend on the way you proceed for parsing your JSON with all attributes and fields.
this is a useful link : https://www.dynatrace.com/support/help/how-to-use-dynatrace/log-monitoring/acquire-log-data/log-proc...
You can use for instance :
PARSE(content,"JSON:parsedJson")
| FIELDS_ADD (resultDescription:parsedJson["resultDescription"])
| PARSE(resultDescription,"SPACE? JSON:resultDescriptionFormated")
| FIELDS_ADD(
or
USING(INOUT content)
| PARSE(content,"JSON:parsedJson")
| FIELDS_ADD (resultDescription:parsedJson["resultDescription"])
to avoid character limitation.
hope it helps,
Thanks
03 Nov 2022 08:59 AM - edited 03 Nov 2022 09:22 AM
tried both, it seems it only takes the first json and ignores the others, is there a way to load each json into a array and loop through the array?
Maybe it takes longer to take effect, will monitor for a longer time and then give feedback
thanks a lot for the moment
03 Nov 2022 09:29 AM - last edited on 23 Mar 2023 11:32 AM by MaciejNeumann
you can now try to concatenate like this :
FIELDS_ADD(content: attribute1 + attribute2)
Log processing functions - String | Dynatrace Docs
Moreover, you can parse out the other attributes that manner :
PARSE(content, "LD '\"US\":' SPACE? DQS:US")
those attributes :
"US\",\"action\"
PARSE(content,"JSON{STRING:US, STRING:action}:parsed")
| FIELDS_ADD(US: parsed[US], action: parsed[action])
| FIELDS_REMOVE(parsed)
03 Nov 2022 10:03 AM
no, its still happening
03 Nov 2022 10:41 AM
seems to work, lets hope it continues to do so
22 Jan 2023 10:28 AM
Hi,
Automatic parsing will be done when Dynatrace finds a timestamp inside the log line. I believe yours is the servertime field. You should add a timestamp configuration for your logs and this will be done automatically. You could also (if the log lines are very long and you can't control where will the timestamp be in the string control the number of strings to look at in each row to identify the field.
All previous comments that I see will only occur after the json has entered Dynatrace so first make sure the timestamp is defined correctly, otherwise you might be missing some lines.