cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Log processing not splitting lines

mferstl
Participant

I have a custom logfile and want to use log processing, but it doesn't split the loglines as it should, is there a way to tell the system to do that?

The log is json formatted, so every {} is one logline
example of the content:
"content": "{\"servertime\":\"2022-11-02T23:44:48.3637831-04:00\",\"action\":\"Start US\",\"region\":\"US\",\"time\":0}{\"region\":\"US\",\"action\":\"Start US\",\"time\":0,\"servertime\":\"2022-11-02T23:45:17.8491586-04:00\"}{\"region\":\"US\",\"action\":\"Start US\",\"time\":0,\"servertime\":\"2022-11-02T23:46:00.7540443-04:00\"}"

6 REPLIES 6

uros_djukic1
Dynatrace Mentor
Dynatrace Mentor

Hi mferstl,

All depend on the way you proceed for parsing your JSON with all attributes and fields. 
this is a useful link : https://www.dynatrace.com/support/help/how-to-use-dynatrace/log-monitoring/acquire-log-data/log-proc...

You can use for instance : 

PARSE(content,"JSON:parsedJson")

| FIELDS_ADD (resultDescription:parsedJson["resultDescription"])
| PARSE(resultDescription,"SPACE? JSON:resultDescriptionFormated")

| FIELDS_ADD(

or

 

USING(INOUT content)
| PARSE(content,"JSON:parsedJson")

| FIELDS_ADD (resultDescription:parsedJson["resultDescription"])

to avoid character limitation.

hope it helps,
Thanks

mferstl
Participant

 

tried both, it seems it only takes the first json and ignores the others, is there a way to load each json into a array and loop through the array?

 

Maybe it takes longer to take effect, will monitor for a longer time and then give feedback

 

thanks a lot for the moment

 

you can now try to concatenate like this : 
FIELDS_ADD(content: attribute1 + attribute2)
Log processing functions - String | Dynatrace Docs

Moreover, you can parse out the other attributes that manner : 

PARSE(content, "LD '\"US\":' SPACE? DQS:US")

those attributes : 

"US\",\"action\"

PARSE(content,"JSON{STRING:US, STRING:action}:parsed")
| FIELDS_ADD(US: parsed[US], action: parsed[action])
| FIELDS_REMOVE(parsed)

mferstl
Participant

no, its still happening

Message
{"region":"CN","servertime":"2022-11-03T05:50:07.1373187-04:00","action":"Start CN","time":0} {"time":291.5652348,"region":"ROW","servertime":"2022-11-03T05:50:09.5621787-04:00","action":"ROW total runtime"}
Other
action
Start CN
process.technology
1..NET
2.CLR
region
CN
time
0

mferstl
Participant

seems to work, lets hope it continues to do so 

gilgi
DynaMight Champion
DynaMight Champion

Hi, 

Automatic parsing will be done when Dynatrace finds a timestamp inside the log line. I believe yours is the servertime field. You should add a timestamp configuration for your logs and this will be done automatically. You could also (if the log lines are very long and you can't control where will the timestamp be in the string control the number of strings to look at in each row to identify the field.

All previous comments that I see will only occur after the json has entered Dynatrace so first make sure the timestamp is defined correctly, otherwise you might be missing some lines.

Featured Posts