β23 Apr 2026
04:40 PM
- last edited on
β24 Apr 2026
07:08 AM
by
MaciejNeumann
Sometimes your application logs contain timestamps in non-standard formats buried deep inside long, unstructured log lines. The timestamp isn't at the beginning of the line and doesn't follow ISO 8601 with separators β so toTimestamp() can't parse it directly.
Example: Your logs look like this (simplified):
{hostName: app-server-01.example.com,level: WARN,message: PaymentService#validateAccount input: source=batch account=123456,serverId: node1,userId: svc-user,threadName: default task-1024,contextMap: [{traceid:a1b2c3d4e5f6},{correlationId:f47ac10b-58cc-4372-a567-0e02b2c3d479}],applicationName: payments,timestamp: 20260423T122121.131-0300}
The timestamp field 20260423T122121.131-0300 is:
Instead of trying to parse the entire log line, we use a surgical extraction approach:
fetch logs
// ... your filters here ...
| fieldsAdd index = indexOf(content, "timestamp")
| fieldsAdd timestamp_log = substring(content, from:index)
| parse timestamp_log, "LD TIMESTAMP('yyyyMMdd\\'T\\'HHmmss.SSSZ'):parsed_ts LD"
| fieldsAdd timestamp = parsed_ts
| fieldsRemove index, timestamp_log, parsed_ts
Step 1 β Find the position
| fieldsAdd index = indexOf(content, "timestamp")
indexOf returns the character position where "timestamp" starts in the content field. This avoids the need to parse the entire log structure.
Step 2 β Cut the string
| fieldsAdd timestamp_log = substring(content, from:index)
This gives us a short fragment like:
timestamp: 20260423T122121.131-0300}
Now we have a manageable string to parse.
Step 3 β Parse with TIMESTAMP pattern
| parse timestamp_log, "LD TIMESTAMP('yyyyMMdd\\'T\\'HHmmss.SSSZ'):parsed_ts LD"
Breaking down the DPL pattern:
Step 4 β Override the timestamp
| fieldsAdd timestamp = parsed_ts
This replaces the record's timestamp field with the correctly parsed value. Dynatrace will now use this timestamp for time-based filtering, sorting, and visualization.
Step 5 β Clean up
| fieldsRemove index, timestamp_log, parsed_ts
Remove the auxiliary fields to keep the result clean.
You can validate this approach without any log data using data record():
data record(content = "{hostName: app-server-01.example.com,level: WARN,message: PaymentService#validateAccount input: source=batch account=123456,serverId: node1,userId: svc-user,threadName: default task-1024,applicationName: payments,timestamp: 20260423T122121.131-0300}")
| fieldsAdd index = indexOf(content, "timestamp")
| fieldsAdd timestamp_log = substring(content, from:index)
| parse timestamp_log, "LD TIMESTAMP('yyyyMMdd\\'T\\'HHmmss.SSSZ'):parsed_ts LD"
| fieldsAdd timestamp = parsed_ts
| fields timestamp, type(timestamp)
Expected result:
timestamp type(timestamp)
| 2026-04-23T12:21:21.131-03:00 | timestamp |
Note: the original -0300 offset is correctly converted to UTC.
If the Z pattern doesn't parse your specific offset format, you can try replacing it with XX (which explicitly matches offsets without colons like -0300):
| parse timestamp_log, "LD TIMESTAMP('yyyyMMdd\\'T\\'HHmmss.SSSXX'):parsed_ts LD"
If the DPL TIMESTAMP matcher gives you trouble, you can manually reconstruct an ISO 8601 string and use toTimestamp():
fetch logs
// ... your filters here ...
| fieldsAdd index = indexOf(content, "timestamp")
| fieldsAdd raw_ts = substring(content, from:index+11, to:indexOf(content, "}", from:index))
| fieldsAdd iso_ts = concat(
substring(raw_ts, from:0, to:4), "-",
substring(raw_ts, from:4, to:6), "-",
substring(raw_ts, from:6, to:8), "T",
substring(raw_ts, from:9, to:11), ":",
substring(raw_ts, from:11, to:13), ":",
substring(raw_ts, from:13)
)
| fieldsAdd timestamp = toTimestamp(iso_ts)
| fieldsRemove index, raw_ts, iso_ts
This transforms 20260423T122121.131-0300 β 2026-04-23T12:21:21.131-0300, which is standard ISO 8601 and toTimestamp() handles it natively.
This indexOf + substring strategy is useful when:
Featured Posts