cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Custom Log ingestion Dynatrace

Piermarco
Frequent Guest

Hi all,

My team and I have successfully processed a custom JSON and sent it to Dynatrace using the log ingestion POST API. We need this data for offline monitoring of an application.

We are now starting to use GRAIL to create dashboards. However, we've encountered a problem: when the data is input into log monitoring, even the numbers in the JSON are recognized as strings.

Captured POST ingestion only string detectionCaptured POST ingestion only string detection

This is the passed json example:

 

{
        "userAgent": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
        "language": "it-IT",
        "cookieEnabled": true,
        "name": "xxxxxxx",
        "timestamp": 1718614669096,
        "session": "1718614523449",
        "userName": "xxxxxx",
        "online": false,
        "deviceMemory": 8,
        "hardwareConcurrency": 8,
        "connectionDownlink": 0,
        "connectionType": "4g",
        "connectionRoundTripTime": 0,
        "duration": 1017
    }

 


To address this problem, we could either:

  1. Modify the data during processing by using built-in Dynatrace functions like `toLong(...)` or `toDuration(...)`.
  2. Or parse it using DPL with the parse syntax `JSON:ingest`. And do not seems also to work correctly as we expected (only recognise duration).parsing using parse commad for jsonparsing using parse commad for json

Our question concerns performance. Each time we use these commands to change the field type or parse our JSON with DPL, it impacts performance. Is there a way to have our JSON data directly arrive with the correct field types, so we don't have to modify them later and just use them?

Thank you for your help.

Piermarco

5 REPLIES 5

jaume_reverte
Dynatrace Helper
Dynatrace Helper

Hello, 

My recommendation is to create a log processing rule. There is no better way to ensure that the data is captured in the form you want. 

This way, the data inside Dynatrace is already the correct one, so you don't need to make changes later. 

Hope this helps. Wish you good monitoring! 

Piermarco
Frequent Guest

Hi,

Where I can find it ?

Thanks for the replay.



jaume_reverte
Dynatrace Helper
Dynatrace Helper

Hello, 

Here is an example of what you want to perform and the documentation page. 

Hi, 
As you suggested to me I have create a custom rule to capture the logs in Settings > Log Monitoring > Processing.
This is the fetch log:
fetch logfetch log

So I have created a new processig rule:

  1. I have use the same matcher in the log ingester

 

 

matchesValue(TagDynatarce, "Offline_log_ingestion")​

 

     2. And use this rule to pass my parameter with the correct type as I want:

 

USING(
    IN userAgent: STRING?,
    IN language: STRING?, 
    IN cookieEnabled: BOOLEAN?,
    INOUT timestampX: TIMESTAMP,
    IN session: STRING,
    IN online: BOOLEAN?,
    IN deviceMemory: INTEGER?,
    IN hardwareConcurrency: INTEGER?,
    IN connectionDownlink: INTEGER?,
    IN connectionType: STRING,
    IN connectionRoundTripTime: INTEGER?,
    INOUT durations: INTEGER,
    INOUT name:STRING,
    INOUT userName:STRING, 
    INOUT TagDynatarce:STRING
)
| FIELDS_RENAME(
    duration: durations,
    actionName: name,
    timestamp: timestampX
)
| FIELDS_REMOVE(durations, name, timestampX, TagDynatarce)​

 

3. After I test it and it is able to find and change the content. Test the ruleTest the rule

 

Now what I expect after I have defined this rule that in GRAL the json is correctly ingest and with the correct type of field, but it does not work.

GRAILGRAIL

Am I doing something wrong ? 

Thanks for the time

jaume_reverte
Dynatrace Helper
Dynatrace Helper

Hello, 

I don't see any error. 

I see in the last photo that you have one new entry with the correct values. This rule only applies to new ingested logs, not to old logs. 

Featured Posts