27 Nov 2023 09:30 AM
Has anyone noticed this:
I create a log processing rule to remove parts of the content and only keep relevant data (remove everything after the '[ERROR]' and store it in a new field "message"
The rule works fine when tested, but when applied at runtime it cuts off parts of the content.
Here, querying Grail after the processing has been applied (note the cut off "message" field):
Are there any settings/limit I should be aware of for such cleanup rules?
Solved! Go to Solution.
29 Nov 2023 06:24 PM
Hi @r_weber
the LD matcher matches per default the first 4096 characters until the next line break. You might have some wired line break. therefore I would suggest to go with the DATA matcher and specify a higher character limit.
But I would suggest that you try the pattern first in notebooks and see if it works the same way as in the processing definition.
fetch logs
| filter <your custom filter for the relevant records>
| parse content, """LD
TIMESTAMP('yyyy-mm-dd HH:mm:ss +S', locale='de')
LD ('[Error]'|'[Warning]')
DATA:message"""
Best,
Sini
14 Dec 2023 07:44 AM
Hello @r_weber
I've seen the same behavior with a processing rule that parses a JSON. When I test the rules everything is Ok, but when log is processed, the extracted field seems to be limited to 250 characters.
Have you find any information or a solution ?
14 Dec 2023 09:51 AM
Hi @GerardJ ,
I solved it as @sinisa_zubic suggested by adding a quantifier to the DATA / LD matcher like this:
USING(INOUT content:STRING) |
PARSE(content, "
LD
TIMESTAMP('yyyy-mm-dd HH:mm:ss +S', locale='de')
LD
('[Error]'|'[Warning]')
DATA{1,8192}:content
")
Hope that helps,
Reinhard