19 Sep 2025
05:27 PM
- last edited on
22 Sep 2025
07:50 AM
by
MaciejNeumann
Now, I'm not talking about extracting metrics from a log or an event. I know how to do that very well.
What I'm talking about is, in OpenPipeline, you can create pipelines for Metrics where you can apply processors to them (like add/remove/change dimensions or values).
What I need to understand is how that should be done in OpenPipeline.
With a logs processor for example, if you add a DQL processor, the pipeline already does the beginning of the DQL query for you:
fetch logs
| filter <whatever your match condition is>
|
and, in the OpenPipeline box, you are putting in whatever you would put after that |
But, you don't fetch metrics in DQL. You use the timeseries command, so, if I want to add a new dimension based on the value of an existing metric dimension, what do I need to put in the Match condition to make sure I'm processing the desired metric and what do I need to put in the DQL box to get the existing dimension values and add a new dimension?
Say I have a metric called:
metric_a
And it has a dimension called order_type with possible values of "M" and "L"
I want to add a new dimension called order_description that has a value of "Market Order" where order_type = "M" and "Limit Order" where order_type = "L".
I know how to do that in Notebooks DQL by using the timeseries command and selecting the metric named metric_a and making sure I use the order_type dimension in the by:{} property and then using a fieldsAdd command with an if clause.
But, since I don't know what OpenPipeline has already done ahead of time for me, I don't know where I'm picking up at for both the filter and the actual adding of the dimension...
Can somebody share an example of an OpenPipeline processor for Metrics that adds a new dimension based on an existing metric dimension?