cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Calculating a Rolling Summary in a DQL

pmurray30
Observer

Hello,

I have the following DQL, which returns a time series of a metric "incoming count in last minute" from a custom extension/generic db query. 

timeseries query=sum(custom.db.query), :interval:1m,
filter:{in(dt.entity.custom_device,classicEntitySelector("type(custom_device),entityName.equals(\"Incoming count in last 1 mins\")"))},
by:{dt.entity.custom_device}

What I'm after is for every minute, return the incoming count for the last 60 minutes, i.e. calculating the total count for the last 60 minutes, every minute.

I could define a db query and another metric to calculate this, but currently we're having performance/capacity issues so really don't want to add another custom db query when I should be able to derive the answer.

P.S. bins or increasing the interval, doesn't give me the granularity I need to compare against existing charts.

Thanks

 

2 REPLIES 2

ChadTurner
DynaMight Legend
DynaMight Legend

@pmurray30 - were you able to get a query for this, or a solution? 

-Chad

Hi @ChadTurner ,

At the time no, but after a year more of experience, it took me a couple of hours to come up with this.

 

timeseries {QueryRslt=Max(custom.db.query), timestamp=start()}, by: { dt.entity.custom_device } , 
  interval:1m, 
  filter: { in(dt.entity.custom_device, 
  classicEntitySelector("type(custom_device),entityName.equals(\"Incremental Count\")")) }
| fieldsAdd key = tolong(timestamp[]-arrayFirst(timestamp))/1000000000
| join on:{key} , 
  [timeseries {QueryRslt=Max(custom.db.query), timestamp=start()}, by: { dt.entity.custom_device }, 
    interval:1m, shift:-10m, 
    filter: { in(dt.entity.custom_device, 
    classicEntitySelector("type(custom_device),entityName.equals(\"Incremental Count\")")) }
  | fieldsAdd key = tolong(timestamp[]-arrayFirst(timestamp))/1000000000] 
| fieldsAdd delta=QueryRslt[]-right.QueryRslt[]
| fields timeframe, interval, delta

 


This works with a incremental count, but should be able to be adapted to a minute reading if you have more joins and add the readings.

Maybe there is a cleaner way of doing it, but at the time I ended up just adding another custom query.

Featured Posts