17 Feb 2026 10:41 PM
We have a log metric configured in classic settings with a very high number of dimensions (tuples). The "dt.sfm.server.metrics.rejections" metric gives the following error for the log metric we configured:
"Couldn't save ingested data. This metric key has reached the maximum number of tuples for a single metric for the last 30 days."
Do I understand correctly that this means we are only ingesting data for tuples (which is a unique set of dimensions for a metric, correct?) that were already ingested before we hit the limit?
According to the documentation, the limit for dimension tuples is 1 million for classic metrics and unlimited (excluding highly volatile dimensions) on Grail. I didn't think the classic metric limit applied since we do use Grail, but is that incorrect?
Does the log metric have to be configured in OpenPipeline in order to benefit from the unlimited dimensions on Grail? And while there are a large number of tuples for our log metric, I don't believe this to be highly volatile as they describe.
I've also created a DQL query to attempt to count the number of unique tuples within the timeframe of the query and it's showing a cumulative count of around 1.7 million by the end of the last 30 days, which is higher than the 1 million classic limit. So why are we getting that error message? Is data really getting dropped?
Here is the DQL query I mentioned for reference:
timeseries {count = sum(<log metric key>)}, by:{<split by all available dimensions>},interval: 6h
// get the index of first occurance of tuple
| fieldsAdd first_index = arrayIndexOf(count,arrayFirst(count))
// only count the first occrance
| fields timeframe, count = if(iIndex() == first_index, count[]/count[]), interval
// count of all new tuples over time
| summarize {timeframe = takeFirst(timeframe), dimension_count = sum(count[]), interval = takeFirst(interval)}
// add cumulative count over time and total count
| fieldsAdd cumulative_dimension_count = arrayCumulativeSum(dimension_count), total = arraySum(dimension_count)
Solved! Go to Solution.
05 Mar 2026 07:37 PM
I ended up opening a support case for this. Here's what we found:
Featured Posts