Showing results for 
Show  only  | Search instead for 
Did you mean: 

Extension metric error


Hello everyone,

recently I've been editing an existing script that sends POST method into Dynatrace. As you can see in the code below, we are inserting data from a .csv file. The original code with its metrics already created in Dynatrace works fine. However, when I change the timeseriesId name, I get the following error:


"2022-03-04 14:58:12,648 Erro ao incluir o payload: {'error': {'code': 400, 'message': 'Constraints violated.', 'constraintViolations': [{'path': 'series[0].timeseriesId', 'message': 'No configuration found for timeseries id.', 'parameterLocation': 'PAYLOAD_BODY', 'location': None}]}}, Json: {"displayName": "Inclus\u00e3o Proposta", "group": "XXXXXX", "type": "DMPS", "properties": {"u_name_abbreviation": "TZ", "path": "D:\\DynatraceExtension", "exe": "", "ServerExtension": "n2431237"}, "configUrl": "XXXXXX", "tags": ["TZ"], "series": [{"timeseriesId": "custom:motors_dmps_response_time_inc", "dimensions": {}, "dataPoints": "


import pandas
import sys
import os
import json
import requests
import logging
from datetime import datetime
from pathlib import Path
import platform
import requests
import numpy as np

HEADER = {'Authorization':'Api-Token XXXXX','Content-Type':'application/json; charset=utf-8'}
logging.basicConfig(filename='readCSV.log', level=logging.INFO, format='%(asctime)s %(message)s')
directory = Path(r'XXXXXXX')
lockfile_history = 'lastline_history.txt'

def eventValidation(payload):'Metodo eventValidation')
req = requests.Request('POST', 'https://dynatracewebpre.dcbr01.corp/e/XXXXXX/api/v1/events', headers=HEADER, data=json.dumps(payload))
response = requests.request(req.method, req.url,, headers=req.headers, verify=False)
if response.status_code != 200:
logging.error('Erro ao incluir o payload: %s, Json: %s',response.json(),json.dumps(payload))

def sendMetrics(timeseriesId,dimension,datapoints):'Metodo sendMetrics')
payload = {
'displayName': 'Inclusão Proposta',
'group': 'XXXXXXX',
'type': 'DMPS',
'properties': {'u_name_abbreviation': 'TZ','path':'D:\dynatrace','exe':'','ServerExtension':platform.uname()[1]},
'tags': ['TZ'],
'series': [
'timeseriesId': 'custom:'+timeseriesId,
'dimensions': dimension,
'dataPoints': datapoints
req = requests.Request('POST', 'https://dynatracewebpre.dcbr01.corp/e/XXXXXXXXXXX/api/v1/entity/infrastructure/custom/dmps_motorsInc', headers=HEADER, data=json.dumps(payload))
response = requests.request(req.method, req.url,, headers=req.headers, verify=False)
if response.status_code != 200:
logging.error('Erro ao incluir o payload: %s, Json: %s',response.json(),json.dumps(payload))

def cd_situ_prop_flux(group_name,group_filter,group_returncode):'Metodo cd_situ_prop_flux')
timeseriesId = 'motors_requests_inc'
dimension = {'CD_ERRO_PROC':str(group_returncode), 'CD_MOTO_PROC':group_name}
data_groups = df[group_filter].groupby(['DH_INCL_MOVI'])
for timestamps_group, metrics in data_groups:
timestamp = str(datetime.strptime(timestamps_group,'%Y-%m-%d-%H:%M').timestamp()).replace(".", "")

def send_responseTime(motor,metric):
dimension = {}'Metodo send_responseTime, motor:%s',motor)
if motor == 'ADT':
timeseriesId = 'motors_dmps_response_time_inc'
metricColumn = 'SS_RSPT_DMPS'
if motor == 'RCS':
timeseriesId = 'motors_cntg_response_time_inc'
metricColumn = 'SS_RSPT_CNTG'
metricList = metric[['DH_INCL_MOVI',metricColumn]].values.tolist()
for datametrics in metricList:
timestamp = str(datetime.strptime(datametrics[0],'%Y-%m-%d-%H:%M').timestamp()).replace(".", "")

def select_lastFiles():'Metodo select_lastFiles')
data_criacao = lambda f: f.stat().st_ctime
data_modificacao = lambda f: f.stat().st_mtime
files = directory.glob('*.csv')
return sorted(files, key=data_modificacao, reverse=True)

def delete_oldfiles(last_files):'Metodo delete_oldfiles')
if'%Y%m%d') not in last_files:'Deletando arquivos no repositorio')
files = os.listdir(directory)'Deletando Arquivos do dia anterior...')
for filecsv in files:
csvfile = f'{directory}/{filecsv}'

def main():
global df
List_last_files = select_lastFiles() ### Selecionando ultimo Arquivo[0])
if len(List_last_files) != 0:'Iniciando')
df=pandas.read_csv(List_last_files[0], delimiter=';')
df['DH_INCL_MOVI_1'] = df['DH_INCL_MOVI']
df['DH_INCL_MOVI'] = pandas.to_datetime(df['DH_INCL_MOVI'], format='%Y-%m-%d-%H.%M.%S.%f').dt.strftime('%Y-%m-%d-%H:%M')
df['CD_MOTO_PROC'] = df['CD_MOTO_PROC'].replace(' ', '', regex=True)
df['CD_MOTO_PROC'] = df['CD_MOTO_PROC'].replace('', 'Unprocessed')
motor_list = []
percentMotores = (df['CD_MOTO_PROC'].groupby(df['CD_MOTO_PROC']).count()/df['CD_MOTO_PROC'].groupby(df['CD_MOTO_PROC']).count().sum())*100
if 'RCS' in percentMotores.index:
if percentMotores['RCS'] > 3:
payload = {
"eventType": "ERROR_EVENT",
"attachRules": {
"entityIds": [
"customProperties": {
"Mensagem de Erro": "Execucao no motor de Contingencia Acima de 3% nos ultimos 15min"
"source": "PropostaMF",
"description": "Baixo desempenho na execucao do motor do ADT",
"title": "Alto volume no Motor Contingencia"
# dif_incl_envidmps(df)
for motor_name,metricMotors in df.groupby(['CD_MOTO_PROC']):
if motor_name !='Unprocessed':

for metricList in metricMotors[['DH_INCL_MOVI','CD_PROP','DS_MESG_ERRO','CD_ERRO_PROC','CD_ERRO_SERV']].values.tolist():
payload = {
"start": int(str(datetime.strptime(metricList[0],'%Y-%m-%d-%H:%M').timestamp()).replace(".", "")[0:10]+'000'),
"attachRules": {
"entityIds": [
"customProperties": {
"Mensagem de Erro": metricList[2]
"source": "PropostaMF",
"annotationType": str(metricList[3]),
"annotationDescription": metricList[4]
for retcode, datails in df.groupby(['CD_ERRO_PROC']):
for motor in motor_list:
filter_group = (df['CD_MOTO_PROC']==motor) & (df['CD_ERRO_PROC']==retcode)
else:'Nao possui arquivos a serem lidos')
delete_oldfiles(str(List_last_files[0])) ### Deletando arquivos antigos'Processo concluido com sucesso!')
logging.exception("Unexpected error: %s", sys.exc_info()[0])

if __name__ == "__main__":


I know we must create an extension metric, like: "ext: motors_dmps_response_time_inc". However, I'm not sure how to do that. 


I would like to know how to do this.


Dynatrace Guru
Dynatrace Guru

Hi, this is the API to create 1.0 metrics in Dynatrace.

I'd recommend you to move over to 2.0 though with the ingest metric API as it doesn't require declaring it first.

It'd require you to use generic entities though instead of custom devices, so it'd be a bigger rewrite.



This is exactly what I needed. However, I want to know how I define the dimesion in payload. I tried to fill it out, but I ended up getting the following error:


"Erro ao incluir o payload: {'error': {'code': 400, 'message': 'Constraints violated.', 'constraintViolations': [{'path': 'series[0].dimensions', 'message': "Dimensions don't match dimensions specified in configuration. Allowed extra dimensions from the configuration are: Inclusao_Proposta", 'parameterLocation': 'PAYLOAD_BODY', 'location': None}]}}, Json: {"displayName": "Inclus\u00e3o Proposta", "group": "TZ - Administracao de Propostas Santander Financiamentos", "type": "DMPS", "properties": {"u_name_abbreviation": "TZ", "path": "D:\\DynatraceExtension", "exe": "", "ServerExtension": "n2431237"}, "configUrl": "https://confluence.santanderbr.corp/pages/viewpage.action?pageId=409685379", "tags": ["TZ"], "series": [{"timeseriesId":"


So, could you tell me what I should fill it out with, based on my code above?



Hi @Gustavo_Godinho,


you need to make sure the types list in the custom metric has also the same type as the custom device you try pushing your metric to. You may need to update the custom metric 

Also in order to push metrics to custom devices you will need to do through with the timeseries field on the custom device, basically refreshing the custom device as well.


Hope this helps.



Featured Posts