Automations
All questions related to Workflow Automation, AutomationEngine, and EdgeConnect, as well as integrations with various tools.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

How to create a workflow to ingest Apigee proxy metadata JSON logs as BizEvents and store them in Grail

tmehta3
Visitor

Hello Community,

I’m working on a use case where I need to:

  • Ingest JSON logs containing Apigee proxy metadata (e.g., proxy name, revision, environment, target server, etc.).

  • Transform this JSON into BizEvents inside a Dynatrace Workflow.

  • Save these BizEvents into Grail for further analysis and visualization.

My questions are:

  1. What is the recommended way to set up a Dynatrace Workflow that takes JSON input and maps it into BizEvents?

  2. Can I use the built-in “Ingest BizEvents” action directly, or should I first transform the JSON using a JavaScript step before ingestion?

  3. What fields are required in the BizEvents JSON (e.g., eventType, timestamp, properties) to ensure successful ingestion into Grail?

  4. Are there any best practices or examples for handling Apigee proxy metadata specifically in this workflow?

Any guidance, examples, or references would be greatly appreciated.

Thanks in advance!

5 REPLIES 5

MaximilianoML
Champion

Hello @tmehta3,

Yes, you can do this with the built-in Ingest business event action. The recommended approach is:

  1. Use an HTTP Request, trigger payload, or previous workflow step to retrieve the Apigee JSON.
  2. If the JSON is already flat and contains the fields you want, pass it directly to Ingest business event.
  3. If the JSON is nested, inconsistent, or needs enrichment, add a Run JavaScript step first.
  4. Return an array of normalized objects from JavaScript.
  5. Use {{ result("your_js_step") | to_json }} in the BizEvent ingestion step.
  6. Query the data from Grail with fetch bizevents.

For Apigee proxy metadata, I would strongly recommend a small JavaScript normalization step. It gives you a stable schema, avoids nested JSON problems, and makes later DQL queries much easier. Use event.type = "apigee.proxy.metadata" and event.provider = "apigee" as stable identifiers, then keep proxy name, revision, environment, target server, and deployment state as top-level fields.

I hope it points to the right path 😊

Max Lopes

Thanks for the detailed explanation @MaximilianoML — I followed the steps you outlined (HTTP Request → Run JavaScript → Ingest BizEvents). The workflow executes successfully till javascript action, but in the Run JavaScript step I’m seeing “No logs available” in the execution details also in the http request i am able to generate json data but there also seeing "No logs available"

Does this indicate that my JavaScript step isn’t returning the normalized objects correctly, or is it expected behavior when no explicit console.log is used? I want to confirm whether the BizEvents are actually being passed downstream to the ingestion step, or if I need to adjust the way I’m returning the array.

Could you clarify how to verify that the BizEvents are being generated and ingested into Grail, especially when the workflow UI shows “No logs available”?

Thanks again for your guidance — I just want to make sure the ingestion pipeline is working end-to-end.

 

Hello again!

Could you gently share some info of the workflow you built, please?

Like the actual JS code and a big picture of the workflow itself, thanks 😁

Max Lopes

Screenshot 2026-04-30 at 3.35.19 PM.pngScreenshot 2026-04-30 at 3.35.58 PM.pngScreenshot 2026-04-30 at 3.36.41 PM.png

 

I have attached the image of the workflow and the script I have used and showing "No logs available".
Can u help me with the same @MaximilianoML 

Hi, yes! Of course I'll try to help you, always feel free to ask 😊

No logs available in the Workflow execution details does not automatically mean that the JavaScript action failed or that nothing was returned.

In this case, it usually only means that the action did not write anything to the log output, for example with:

console.log(something)
console.error(error_something)

The important thing to check is the Result of the JavaScript task, not only the Logs tab. Dynatrace’s Run JavaScript action can return a custom object or array, and that returned value is available to subsequent workflow tasks. The result can be inspected in the execution details and used by later tasks.

Looking at your JavaScript, the main point is that you are currently returning this structure:

return { bizEvents, debug: apis };

So your JavaScript result is not the BizEvent array itself. It is an object that contains the array under the bizEvents property.

That means, in the Ingest BizEvents step, you should either reference only that nested array, something like this:

{{ result("run_javascript_task_name").bizEvents | to_json }}

 Personally, I would simplify the JavaScript and return the array directly:

function transform(input) {
  let body = input.http_request_1?.body;
  let apis;

  try {
    apis = typeof body === "string" ? JSON.parse(body) : body;
  } catch (e) {
    console.error("JSON parse error:", e);
    return [];
  }

  const proxies = apis?.proxies || [];

  const bizEvents = proxies.map(p => ({
    "event.type": "apigee.proxy.metadata",
    "event.provider": "apigee",
    "event.category": "api-management",

    "source.system": "apigee",
    "apigee.organization": "jlr-ddc1-prod",

    "proxy.name": p.name,
    "proxy.revision": String(p.revision ?? ""),

    "debug.ingestion_test_id": "apigee-test-001"
  }));

  console.log("Generated BizEvent count:", bizEvents.length);
  console.log("First BizEvent:", JSON.stringify(bizEvents[0] ?? {}, null, 2));

  return bizEvents;
}

Also, I would recommend using Dynatrace way to event fields such as:

"event.type": "apigee.proxy.metadata",
"event.provider": "apigee"

 

Let me know if worked and the next step, please 😉

Also, if it helped you, could you gently give Kudos? I'm aiming to be Member of the Month someday 🤗

 

Max Lopes

Featured Posts