cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

openpipeline best practices?

calfanonerey
Contributor

Hey Everyone, looking for some advice here.

I want to build out OpenPipeline for logs and spans and wondering what the best practice is. 
The end goal is to have buckets on a per service level as well as extract metrics from spans/logs and process logs as needed.

 

To organize things would it be best to do 

a. 1 dynamic route per service -> 1 pipeline per service 

b. 1 dynamic route for stackenv -> 1 pipeline per stackenv

 

I like option a because it gives us more control but option b seems easier to manage? Looking for advice thanks in advance!

1 REPLY 1

JeanBlanc
Advisor

Hi @calfanonerey,

From what I’ve seen in the docs and setups I’ve worked with, the choice really depends on how similar your services are in terms of log and span processing:

  • Option (a) – one dynamic route and one pipeline per service – gives you fine-grained control (different extraction rules, retention, metrics, etc.) but can quickly become hard to maintain at scale.

  • Option (b) – grouping by stackenv – is easier to manage and works well if the services in a stack share similar characteristics.

Personally, I’d probably go for a hybrid approach: start with stackenv-level pipelines and split out only the services that need specific processing logic. Automating pipeline creation via the Settings API can help a lot here.

That said, this is just my take based on current documentation and experience — I’d be very interested to hear how others have structured their OpenPipeline deployments and if they found a cleaner pattern!

Warm regards,

Jean

Featured Posts