Hello Dynatrace Community,
Recently, we've noticed a number of replies on the Community created by LLM chatbots, such as ChatGPT or Bing Answers. We’d love to clarify the Dynatrace Community’s stance on using such tools.
We're adding a new rule to the Dynatrace Community Guidelines:
Mark the AI-generated content
All the AI-generated content posted in the Dynatrace Community must be clearly marked as such and communicated to other users. AI content can be used only as a reference and not be the only part of the message. If the AI Chatbot you're using provides sources, please include them in the answer. The answers generated by AI must be verified and tested by the author of the question before accepting them as solutions.
If the post doesn't respect the said rules, it will be deleted.
Dynatrace Community focuses on user-to-user interaction; it’s a place where we can exchange information and learn from each other to become better professionals. While AI tools can be incredibly useful, they currently can’t provide solutions for the more complex questions. The AI-generated answers often lead to confusion instead of a solution and, at the same time, harm the Community itself - lowering the SEO score of the site and preventing us from making the feature Dynatrace AI more knowledgeable, as we can’t feed it content from other AI chatbots.
This is why, while we don’t want to ban the AI content completely, we want to ensure that it is marked and provided with context.
Let us know if you have any questions, and we’ll be happy to answer them.
Thanks @MaciejNeumann. Great policy!
Dynatrace Community focuses on user-to-user interaction; it’s a place where we can exchange information and learn from each other to become better professionals.
100% agree with your statement.
There is a new tag to identify such content?
At this moment, we won't introduce a special label for AI-generated content, but we'll monitor the situation and introduce one if needed.
And, as always, all users can report inappropriate content to us if they notice something breaking the Community rules.