Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Otel Metrics Not Received #10070

Open
stephen-galea-weavr opened this issue Apr 24, 2024 · 2 comments
Open

Otel Metrics Not Received #10070

stephen-galea-weavr opened this issue Apr 24, 2024 · 2 comments
Assignees
Labels
kind/bug A bug triage/needs-reproducing Someone else should try to reproduce this

Comments

@stephen-galea-weavr
Copy link

stephen-galea-weavr commented Apr 24, 2024

What happened?

I am using Grafana Agent as an otel collector and is working for other otel sources. However when I set a MeshMetric based on (https://kuma.io/docs/2.7.x/guides/otel-metrics/) I am not receiving any metrics.

This is my config:


apiVersion: kuma.io/v1alpha1
kind: MeshMetric
metadata:
  name: otel-metrics
  namespace: kuma-system
  labels:
    kuma.io/mesh: default
spec:
  targetRef:
    kind: Mesh
  default:
    # sidecar:
    #   profiles:
    #     appendProfiles:
    #     - name: None
    #     include:
    #     - type: Regex
    #       match: envoy_cluster_external_upstream_rq_.*
    backends:
      - type: OpenTelemetry
        openTelemetry:
          endpoint: grafana-agent-alloy.grafana.svc:4317

I can also confirm that the sidecar is being configured to send metrics

2024-04-24T15:52:48.293Z	INFO	mesh-metric-config-fetcher	Starting OpenTelemetry exporter	{"backend": "grafana-agent-alloy.grafana.svc-4317"}
@stephen-galea-weavr stephen-galea-weavr added kind/bug A bug triage/pending This issue will be looked at on the next triage meeting labels Apr 24, 2024
@Automaat
Copy link
Contributor

@stephen-galea-weavr could you share more info about your Grafana agent setup?

I've just tested it, with the basic setup from Grafana docs, and it works for me

@jakubdyszkiewicz jakubdyszkiewicz added triage/needs-information Reviewed and some extra information was asked to the reporter and removed triage/pending This issue will be looked at on the next triage meeting labels Apr 29, 2024
@stephen-galea-weavr
Copy link
Author

Hi, sure this is my setup. This is working fine for all other microservices etc... in the cluster.

otelcol.receiver.otlp "default" {
        grpc {
          endpoint = "0.0.0.0:4317"
        }
        http {
          endpoint = "0.0.0.0:4318"
        }
        output {
          traces  = [otelcol.processor.batch.default.input,otelcol.connector.servicegraph.default.input]
          metrics = [otelcol.processor.batch.default.input]
        }
      }

      otelcol.processor.batch "default" {
        output {
          traces  = [otelcol.exporter.otlp.default.input]
          metrics = [otelcol.exporter.prometheus.default.input]
        } 
      }

      otelcol.connector.servicegraph "default" {
        dimensions = ["http.method"]
        output {
            metrics = [otelcol.exporter.prometheus.default.input]
        }
      }

      otelcol.exporter.prometheus "default" {
        forward_to = [prometheus.remote_write.mimir.receiver]
        add_metric_suffixes = false
      }

@lahabana lahabana added triage/needs-reproducing Someone else should try to reproduce this and removed triage/needs-information Reviewed and some extra information was asked to the reporter labels May 2, 2024
@jakubdyszkiewicz jakubdyszkiewicz assigned lukidzi and unassigned slonka May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug A bug triage/needs-reproducing Someone else should try to reproduce this
Projects
None yet
Development

No branches or pull requests

6 participants