Description
Component(s)
receiver/kafka
What happened?
Description
Metrics emitted from the kafka receiver are emitted with an empty name label. This causes problems in downstream systems that reject these metrics due to the empty label value.
Example from the debug output of our pipeline showing the metrics with empty name
labels:
otelcol_kafka_receiver_current_offset{name=,partition=2} 3095809
otelcol_kafka_receiver_current_offset{name=,partition=3} 3277479
otelcol_kafka_receiver_current_offset{name=,partition=0} 25849
otelcol_kafka_receiver_current_offset{name=,partition=1} 25939
otelcol_kafka_receiver_messages{name=,partition=1} 13057
otelcol_kafka_receiver_messages{name=,partition=2} 2384874
otelcol_kafka_receiver_messages{name=,partition=3} 2518850
otelcol_kafka_receiver_messages{name=,partition=0} 12304
otelcol_kafka_receiver_offset_lag{name=,partition=2} 0
otelcol_kafka_receiver_offset_lag{name=,partition=3} 0
otelcol_kafka_receiver_offset_lag{name=,partition=0} 0
otelcol_kafka_receiver_offset_lag{name=,partition=1} 0
otelcol_kafka_receiver_partition_close{name=} 6
otelcol_kafka_receiver_partition_start{name=} 9
These are rejected in our Thanos backend with the following logs:
ts=2025-04-17T21:21:57.778242899Z caller=writer_errors.go:127 level=warn component=receive component=receive-writer tenant=centralinfrastructure msg="Error on series with empty label name or value" numDropped=2
ts=2025-04-17T21:21:57.778323812Z caller=writer_errors.go:127 level=warn component=receive component=receive-writer tenant=centralinfrastructure msg="Error on series with empty label name or value" numDropped=3
ts=2025-04-17T21:22:07.873682383Z caller=writer_errors.go:127 level=warn component=receive component=receive-writer tenant=centralinfrastructure msg="Error on series with empty label name or value" numDropped=2
Steps to Reproduce
This should be a minimal configuration to reproduce:
receivers:
kafka:
brokers:
- ${env:KAFKA_BROKER}
client_id: ${env:POD_NAME}
encoding: otlp_proto
group_id: ${env:KAFKA_CONSUMER_GROUP}
protocol_version: 3.0.0
topic: ${env:KAFKA_TOPIC}
otlp:
protocols:
http:
endpoint: localhost:4318
exporters:
debug:
verbosity: normal
service:
pipelines:
logs:
receivers:
- kafka/test
exporters:
- nop
metrics:
receivers:
- otlp
exporters:
- debug
telemetry:
metrics:
address: 0.0.0.0:8888
level: detailed
readers:
- periodic:
exporter:
otlp:
endpoint: http://localhost:4318
protocol: http/protobuf
Expected Result
The name
label should be populated with the ID of the receiver.
Actual Result
The name
label is empty.
Collector version
v0.124.0
Environment information
Environment
OS: Microsoft Azure Linux 3.0
Kernel version: 6.6.78.1-3.azl3
Container runtime: containerd://2.0.0
Kubernetes: v1.32.3
OpenTelemetry Collector configuration
Log output
Additional context
I think this happens due to the way logsConsumerGroup
, metricsConsumerGroup
, and tracesConsumerGroup
are created.
The ID is present in the struct here, but is never actually set in the function where the consumer group handlers are instantiated (here).
Therefore the id
is nil
and I guess this leads to the empty name
label when c.id.Name()
is called here.