Skip to content

Python: [Python] LLM Span missing system message in gen_ai.input.messages when using instructions parameter #3163

@claude89757

Description

@claude89757

Summary

When using OpenTelemetry instrumentation (@use_instrumentation) for ChatClient, the LLM span's gen_ai.input.messages attribute does not include the system message when the system prompt is passed via the instructions parameter (or ChatOptions(instructions=...)).

Expected Behavior

The gen_ai.input.messages attribute should contain the complete message list that is actually sent to the LLM, including the system message:

{
  "input": [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "What is machine learning?"}
  ]
}

Actual Behavior

The system message is missing from the recorded input:

{
  "input": [
    {"role": "user", "content": "What is machine learning?"}
  ]
}

Additionally, the gen_ai.system_instructions attribute is also not set.

Root Cause

The issue is in observability.py. The tracing decorator _trace_get_response captures the original messages parameter before the system message is added.

Execution Flow

1. trace_get_response() captures original messages  ← Does NOT contain system message
                ↓
2. _capture_messages(messages=original_messages)    ← Records without system message
                ↓
3. Original get_response() adds system_msg          ← System message added here
   (in _clients.py:573-577)
                ↓
4. _inner_get_response(prepped_messages)            ← Actual LLM call includes system

Code Location

observability.py:1099-1100 (_trace_get_response):

if OBSERVABILITY_SETTINGS.SENSITIVE_DATA_ENABLED and messages:
    _capture_messages(span=span, provider_name=provider_name, messages=messages)
    # Missing: system_instructions parameter

observability.py:1188-1193 (_trace_get_streaming_response):

if OBSERVABILITY_SETTINGS.SENSITIVE_DATA_ENABLED and messages:
    _capture_messages(
        span=span,
        provider_name=provider_name,
        messages=messages,
    )
    # Missing: system_instructions parameter

Note: The Agent-level tracing (_trace_agent_run at line 1356-1362) correctly passes system_instructions, but the ChatClient-level tracing does not.

Suggested Fix

Pass the system_instructions parameter from chat_options to _capture_messages:

# In _trace_get_response (line ~1099)
if OBSERVABILITY_SETTINGS.SENSITIVE_DATA_ENABLED and messages:
    chat_options = kwargs.get("chat_options")
    _capture_messages(
        span=span,
        provider_name=provider_name,
        messages=messages,
        system_instructions=getattr(chat_options, "instructions", None) if chat_options else None,
    )

The same fix should be applied to _trace_get_streaming_response.

Impact

This affects all users relying on OpenTelemetry-based observability (e.g., Langfuse, Jaeger) to monitor their LLM calls. The missing system prompt makes it difficult to:

  • Debug prompt-related issues
  • Audit complete LLM interactions
  • Calculate accurate token usage attribution

Environment

  • Package: agent-framework (Python)
  • File: python/packages/core/agent_framework/observability.py

Metadata

Metadata

Labels

observabilityIssues related to observability or telemetrypython

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions