Skip to content

Conversation

@constantinius
Copy link
Contributor

@constantinius constantinius requested a review from a team as a code owner January 14, 2026 09:36
@linear
Copy link

linear bot commented Jan 14, 2026

@github-actions
Copy link
Contributor

Semver Impact of This PR

🟡 Minor (new features)

📋 Changelog Preview

This is how your changes will appear in the changelog.
Entries from this PR are highlighted with a left border (blockquote style).


New Features ✨

  • feat(asyncio): Add on-demand way to enable AsyncioIntegration by sentrivana in #5288
  • feat(integration): add gen_ai.conversation.id if available by constantinius in #5307

Bug Fixes 🐛

  • fix(ai): redact message parts content of type blob by constantinius in #5243
  • fix(clickhouse): Guard against module shadowing by alexander-alderman-webb in #5250
  • fix(gql): Revert signature change of patched gql.Client.execute by alexander-alderman-webb in #5289
  • fix(grpc): Derive interception state from channel fields by alexander-alderman-webb in #5302
  • fix(litellm): Guard against module shadowing by alexander-alderman-webb in #5249
  • fix(pure-eval): Guard against module shadowing by alexander-alderman-webb in #5252
  • fix(ray): Guard against module shadowing by alexander-alderman-webb in #5254
  • fix(threading): Handle channels shadowing by sentrivana in #5299
  • fix(typer): Guard against module shadowing by alexander-alderman-webb in #5253

Documentation 📚

  • docs(metrics): Remove experimental notice by alexander-alderman-webb in #5304
  • docs: Update Python versions banner in README by sentrivana in #5287

Internal Changes 🔧

Release

  • ci(release): Bump Craft version to fix issues by BYK in #5305
  • ci(release): Switch from action-prepare-release to Craft by BYK in #5290

Other

  • chore(gen_ai): add auto-enablement for google genai by shellmayr in #5295

🤖 This preview updates automatically when you update the PR.

Comment on lines +119 to +124
)

_set_response_model_on_agent_span(agent, response_model)
update_ai_client_span(span, streaming_response, agent=agent)
finally:
span.__exit__(*sys.exc_info())
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: The _sentry_conversation_id is not propagated to secondary agents during handoffs, causing subsequent spans for that agent to lack the conversation ID and breaking trace continuity.
Severity: HIGH

Suggested Fix

Propagate the _sentry_conversation_id to handoff agents before they are executed. This could be achieved by setting the attribute on the handoff agent object before it is used in _run_single_turn, or by storing the conversation_id in a context that persists across handoffs, ensuring all agents in the chain have access to it.

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: sentry_sdk/integrations/openai_agents/patches/models.py#L119-L124

Potential issue: The `_sentry_conversation_id` attribute is set on the initial agent at
the start of an execution run. However, when this agent hands off control to a secondary
agent, the `_sentry_conversation_id` is not propagated. As a result, any subsequent
spans created for the secondary agent, such as `invoke_agent_span`, `ai_client_span`,
and `execute_tool_span`, will be missing the conversation ID. This leads to incomplete
trace data, as the spans from the secondary agent cannot be linked to the overall
conversation.

Did we get this right? 👍 / 👎 to inform future reviews.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants