Exploring MCP Observability: Bridging the Blind Spot in AI Agents

As AI evangelists and developers of MCP Run, Alex Volov and Benjamin Eckle discuss the rise of observability blind spots in the growing ecosystem of Machine Learning (ML) agents, highlighting the need for vendor-neutral solutions to ensure seamless integration with existing tools.

  • 1. Alex Volov, AI evangelist at Weights and Biases, and Benjamin Eckle, co-founder and CTO of DIPso, discuss MCP observability.
  • 2. The rise of MCP (Machine Coordination Protocol) in AI agents is creating an observability blind spot.
  • 3. As AI agents become more prevalent, the problem can be compounded by multiple tools used within MCP.
  • 4. Developers may have limited knowledge about end-to-end happenings within their agent as observability diminishes.
  • 5. MCP Run has both clients and servers in production but lacks proper observability tooling, leading to a bespoke solution for Weave MCP in Python SDK.
  • 6. TypeScript agent was built alongside Python SDK, demonstrating the language-agnostic nature of MCP servers and clients.
  • 7. The TypeScript agent adapted traces into weave using OpenTelemetry Protocol (OTLP) within minutes.
  • 8. MCP stories highlight agents learning from other agents or tools to improve functionality, as seen with OPUS 4 and Weave WB's support bot.
  • 9. MCP Run will export telemetry to hotel-compatible syncs for both servers and clients, allowing for more robust observability in the future.
  • * Start thinking about observability via MCP tooling.
  • * Determine if you're getting observability end-to-end in your execution chain.
  • 11. For tool builders and platform providers:
  • * Work on higher-level SDKs, such as Open Inference, to help with instrumentation for clients using bespoke SDKs.
  • * Collaborate on conventions together.
  • 12. Semantic conventions are essential for understanding user-defined span attributes and creating agreement among observability platforms.
  • 13. Platform builders like MCP Run can contribute by adding OpenTelemetry support, reviewing RFCs, and engaging in discussions about ideas.
  • 14. AI Engineer track at the conference provides opportunities to learn from implementors and discuss new developments.
  • 15. Alex Volov encourages attendees to check out Weights and Biases (W&B), MCPOP for tracing MCP with OpenTelemetry, and observable tools initiative.
  • 16. Ben Eckle invites attendees to visit the booth for surprises and learn more about AI news on Thursday I podcast.

Source: AI Engineer via YouTube

❓ What do you think? What are your thoughts on the ideas shared in this video? Feel free to share your thoughts in the comments!