Standardizing on MCP for Internal Integration: Streamlining AI Service Development at Anthropic

Implementing MCP clients and standardizing on message protocols can simplify integrations, reduce complexity, and unlock future features - let me share my experiences at Anthropic and how we achieved scalability and ease of use with our internal integrations.

  • 1. John has spent 20 years building large scale systems and is currently a member of technical staff at Anthropic, focusing on tool calling and integration.
  • 2. There was an explosion of interest in MCP (Model Call Protocol) clients when models became good at calling tools last year.
  • 3. Teams started moving fast, creating custom endpoints and services for various use cases, leading to integration chaos.
  • 4. Many endpoints began to resemble MCP over time, with similar features like get tools, get resources, and elicitation of details.
  • 5. MCP has two main components: a JSON RPC specification for sending messages and a global transport standard for streamable HTTP, OAuth 2.1 session management.
  • 6. Anthropic decided to use MCP for everything involved in providing model context to models, allowing for a unified approach to integration.
  • 7. Standardizing on MCP internally has several benefits:
  • * It's not a competitive advantage to be good at plumbing integrations.
  • * A single approach makes things faster for engineers.
  • * Each new integration might clean up the field for the next person.
  • 8. MCP solves problems before they become apparent, such as handling different billing models and token limits in integrations.
  • 9. Anthropic is running into requirements converging, with external remote MCP services popping up and a proliferation of internal agents like PR review bots.
  • 10. To address these concerns, Anthropic introduced the MCP gateway, a single point of entry for all integrations that returns an MCP SDK client session.
  • 11. The MCP gateway handles credential management, rate limiting, and observability, making it easier for engineers to build integrations.
  • 12. Client libraries have been developed to simplify the connection process by passing a URL, org ID, account ID, and signed token for authentication.
  • 13. The MCP SDK object returned by the connection call allows organizations to easily update their packages to access new features in the protocol.
  • 14. Organizations can choose the best transport method for their internal setup when connecting to external MCP servers.
  • 15. Anthropic uses websockets for their internal transport, but other options like gRPC, Unix sockets, or even enterprise-grade email transports can be used.
  • 16. The MCP gateway handles authentication, allowing consumers to focus on building features without worrying about the complexity of OAuth.
  • 17. A unified authentication model enables portable credentials for batch jobs and consistent handling of tokens for internal services.
  • 18. Centralizing context for models provides a single place for enforcing policy and processing tools or resources in a standardized format.
  • 19. Adding MCP support to new services is as simple as importing a package, regardless of the language used.
  • 20. Standardizing on something like MCP offers operational simplicity with a single point of ingress/egress and standardized message formats.
  • 21. Future features are automatically included as the protocol evolves, saving time and resources.
  • 22. The key takeaways from this talk include:
  • * MCP is essentially JSON streams.
  • * How you pipe those streams around your infrastructure is a minor implementation detail.
  • * Standardizing on something like MCP makes life easier for future engineers.
  • 23. Building "pits of success" helps ensure that the right way to do things is also the easiest way.
  • 24. Centralizing shared problems at the correct layer allows organizations to focus on more interesting and valuable issues.

Source: AI Engineer via YouTube

❓ What do you think? What are your thoughts on the ideas shared in this video? Feel free to share your thoughts in the comments!