Saturday, December 27, 2025

Agentic AI Frameworks - Crew AI, LangChain, and LangGraph

  

LangChain- LangChain is a framework for building applications powered by LLMs (Large Language Models). It helps developers connect language models with external data sources and tools. Modular framework for building LLM apps.

Key Features:

  • Chains: Combine multiple steps (e.g., prompt → model → output) into a workflow.
  • Agents: Enable LLMs to make decisions and utilize tools dynamically (e.g., calculators, search engines).
  • Memory: Store context across interactions for more personalized responses.
  • Integrations: Connects with APIs, databases, files, and other services.

 

LangGraph -- LangGraph is built on top of LangChain and is designed for creating stateful, multi-step workflows using a graph-based architecture. Extension for building statefulmulti-step, and dynamic workflows using LangChain. LangGraph builds on LangChain by introducing graph-based workflows, which allow for non-linearstateful, and dynamic execution paths.

Key Features:

  • Graph-based execution: Nodes represent steps, and edges define transitions.
  • State management: Keeps track of the conversation or task state across steps.
  • Looping and branching: Supports complex logic like retries, conditionals, and loops.

 

LangSmith - it is a developer platform created by the makers of LangChain to help you debug, test, and monitor applications built with language models.

Key Features: LangSmith provides tools for:

  • Tracing: to see how your LLM app executes step-by-step (prompts, responses, tool calls).
  • Debugging: Identify where issues arise in complex chains or agent workflows.
  • Evaluation: Run automated or manual tests to assess quality and reliability.
  • Monitoring: Track performance and usage over time in production environments.



Wednesday, December 24, 2025

MCP Server Architecture and Internal Components

What is MCP 

MCP (Model Context Protocol) is a way for AI models (like me) to interact with external tools or systems in a structured way.

MCP Components:

  • Host: Think of this as the "middleman" or platform that runs the AI model and manages communication.
  • Client: This is the user-facing application (like your chat interface) that sends requests to the host.
  • Server: This is the tool or service that provides extra functionality (like a database, calculator, or image generator). The host talks to the server when the client needs something special.

Communication mechanism between MCP Components:

  • MCP Client→ sends a request (e.g., "generate an image") to the Host.
    • What it is: A software component (usually code) that runs in the user-facing application (like a chat UI, IDE plugin, or CLI).
    • Role: Sends structured requests to the MCP Host and receives responses.
    • Not an agent by itself—it’s typically implemented in Python, JavaScript, or other languages as part of the app.
    • Think of it as: The “bridge” between the user and the MCP Host.
    • Client → Host: Protocol => JSON-RPC 2.0 over WebSocket or HTTP
  • MCP Host→ interprets the request and decides if it needs help from a Server.
    • What it is: A runtime environment (code) that runs the AI model and orchestrates communication with MCP Servers.
    • Role: Receives requests from the Client, interprets them, and calls MCP Servers when tools or external data are needed.
    • Not an agent in isolation—it’s usually part of the AI platform (like OpenAI’s MCP implementation).
    • Think of it as: The “brain” that routes requests and aggregates results.
    • Host ↔ Server: Protocol =>  JSON-RPC 2.0 over WebSocket or HTTP
  • MCP Server → does the job (e.g., creates the image) and sends the result back to the Host.
    • What it is: A service or process (code) that exposes tools/resources via the MCP protocol.
    • Role: Provides access to external systems (DB, APIs, file system) through standardized MCP endpoints.
    • Implemented as: Python, Node.js, or any language that supports MCP spec.
    • Think of it as: The “toolbox” behind the Host.
    • Server ↔ tools: Protocol => HTTP, SQL, File I/O to backend
  • Host → returns the result to the MCP Client
Protocols & Message Flow:
  • Requestor: The human or external app that initiates a request (e.g., you in a chat UI).
    • Requestor → MCP Client: UI calls or app method invocations (or HTTP if separate service).
  • MCP Client: The user-facing application (chat app, IDE extension, etc.). It sends requests to the Host.
    • MCP Client → MCP Host: JSON-RPC 2.0 over WebSocket or HTTP.
  • MCP Host: The orchestrator/platform that runs the AI model and speaks MCP. It routes requests to MCP servers.
    • MCP Host → MCP Server(s): JSON-RPC 2.0 over WebSocket or HTTP.
  • MCP Server: One or more services that expose tools/resources via MCP (e.g., file system, DB, APIs).
    • MCP Server → Tool/DB/API: HTTP/REST, SQL, File I/O, etc.
  • Tools / DB Access: The actual backends the MCP server uses (databases, APIs, search, compute, etc.).
MCP and Architecture View:
  • Layer 1: User Layer
    • Component: Requestor (User or external app)
    • Role: Initiates commands or queries.
  • Layer 2: Interface Layer
    • Component: MCP Client
    • Role: Converts user input into structured MCP requests.
    • Protocol: JSON-RPC over WebSocket/HTTP to Host.
  • Layer 3: Orchestration Layer
    • Component: MCP Host
    • Role: Runs the AI model, interprets requests, and routes to MCP Servers.
    • Protocol: JSON-RPC over WebSocket/HTTP to Servers.
  • Layer 4: Integration Layer
    • Components: MCP Servers (API Tools, DB Adapter, File System)
    • Role: Wrap external resources and expose them via MCP.
    • Protocol: HTTP, SQL, File I/O to backends.
  • Layer 5: Backend Layer
    • Components: External APIs, Databases, File Storage

Requestor, MCP Client Stub, MCP Skeleton, MCP Host, MCP Server:

How MCP (Model Context Protocol) works across multiple components. Below is a summary.

1. Requestor

What it is

    • The user, UI, or external application that initiates a request.
    • Examples: Chat UI, automation script, agent, backend service.

What it does

    • Sends a command like:
      • “Get customer 123 summary.”
      • “Buy 10 shares of AAP.L”
    • It never talks directly to MCP Servers.

Who it talks to

    • Always → MCP Host (or indirectly through the Stub inside the Host).

2. MCP Client Stub

What it is

    • A client-side proxy that lives inside the MCP Host.
    • It is created by the MCP Client SDK.
    • It is NOT a separate service.
    • It is NOT per-server code — it is dynamically generated.

What it does

    • Serializes requests into MCP protocol messages.
    • Sends them to the correct MCP Server.
    • Receives responses and returns them to the Host.

Key point

There is ONE MCP Client SDK, but one Stub instance per MCP Server connection.

3. MCP Skeleton

What it is

    • The server-side dispatcher is inside each MCP Server.
    • Automatically generated by FastMCP or Node MCP.

What it does

    • Receives MCP protocol messages from the Stub.
    • Maps them to your Python/JS functions.
    • Calls your @mcp.tool() or @mcp.resource() functions.
    • Returns results back to the Stub.

Where it lives

    • Inside each MCP Server process.

Key point

If you have 5 MCP Servers → you have 5 Skeletons (one per server).

4. MCP Host

What it is

    • The orchestrator and “brain” of the system.
    • Lives in the main application (Copilot, IDE, agent runtime, backend).

What it does

    • Reads configuration (mcp.json).
    • Starts MCP Servers.
    • Creates MCP Client Stub instances.
    • Routes requests to the correct server.
    • Manages connections, retries, auth, and capabilities.

Who it talks to

    • Requestor (UI/app)
    • MCP Client Stub (inside itself)
    • MCP Servers (via transport)

Key point

The Host—not the Stub—decides which server handles a request.

5. MCP Server

What it is

    • A standalone process that exposes tools/resources.
    • It runs outside the MCP host
    • Example: Your accounts server is built with FastMCP.

What it contains

    • The Skeleton
    • Your @mcp.tool() functions
    • Your @mcp.resource() functions
    • Business logic (e.g., Account.get())

What it does

    • Executes tools
    • Reads resources
    • Accesses DB, APIs, files
    • Returns results to the Host via the Stub

Key point

Each MCP Server is independent. Each has its own Skeleton. Each has its own transport connection.

Discovery of MCP (Model Context Protocol) components:

how these are arranged and talk to each other, depending on whether the MCP Client is embedded in your app or is a separate process/service. data flow, and where each piece lives.

MCP deployment pattern

  1. Embedded MCP Client (in the same app)
  2. Separate MCP Client (decoupled from the app)
1) Embedded MCP Client (in the same app)

  • The Requestor app (web/desktop/IDE plugin) includes the MCP Client library directly.
  • Your app directly includes the MCP Client library. The MCP Host runs separately. The client communicates with the Host using JSON‑RPC over WebSocket or HTTP.
  • Where the client lives: Inside your application process (same memory space)
  • Transport: Direct function calls (or in some SDKs, stdio pipes if the host is a child process).
  • Flow:
    • Requestor/ app calls the MCP Client Stub. App (web app, desktop app, IDE plugin, agent runtime) initiates requests. It calls the MCP Client Stub directly through function calls
    • MCP Client Stub lives inside the app. Converts your app’s function calls into JSON‑RPC messages. sends JSON‑RPC to the MCP Host using WebSocket or HTTP.
    • This is the actual MCP Client.
    • It is not a separate service.
    • It is not inside the Host.
    • It is part of your app’s codebase.
    • MCP Host lives separate process or service. decides which tool is needed.
    • Receives JSON‑RPC from the MCP Client.
    • Runs the model and orchestrates tool calls.
    • Decides which MCP Server to call. sends MCP protocol messages to the MCP Servers
    • Sends results back to the Client Stub.
    • MCP Server separate process. Exposes tools and resources. Receives MCP protocol messages from the Host. Returns structured results.
    • MCP Skeletons live inside each MCP Server. Embedded inside the MCP server. Dispatches incoming MCP calls to the correct tool function. Toll called by the MCP Skeleton
    • MCP Tool: It lives inside the MCP Servers. The tool executes and returns results.

        This is the simplest: no network, low latency, easy debugging.

2) Client as a Separate Service - Separate MCP Client (decoupled from the app)

  • The Requestor app calls the MCP Client over HTTP/WebSocket.
  • The MCP Client then talks to the MCP Host.
  • Pros: Reusable client across multiple requestors; separation of concerns.
  • Cons: More components to operate.
        There are variants:
2A) The MCP Client is separate (not inside the host).
    • Where the client lives: A separate service/process that implements the MCP Client role.  
    • Your App → MCP Client over HTTP/WebSocket → MCP Host.
    • Transport:
      • Your app calls the MCP Client using HTTP/WebSocket (like an API).
      • The MCP Client talks to the MCP Host using the MCP transport (WebSocket/HTTP/stdio), per your deployment.
    • Flow:
      • Requestor/app sends HTTP or WebSocket request to MCP Client Stub
      • MCP Client Stub converts it to the MCP protocol; it lives in the Client Layer. Runs as a separate service
      • MCP Host receives MCP messages from the Client Stub; lives in the Host Layer; runs the model and orchestrates the tool call. MCP Host calls the correct MCP Server
      • MCP Skeleton inside the MCP server executes the tool; dispatches MCP calls to actual tool functions
      • MCP Server, lives in the Server Layer, an independent process that exposes tools. MCP Server accesses backend through MCP Skeleton
      • Backend, Lives in the Backend Layer; real systems like databases, APIs, file systems
      • Response flows back up the chain to the Requestor


2B) Your App embeds MCP Client → talks directly (HTTP/WebSocket) to MCP Host

    • Where the client lives: Inside your app (library/SKD), but the host is a separate process/service.
    • Transport: The embedded client uses WebSocket/HTTP to reach the host.
    • Flow: 
      • Requestor/ app → MCP Client (in-process) → MCP Host (remote/local service) → tools/resources → response
      • Requestor: lives in the Request Layer, calls the MCP Client Stub using in‑process function calls, and does NOT speak MCP directly
        • Requestor App calls the MCP Client Stub (in‑process).
      • The MCP Client Stub converts calls into an MCP protocol; it resides in the Client layer, embedded inside the requestor app as a library, converting function calls into MCP protocol messages. Sends MCP messages to the MCP Host
      • MCP Host routes to the correct MCP Server. Runs the model and orchestrates tool calls. Get MCP messages from the client Stub and decide which MCP Server to call.
      • MCP Server lives in the Server Layer. It is an independent process. Exposes tools and resources. MCP Server accesses the backend.
      • MCP Skeleton dispatches MCP calls to actual tool functions; MCP Skeleton executes the tool. Lives inside each MCP Server. Handles execution and returns results
      • Backend, Lives in the Backend Layer; Real systems like databases, APIs, file systems, and external services
      • Response flows back up the chain to the Requestor

            This avoids a separate “client service” tier, but still keeps the host decoupled.

     2C) Both client and host are embedded (host-spawned)

    • Transport:
      • Your app calls the MCP Client using stdio/HTTP/WebSocket (like an API).
      • The MCP Client talks to the MCP Host using the MCP transport (WebSocket/HTTP/stdio), per your deployment.
    • Flow:
      • Your App calls the MCP Client Stub (in‑process). Calls the MCP Client Stub using in‑process function calls. No network needed
      • MCP Client Stub lives in the Client Layer. Embedded inside your app. MC Client Stub sends MCP message to MCP Host (in‑process or stdio).
      • MCP Host, also embedded inside your app OR spawned as a child process. Runs the model and orchestrates tool calls. MCP Host calls and routes to the MCP Server directly.
      • MCP Skeleton executes the tool. Lives inside each MCP Server. Dispatches MCP calls to actual tool functions.
      • MCP Server, a separate process or spawned by the Host. Exposes tools and resources. Server accesses backend.
      • Backend, Lives in the Backend Layer; Real systems like databases, APIs, file systems, and external services
      • Response flows back up the chain to the Requestor

3) Host-Managed Clients

  • Some platforms may offer a thin client inside host-side SDKs, the client remains logically separate.
  • Transport:
    • Your app calls the MCP Client using stdio/HTTP/WebSocket (like an API).
    • The MCP Client talks to the MCP Host using the MCP transport (WebSocket/HTTP/stdio), per your deployment.

o   Flow:

·        Requestor (in your app) sends a request to the Host. Talks directly to the MCP Host.

·        Host (embedded) decides a tool is needed and uses its internal MCP Client Stub.

·        The MCP Client Stub lives inside the MCP Host process but is logically separate. Used by the Host to call external MCP Servers. MCP Client Stub connects to a separate MCP Server

·       MCP Skeleton, lives inside each MCP Server. Dispatches MCP protocol calls to actual tool functions

·   MCP Server, it is an independent process. exposes tools and resources. Inside the MCP server, the MCP Client Skeleton dispatches the call to the right Tool.

·   Backend/ Toollives in the Backend Layer; real systems like databases, APIs, file systems, and external services

·        Response flows back up the chain to the Requestor

  • - Skeleton wraps the resultand sends it back to the Host’s Stub.
  • - Stub returns it to the Host, which returns it to the Requestor.

THE UNIFIED TABLE (All Categories)
        

Category

Client Stub

Host

Server

Skeleton

Requestor

Transport

1 Embedded Client

Embedded in app

Separate

Separate

In server

In app

MCP WS/HTTP/stdio

2A Client Service

Separate service

Separate

Separate

In server

In app

HTTP/WS + MCP

2B Client Embedded, Host Separate

Embedded in app

Separate

Separate

In server

In app

MCP WS/HTTP/stdio

2C Client + Host Embedded

Embedded in app

Embedded/spawned

Separate/spawned

In server

In app

in‑proc + stdio

3 Host‑Managed Client

Embedded inside Host

Embedded

Separate

In server

In app

in‑proc + MCP

Implementation MCP and explanation

  • SDK:
    • FastMCP - python SDK for building MCP server
    • Node MCP - node SDK ffor building MCP Server
    • Fetch Server - python SDK for building MCP server 
    • SDK loads each server independently
    • SDK generates one stub per server
    • SDK manages all stubs in one runtime
  • Creating MCP Server:

from mcp.server.fastmcp import FastMCP
from accounts import Account
mcp = FastMCP("accounts_server")
@mcp.tool()
async def get_balance(name: str) -> float:
return Account.get(name).balance
@mcp.tool()
async def get_holdings(name: str) -> dict[str, int]:
return Account.get(name).holdings
@mcp.tool()
async def buy_shares(name: str, symbol: str, quantity: int, rationale: str) -> float:
return Account.get(name).buy_shares(symbol, quantity, rationale)
@mcp.tool()
async def sell_shares(name: str, symbol: str, quantity: int, rationale: str) -> float:
return Account.get(name).sell_shares(symbol, quantity, rationale)
@mcp.tool()
async def change_strategy(name: str, strategy: str) -> str:
return Account.get(name).change_strategy(strategy)
@mcp.resource("accounts://accounts_server/{name}")
async def read_account_resource(name: str) -> str:
 account = Account.get(name.lower())
 return account.report()
@mcp.resource("accounts://strategy/{name}")
async def read_strategy_resource(name: str) -> str:
    account = Account.get(name.lower())
    return account.get_strategy()
if __name__ == "__main__":
    mcp.run(transport='stdio')

  • Creating MCP Client:
params = StdioServerParameters(command="uv", args=["run", "accounts_server.py"], env=None)
async def list_accounts_tools():
    async with stdio_client(params) as streams:
        async with mcp.ClientSession(*streams) as session:
            await session.initialize()
            tools_result = await session.list_tools()
            return tools_result.tools
  async def call_accounts_tool(tool_name, tool_args):
        return result  
 async def read_accounts_resource(name):
       return result.contents[0].text   
 async def read_strategy_resource(name):
        return result.contents[0].text
 async def get_accounts_tools_openai():
    openai_tools = []
    for tool in await list_accounts_tools():
        schema = {**tool.inputSchema, "additionalProperties": False}
        openai_tool = FunctionTool(
            name=tool.name,
            description=tool.description,
            params_json_schema=schema,
            on_invoke_tool=lambda ctx, args, toolname=tool.name: call_accounts_tool(toolname, json.loads(args)))
        openai_tools.append(openai_tool)
    return openai_tools
  • Using MCP Server:
params = {"command": "uv", "args": ["run", "accounts_server.py"]}
async with MCPServerStdio(params=params, client_session_timeout_seconds=30) as server:
mcp_tools = await server.list_tools()
instructions = "answer questions about the ???."
request = "??????"
model = "gpt-4.1-mini" 
async with MCPServerStdio(params=params, client_session_timeout_seconds=30) as mcp_server:
agent = Agent(name="account_manager", instructions=instructions, model=model, mcp_servers=[mcp_server])
result = await Runner.run(agent, request) 
  • Using MCP Client:
from accounts_client import get_accounts_tools_openai, read_accounts_resource, list_accounts_tools
mcp_tools = await list_accounts_tools()
print(mcp_tools)
openai_tools = await get_accounts_tools_openai()
print(openai_tools)
request = "????????"
with trace("account_mcp_client"):
            agent = Agent(name="account_manager", instructions=instructions, model=model, tools=openai_tools)
            result = await Runner.run(agent, request
            context = await read_accounts_resource("ed")
            print(context)
  • MCP CLIENT — loads servers and uses stubs
  • FAST MCP Client automatically generates stubs
  • Each stub corresponds to one server
  • SDK loads all stubs
  • Creating skeleton:

MCP internally creates something like:

class AccountsSkeleton:
def handle_call_accounts_tool(...) 
def handle_read_accounts_resource(...) 

           FAST MCP generates it internally.  

  • MCP CLIENT — The client stub actually looks like it internally

            FAST MCP dynamically generates something like:

class AccountsStub:
def call_accounts_tool(self, a, b): return rpc_call("add", {"a": a, "b": b}) 
    return rpc_call("add", {"a": a, "b": b}) 
def read_accounts_resource(self, a, b): 
return rpc_call("multiply", {"a": a, "b": b}) 

Step-by-step detailed information:

Step1: Defining an MCP server

  • Defining a single MCP server using FastMCP. Just one MCP server is instantiated
    • mcp = FastMCP("accounts_server")
  • This server exposes tools and resources that interact with an Account class to manage financial operations
  • All tools and resources are registered under this single server.
  • Tools Defined: These are callable functions exposed via MCP:

Tool Name

Purpose

get_balance

Returns the cash balance of an account

get_holdings

Returns the stock holdings of an account

buy_shares

Buys shares of a stock with rationale

sell_shares

Sells shares of a stock with rationale

change_strategy

Updates the investment strategy

    • Each tool uses Account.get(name) to retrieve the account object and perform the operation.
  • Resources Defined: These are read-only endpoints exposed via URI:

Resource URI Pattern

Returns

accounts://accounts_server/{name}

Full account report

accounts://strategy/{name}

Current investment strategy

     At a high level:

  • MCP Server = the thing that provides tools/resources (your accounts_server).
  • MCP Client = the thing that calls those tools/resources programmatically (e.g., , an IDE plugin, a script).
  • MCP Host = the environment that loads and manages MCP clients and servers (e.g., “the runtime” that starts MCP servers, manages config, handles connections).
  • Stub (client side) = the generated/provided proxy that makes calling MCP tools feel like calling local functions.
  • Skeleton (server side) = the dispatcher that receives protocol messages and calls your Python functions like get_balance.

Step 2: MCP Components

  •     MCP server
    • process that implements tools/resources.Usually its own process (Python, Node, etc.).
    • It might run on a local machine, or as a remote service (over TCP, HTTP, etc.), depending on the transport.
    • Example: your accounts_server process is running:
      • mcp = FastMCP("accounts_server")
        ...
        if __name__ == "__main__":
            mcp.run(transport='stdio')
  • MCP client (stub)
    • Code that speaks MCP (JSON RPC-ish messages), serialize a “call tool X with params Y” into protocol messages, and parses responses.
    • call a tool by name with arguments.”
    • it lives inside the host process 
    • It’s typically a library or module loaded by the host.
    • Client stubs/proxies for each tool/resource
    • Connection state (e.g., stdio pipes, sockets).
MCP host
    • It’s usually the main application process. Host = main app Client = protocol driver inside the host Server = external tool provider process
    • The orchestrator that loads server definitions (from config, registry, etc.).
    • Starts/stops MCP servers (processes).
    • Manages connections and routing.
    • It starts the Servers correctly,
    • Clients are wired to the right server.
  • Skeleton (on the server)
    • Inside the MCP server process (same process as your Python code. It’s part of the FastMCP runtime.
    • The server-side dispatcher that:
      • Receives protocol messages from the client and maps to the right Python/TypeScript function. Handles serialization, errors, and responses.
      • In this code, don’t see “skeleton” explicitly — FastMCP generates/handles that:
      • When a message “call tool get_balance with args {name: "alice"}” arrives:
      • The skeleton in FastMCP:
        • Finds the registered function get_balance,
        • Call it with the correct arguments,
        • Wraps the return value into an MCP response.

Step 3: How they communicate

A client wants to call your buy_shares tool.

Step a: The host decides to use a server

  • Host (e.g.,  runtime):
    • Reads config: “There is an MCP server named accounts_server.”
    • Starts it:
      • Might spawn python accounts_server.py
      • Or connect to an already-running instance.
    • Connection is established over the configured transport (stdio in your case).
    • It lives:
      • Host process: main app / IDE/runtime.
      • Server process: Python process running FastMCP.
    • The Host loads a configuration file such as:
      • mcp.json
      • mcp.config.json
      • VS Code extension settings
      •  workspace config
      • A registry of installed MCP servers
    • This config contains entries like: json

      {
        "servers": {
          "accounts_server": {                      //What the server is called,
            "command": "python",                //How to start a career
            "args": ["accounts_server.py"],   //Where it lives
            "transport": "stdio"                      //What transport to use
          }
        }
      }

    • The Host uses this config to start the MCP Server
    • The Host then creates the MCP Client stub and connects it to the server

Step b: Client discovers tools/resources 

  • The MCP Client gets informed through the Host
  • MCP client inside the host performs capability discovery:
    • Asks the server: “What tools/resources do you have?”
    • Server responds with:
      • Tools: get_balance, get_holdings, buy_shares, sell_shares, change_strategy
      • Resources: accounts://accounts_server/{name}, accounts://strategy/{name}
    • The client builds stubs (proxies) for each tool/resource:
      • “I now know how to call buy_shares remotely.”
    • It lives:
      • Client stub code: inside host.
      • Capability response: from the server skeleton in your Python process.
    • Once the Host starts the server, the MCP Client:
      • Opens the transport (stdio, socket, etc.)
      • Sends a “initialize” message
      • Receives the server’s capabilities:
        • Tools,  Resources and Metadata
      • This is where the MCP Client learns:
        • This server has a tool called buy_shares
        • This server exposes a resource accounts://strategy/{name}”
    • The MCP Client is informed by the Host, not by the requester
    • The MCP Client learns capabilities from the server, not from config

Step c: Host/client decides to call a tool

  • The Host routes the request to the correct server
  • An AI agent or UI action decides:
    • Call buy_shares for name="alice", symbol="AAPL", quantity=10, rationale="Long-term growth".
    • Looks at the available servers, checks which server exposes a tool named buy_shares
    • Chooses the correct server (accounts_server)
  • Client stub builds a request message:
      • method: "tools/call"
      • params: { "name": "buy_shares", "arguments": { ... } }
    • This is serialized to the MCP protocol (JSON).
    • It sends this over the connection (stdio / socket) to the server.
    • It lives:
      • Request creation: client stub in host.
      • Wire transmission: transport channel between host and server processes.
    • The Host decides
    • The Client executes
    • The Server performs the action

Step d: The MCP Server skeleton handles the call

  • The skeleton in your FastMCP server: accounts_server
    • mcp = FastMCP("accounts_server")
      ...
      if __name__ == "__main__":
          mcp.run(transport='stdio')

    • Receives the protocol message. Looks up the registered tool handler:
      • @mcp.tool()
        async def buy_shares(...):

            ...
      • @mcp.resource().
    • Deserializes the arguments to Python types.
      • result = await buy_shares(name, symbol, quantity, rationale)
    • function executes:
      • return Account.get(name).buy_shares(symbol, quantity, rationale)
    • It lives:
      • Skeleton & tool function: inside the server process.
      • Business logic (Account): same server process (Python code).
    • Inside your Python process:
      • The skeleton receives the message
      • It finds the correct function (buy_shares)
      • Call your Python code
      • Returns the result back to the client stub

Step e: Response back to the client (The MCP Client returns the result to the Host)

  • Once the function returns a result (say, updated balance as float):
    • Skeleton wraps it in an MCP response message. Serializes it (JSON). 
    • Sends it back over the same transport channel.
  • Client stub:
    • Receives the response. deserializes it to a native type (e.g., float or structured object).
    • Returns it to the caller (agent, UI, or higher-level logic).
  • It lives:
    • Response handling: client stub in host process
    • Response generation: server process/skeleton
  • The Host then:
    • Gives the result to the agent
    • Or displays it to the user
    • or uses it in a chain of reasoning
Summary:
  • The MCP Client gets informed through the Host, not directly from the requester.
  • The Host learns about servers through configuration.
  • The Client learns about tools/resources from the server during initialization.
  • The Host routes requests to the correct server.
Protocols & Message Flow
  • Requestor → MCP Client: UI calls or app method invocations (or HTTP if separate service).
  • MCP Client → MCP Host: JSON-RPC 2.0 over WebSocket or HTTP.
  • MCP Host → MCP Server(s): JSON-RPC 2.0 over WebSocket or HTTP.
  • MCP Server → Tool/DB/API: HTTP/REST, SQL, File I/O, etc.



Use Cases:
  • Use Case 1: KYC / AML Automation (Identity Verification & Risk Checks)
    • MCP allows AI agents to securely access customer documents, transaction history, sanctions lists, and risk scoring tools.
    • AI can maintain context across multiple systems and steps — something legacy automation cannot do.
    • MCP enforces policy boundaries so AI can only perform allowed actions.
  • Use Case 2: Payment Reconciliation & Exception Handling
    • MCP lets AI agents connect to payment systems, ledger APIs, transaction logs, and exception queues.
    • AI can maintain context across multiple systems and automatically resolve mismatches.
    • MCP ensures actions follow strict policy boundaries.
  • Some other use cases
    •  Banking Need

      Why MCP Helps

      KYC / AML Automation

      Secure AI access to documents, risk systems, sanctions data; context persistence; policy enforcement

      Payment Reconciliation

      AI agents can take authorized actions across multiple systems with auditability

      General AI Integration

      MCP is a universal standard bridging AI and fragmented banking systems

      Real‑time Decisioning

      MCP enables real‑time, context‑aware AI across financial data

MCP server Integral Components:

Remote Procedure Call (RPC):

Remote Procedure Call (RPC) is a method that enables a program to execute a function on another computer in a network as if it were local. The client sends the request (with arguments) to the server, the server executes the function, and the result is sent back. RPC hides the details of networking,

Step by Step - RPC Works:
  • Client Calls Stub: The client calls a local procedure (stub) as if it were a normal procedure.
  • Marshalling: The stub packs (marshals) all input parameters into a message.
  • Send to Server: The message is sent across the network to the server.
  • Server Stub: The server stub unpacks the message and calls the actual server procedure.
  • Execution & Return: The server runs the procedure and returns the result to the stub.
  • Back to Client: The server stub sends the result back, and the client stub unpacks it.

 Notes:

  • RPC Runtime: A library that manages the communication in RPC. It handles binding, sending/receiving data, selecting the protocol, and handling errors.

Inter-Process Communication, or IPC:
Inter-Process Communication, or IPC, is a mechanism that allows coordinating communication between different running processes (applications)
It helps processes synchronize their activities, share information, and avoid conflicts while accessing shared resources.

Transport: 
How messages flow—function call (in-process), stdio (same machine), WebSocket / HTTP (network).