Sunday, 11 May 2025
  • My Feed
  • My Interests
  • My Saves
  • History
  • Blog
Subscribe
Capernaum
  • Finance
    • Cryptocurrency
    • Stock Market
    • Real Estate
  • Lifestyle
    • Travel
    • Fashion
    • Cook
  • Technology
    • AI
    • Data Science
    • Machine Learning
  • Health
    HealthShow More
    Skincare as You Age Infographic
    Skincare as You Age Infographic

    When I dove into the scientific research for my book How Not…

    By capernaum
    Treating Fatty Liver Disease with Diet 
    Treating Fatty Liver Disease with Diet 

    What are the three sources of liver fat in fatty liver disease,…

    By capernaum
    Bird Flu: Emergence, Dangers, and Preventive Measures

    In the United States in January 2025 alone, approximately 20 million commercially-raised…

    By capernaum
    Inhospitable Hospital Food 
    Inhospitable Hospital Food 

    What do hospitals have to say for themselves about serving meals that…

    By capernaum
    Gaming the System: Cardiologists, Heart Stents, and Upcoding 
    Gaming the System: Cardiologists, Heart Stents, and Upcoding 

    Cardiologists can criminally game the system by telling patients they have much…

    By capernaum
  • Sport
  • 🔥
  • Cryptocurrency
  • Data Science
  • Travel
  • Real Estate
  • AI
  • Technology
  • Machine Learning
  • Stock Market
  • Finance
  • Fashion
Font ResizerAa
CapernaumCapernaum
  • My Saves
  • My Interests
  • My Feed
  • History
  • Travel
  • Health
  • Technology
Search
  • Pages
    • Home
    • Blog Index
    • Contact Us
    • Search Page
    • 404 Page
  • Personalized
    • My Feed
    • My Saves
    • My Interests
    • History
  • Categories
    • Technology
    • Travel
    • Health
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Home » Blog » A Step-by-Step Coding Guide to Defining Custom Model Context Protocol (MCP) Server and Client Tools with FastMCP and Integrating Them into Google Gemini 2.0’s Function‑Calling Workflow
AI

A Step-by-Step Coding Guide to Defining Custom Model Context Protocol (MCP) Server and Client Tools with FastMCP and Integrating Them into Google Gemini 2.0’s Function‑Calling Workflow

capernaum
Last updated: 2025-04-21 19:03
capernaum
Share
A Step-by-Step Coding Guide to Defining Custom Model Context Protocol (MCP) Server and Client Tools with FastMCP and Integrating Them into Google Gemini 2.0’s Function‑Calling Workflow
SHARE

In this Colab‑ready tutorial, we demonstrate how to integrate Google’s Gemini 2.0 generative AI with an in‑process Model Context Protocol (MCP) server, using FastMCP. Starting with an interactive getpass prompt to capture your GEMINI_API_KEY securely, we install and configure all necessary dependencies: the google‑genai Python client for calling the Gemini API, fastmcp for defining and hosting our MCP tools in‑process, httpx for making HTTP requests to the Open‑Meteo weather API, and nest_asyncio to patch Colab’s already‑running asyncio event loop. The workflow proceeds by spinning up a minimal FastMCP “weather” server with two tools, get_weather(latitude, longitude) for a three‑day forecast and get_alerts(state) for state‑level weather alerts, then creating a FastMCPTransport to connect an MCP client to that server. Finally, using the Gemini function‑calling feature, we send a natural‑language prompt to Gemini, have it emit a function call based on our explicit JSON schemas, and then execute that call via the MCP client, returning structured weather data into our notebook.

Copy CodeCopiedUse a different Browser
from getpass import getpass
import os


api_key = getpass("Enter your GEMINI_API_KEY: ")
os.environ["GEMINI_API_KEY"] = api_key

We securely prompt you to enter your Gemini API key (without displaying it on the screen) and then store it in the GEMINI_API_KEY environment variable, allowing the rest of your notebook to authenticate with Google’s API.

Copy CodeCopiedUse a different Browser
!pip install -q google-genai mcp fastmcp httpx nest_asyncio

We install all the core dependencies needed for our Colab notebook in one go—google‑genai for interacting with the Gemini API, mcp and fastmcp for building and hosting our Model Context Protocol server and client, httpx for making HTTP requests to external APIs, and nest_asyncio to patch the event loop so our async code runs smoothly.

Copy CodeCopiedUse a different Browser

We apply the nest_asyncio patch to the notebook’s existing event loop, allowing us to run asyncio coroutines (like our MCP client interactions) without encountering “event loop already running” errors.

Copy CodeCopiedUse a different Browser
from fastmcp import FastMCP
import httpx


mcp_server = FastMCP("weather")


@mcp_server.tool()
def get_weather(latitude: float, longitude: float) -> str:
    """3‑day min/max temperature forecast via Open‑Meteo."""
    url = (
        f"https://api.open-meteo.com/v1/forecast"
        f"?latitude={latitude}&longitude={longitude}"
        "&daily=temperature_2m_min,temperature_2m_max&timezone=UTC"
    )
    resp = httpx.get(url, timeout=10)
    daily = resp.json()["daily"]
    return "n".join(
        f"{date}: low {mn}°C, high {mx}°C"
        for date, mn, mx in zip(
            daily["time"],
            daily["temperature_2m_min"],
            daily["temperature_2m_max"],
        )
    )


@mcp_server.tool()
def get_alerts(state: str) -> str:
    """Dummy US‑state alerts."""
    return f"No active weather alerts for {state.upper()}."

We create an in‑process FastMCP server named “weather” and register two tools: get_weather(latitude, longitude), which fetches and formats a 3‑day temperature forecast from the Open‑Meteo API using httpx, and get_alerts(state), which returns a placeholder message for U.S. state weather alerts.

Copy CodeCopiedUse a different Browser
import asyncio
from google import genai
from google.genai import types
from fastmcp import Client as MCPClient
from fastmcp.client.transports import FastMCPTransport

We import the core libraries for our MCP‑Gemini integration: asyncio to run asynchronous code, google‑genai and its types module for calling Gemini and defining function‑calling schemas, and FastMCP’s Client (aliased as MCPClient) with its FastMCPTransport to connect our in‑process weather server to the MCP client.

Copy CodeCopiedUse a different Browser
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
MODEL = "gemini-2.0-flash"
transport = FastMCPTransport(mcp_server)

We initialize the Google Gemini client using the GEMINI_API_KEY from your environment, specify the gemini-2.0-flash model for function‑calling, and set up a FastMCPTransport that connects the in‑process mcp_server to the MCP client.

Copy CodeCopiedUse a different Browser
function_declarations = [
    {
        "name": "get_weather",
        "description": "Return a 3‑day min/max temperature forecast for given coordinates.",
        "parameters": {
            "type": "object",
            "properties": {
                "latitude": {
                    "type": "number",
                    "description": "Latitude of target location."
                },
                "longitude": {
                    "type": "number",
                    "description": "Longitude of target location."
                }
            },
            "required": ["latitude", "longitude"]
        }
    },
    {
        "name": "get_alerts",
        "description": "Return any active weather alerts for a given U.S. state.",
        "parameters": {
            "type": "object",
            "properties": {
                "state": {
                    "type": "string",
                    "description": "Two‑letter U.S. state code, e.g. 'CA'."
                }
            },
            "required": ["state"]
        }
    }
]


tool_defs = types.Tool(function_declarations=function_declarations)

We manually define the JSON schema specifications for our two MCP tools, get_weather (which accepts latitude and longitude as numeric inputs) and get_alerts (which accepts a U.S. state code as a string), including names, descriptions, required properties, and data types. It then wraps these declarations in types. Tool object (tool_defs), which informs Gemini how to generate and validate the corresponding function calls.

Copy CodeCopiedUse a different Browser
async def run_gemini(lat: float, lon: float):
    async with MCPClient(transport) as mcp_client:
        prompt = f"Give me a 3‑day weather forecast for latitude={lat}, longitude={lon}."
        response = client.models.generate_content(
            model=MODEL,
            contents=[prompt],
            config=types.GenerateContentConfig(
                temperature=0,
                tools=[tool_defs]
            )
        )


        call = response.candidates[0].content.parts[0].function_call
        if not call:
            print("No function call; GPT said:", response.text)
            return


        print("🔧 Gemini wants:", call.name, call.args)


        result = await mcp_client.call_tool(call.name, call.args)
        print("n📋 Tool result:n", result)


asyncio.get_event_loop().run_until_complete(run_gemini(37.7749, -122.4194))

Finally, this async function run_gemini opens an MCP client session over our in‑process transport, sends a natural‑language prompt to Gemini asking for a 3‑day forecast at the given coordinates, captures the resulting function call (if any), invokes the corresponding MCP tool, and prints out the structured weather data, all of which is kicked off by running it in the notebook’s event loop with run_until_complete.

In conclusion, we have a fully contained pipeline that showcases how to define custom MCP tools in Python, expose them via FastMCP, and seamlessly integrate them with Google’s Gemini 2.0 model using the google‑genai client. The key frameworks, FastMCP for MCP hosting, FastMCPTransport and MCPClient for transport and invocation, httpx for external API access, and nest_asyncio for Colab compatibility, work together to enable real‑time function calling without external processes or stdio pipes. This pattern simplifies local development and testing of MCP integrations in Colab and provides a template for building more advanced agentic applications that combine LLM reasoning with specialized domain tools.


Here is the Colab Notebook. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.

🔥 [Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop

The post A Step-by-Step Coding Guide to Defining Custom Model Context Protocol (MCP) Server and Client Tools with FastMCP and Integrating Them into Google Gemini 2.0’s Function‑Calling Workflow appeared first on MarkTechPost.

Share This Article
Twitter Email Copy Link Print
Previous Article Top 4 Cryptos Expected to Surge in the Next Crypto Bull Run of 2025 Top 4 Cryptos Expected to Surge in the Next Crypto Bull Run of 2025
Next Article Analyst Reveals How High XRP Price Can Go If This Happens Analyst Reveals How High XRP Price Can Go If This Happens
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Using RSS feeds, we aggregate news from trusted sources to ensure real-time updates on the latest events and trends. Stay ahead with timely, curated information designed to keep you informed and engaged.
TwitterFollow
TelegramFollow
LinkedInFollow
- Advertisement -
Ad imageAd image

You Might Also Like

This AI Paper Introduces Effective State-Size (ESS): A Metric to Quantify Memory Utilization in Sequence Models for Performance Optimization
AIMachine LearningTechnology

This AI Paper Introduces Effective State-Size (ESS): A Metric to Quantify Memory Utilization in Sequence Models for Performance Optimization

By capernaum
LightOn AI Released GTE-ModernColBERT-v1: A Scalable Token-Level Semantic Search Model for Long-Document Retrieval and Benchmark-Leading Performance
AIMachine LearningTechnology

LightOn AI Released GTE-ModernColBERT-v1: A Scalable Token-Level Semantic Search Model for Long-Document Retrieval and Benchmark-Leading Performance

By capernaum

A Coding Implementation of Accelerating Active Learning Annotation with Adala and Google Gemini

By capernaum
Tencent Released PrimitiveAnything: A New AI Framework That Reconstructs 3D Shapes Using Auto-Regressive Primitive Generation
AITechnology

Tencent Released PrimitiveAnything: A New AI Framework That Reconstructs 3D Shapes Using Auto-Regressive Primitive Generation

By capernaum
Capernaum
Facebook Twitter Youtube Rss Medium

Capernaum :  Your instant connection to breaking news & stories . Stay informed with real-time coverage across  AI ,Data Science , Finance, Fashion , Travel, Health. Your trusted source for 24/7 insights and updates.

© Capernaum 2024. All Rights Reserved.

CapernaumCapernaum
Welcome Back!

Sign in to your account

Lost your password?