• unwind ai
  • Posts
  • Build an MCP GitHub Agent in Less Than 50 Lines of Code

Build an MCP GitHub Agent in Less Than 50 Lines of Code

Fully functional AI agent using MCP with step-by-step instructions (100% opensource)

There's been quite a debate about whether the Model Context Protocol (MCP) really brings anything new to the table compared to traditional APIs. While some say it just adds another layer of complexity, others point to its standardization benefits. In this tutorial, we'll show you how MCP can actually simplify things when building AI agents that talk to external services like GitHub.

Today, we'll build a GitHub agent that uses MCP to let you query repositories with natural language. You'll track issues, analyze PRs, and check repo activity—all without leaving your chat interface or wrestling with complex API code.

MCP, for those living under a rock, is an open protocol that standardizes how applications provide context to LLMs. Think of MCP as a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to peripherals, MCP provides a standardized way to connect AI models to different data sources and tools.

We have used Agno, a lightweight framework for building multi-modal AI agents with a focus on simplicity, performance, and flexibility. With Agno's new MCP integration, you can easily create agents that connect to any MCP-compatible service with minimal code. OpenAI’s GPT-4o is used as the LLM.

Don’t forget to share this tutorial on your social channels and tag Unwind AI (X, LinkedIn, Threads, Facebook) to support us!

What We’re Building

This Streamlit application allows you to explore and analyze GitHub repositories through natural language queries using the Model Context Protocol (MCP).

Features:

  • Natural Language Interface: Ask questions about repositories in plain English

  • Comprehensive Analysis: Explore issues, pull requests, repository activity, and code statistics

  • Interactive UI: User-friendly interface with example queries and custom input

  • MCP Integration: Leverages the Model Context Protocol to interact with GitHub's API

  • Real-time Results: Get immediate insights on repository activity and health

Prerequisites

Before we begin, make sure you have the following:

  1. Python installed on your machine (version 3.10 or higher is recommended)

  2. Node.js and npm installed (for running the MCP GitHub server)

    • This is a critical requirement! The app uses npx to run the MCP GitHub server

    • Download and install from nodejs.org

  3. GitHub Personal Access Token with appropriate permissions

  4. Your OpenAI API key

  5. A code editor of your choice (we recommend VS Code or PyCharm for their excellent Python support)

  6. Basic familiarity with Python programming

Step-by-Step Instructions

Setting Up the Environment

First, let's get our development environment ready:

  1. Clone the GitHub repository:

git clone https://github.com/Shubhamsaboo/awesome-llm-apps.git
  1. Go to the github_mcp_agent folder:

cd mcp_ai_agents/github_mcp_agent
pip install -r requirements.txt
  1. Verify Node.js and npm are installed:

node --version
npm --version
npx --version

All of these commands should return version numbers. If they don't, please install Node.js.

  1. Set up your API keys:

    • Set OpenAI API Key as an environment variable:

export OPENAI_API_KEY=your-openai-api-key

GitHub token will be entered directly in the app interface

  1. Create a GitHub Personal Access Token:

Creating the Streamlit App

Let’s create our app. Create a new file github_agent.py and add the following code:

  1. Let's import our libraries:

import asyncio
import os
import streamlit as st
from textwrap import dedent
from agno.agent import Agent
from agno.tools.mcp import MCPTools
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
  1. Set up the Streamlit page configuration:

# Page config
st.set_page_config(page_title="🐙 GitHub MCP Agent", page_icon="🐙", layout="wide")

# Title and description
st.markdown("<h1 class='main-header'>🐙 GitHub MCP Agent</h1>", unsafe_allow_html=True)
st.markdown("Explore GitHub repositories with natural language using the Model Context Protocol")
  1. Create the sidebar for API key input and example queries:

# Setup sidebar for API key
with st.sidebar:
    st.header("🔑 Authentication")
    github_token = st.text_input("GitHub Token", type="password",
                                help="Create a token with repo scope at github.com/settings/tokens")
    if github_token:
        os.environ["GITHUB_TOKEN"] = github_token
    
    st.markdown("---")
    st.markdown("### Example Queries")
    st.markdown("**Issues**")
    st.markdown("- Show me issues by label")
    st.markdown("- What issues are being actively discussed?")
    st.markdown("**Pull Requests**")
    st.markdown("- What PRs need review?")
    st.markdown("- Show me recent merged PRs")
    st.markdown("**Repository**")
    st.markdown("- Show repository health metrics")
    st.markdown("- Show repository activity patterns")
    st.markdown("---")
    st.caption("Note: Always specify the repository in your query if not already selected in the main input.")
  1. Build the main query interface:

# Query input
col1, col2 = st.columns([3, 1])
with col1:
    repo = st.text_input("Repository", value="Shubhamsaboo/awesome-llm-apps", help="Format: owner/repo")
with col2:
    query_type = st.selectbox("Query Type", [
        "Issues", "Pull Requests", "Repository Activity", "Custom"
    ])

# Create predefined queries based on type
if query_type == "Issues":
    query_template = f"Find issues labeled as bugs in {repo}"
elif query_type == "Pull Requests":
    query_template = f"Show me recent merged PRs in {repo}"
elif query_type == "Repository Activity":
    query_template = f"Analyze code quality trends in {repo}"
else:
    query_template = ""

query = st.text_area("Your Query", value=query_template,
                    placeholder="What would you like to know about this repository?")
  1. Implement the GitHub agent function using MCP:

# Main function to run agent
async def run_github_agent(message):
    if not os.getenv("GITHUB_TOKEN"):
        return "Error: GitHub token not provided"
    
    try:
        server_params = StdioServerParameters(
            command="npx",
            args=["-y", "@modelcontextprotocol/server-github"],
        )
        
        # Create client session
        async with stdio_client(server_params) as (read, write):
            async with ClientSession(read, write) as session:
                # Initialize MCP toolkit
                mcp_tools = MCPTools(session=session)
                await mcp_tools.initialize()
                
                # Create agent
                agent = Agent(
                    tools=[mcp_tools],
                    instructions=dedent("""\\
                    You are a GitHub assistant. Help users explore repositories and their activity.
                    - Provide organized, concise insights about the repository
                    - Focus on facts and data from the GitHub API
                    - Use markdown formatting for better readability
                    - Present numerical data in tables when appropriate
                    - Include links to relevant GitHub pages when helpful
                    """),
                    markdown=True,
                    show_tool_calls=True,
                )
                
                # Run agent
                response = await agent.arun(message)
                return response.content
                
    except Exception as e:
        return f"Error: {str(e)}"
  1. Create the run button and display results:

# Run button
if st.button("🚀 Run Query", type="primary", use_container_width=True):
    if not github_token:
        st.error("Please enter your GitHub token in the sidebar")
    elif not query:
        st.error("Please enter a query")
    else:
        with st.spinner("Analyzing GitHub repository..."):
            # Ensure the repository is explicitly mentioned in the query
            if repo and repo not in query:
                full_query = f"{query} in {repo}"
            else:
                full_query = query
            
            result = asyncio.run(run_github_agent(full_query))
            
            # Display results in a nice container
            st.markdown("### Results")
            st.markdown(result)
  1. Add help text for first-time users:

# Display help text for first-time users
if 'result' not in locals():
    st.markdown(
        """<div class='info-box'>
        <h4>How to use this app:</h4>
        <ol>
        <li>Enter your GitHub token in the sidebar</li>
        <li>Specify a repository (e.g., Shubhamsaboo/awesome-llm-apps)</li>
        <li>Select a query type or write your own</li>
        <li>Click 'Run Query' to see results</li>
        </ol>
        <p><strong>Important Notes:</strong></p>
        <ul>
        <li>The Model Context Protocol (MCP) provides real-time access to GitHub repositories</li>
        <li>Queries work best when they focus on specific aspects like issues, PRs, or repository info</li>
        <li>More specific queries yield better results</li>
        <li>This app requires Node.js to be installed (for the npx command)</li>
        </ul>
        </div>""",
        unsafe_allow_html=True
    )

# Footer
st.markdown("---")
st.write("Built with Streamlit, Agno, and Model Context Protocol ❤️")

Running the App

With our code in place, it's time to launch the app.

  • In your terminal, navigate to the project folder, and run the following command

streamlit run github_agent.py
  • Streamlit will provide a local URL (typically http://localhost:8501). Open this in your web browser.

  • Enter your GitHub token in the sidebar.

  • Specify a repository to analyze (default is Shubhamsaboo/awesome-llm-apps).

  • Select a query type or write your own.

  • Click "Run Query" to see the results.

How The App Works

When you run this app, the following workflow happens behind the scenes:

  1. User Input: The user provides a GitHub token and a natural language query about a repository.

  2. MCP Server: The app launches an MCP GitHub server using npx, which acts as a standardized interface to GitHub's API.

  3. Connection: The Agno agent connects to this MCP server, establishing a bidirectional communication channel.

  4. Query Processing: The agent processes the natural language query and translates it into appropriate MCP tool calls.

  5. Data Retrieval: The MCP server fetches the requested data from GitHub's API and returns it in a standardized format.

  6. Response Generation: The agent processes this information and generates a human-readable response, formatted with markdown for better readability.

This design abstracts away the complexity of direct GitHub API integration, allowing you to focus on the user experience rather than API specifics.

Working Application Demo

Conclusion

You've just built a GitHub Agent using MCP that can answer natural language queries about repositories, all in less than 50 lines of code. This shows how MCP can simplify API integrations and create a smooth interface between LLMs and external services.

Here are some ways you could enhance this project:

  1. Expand the Analysis: Add support for code analysis, commit history visualization, or contributor statistics.

  2. Add Visualizations: Integrate charts and graphs to show repository activity trends.

  3. Compare Repositories: Enable comparing metrics across multiple repositories.

  4. Schedule Reports: Set up automatic report generation for repository health and activity.

Keep experimenting with different agent configurations and features to build more sophisticated AI applications.

We share hands-on tutorials like this 2-3 times a week, to help you stay ahead in the world of AI. If you're serious about leveling up your AI skills and staying ahead of the curve, subscribe now and be the first to access our latest tutorials.

Don’t forget to share this tutorial on your social channels and tag Unwind AI (X, LinkedIn, Threads, Facebook) to support us!

Reply

or to participate.