- unwind ai
- Posts
- MCP Tools Registry for AI Agents
MCP Tools Registry for AI Agents
PLUS: Mistral Small 3.1 beats Gemma 3, Home for building and running AI agents
Today’s top AI Highlights:
App Store for AI tools that work with Model Context Protocol (MCP)
End-to-end platform to explore, develop, test, and deploy AI agents
Mistral Small 3.1 that outperforms last week’s SOTA Gemma 3
Claude will get full access to read and edit your local files
Serve 1000s of fine-tuned LLMs on a single GPU
& so much more!
Read time: 3 mins
AI Tutorials
In this tutorial, we'll show you how to create your own powerful Deep Research Agent that performs in minutes what might take human researchers hours or even days—all without the hefty subscription fees. Using OpenAI's Agents SDK and Firecrawl, you'll build a multi-agent system that searches the web, extracts content, and synthesizes comprehensive reports through a clean Streamlit interface.
OpenAI's Agents SDK is a lightweight framework for building AI applications with specialized agents that work together. It provides primitives like agents, handoffs, and guardrails that make it easy to coordinate tasks between multiple AI assistants.
Firecrawl’s new deep-research endpoint enables our agent to autonomously explore the web, gather relevant information, and synthesize findings into comprehensive insights.
We share hands-on tutorials like this 2-3 times a week, designed to help you stay ahead in the world of AI. If you're serious about leveling up your AI skills and staying ahead of the curve, subscribe now and be the first to access our latest tutorials.
Latest Developments

The Model Context Protocol (MCP) now has a central hub: mcp.run, a registry for AI tools designed to be easily integrated into any AI application. This platform allows you to share and discover tools, called servlets, that extend the capabilities of AI models and agents.
mcp.run emphasizes security and portability, with servlets running in isolated WebAssembly environments. With features for team collaboration and built-in integrations, mcp.run helps you easily connect your AI to real-world data and services.
Key Highlights
Open Tool Registry - mcp.run provides a registry of pre-built AI tools ("servlets") and supports the development of new ones in 5+ languages (TypeScript, Go, Python, Rust, C#, C++, Zig). This allows for collaborative development and sharing of tools within the AI community, using a portable, WebAssembly-based format.
Secure and Isolated Tool Execution - Servlets run within a secure WebAssembly sandbox, providing strong isolation and resource control. You can specify granular permissions for network and filesystem access, ensuring that tools operate within defined boundaries. This offers an enterprise-grade security and isolation approach.
Integration via MCP and mcpx - All tools on mcp.run implement the MCP, ensuring model-agnostic access. The mcpx technology simplifies integration by presenting multiple servlets as a single MCP server, streamlining the connection process for client applications.
Tasks for Event-Driven AI Automation - Define prompts with installed AI tools, that trigger on events. Set the triggers through HTTP requests. It allows integrations of LLM in agentic apps, and custom workflows. Handle a webhook and automate your work or periodically schedule a smart assistant!

Agentverse by Fetch.ai is an end-to-end platform for the entire lifecycle of AI agents. This new SaaS offering allows you to build, deploy, and manage autonomous agents very easily. Its cloud-based IDE provides everything needed to create intelligent agents that can communicate with each other and perform complex tasks.
Agentverse offers secure agent hosting with near 100% uptime, flexible scaling based on message volume, and seamless integration with the AI Engine that can understand natural language requests and connect users with appropriate agent functions. You can even register your agents' functions for use by Fetch.ai's AI Engine, opening possibilities for revenue generation.
Key Highlights:
Streamlined Development: Agentverse provides a cloud-based IDE, initially focused on Python, for building and deploying agents. To get you started quickly, it also includes a library of pre-built agent templates for various use cases.
Continuous Agent Hosting - Once deployed, your agents keep running with a targeted 100% uptime regardless of whether your browser is open. The platform handles scaling based on message volume automatically.
Multi-Agent Communication - Agents can talk to each other through various methods including direct messaging, REST APIs, and natural language via the DeltaV interface.
AI Engine - The built-in AI Engine converts natural language requests into specific tasks, connecting users to the most appropriate agent functions to complete their objectives.
Agent Marketplace - Register your agent's functions in the marketplace to make them discoverable. When others use your functions through the AI Engine, you can earn compensation for those interactions.
Quick Bites
Mistral has released Mistral Small 3.1, a 24B model built - best in its weight class. Opensourced under Apache 2.0, this new model comes with improved text performance, multimodal understanding, and a context window of up to 128k tokens. The model outperforms Gemma 3 27B and GPT-4o-mini across all benchmarks including Simple QA, MMLU Pro, and document understanding, while delivering inference speeds of 150 tokens/second.
Anthropic is soon releasing a new "Harmony" feature for Claude that gives it full access to your local directory. Harmony enables Claude to analyze, index, edit, and search files within the directory, even creating new files and showing diffs for user approval. Early testing showed Claude analyzing an entire codebase for vulnerability detection, as well as, direct file interaction for operations. Its full capabilities remain to be seen.
We spotted another interesting update in the Cursor changelog. Anthropic is releasing a new Claude mode called “3.7 Sonnet MAX with and without thinking” and Cursor has already added support for it.
Tools of the Trade
LoRAX: Serve 1000s of fine-tuned LLMs on a single GPU. LoRAX dynamically loads LoRA adapters just-in-time while maintaining performance through heterogeneous continuous batching. It dramatically reduces serving costs without compromising on throughput or latency.
Cursor talk to Figma MCP: Figma now has an MCP server to talk to Cursor. This MCP plugin allows Cursor to communicate with Figma to read and modify UI designs programmatically. Completely free and opensource.
Basic Memory: Build a persistent semantic graph from conversations with AI assistants like Claude so you don’t have to give it context of the same thing repeatedly. All knowledge is stored in standard Markdown files on your computer. Uses MCP to enable any compatible LLM to read and write to your local knowledge base.
SWE-agent: Enables LLMs to autonomously use tools to fix issues in real GitHub repositories, perform tasks on the web, find cybersecurity vulnerabilities, or other tasks. It does so by using configurable agent-computer interfaces to interact with isolated computer environments.
Awesome LLM Apps: Build awesome LLM apps with RAG, AI agents, and more to interact with data sources like GitHub, Gmail, PDFs, and YouTube videos, and automate complex work.

Hot Takes
Some people today are discouraging others from learning programming on the grounds AI will automate it. This advice will be seen as some of the worst career advice ever given. I disagree with the Turing Award and Nobel prize winner who wrote, “It is far more likely that the programming occupation will become extinct [...] than that it will become all-powerful. More and more, computers will program themselves.” Statements discouraging people from learning to code are harmful! ~
Andrew NgI’m starting a new business.
After you fire your developers and start vibe coding everything, we’ll come in and fix all the bugs and security issues with your AI-generated code.
We’ll take what you have and make it work.
The service will start at $1,000/hour. ~
Santiago
That’s all for today! See you tomorrow with more such AI-filled content.
Don’t forget to share this newsletter on your social channels and tag Unwind AI to support us!
PS: We curate this AI newsletter every day for FREE, your support is what keeps us going. If you find value in what you read, share it with at least one, two (or 20) of your friends 😉
Reply