MCP-Use LogoMCP-Use

Quick Start with MCP-Use ๐Ÿš€

Welcome to the MCP-Use universe โ€“ the easiest way to plug any LLM into powerful, real-world tools! In this short guide you'll go from zero to a running agent in under five minutes.

This tutorial is the perfect place to start if you've just installed MCP-Use or if you want a refresher on the basics. When you're done, jump to the other tutorials in this folder for deeper dives.

0. Why MCP before MCP-Use?

Before diving into code let's understand the playing field.

MCP (Model-Context Protocol) is an open standard that lets an AI model call external tools in a safe, typed way โ€“ similar to how your smartphone apps request permissions. A tool can be anything:

  • open a web page and click a button
  • run a shell command
  • query a SQL database
  • generate a Blender mesh

MCP defines a common language so the model (host) says "I want to call browser.search with query=best ramen berlin" and an MCP server executes it, returning JSON results.

๐Ÿ‘‰ In short: MCP turns any black-box LLM into a programmable agent that can actually get stuff done.

Real-world Stories

PersonaGoalHow MCP helps
โœˆ๏ธ Travel hackerCompare Airbnb prices, then check Skyscanner flights, stash results in Notion.Servers: airbnb, playwright, filesystem
๐Ÿ“ฐ Research analystDaily crawl of news sites, summarize top stories, dump into Slack.Servers: browser, filesystem, custom slack
๐Ÿ› ๏ธ 3D artistProcedurally create Blender scenes with an LLM, render thumbnails.Server: blender

1. Enter MCP-Use ๐Ÿ› ๏ธ

MCP-Use is the Python glue that makes all of the above dead-simple:

  • Spin up or connect to many MCP servers โœ”๏ธ
  • Wrap them in an ergonomic async client โœ”๏ธ
  • Feed tool schemas to your favourite LangChain LLM โœ”๏ธ
  • Provide guard-rails like max_steps, allowed tools, automatic server routing โœ”๏ธ

If MCP is the protocol, MCP-Use is the power socket adapter.


1. Prerequisites

โœ… RequirementWhy you need it
Python 3.11+MCP-Use targets modern Python for async-await bliss.
LLM provider keysAny model that supports tool/function calling via LangChain chat models works. Examples: OPENAI_API_KEY, ANTHROPIC_API_KEY.
Node 18+ (optional but recommended)Many official MCP servers are distributed as npx <package>, so having Node installed unlocks instant servers like Playwright or Airbnb.

2. Installation

# 1. Core library
pip install mcp-use

# 2. Pick an LLM provider โ€“ here we use OpenAI
pip install langchain-openai

# 3. (Optional) Playwright browser MCP server
npm install -g @playwright/mcp   # or just use npx in the config later

Add your API keys to a .env file in your project root:

OPENAI_API_KEY="sk-โ€ฆ"

MCP-Use will automatically pick them up via python-dotenv.


3. Your First Agent โ€“ "Find me coffee โ˜•"

Create a new file called hello_mcp.py and paste the following code:

import asyncio
import os
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
from mcp_use import MCPAgent, MCPClient

load_dotenv()  # ๐Ÿ”‘ Load API keys

# 1๏ธโƒฃ Describe which MCP servers you want.  Here we spin up Playwright in a headless browser.
CONFIG = {
    "mcpServers": {
        "playwright": {
            "command": "npx",
            "args": ["@playwright/mcp@latest"],
            "env": {"DISPLAY": ":1"}  # required if you run inside Xvfb / CI
        }
    }
}

async def main():
    client = MCPClient.from_dict(CONFIG)
    llm = ChatOpenAI(model="gpt-4o")

    # 2๏ธโƒฃ Wire the LLM to the client
    agent = MCPAgent(llm=llm, client=client, max_steps=20)

    # 3๏ธโƒฃ Ask something that requires real web browsing
    result = await agent.run("Find the best specialty coffee in Berlin using Google Search")
    print("\n๐Ÿ”ฅ Result:", result)

    # 4๏ธโƒฃ Always clean up running MCP sessions
    await client.close_all_sessions()

if __name__ == "__main__":
    asyncio.run(main())

Run it:

python hello_mcp.py

If everything is set up correctly you'll watch MCP-Use boot the Playwright server, let the LLM pick browser actions, and finally print a human-readable answer.


4. How it Works (TL;DR)

  1. MCPClient starts the external server in a separate process.
  2. The server exposes tools (e.g. browser.search, browser.click).
  3. MCPAgent sends the available tools to your LLM.
  4. The LLM decides which tool to call, returns a JSON tool invocation, and MCP-Use executes it.
  5. Steps 3-4 repeat until the agent decides it's "done" or hits max_steps.

5. Next Steps โ–ถ๏ธ

  • Build your own agent โ€“ check out first-agent.md for a fully-commented walkthrough.
  • Multiple servers โ€“ want to combine Airbnb + Browser + Filesystem? Hop over to multi-server.md.
  • Debugging โ€“ learn pro tips for logging and tracing in debugging.md.

Happy hacking โ€“ and don't forget to โญ the project on GitHub if this saved you time!
(GitHub Repo)