MCP-Use LogoMCP-Use

What is mcp_use?

mcp_use is an open source library that enables developers to connect any LLM to any MCP server, allowing the creation of custom agents with tool access without relying on closed-source or application-specific clients.

Just like this

import asyncio
from langchain_openai import ChatOpenAI
from mcp_use import MCPAgent, MCPClient

async def main():
    client = MCPClient(config= {"mcpServers":{
    "playwright": {
        "command": "npx",
        "args": ["@playwright/mcp@latest"],
        "env": {"DISPLAY": ":1"}
    }}})
    # Create LLM
    llm = ChatOpenAI(model="gpt-4o", api_key=...)
    # Create agent with tools
    agent = MCPAgent(llm=llm, client=client, max_steps=30)
    # Run the query
    result = await agent.run("Find the best restaurant in San Francisco")

if __name__ == "__main__":
    asyncio.run(main())
Create your own

Key Features

Open Source
Connect any LLM to any MCP server without vendor lock-in
Flexible Configuration
Support for any MCP server through a simple configuration system
Easy Setup
Simple JSON-based configuration for MCP server integration
Universal LLM Support
Compatible with any LangChain-supported LLM provider
HTTP Connection
Connect to MCP servers running on specific HTTP ports for web-based integrations
Dynamic Server Selection
Agents can dynamically choose the most appropriate MCP server for the task.

Getting Started

Chat with your MCP servers
with
zero setup time

Spin up, test, and interact with MCP instantly using our AI-powered chat — no config, no deploys.

Launch Chat