smolagents Integration
Integrate the LookC MCP server with smolagents, Hugging Face's Python library for building AI agents with native MCP support.
Overview
smolagents is an open-source Python library from Hugging Face designed to make it easy to build and run AI agents. It provides first-class support for the Model Context Protocol (MCP), allowing seamless integration with the LookC MCP server.
Native MCP Support
Built-in MCPClient for direct connection to MCP servers without custom adapters.
Code Agents
Agents write Python code to invoke tools, enabling natural composability with loops and conditionals.
Model Agnostic
Works with OpenAI, Anthropic, Hugging Face models, Ollama, and more.
Installation
Install smolagents with MCP support using pip:
Installation
pip install 'smolagents[mcp]'
Basic Usage
Connect to the LookC MCP server using the MCPClient context manager:
Basic smolagents usage
from smolagents import MCPClient, CodeAgent, InferenceClientModel
# Initialize your model (uses Hugging Face Inference API by default)
model = InferenceClientModel()
# Connect to the LookC MCP server
with MCPClient({
"url": "https://api.lookc.io/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": "YOUR_LOOKC_API_KEY"
}
}) as tools:
agent = CodeAgent(tools=tools, model=model)
result = agent.run("Find information about Anthropic's engineering team")
print(result)
The MCPClient automatically discovers all available tools from the LookC MCP server and makes them available to your agent. When the context manager exits, the connection is properly closed.
Authentication
The LookC MCP server requires API key authentication. Pass your API key in the headers configuration:
Authentication configuration
from smolagents import MCPClient, CodeAgent, InferenceClientModel
import os
model = InferenceClientModel()
# Load API key from environment variable
mcp_config = {
"url": "https://api.lookc.io/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": os.getenv("LOOKC_API_KEY")
}
}
with MCPClient(mcp_config) as tools:
agent = CodeAgent(tools=tools, model=model)
result = agent.run("Get the employee count for LinkedIn")
print(result)
Advanced Usage
Manual Connection Management
For more control over the connection lifecycle, you can manually manage the MCP client:
Manual connection management
from smolagents import MCPClient, CodeAgent, InferenceClientModel
model = InferenceClientModel()
mcp_config = {
"url": "https://api.lookc.io/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": "YOUR_LOOKC_API_KEY"
}
}
try:
mcp_client = MCPClient(mcp_config)
tools = mcp_client.get_tools()
agent = CodeAgent(tools=tools, model=model)
result = agent.run("Search for organisations in the technology industry")
print(result)
finally:
mcp_client.disconnect()
Multiple MCP Servers
You can connect to multiple MCP servers simultaneously by passing a list of configurations:
Multiple servers
from smolagents import MCPClient, CodeAgent, InferenceClientModel
model = InferenceClientModel()
# Connect to LookC and another MCP server
server_configs = [
{
"url": "https://api.lookc.io/mcp",
"transport": "streamable-http",
"headers": {"Authorization": "YOUR_LOOKC_API_KEY"}
},
{
"url": "http://localhost:8000/mcp",
"transport": "streamable-http"
}
]
with MCPClient(server_configs) as tools:
agent = CodeAgent(tools=tools, model=model)
result = agent.run("Research companies and analyze the data")
print(result)
Tool Calling Agent
If you prefer JSON-based tool calling over code generation, use ToolCallingAgent:
Tool calling agent
from smolagents import MCPClient, ToolCallingAgent, InferenceClientModel
model = InferenceClientModel()
with MCPClient({
"url": "https://api.lookc.io/mcp",
"transport": "streamable-http",
"headers": {"Authorization": "YOUR_LOOKC_API_KEY"}
}) as tools:
# ToolCallingAgent uses JSON-based tool calling instead of code
agent = ToolCallingAgent(tools=tools, model=model)
result = agent.run("Find senior engineers at Stripe")
print(result)
Using Different Models
smolagents supports various LLM providers. Here are examples for popular options:
Model configurations
# Requires: pip install 'smolagents[litellm]'
from smolagents import LiteLLMModel, MCPClient, CodeAgent
model = LiteLLMModel(model_id="gpt-4o")
with MCPClient({
"url": "https://api.lookc.io/mcp",
"transport": "streamable-http",
"headers": {"Authorization": "YOUR_LOOKC_API_KEY"}
}) as tools:
agent = CodeAgent(tools=tools, model=model)
result = agent.run("Research the leadership team at Microsoft")
print(result)
Best Practices
Use Context Managers
Always use the with statement to ensure proper connection cleanup and resource management.
Environment Variables
Store API keys in environment variables rather than hardcoding them in your code.
Choose the Right Agent
Use CodeAgent for complex workflows with loops and conditionals; use ToolCallingAgent for simpler tasks.
Error Handling
Wrap agent runs in try/except blocks to handle API errors and connection issues gracefully.
For more information on smolagents, see the official documentation. For other integration options, see our guides for Claude Desktop, Development Tools, and Custom Agents.
