Documentation Index
Fetch the complete documentation index at: https://docs.stackone.com/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Pydantic AI includes native MCP integration via the mcp_servers parameter, enabling direct connection to StackOne’s MCP server.
Official Docs
Installation
Quick Start
Connect to StackOne MCP and create an agent:
import os
import base64
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP
# Configure StackOne account
STACKONE_ACCOUNT_ID = "<account_id>" # Your StackOne account ID
# Encode API key for Basic auth
auth_token = base64.b64encode(
f"{os.getenv('STACKONE_API_KEY')}:".encode()
).decode()
# Create agent with StackOne MCP server
agent = Agent(
model="openai:gpt-5",
mcp_servers=[
MCPServerStreamableHTTP(
url="https://api.stackone.com/mcp",
headers={
"Authorization": f"Basic {auth_token}",
"x-account-id": STACKONE_ACCOUNT_ID
}
)
]
)
# Run agent with StackOne tools
result = agent.run_sync("Search recent calls in Gong")
print(result.data)
Environment Variables
STACKONE_API_KEY=<stackone_api_key>
OPENAI_API_KEY=your_openai_key
Avoid naming conflicts when using multiple MCP servers:
agent = Agent(
model="openai:gpt-5",
mcp_servers=[
MCPServerStreamableHTTP(
url="https://api.stackone.com/mcp",
headers={
"Authorization": f"Basic {auth_token}",
"x-account-id": STACKONE_ACCOUNT_ID
},
tool_prefix="stackone_" # Tools become stackone_gong_crm_search_calls, etc.
)
]
)
Resources
Next Steps
LangChain
Build agents with LangChain MCP adapters
CrewAI
Create multi-agent systems with CrewAI