Why Multiple Protocols?
StackOne’s Falcon Connector engine handles authentication, rate limiting, data transformation, and error recovery for every action. All protocols (direct API, MCP, A2A, and SDKs) route through this same engine, so you get the same reliability regardless of how you call actions.Actions are the atomic operations in StackOne:
salesforce_list_contacts, gmail_send_message, google_drive_list_files, etc. Every protocol is just a different way to invoke the same underlying actions.Quick Decision Guide
Actions API
Direct RPC calls for traditional applications, scripts, or when you need full controlJump to Actions API →
AI Toolset (SDK)
Native TypeScript/Python with framework integrations (LangChain, CrewAI, OpenAI)Jump to SDKs →
MCP Servers
Open standard for AI tools. Works with Claude, Cursor, Vercel AI, n8nJump to MCP →
A2A Protocol
Agent-to-agent communication for multi-agent orchestrationJump to A2A →
Comparison Table
| Feature | Actions API | AI Toolset (SDK) | MCP Servers | A2A Protocol |
|---|---|---|---|---|
| Use case | Traditional apps, scripts | AI agents with frameworks | MCP-compatible clients | Multi-agent systems |
| Requires code | Yes | Yes | No (config only) | Yes |
| Best for | Full control, non-agentic | Production AI apps | Quick setup, IDE agents | Agent orchestration |
| Works with | Any HTTP client | LangChain, CrewAI, OpenAI | Claude, Cursor, Vercel AI | Any A2A orchestrator |
Actions API
The Actions API is the foundational protocol: a direct RPC endpoint to invoke any StackOne action programmatically. MCP and A2A both use this API internally.- When to Use
- Example
- Building traditional applications (not AI agents)
- Writing scripts or automation that call connectors
- Need direct HTTP calls without SDK dependencies
- Want full control over request/response handling
- Integrating from languages without SDK support
Execute Actions API →
AI Toolset SDKs
Native TypeScript and Python libraries with framework integrations for OpenAI, LangChain, CrewAI, Vercel AI SDK, and more.- When to Use
- When NOT to Use
- Building a custom AI agent or product
- Need programmatic control over which tools to load
- Using LangChain, CrewAI, OpenAI SDK, or similar frameworks
- Need to dynamically select tools based on context or user
- Want type safety and IDE autocomplete
- Building production applications with custom error handling
MCP Servers
Model Context Protocol (MCP) is an open standard for connecting AI models to external tools. StackOne hosts MCP servers that expose all actions. These servers call the same Actions API.- When to Use
- When NOT to Use
- Using an MCP-compatible client: Claude SDK, Vercel AI SDK, Cursor, Windsurf, n8n
- Want zero-code setup: just configure and go
- Building with frameworks that support MCP natively
- Adding StackOne tools to existing MCP setups
- Using IDEs or agent builders like Cursor or Flowise
- Need to switch between accounts dynamically (just change
x-account-id)
MCP Quickstart →
A2A Protocol
Agent-to-Agent (A2A) is Google’s open protocol for autonomous agent communication. StackOne exposes A2A-compatible agents for each integration. These agents invoke the same underlying Actions API.- When to Use
- When NOT to Use
- Building multi-agent systems with specialized agents
- Want autonomous agent communication (agents talking to agents)
- Need context isolation to protect your top-level agent’s context window
- Using an A2A-compatible orchestrator
- Need long-running tasks with async completion
A2A Quickstart →
Can I Use Multiple?
Yes! Many production setups combine protocols:- Actions API for backend jobs (scheduled syncs, data pipelines)
- AI Toolset (SDK) for your product (customer-facing AI features)
- MCP for internal tools (Cursor, Claude Desktop for your team)
- A2A for complex workflows (multi-agent orchestration)