Connect An MCP Client
The Automation Suite is the local part of Daeda AI.
It runs as an MCP server (@daeda/mcp-pro) on your machine and syncs portal data into a local DuckDB database. Your AI client then reads that local database through MCP tools and proposes Write Plans for review inside Daeda AI.
Before You Start
Section titled “Before You Start”| Requirement | Why it matters |
|---|---|
| A HubSpot portal with Daeda AI installed | Daeda AI is the admin surface for the Automation Suite |
| An MCP license key from Daeda AI | The local client uses it to register |
| Node.js 20 or newer | @daeda/mcp-pro requires Node 20+ |
| An MCP-compatible client | Claude Code, Cursor, and other MCP clients work |
What You Set Up In HubSpot First
Section titled “What You Set Up In HubSpot First”1. Open Daeda AI
Section titled “1. Open Daeda AI”Go to the Daeda AI settings page in HubSpot.
2. Create An MCP License Key
Section titled “2. Create An MCP License Key”Open Master Settings.
Create a license key in the MCP License Keys section.
3. Connect Any Client Portals You Need
Section titled “3. Connect Any Client Portals You Need”If you want to query a different HubSpot portal, connect it first from Master Settings.
Select the data scopes you want.
Generate the install link.
Open that link on the target portal and approve the scopes.
Scope Planning
Section titled “Scope Planning”Choose scopes based on the work you want the Automation Suite to do.
| Scope type | Needed for |
|---|---|
| Read scopes | Querying CRM objects and metadata |
| Write scopes | Write Plans that create, update, delete, or merge records |
| Schema write scopes | Property, property-group, and custom-schema changes |
| Workflow and sequence scopes | Workflow metadata, triggers, and automation-related operations |
If you request too few scopes, reads may be partial and some Write Plans will pause for re-authorisation.
Install In Claude Code
Section titled “Install In Claude Code”Run:
claude mcp add --transport stdio --env DAEDA_LICENSE_KEY=your-license-key mcp-pro -- npx -y @daeda/mcp-proReplace your-license-key with the key from Daeda AI.
Install In Other MCP Clients
Section titled “Install In Other MCP Clients”Use the same package:
| Setting | Value |
|---|---|
| Command | npx |
| Args | -y @daeda/mcp-pro |
| Env var | DAEDA_LICENSE_KEY=your-license-key |
Example JSON:
{ "mcpServers": { "mcp-pro": { "command": "npx", "args": ["-y", "@daeda/mcp-pro"], "env": { "DAEDA_LICENSE_KEY": "your-license-key" } } }}What Happens On First Connection
Section titled “What Happens On First Connection”| Step | What happens |
|---|---|
| 1 | The local MCP client starts over stdio |
| 2 | It registers with your license key |
| 3 | The server returns the connected HubSpot portals and the local encryption key |
| 4 | If this is the first real use and no paid plan is active, the free trial can start now |
| 5 | The client creates a local per-portal DuckDB database |
| 6 | It requests the latest artifact inventory for the selected portal |
| 7 | It downloads and ingests artifact-backed data |
| 8 | It starts live watch behaviour for the selected portal |
Local Storage
Section titled “Local Storage”By default, local portal data is stored under:
| Path | Purpose |
|---|---|
~/.daeda-mcp/portals/<portalId>/hubspot.duckdb | Live local DuckDB database |
~/.daeda-mcp/portals/<portalId>/hubspot.replica.duckdb | Published read replica |
~/.daeda-mcp/portals/<portalId>/portal_data.json | Local artifact and plugin sync state |
First Verification
Section titled “First Verification”After the MCP client connects, verify in this order.
1. Confirm The Server Is Connected
Section titled “1. Confirm The Server Is Connected”In Claude Code, run /mcp.
You should see mcp-pro connected.
2. Check Connection State
Section titled “2. Check Connection State”Ask your AI client to call:
status(section="connection")Look for:
| Field | What you want |
|---|---|
connectionState | Connected |
selectedPortalId | A real portal ID |
portals | The portals you expect to see |
3. Select The Portal
Section titled “3. Select The Portal”If you have more than one connected portal, call:
set_portal(portalId=<your portal id>)If only one portal is connected, it is usually auto-selected.
4. Check Schema Status
Section titled “4. Check Schema Status”Call:
status(section="schema")This is the main health check.
It shows the local database structure, table availability, plugin freshness, and refresh jobs.
5. Run A Simple Query
Section titled “5. Run A Simple Query”Call:
query(sql="SELECT id, json_extract_string(properties, '$.firstname') AS firstname FROM contacts LIMIT 5")If your portal has contacts and the contact scope was granted, this should return rows.
The Most Important Tools
Section titled “The Most Important Tools”| Tool | What it does |
|---|---|
set_portal | Choose the default portal |
status | Inspect connection, schema, freshness, and metadata |
query | Run read-only SQL against the local DuckDB mirror |
chart | Turn query results into a chart |
configure_sync_controls | Enable or disable sync by portal or artifact |
refresh_plugins | Refresh lightweight plugin-backed metadata |
describe_operations | Discover write operation types |
build_plan | Draft a Write Plan |
submit_plan | Validate or submit a Write Plan |
get_plans | List prior Write Plans |
Multi-Portal Behaviour
Section titled “Multi-Portal Behaviour”The Automation Suite supports more than one connected client portal.
| Situation | What to do |
|---|---|
| One connected portal | It is usually auto-selected |
| Multiple connected portals | Use set_portal before querying |
| Need to query many portals | Use tool calls with explicit portalIds |
Sync Controls
Section titled “Sync Controls”Portal sync can be enabled or disabled per portal.
Artifact sync can also be disabled for specific object types.
| State | Meaning |
|---|---|
| Portal sync enabled | The Automation Suite keeps artifact-backed data fresh for that portal |
| Portal sync disabled | Reads use stale local data only, if local data already exists |
| Artifact disabled | That object type stops receiving new artifact work |
How Freshness Works
Section titled “How Freshness Works”Two data systems exist in the local database.
| Data type | Freshness model |
|---|---|
| Artifact-backed data | Managed automatically for normal reads |
| Lightweight plugin-backed data | Not real-time; refresh it manually when current values matter |
Read this next:
How The Local Read Database Works
Common First-Time Issues
Section titled “Common First-Time Issues”| Problem | Likely cause | Fix |
|---|---|---|
mcp-pro does not connect | Node version or bad config | Check Node 20+, command, args, and env |
| Invalid license key | Wrong key or revoked key | Generate a fresh key in Daeda AI |
| No portals returned | No connected client portals | Connect a portal in Daeda AI |
| Query says no data yet | Initial artifact sync still running | Wait, then re-run status(section="schema") |
| Query returns stale metadata | Plugin-backed table was not refreshed | Run refresh_plugins, then poll status(section="schema") |
| Registration blocked | Trial expired and no paid plan | Choose a paid plan in Daeda AI |
Team Setup
Section titled “Team Setup”For shared project configs, store the MCP config in the repo but keep the key in an environment variable.
Example:
{ "mcpServers": { "mcp-pro": { "command": "npx", "args": ["-y", "@daeda/mcp-pro"], "env": { "DAEDA_LICENSE_KEY": "${DAEDA_LICENSE_KEY}" } } }}Next Steps
Section titled “Next Steps”| Next guide | Use it for |
|---|---|
| How The Local Read Database Works | Understand artifacts, plugins, freshness, and refresh rules |
| Troubleshooting | Diagnose setup, sync, and refresh issues |
| Custom Objects | The supported workflow for creating and using custom objects |