servers/agents — @leadmetrics/server-agents
A dedicated Node.js background service that runs all AI agent BullMQ workers. It consumes jobs from agent queues, dispatches them to the appropriate adapter (Claude / OpenAI / Gemini), streams results back, and updates the database.
Source: apps/servers/agents/
Package: @leadmetrics/server-agents
Agent worker logic lives in: packages/agents/
Why a Separate Service
| Concern | Reason |
|---|---|
| Isolation | Agent failures never affect the API or notification service |
| Independent scaling | Scale by replica count — run 3 agent servers for 3× throughput |
| Resource separation | Agent processes are memory- and CPU-intensive; separate from API |
| Adapter env vars | All LLM credentials live in the agents server only |
Scaling note: Concurrency is set per-worker in
packages/agents/src/workers/. To scale, run multiple replicas of this server — each replica picks up jobs from the shared BullMQ queues. No code changes needed.
Workers
Each worker corresponds to one agent role and listens on agent__{role}:
| Worker | Queue | Adapter | Description |
|---|---|---|---|
| Setup | agent__setup | claude_local | Runs client-researcher → competitor → context-file-writer chain |
| Strategy Writer | agent__strategy_writer | claude_local | Generates marketing strategy document |
| Deliverable Planner | agent__deliverable_planner | claude_local | Plans goals + monthly deliverable volumes |
| Activity Planner | agent__activity_planner | claude_local | Schedules activity pipeline from deliverable plan |
| Blog Writer | agent__blog_writer | claude_local | Drafts full blog posts from SEO briefs |
| Social Post Designer | agent__social_post_designer | codex_local | Designs social post images via Azure OpenAI |
| Client Researcher | agent__client_researcher | codex_local | Researches client domain → notes |
| RAG Ingestion | agent__rag_ingestion | — | Ingests documents into Qdrant vector store |
Queues are shared across all tenants — tenantId is in the job payload, not the queue name.
Package Split
| Package | Contents |
|---|---|
packages/agents | All worker code, agent-events.ts, skills.ts, lib/dependency-resolver.ts |
packages/queue | Connection, enqueueNotification, enqueueAgentJob, queue names, types only |
apps/servers/agents imports workers from @leadmetrics/agents/src/workers/*.
Required Config (apps/servers/agents/.env)
| Variable | Required | Description |
|---|---|---|
DATABASE_URL | yes | Prisma |
REDIS_URL | yes | BullMQ |
DASHBOARD_URL | yes | Used in email links from agent events |
DO_SPACES_KEY | yes | DigitalOcean Spaces — all 6 vars required |
DO_SPACES_SECRET | yes | |
DO_SPACES_REGION | yes | |
DO_SPACES_ENDPOINT | yes | e.g. https://sgp1.digitaloceanspaces.com — commonly missing |
DO_SPACES_BUCKET | yes | |
DO_SPACES_CDN_URL | yes | |
AZURE_IMAGE_API_KEY | yes | Azure OpenAI for GPT Image (social post designer) |
AZURE_IMAGE_ENDPOINT | yes | |
PIXABAY_API_KEY | yes | Image search fallback |
UNSPLASH_ACCESS_KEY | yes | Image search fallback |
DO_SPACES_ENDPOINTis the most commonly missing variable. Without it, the social post designer worker fails silently and resets posts todm_review.
Startup Flow (src/index.ts)
- Validate config via Zod (
src/config.ts). - Create shared
IORedisconnection. - Call
start*Workers()for each agent role. - Graceful shutdown on
SIGTERM/SIGINT: callstop*Workers()thenredis.quit().
Related Docs
- Agent Hierarchy — 4-tier pipeline (Setup → Strategy → Planner → Workers)
- Agent Execution Engine — adapter dispatch, streaming, cost tracking
- Task Queue & Orchestration — BullMQ queue design, job payloads