@helper what’s our refund policy? and the agent responds with the correct info, without interrupting the ongoing human conversation.
What You’ll Build
- A Mastra agent that can participate in group conversations.
- Triggered only when explicitly mentioned (e.g.,
@agent). - Can fetch knowledge, trigger tools, or summarize discussions.
- Optional: integrate into CometChat group chats.
Prerequisites
- A Mastra project (
npx create-mastra@latest my-mastra-app). - Node.js installed.
- OpenAI API key in
.envasOPENAI_API_KEY. - A CometChat app with group chat enabled.
Step 1
Create the Agent
src/agents/group-chat-agent.ts:
Step 2
Register the Agent in Mastra
src/mastra/index.ts:
Step 3
Run the Agent
Step 4
Deploy & Connect
- Deploy the API (
/api/agents/group-chat/generate) using Render, Railway, Vercel, or any host. - In CometChat Dashboard → AI Agents, create an agent with:
- Provider: Mastra
- Agent ID:
group-chat - Deployment URL: public endpoint from your host
Troubleshooting
- Agent talks too much: Tighten instructions to only respond when invoked.
- Doesn’t respond to mentions: Ensure input message contains
@agentkeyword. - Not visible in chat: Verify the agent is added as a user in CometChat and enabled in the Dashboard.
Next Steps
- Extend the agent with tools (e.g.,
summarize-discussion,fetch-policy). - Add guardrails so the agent only responds to whitelisted topics.
- Use in combination with a Multi-agent Orchestration (relay) agent to query multiple humans/agents.