Back to Blog
TutorialFebruary 2, 202612 min

Matrix AI Bot: Decentralized Chat Assistant

Build an AI bot for Matrix protocol. End-to-end encryption, federation, and bridging support for decentralized messaging.

matrixdecentralizedencryptionai chatbotself-hosted

Molted Team

Molted.cloud

Matrix is the decentralized, open-source protocol for secure communication. Unlike Slack or Teams, no single company controls it. If you value federation, encryption, and open standards, Matrix is your messaging platform. This guide shows how to add AI to it.

Why Matrix for AI?

  • Decentralized - Run your own server, federate with others
  • End-to-end encryption - Optional but robust
  • Open protocol - No vendor lock-in
  • Self-hostable - Complete control over your data
  • Bridging - Connect to Slack, Discord, Telegram, etc.

Matrix basics

Matrix uses these concepts:

  • Homeserver - Where accounts live (like matrix.org or your own)
  • User ID - @username:homeserver.com
  • Room - Where conversations happen
  • Bot - Just a user account that's automated

Prerequisites

  • A Matrix account for your bot (create on matrix.org or your homeserver)
  • Node.js 18+
  • Anthropic or OpenAI API key

Create a bot account

Register a new account on your homeserver. For matrix.org:

  1. Go to Element
  2. Create account with a bot-specific username
  3. Note the user ID (e.g., @myaibot:matrix.org)
  4. Get an access token from Settings → Help & About → Access Token

Project setup

mkdir matrix-ai-bot
cd matrix-ai-bot
npm init -y
npm install matrix-bot-sdk @anthropic-ai/sdk dotenv

Create .env:

MATRIX_HOMESERVER=https://matrix.org
MATRIX_ACCESS_TOKEN=your-access-token
MATRIX_BOT_USER_ID=@yourbot:matrix.org
ANTHROPIC_API_KEY=your-anthropic-key

Basic Matrix AI bot

require('dotenv').config();
const {
  MatrixClient,
  SimpleFsStorageProvider,
  AutojoinRoomsMixin,
} = require('matrix-bot-sdk');
const Anthropic = require('@anthropic-ai/sdk');

const homeserverUrl = process.env.MATRIX_HOMESERVER;
const accessToken = process.env.MATRIX_ACCESS_TOKEN;
const botUserId = process.env.MATRIX_BOT_USER_ID;

const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });
const conversations = new Map();

// Storage for sync state
const storage = new SimpleFsStorageProvider('bot-storage.json');

// Create client
const client = new MatrixClient(homeserverUrl, accessToken, storage);

// Auto-join rooms when invited
AutojoinRoomsMixin.setupOnClient(client);

// Handle messages
client.on('room.message', async (roomId, event) => {
  // Ignore our own messages
  if (event.sender === botUserId) return;

  // Only handle text messages
  if (event.content?.msgtype !== 'm.text') return;

  const text = event.content.body;

  // In rooms with multiple users, only respond when mentioned
  const members = await client.getJoinedRoomMembers(roomId);
  if (members.length > 2) {
    // Check if bot is mentioned
    const botDisplayName = 'AI'; // Or get from profile
    if (!text.toLowerCase().includes(botDisplayName.toLowerCase()) &&
        !text.includes(botUserId)) {
      return;
    }
  }

  // Clean text (remove mentions)
  const cleanText = text
    .replace(new RegExp(botUserId, 'g'), '')
    .replace(/@\S+/g, '')
    .trim();

  if (!cleanText) return;

  // Get or create conversation
  if (!conversations.has(roomId)) {
    conversations.set(roomId, []);
  }
  const history = conversations.get(roomId);

  history.push({ role: 'user', content: cleanText });

  if (history.length > 20) {
    history.splice(0, history.length - 20);
  }

  try {
    // Send typing indicator
    await client.sendTyping(roomId, true, 30000);

    const response = await anthropic.messages.create({
      model: 'claude-sonnet-4-20250514',
      max_tokens: 1024,
      system: 'You are a helpful AI assistant in a Matrix chat room. Keep responses concise.',
      messages: history,
    });

    const reply = response.content[0].text;
    history.push({ role: 'assistant', content: reply });

    // Stop typing
    await client.sendTyping(roomId, false);

    // Send reply
    await client.sendText(roomId, reply);
  } catch (error) {
    console.error('Error:', error);
    await client.sendText(roomId, 'Sorry, I encountered an error.');
  }
});

// Start the bot
client.start().then(() => {
  console.log('Matrix bot started!');
});

Matrix AI without the code

OpenClaw supports Matrix as a plugin. Configure and go.

Start free trial

Encrypted rooms (E2EE)

For end-to-end encrypted rooms, you need additional setup:

const {
  MatrixClient,
  SimpleFsStorageProvider,
  AutojoinRoomsMixin,
  RustSdkCryptoStorageProvider,
} = require('matrix-bot-sdk');

// Crypto storage for E2EE
const cryptoStorage = new RustSdkCryptoStorageProvider('./crypto-store');

const client = new MatrixClient(
  homeserverUrl,
  accessToken,
  storage,
  cryptoStorage
);

// The rest is the same - the SDK handles encryption transparently

Note: E2EE support in bots can be complex. The matrix-bot-sdk handles most of it, but key verification and cross-signing require additional work.

Slash commands

client.on('room.message', async (roomId, event) => {
  if (event.sender === botUserId) return;
  if (event.content?.msgtype !== 'm.text') return;

  const text = event.content.body;

  // Handle commands
  if (text.startsWith('!')) {
    const [command, ...args] = text.slice(1).split(' ');

    switch (command) {
      case 'help':
        await client.sendText(roomId, `Available commands:
!help - Show this message
!clear - Clear conversation history
!model - Show current AI model
!ask <question> - Ask a one-off question`);
        return;

      case 'clear':
        conversations.delete(roomId);
        await client.sendText(roomId, 'Conversation cleared.');
        return;

      case 'ask':
        const question = args.join(' ');
        // One-off question without history
        const response = await anthropic.messages.create({
          model: 'claude-sonnet-4-20250514',
          max_tokens: 1024,
          messages: [{ role: 'user', content: question }],
        });
        await client.sendText(roomId, response.content[0].text);
        return;
    }
  }

  // Regular message handling...
});

Rich formatting

// Send formatted message (HTML)
await client.sendHtmlText(
  roomId,
  '<strong>Bold</strong> and <em>italic</em> text',
  '**Bold** and *italic* text' // Fallback plain text
);

// Send code block
const code = `function hello() {
  console.log('Hello!');
}`;

await client.sendHtmlText(
  roomId,
  `<pre><code>${code}</code></pre>`,
  code
);

File and image handling

client.on('room.message', async (roomId, event) => {
  // Handle images
  if (event.content?.msgtype === 'm.image') {
    const mxcUrl = event.content.url; // mxc://server/media-id
    const httpUrl = client.mxcToHttp(mxcUrl);

    // Download and process
    const response = await fetch(httpUrl);
    const buffer = await response.arrayBuffer();
    const base64 = Buffer.from(buffer).toString('base64');

    const aiResponse = await anthropic.messages.create({
      model: 'claude-sonnet-4-20250514',
      max_tokens: 1024,
      messages: [{
        role: 'user',
        content: [
          {
            type: 'image',
            source: {
              type: 'base64',
              media_type: event.content.info?.mimetype || 'image/jpeg',
              data: base64,
            },
          },
          {
            type: 'text',
            text: 'What is in this image?',
          },
        ],
      }],
    });

    await client.sendText(roomId, aiResponse.content[0].text);
  }
});

Decentralized AI, zero hassle

OpenClaw supports Matrix with E2EE. Plugin install on Molted or self-host.

Try free for 24 hours

Running your own homeserver

For maximum control, run your own Matrix homeserver:

# Using Synapse (reference implementation)
docker run -d --name synapse \
  -v synapse-data:/data \
  -e SYNAPSE_SERVER_NAME=matrix.yourdomain.com \
  -e SYNAPSE_REPORT_STATS=no \
  -p 8008:8008 \
  matrixdotorg/synapse:latest

# Or Conduit for lighter resource usage
# Or Dendrite for the newer Go implementation

Bridging to other platforms

Matrix bridges let you connect to other platforms. Your AI bot can respond to messages bridged from Slack, Discord, or Telegram.

  • mautrix-slack - Bridge Slack workspaces
  • mautrix-discord - Bridge Discord servers
  • mautrix-telegram - Bridge Telegram chats

With bridges, one Matrix bot can effectively serve multiple platforms.

OpenClaw for Matrix

OpenClaw supports Matrix as a plugin:

  • Automatic E2EE handling
  • Works with any homeserver
  • Bridging-aware (recognizes bridged users)
  • Admin controls for room access

Related guides

Free 24-hour trial

AI for the decentralized web

OpenClaw on Matrix. Your server, your rules, your AI.

Start free trial

24-hour free trial · No credit card required · Cancel anytime

Ready to try OpenClaw?

Deploy your AI personal assistant in 60 seconds. No coding required.

Start free trial

24-hour free trial · No credit card required