Introduction

AWS EC2 is one of the most popular platforms for running OpenClaw in production. It offers reliable 24/7 hosting, flexible instance sizing, global regions for data residency, and seamless integration with AWS services like Secrets Manager and CloudWatch. Whether you're running a single personal agent or scaling to multiple business agents, EC2 provides the control and reliability you need.

This guide is the most comprehensive OpenClaw-on-EC2 resource available. We cover everything from choosing the right instance type to configuring webhooks for Telegram and WhatsApp, securing your deployment, and keeping it running reliably. Follow along step-by-step, or use the video walkthrough for a visual guide.

Quick Reference

  • Recommended instance: t3.small (2 vCPU, 2GB) for cloud LLM; t3.medium (4GB) for Ollama
  • Region: us-east-1 (cheapest, low latency to OpenAI); eu-central-1 for EU data
  • Cost: ~$20–30/mo infrastructure + $30–100/mo API
  • Setup time: 45–90 minutes first run

Video Walkthrough

For a visual, step-by-step walkthrough of the entire EC2 setup process, watch this comprehensive video guide:

The video covers instance launch, SSH connection, OpenClaw installation, configuration, and first-run verification. Use it alongside this guide for the best experience.

Prerequisites

Before you begin, ensure you have:

  • AWS account — Sign up at aws.amazon.com if needed. Free tier includes 750 hours/month of t2.micro (not recommended for OpenClaw — too small).
  • LLM API key — OpenAI, Anthropic, or Google. Set a $20–50 spending limit before running.
  • Telegram bot token — Create via @BotFather. Easiest first channel. See Telegram setup.
  • SSH key pair — You'll create one during EC2 launch, or use an existing one.
  • Basic terminal familiarity — You'll run commands via SSH. No advanced Linux experience required.

EC2 Instance Selection

Choosing the right instance type and region is critical for performance and cost.

Instance Type

InstancevCPURAMCost/moBest For
t3.micro21 GB~$8Too small — avoid
t3.small22 GB~$15Single agent, cloud LLM only
t3.medium24 GB~$30Ollama + OpenClaw, or heavier use
t3.large28 GB~$60Multiple agents, larger Ollama models

Recommendation: Start with t3.small for cloud LLM. Upgrade to t3.medium if you add Ollama or run multiple agents.

AMI (Operating System)

Ubuntu 22.04 LTS — Most popular. Excellent Docker support, familiar to developers. Use "Ubuntu Server 22.04 LTS" from the Quick Start list.

Amazon Linux 2023 — Native AWS integration. Slightly different package names (dnf vs apt). Also works well.

Region

us-east-1 (N. Virginia) — Cheapest, lowest latency to OpenAI (they run on AWS). Default choice for most users.

eu-central-1 (Frankfurt) — EU data residency. Required for GDPR-sensitive deployments.

ap-southeast-1 (Singapore) — APAC. Low latency for Asian users.

ap-south-1 (Mumbai) — India data residency.

Storage

20–30 GB gp3 — Config and memory are small (under 1 GB typically). Logs can grow. 30 GB gives headroom. gp3 is faster and often cheaper than gp2.

Launching Your EC2 Instance

  1. Log into AWS Console → EC2 → Launch Instance.
  2. Name: e.g., "openclaw-production".
  3. AMI: Ubuntu Server 22.04 LTS (64-bit).
  4. Instance type: t3.small (or t3.medium for Ollama).
  5. Key pair: Create new or select existing. Download the .pem file. Store securely. You need this for SSH.
  6. Network settings: Create security group (see Security Hardening below). Allow SSH (22) from your IP only.
  7. Storage: 30 GB gp3.
  8. Launch.

Wait 1–2 minutes for the instance to reach "running" state. Note the public IP address (or use Elastic IP — see below).

Connecting via SSH

# Fix key permissions (required)
chmod 400 ~/Downloads/your-key.pem

# Connect (Amazon Linux uses ec2-user; Ubuntu uses ubuntu)
ssh -i ~/Downloads/your-key.pem ubuntu@YOUR_EC2_PUBLIC_IP

Replace YOUR_EC2_PUBLIC_IP with your instance's public IPv4 address from the EC2 console. On first connect, you may see a host key verification prompt — type yes.

You should now see a terminal prompt. You're connected to your EC2 instance.

Installing Dependencies

Update the system and install Node.js, Git, and Docker (optional but recommended):

# Update packages
sudo apt update && sudo apt upgrade -y

# Install Node.js 20.x
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
sudo apt install -y nodejs

# Verify
node --version   # Should show v20.x
npm --version

# Install Git
sudo apt install -y git

# Install Docker (recommended for production)
sudo apt install -y docker.io
sudo systemctl start docker
sudo systemctl enable docker
sudo usermod -aG docker ubuntu
# Log out and back in for docker group to take effect, or run: newgrp docker

Installing OpenClaw

Option A: npm (direct install)

sudo npm install -g openclaw
openclaw --version

Option B: From source (for latest)

git clone https://github.com/openclaw-foundation/openclaw.git
cd openclaw
npm install
npm run build
# Run with: npm start

Option C: Docker (recommended — see Docker Deployment section)

docker pull openclaw/openclaw:latest

Configuring OpenClaw

Create config directory and run setup:

mkdir -p ~/.openclaw
cd ~/.openclaw

# Create config.yaml (or run: openclaw setup)
nano config.yaml

Minimum config for Telegram + OpenAI:

llm:
  default_provider: openai
  providers:
    openai:
      api_key: "${OPENAI_API_KEY}"
      model: "gpt-4o-mini"

channels:
  telegram:
    enabled: true
    bot_token: "${TELEGRAM_BOT_TOKEN}"
    allowed_user_ids:
      - YOUR_TELEGRAM_USER_ID

gateway:
  host: "127.0.0.1"
  port: 18789

Use environment variables for secrets. Create ~/.openclaw/.env:

OPENAI_API_KEY=sk-your-key-here
TELEGRAM_BOT_TOKEN=123456789:AAH...
ANTHROPIC_API_KEY=sk-ant-...   # Optional

Set permissions: chmod 600 ~/.openclaw/.env. Never commit .env to Git.

Find your Telegram user ID: message @userinfobot on Telegram. Add the numeric ID to allowed_user_ids.

Docker Deployment (Recommended)

Docker provides isolation, easier updates, and consistent getting it running. See OpenClaw Docker for full details.

# Create directories
mkdir -p ~/openclaw/config ~/openclaw/memory

# Copy your config
cp ~/.openclaw/config.yaml ~/openclaw/config/
cp ~/.openclaw/.env ~/openclaw/  # Or use -e flags

# Run OpenClaw in Docker
docker run -d   --name openclaw   --restart unless-stopped   -v ~/openclaw/config:/app/config   -v ~/openclaw/memory:/app/memory   --env-file ~/openclaw/.env   -p 127.0.0.1:18789:18789   openclaw/openclaw:latest

# Check logs
docker logs -f openclaw

Important: Bind Gateway to 127.0.0.1 only. Never expose port 18789 to the public internet. See security guide.

Running as a Persistent Service

For non-Docker installs, use systemd so OpenClaw starts on boot and restarts on crash:

sudo nano /etc/systemd/system/openclaw.service
[Unit]
Description=OpenClaw AI Agent
After=network.target
Wants=network-online.target

[Service]
Type=simple
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu
EnvironmentFile=/home/ubuntu/.openclaw/.env
ExecStart=/usr/bin/openclaw start
Restart=on-failure
RestartSec=10s
StandardOutput=journal
StandardError=journal

[Install]
WantedBy=multi-user.target
sudo systemctl daemon-reload
sudo systemctl enable openclaw
sudo systemctl start openclaw
sudo systemctl status openclaw
sudo journalctl -u openclaw -f   # View logs

Webhook Setup for Telegram & WhatsApp

Telegram and WhatsApp use webhooks to push messages to your agent. Your EC2 must be reachable via HTTPS. Two options:

Option 1: Public IP + HTTPS Tunnel (ngrok)

For testing, use ngrok to expose localhost:

# Install ngrok
curl -s https://ngrok-agent.s3.amazonaws.com/ngrok.asc | sudo tee /etc/apt/trusted.gpg.d/ngrok.asc >/dev/null
echo "deb https://ngrok-agent.s3.amazonaws.com buster main" | sudo tee /etc/apt/sources.list.d/ngrok.list
sudo apt update && sudo apt install ngrok

# Run tunnel (requires ngrok account + auth token)
ngrok http 18789

Use the ngrok HTTPS URL as your webhook. Note: ngrok URLs change on restart unless you have a paid plan.

Option 2: Domain + Reverse Proxy (Production)

For production, use a domain and Nginx/Caddy with SSL:

  1. Point a subdomain (e.g., openclaw.yourdomain.com) to your Elastic IP.
  2. Install Caddy: sudo apt install caddy
  3. Configure Caddy to proxy to 127.0.0.1:18789 with automatic HTTPS.
  4. Open port 443 in security group. Restrict to webhook IPs if possible.

Telegram webhook: https://openclaw.yourdomain.com/webhook/telegram. Configure in OpenClaw or via Telegram API.

See Telegram setup and WhatsApp setup for channel-specific details.

Elastic IP for Stable Addressing

EC2 public IPs change when you stop/start the instance. For webhooks, you need a stable address.

  1. EC2 → Elastic IPs → Allocate Elastic IP address.
  2. Associate with your instance.
  3. Update your DNS or webhook config to use the Elastic IP.

Cost: Elastic IPs are free while associated with a running instance. You're charged if you allocate one and don't use it.

Security Hardening

EC2 deployments require careful security. See Is OpenClaw Safe? for full guidance.

  • Security group: Inbound: 22 (SSH) from your IP only. 443 (HTTPS) from 0.0.0.0/0 only if you need webhooks — consider restricting to Telegram/WhatsApp IP ranges. Outbound: 443 for API calls. No port 18789 to 0.0.0.0/0.
  • Gateway binding: Always bind to 127.0.0.1. Never 0.0.0.0.
  • Secrets: Use .env or AWS Secrets Manager. Never hardcode API keys in config.
  • SSH: Use key-based auth. Disable password auth: PasswordAuthentication no in /etc/ssh/sshd_config.
  • Updates: sudo apt update && sudo apt upgrade -y regularly. Enable unattended-upgrades.

Monitoring & Logging

CloudWatch: EC2 sends basic metrics (CPU, network) to CloudWatch by default. Add custom metrics if needed.

OpenClaw logs: journalctl -u openclaw -f (systemd) or docker logs -f openclaw (Docker).

Disk space: Monitor with df -h. Logs can grow. Configure log rotation.

Alerts: Use CloudWatch Alarms for CPU, disk, or instance status. SNS for email/SMS alerts.

Backup & Data Persistence

Your OpenClaw memory and config are critical. Back them up.

  • EBS snapshots: EC2 → Volumes → Select volume → Create snapshot. Schedule weekly with AWS Backup or Lambda.
  • S3 sync: aws s3 sync ~/openclaw s3://your-bucket/openclaw-backup/. Run via cron.
  • Config in Git: Store config.yaml (without secrets) in a private repo. Secrets stay in .env or Secrets Manager.

Cost Breakdown

ItemMonthly Cost
EC2 t3.small~$15
EBS 30 GB gp3~$3
Data transfer (typical)~$1–2
OpenAI API (moderate use)$30–80
Total~$50–100

Use Reserved Instances or Savings Plans for 30–40% EC2 discount. See OpenClaw pricing for API cost optimization.

Troubleshooting

Agent doesn't respond on Telegram: Verify bot token, allowed_user_ids, and that webhook URL is reachable. Check firewall and security group. Test with curl from outside.

Out of memory: t3.small has 2 GB. OpenClaw + browser automation can exceed that. Upgrade to t3.medium or reduce concurrent tasks.

High API costs: Enable two-tier processing. Set spending limits. Use cheaper models for Heartbeat.

Connection refused on webhook: Ensure Gateway is bound to 0.0.0.0 for internal access if using reverse proxy, or that proxy is correctly forwarding. Check security group allows 443.

Instance unreachable: Check security group. Verify Elastic IP is associated. Ensure instance is running.

Frequently Asked Questions

Can I use AWS Free Tier? t2.micro (1 GB RAM) is too small for OpenClaw. t3.micro might work for very light use but will be slow. t3.small is the minimum recommended.

What about Spot Instances? 60–70% cheaper. Can be interrupted. OpenClaw persists to disk — restart and resume. Good for non-critical workloads. Use with caution for production.

Can I use AWS Bedrock instead of OpenAI? Yes. OpenClaw supports Bedrock as an LLM provider. Keeps inference in AWS. Check OpenClaw docs for Bedrock config.

How do I update OpenClaw? Docker: docker pull openclaw/openclaw:latest && docker restart openclaw. npm: sudo npm update -g openclaw. Restart the service.

Can I run multiple OpenClaw agents on one EC2? Yes. Use different config directories and ports. Or use ECS for orchestration. See OpenClaw on AWS for ECS patterns.

Wrapping Up

Deploying OpenClaw on EC2 gives you a reliable, scalable, 24/7 AI agent in the cloud. Follow this guide step-by-step, or use the video walkthrough for a visual guide. Start with t3.small, secure your deployment, and scale up as needed.

For more: OpenClaw on AWS (ECS, regions), Docker deployment, HEARTBEAT.md (proactive tasks), and security best practices.