MCPWorks

Build an Ethereum Price Tracker Agent with MCPWorks

Simon Carr

MCPWorks Agents are autonomous containers with their own AI brain, persistent state, scheduled functions, and communication channels. You configure them by talking to your AI assistant. The agent runs independently after that.

This post walks through building one from scratch: an Ethereum price tracker that monitors ETH/USD, stores price history in a free Turso database, detects significant price movements, and sends AI-generated market summaries to a Discord channel — all using an OpenRouter LLM for analysis.

No dashboard clicks. No YAML files. Every step happens through natural conversation with Claude Code (or any MCP client connected to your MCPWorks namespace).

What are MCPWorks Agents?

A MCPWorks Agent is a named, containerized entity on MCPWorks infrastructure. Each agent gets its own subdomain (eth-tracker.agent.mcpworks.io), encrypted state storage, scheduled triggers, webhook endpoints, and an optional AI brain that can reason about function outputs and decide what to do next.

The key concepts:

Concept What it does
Functions Python code that runs in a secure sandbox — your agent's capabilities
Schedules Cron expressions that trigger functions on a timer
AI brain An LLM (any provider) that analyzes outputs and makes decisions
Channels Discord, Slack, or email — where the agent sends reports
State Encrypted key-value store for persistent data between runs
Orchestration Controls how the AI interacts with function execution

Agents follow the BYOAI principle. MCPWorks provides the infrastructure — compute, state, scheduling, channels. You bring the AI engine and API key. Switch from Claude to GPT to Gemini whenever you want. The agent keeps running.

Step 1: Create the agent and service

Start by creating the agent and a service to organize its functions. Services are logical groupings — think of them as folders for related capabilities.

make_agent(
  name="eth-tracker",
  display_name="Ethereum Price Tracker"
)

make_service(
  name="price-monitor",
  description="ETH price tracking, storage, and alerting"
)

The agent is now live at eth-tracker.agent.mcpworks.io but idle. It has no functions, no schedule, and no AI. We'll add each piece.

Step 2: Write the price fetching function

This function hits the CoinGecko API (free, no key required), grabs the current ETH/USD price and 24-hour change, and writes it to a Turso database.

Turso gives you 500 databases and 9 GB of storage on the free tier. Create a database at turso.tech, grab your database URL and auth token, and store them in agent state so your functions can access them securely.

set_agent_state("eth-tracker", "turso_url",
  "libsql://your-db-name-your-org.turso.io")
set_agent_state("eth-tracker", "turso_token",
  "your-turso-auth-token")
set_agent_state("eth-tracker", "alert_threshold_pct", "3.0")

Now create the function:

make_function(
  service="price-monitor",
  name="fetch-eth-price",
  backend="code_sandbox",
  language="python",
  code="""
import httpx
import json
from datetime import datetime, timezone

turso_url = context["state"]["turso_url"]
turso_token = context["state"]["turso_token"]
headers = {"Authorization": f"Bearer {turso_token}",
           "Content-Type": "application/json"}

# Create table if first run
httpx.post(f"{turso_url}/v2/pipeline", headers=headers,
  json={"requests": [{"type": "execute", "stmt": {
    "sql": "CREATE TABLE IF NOT EXISTS eth_prices "
           "(id INTEGER PRIMARY KEY AUTOINCREMENT, "
           "price REAL, change_24h REAL, volume REAL, "
           "market_cap REAL, recorded_at TEXT)"
  }}, {"type": "close"}]})

# Fetch current price from CoinGecko
resp = httpx.get(
  "https://api.coingecko.com/api/v3/simple/price",
  params={"ids": "ethereum",
          "vs_currencies": "usd",
          "include_24hr_change": "true",
          "include_24hr_vol": "true",
          "include_market_cap": "true"})
data = resp.json()["ethereum"]

price = data["usd"]
change = data["usd_24h_change"]
volume = data["usd_24h_vol"]
mcap = data["usd_market_cap"]
now = datetime.now(timezone.utc).isoformat()

# Store in Turso
httpx.post(f"{turso_url}/v2/pipeline", headers=headers,
  json={"requests": [{"type": "execute", "stmt": {
    "sql": "INSERT INTO eth_prices "
           "(price, change_24h, volume, market_cap, recorded_at) "
           "VALUES (?, ?, ?, ?, ?)",
    "args": [{"type": "float", "value": str(price)},
             {"type": "float", "value": str(change)},
             {"type": "float", "value": str(volume)},
             {"type": "float", "value": str(mcap)},
             {"type": "text", "value": now}]
  }}, {"type": "close"}]})

# Check for alert condition
threshold = float(context["state"].get("alert_threshold_pct", "3.0"))
alert = abs(change) >= threshold

return {
  "price": price,
  "change_24h_pct": round(change, 2),
  "volume_24h": round(volume, 2),
  "market_cap": round(mcap, 2),
  "recorded_at": now,
  "alert_triggered": alert,
  "threshold_pct": threshold
}
"""
)

Every execution writes a row to Turso and returns structured data. The alert_triggered flag tells the AI brain whether this price movement is worth reporting.

Step 3: Build the market summary function

This function queries Turso for recent price history and returns it in a format the AI can analyze. We keep this separate from the fetch function so the AI can call it independently when composing reports.

make_function(
  service="price-monitor",
  name="get-price-history",
  backend="code_sandbox",
  language="python",
  code="""
import httpx

turso_url = context["state"]["turso_url"]
turso_token = context["state"]["turso_token"]
headers = {"Authorization": f"Bearer {turso_token}",
           "Content-Type": "application/json"}

hours = input.get("hours", 24)

resp = httpx.post(f"{turso_url}/v2/pipeline", headers=headers,
  json={"requests": [{"type": "execute", "stmt": {
    "sql": "SELECT price, change_24h, volume, market_cap, "
           "recorded_at FROM eth_prices "
           "WHERE recorded_at >= datetime('now', ?) "
           "ORDER BY recorded_at DESC",
    "args": [{"type": "text", "value": f"-{hours} hours"}]
  }}, {"type": "close"}]})

result = resp.json()
rows = result["results"][0].get("response", {}).get(
  "result", {}).get("rows", [])

prices = [{"price": r[0]["value"], "change_24h": r[1]["value"],
           "volume": r[2]["value"], "market_cap": r[3]["value"],
           "time": r[4]["value"]} for r in rows]

if prices:
  price_vals = [float(p["price"]) for p in prices]
  return {
    "count": len(prices),
    "latest": prices[0],
    "high": max(price_vals),
    "low": min(price_vals),
    "spread_pct": round(
      (max(price_vals) - min(price_vals))
      / min(price_vals) * 100, 2),
    "prices": prices[:20]
  }
return {"count": 0, "message": "No data yet"}
""",
  input_schema={
    "type": "object",
    "properties": {
      "hours": {
        "type": "integer",
        "description": "Hours of history to retrieve",
        "default": 24
      }
    }
  }
)

Step 4: Configure the AI brain with OpenRouter

Now we give the agent an LLM that can reason about price data and compose reports. OpenRouter provides a unified API across dozens of models. Pick whichever suits your budget — Google's Gemini 2.5 Flash is a strong choice for structured analysis at low cost, or use Anthropic's Claude Sonnet 4 for more nuanced market commentary.

configure_agent_ai(
  agent_name="eth-tracker",
  engine="openrouter",
  model="google/gemini-2.5-flash",
  api_key="your-openrouter-api-key",
  system_prompt="""You are an Ethereum market analyst agent.

When you receive price data from fetch-eth-price:
- If alert_triggered is true, compose a concise market alert
  with the price, percentage change, and a brief analysis of
  what might be driving the movement.
- If alert_triggered is false, acknowledge the data silently.
  Do not send routine updates to Discord unless asked.

When composing reports, call get-price-history to include
trend context (high, low, spread over the last 24 hours).

Format Discord messages with markdown. Use bold for the price
and percentage. Keep alerts under 280 characters. Keep daily
summaries under 800 characters.

You track Ethereum. Do not speculate on future prices. Report
what happened and provide context, not predictions.""",
  auto_channel="discord"
)

The auto_channel parameter tells the agent to automatically route AI-generated messages to Discord. When the AI decides something is worth reporting, it goes straight to your channel.

Step 5: Connect Discord

Create a Discord bot at the Discord Developer Portal, add it to your server with "Send Messages" permission, and grab the bot token and guild (server) ID.

add_channel(
  agent_name="eth-tracker",
  channel_type="discord",
  config={
    "bot_token": "your-discord-bot-token",
    "guild_id": "your-discord-guild-id"
  }
)

The agent can now post to your Discord server. Messages sent by the AI brain (when auto_channel is set) or explicitly by functions will appear in the configured channel.

Step 6: Schedule the price checks

Set a cron schedule to fetch prices every 5 minutes. The run_then_reason orchestration mode executes the function first, then passes the result to the AI brain for analysis. The AI only composes a Discord message if the alert threshold is triggered.

add_schedule(
  agent_name="eth-tracker",
  function_name="price-monitor.fetch-eth-price",
  cron_expression="*/5 * * * *",
  orchestration_mode="run_then_reason",
  failure_policy={"strategy": "continue"},
  timezone="America/Toronto"
)

You can also add a daily summary schedule that runs once per day:

add_schedule(
  agent_name="eth-tracker",
  function_name="price-monitor.get-price-history",
  cron_expression="0 9 * * *",
  orchestration_mode="run_then_reason",
  failure_policy={"strategy": "continue"},
  timezone="America/Toronto"
)

Every morning at 9 AM, the agent fetches 24 hours of history, passes it to the AI, and the AI sends a daily market summary to Discord.

Step 7: Add a webhook for manual triggers

Sometimes you want an on-demand report without waiting for the schedule. Add a webhook so you (or another service) can trigger a summary by hitting a URL.

add_webhook(
  agent_name="eth-tracker",
  path="report",
  handler_function_name="price-monitor.get-price-history",
  orchestration_mode="run_then_reason"
)

Now a POST to eth-tracker.agent.mcpworks.io/webhook/report triggers a full analysis cycle. Call it from a Discord slash command, a bookmark, or another agent.

How orchestration modes work

The orchestration mode controls the relationship between function execution and AI reasoning. This is the core design decision for any agent.

Mode Behavior Best for
direct Execute the function, return results, no AI Data collection, simple writes
reason_first Send the trigger to the AI, let it decide which functions to call Complex decision-making
run_then_reason Execute the function, then pass output to AI for analysis Monitoring with conditional alerts

For this tracker, run_then_reason is the right choice. The price fetch runs unconditionally (we always want data in the database). The AI only activates afterward to decide whether the result warrants a Discord message. This keeps AI token costs low — most 5-minute checks don't trigger an alert, so the AI gets a quick structured input and responds with nothing.

What a Discord alert looks like

When ETH moves 3% or more, your Discord channel receives something like:

ETH Alert: $2,847.32 (+4.2%)

Ethereum up 4.2% in the last 24h. 24h range: $2,712 – $2,851. Volume is elevated at $18.4B, roughly 40% above the 7-day average. The move coincides with broader market strength across major L1s.

The AI writes this based on the structured data from your functions. No templates, no hardcoded formats. Change the system prompt and the tone changes with it.

Adjusting the alert threshold

The alert threshold lives in agent state. Change it any time without modifying functions:

set_agent_state("eth-tracker", "alert_threshold_pct", "5.0")

Now the agent only alerts on 5%+ moves. You can also add new state keys that your functions read — support/resistance levels, comparison assets, reporting preferences. State is an encrypted key-value store that persists across all function executions.

Why Turso instead of agent state?

MCPWorks agents have built-in encrypted state storage (set_agent_state / get_agent_state), which is great for configuration and small datasets. But price history is time-series data that grows continuously. You'll want to query it — "what was the high over the last 48 hours?" or "show me every 3%+ move this month."

That's a database problem, not a key-value problem. Turso fits because:

  • Free tier is generous — 500 databases, 9 GB storage, 1 billion row reads per month
  • HTTP API — works from the MCPWorks code sandbox without drivers or TCP connections
  • SQLite semanticsdatetime() functions, window queries, aggregations all work
  • Edge replicas — if you later need low-latency reads from multiple regions

Other free options work too. Neon offers free PostgreSQL with 0.5 GB storage. PlanetScale has a free MySQL tier. Any database with an HTTP API is accessible from the MCPWorks sandbox.

Extending the agent

Once the base tracker is running, you can layer on capabilities without rebuilding anything.

Multi-asset tracking. Add a fetch-btc-price function to the same service. Reuse the same Turso database with a separate table. The AI brain sees all functions and can correlate movements across assets.

Trade execution. Add a function that calls a CEX API (Coinbase, Kraken) to place limit orders when conditions are met. Lock the function with lock_function so only you can modify it — the AI can call it but can't change the code.

External MCP servers. Use configure_mcp_servers to connect your agent to other MCP services — news feeds, sentiment analysis, on-chain data — and the AI can pull from all of them when composing reports.

Talk to your agent. Use chat_with_agent to have a conversation with the AI brain directly. Ask it "what happened overnight?" and it'll call get-price-history, analyze the data, and respond in natural language.

Start building

Every step in this post is a tool call to your MCPWorks namespace. Open Claude Code, connect to your namespace, and start talking. The entire agent — functions, AI, Discord, schedule — can be configured in a single conversation.

MCPWorks Agents are available now in Developer Preview. Sign up at mcpworks.io and start building.

Questions? Reach out at [email protected] or find us on Bluesky.

Frequently asked questions

What does it cost to run this agent? MCPWorks infrastructure costs depend on your tier — the Developer Preview Builder tier is free for 90 days. The only external cost is your OpenRouter API key. With run_then_reason orchestration, the AI only processes tokens when an alert triggers, keeping costs to a few cents per day even at 5-minute intervals.

Can I use a different LLM instead of OpenRouter? Yes. MCPWorks agents support Anthropic, OpenAI, Google, and OpenRouter as AI engines. Use configure_agent_ai with engine="anthropic" and your Claude API key, or engine="openai" for GPT. OpenRouter is convenient because it gives you access to dozens of models through a single API key.

Do I need to know Python to build an agent? The function code in this post is Python, but you don't have to write it yourself. Tell Claude Code what you want the function to do and it will write the code, create the function, and configure the schedule — all through conversation. That's the point of MCP-native infrastructure.

Can the agent trade automatically? Yes, but with guardrails. Create a function that calls your exchange's API, then lock it with lock_function so the AI can invoke it but can't modify the code. Set orchestration limits with configure_orchestration_limits to cap how many functions the AI can call per cycle. Start with paper trading before going live.

How do I stop the agent if something goes wrong? Call stop_agent("eth-tracker"). This immediately pauses all schedules and webhooks. The agent's state and functions are preserved — call start_agent to resume. For permanent removal, destroy_agent deletes everything including state and function history.

Further reading

MCPWorks is open source.

Self-host free forever, or try MCPWorks Cloud — 14-day Pro trial, no credit card.

View on GitHub Cloud Trial — Coming Soon