MillionScan is an on-chain perpetual futures trader analytics platform. We score and surface public on-chain trader performance data, continuously monitor it, and refresh it in real time. Built for developers, researchers, and informed observers. Information only. Not investment advice.

Skip to main content

Cookbook

Drop-in recipes for the MillionScan API

Worked Python and TypeScript examples for building research, monitoring, and analytical tools on top of the on-chain perpetual futures trader API. Designed to paste straight into Claude Code, Cursor, Cline, Aider, or any AI-assisted coding workflow — every snippet runs as-is once you have an API key.

Get a key from /developers; full reference at /developers/docs. Informational and research use only — what you build with the data is your own software.

Recipe 1

Stream OPEN events into a research notebook

Run a Jupyter or Python script that listens to live OPEN events for the curated trader pool and writes each one to a pandas DataFrame for ad-hoc analysis.

Pythonruns as-is
# pip install millionscan pandas
import os
import pandas as pd
from millionscan import Client

client = Client(api_key=os.environ["MILLIONSCAN_KEY"])

rows: list[dict] = []
log = pd.DataFrame()

# Subscribe to OPEN events across the entire curated pool.
# Pass trader_ids=[...] to scope to a watchlist instead.
for event in client.stream.subscribe(events=["OPEN"]):
    rows.append({
        "timestamp": event.timestamp,
        "public_id": event.public_id,
        "score": event.score,
        "coin": event.coin,
        "side": event.side,
        "size": event.size,
        "entry_price": event.entry_price,
        "leverage": event.leverage,
    })
    log = pd.DataFrame(rows)
    print(f"[{event.timestamp}] #{event.public_id} {event.action} {event.side} {event.coin} @ {event.entry_price}")

    if len(rows) % 50 == 0:
        log.to_parquet("opens.parquet")
        print(f"snapshot saved ({len(rows)} rows)")

Why this works. WebSocket streaming delivers events within seconds of on-chain confirmation. The DataFrame becomes a living research log you can query, plot, or join against price / funding data later.

Recipe 2

Build a daily digest of biggest position changes

Pull the last 24 hours of position events for your watchlist, rank them by notional, and email yourself the top 10 each morning.

Pythonruns as-is
# pip install millionscan
import os
from datetime import datetime, timedelta, timezone
from millionscan import Client

client = Client(api_key=os.environ["MILLIONSCAN_KEY"])

WATCHLIST = ["77271", "14567", "88123"]  # the public_ids you observe
since = datetime.now(timezone.utc) - timedelta(hours=24)

events = []
for pid in WATCHLIST:
    trades = client.traders.trades(public_id=pid, since=since)
    events.extend(trades)

# Rank by absolute notional. Notional = size × entry_price.
events.sort(key=lambda e: abs(e.size * e.entry_price), reverse=True)
top10 = events[:10]

print("=== MillionScan daily digest ===")
for e in top10:
    notional = abs(e.size * e.entry_price)
    pnl = f" realized {e.realized_pnl:+,.0f}" if e.realized_pnl else ""
    print(f"  #{e.public_id}  {e.action:5s}  {e.coin:6s}  {e.side:5s}  ${notional:>12,.0f}{pnl}")

# Drop the digest into your own mailer / Slack / Discord webhook.
# That step is your software, not MillionScan's — see /terms.

Why this works. REST + simple aggregation is enough — no streaming required. The script is cron-friendly and shows how scoreing + filtering composes inside your own pipeline.

Recipe 3

Telegram alert when a top-scored trader opens a position

Listen for OPEN events from any trader scoring 65+ and ping a Telegram chat. Useful as a personal research feed for the cohort you trust most.

Pythonruns as-is
# pip install millionscan requests
import os
import requests
from millionscan import Client

client = Client(api_key=os.environ["MILLIONSCAN_KEY"])
TG_TOKEN = os.environ["TELEGRAM_BOT_TOKEN"]
TG_CHAT = os.environ["TELEGRAM_CHAT_ID"]

def notify(text: str) -> None:
    requests.post(
        f"https://api.telegram.org/bot{TG_TOKEN}/sendMessage",
        json={"chat_id": TG_CHAT, "text": text, "parse_mode": "Markdown"},
        timeout=10,
    )

# Stream OPEN events for the entire curated pool, then keep only
# top-scoring traders. Keeping the score filter on the client side
# means a score change mid-event is reflected immediately.
for ev in client.stream.subscribe(events=["OPEN"]):
    if ev.score is None or ev.score < 65:
        continue
    notional = abs(ev.size * ev.entry_price)
    msg = (
        f"*Top-scored OPEN* — score {ev.score}\n"
        f"Trader `#{ev.public_id}` "
        f"{ev.side} {ev.coin} @ {ev.entry_price}\n"
        f"Notional ${notional:,.0f} · "
        f"[trader page](https://millionscan.com/trader/{ev.public_id})"
    )
    notify(msg)

Why this works. Filtering on score inside the WebSocket subscription keeps the bandwidth low. The Telegram bot side is plain HTTP — no MillionScan SDK extension needed.

Recipe 4

Score-aware dashboard component (Next.js / TypeScript)

Render a live grid of the top 20 traders by score in a Next.js app, refreshed every 30 seconds via a server component fetch.

TypeScriptruns as-is
// app/dashboard/page.tsx (Next.js 14 server component)
import { MillionScan } from "@millionscan/sdk";

export const revalidate = 30;

const ms = new MillionScan({ apiKey: process.env.MS_KEY! });

export default async function ScoreboardPage() {
  const traders = await ms.traders.list({
    sort: "score",
    limit: 20,
    tier: "active",
  });

  return (
    <main>
      <h1>Top 20 by score</h1>
      <ul>
        {traders.map((t) => (
          <li key={t.public_id}>
            <a href={`/trader/${t.public_id}`}>
              #{t.public_id}
            </a>
            <span>{t.score} / 68</span>
            <span>{t.roi_30d?.toFixed(1)}% (30D ROI)</span>
            <span>${t.account_value?.toLocaleString()}</span>
          </li>
        ))}
      </ul>
    </main>
  );
}

Why this works. Server components let you hold the API key on the server, avoiding any client-side credential exposure. The 30 s refresh matches the API's snapshot cadence so you never poll faster than the data updates.

Recipe 5

Backtest a watchlist's full event history

Pull every event your watchlist has produced and replay them in chronological order against an analytical model — counting outcomes, plotting cumulative PnL, or anything else your research workflow needs.

Pythonruns as-is
# pip install millionscan pandas matplotlib
import os
import pandas as pd
import matplotlib.pyplot as plt
from millionscan import Client

client = Client(api_key=os.environ["MILLIONSCAN_KEY"])

WATCHLIST = ["77271", "14567", "88123"]

frames = []
for pid in WATCHLIST:
    trades = client.traders.trades(public_id=pid, limit=1_000)
    df = pd.DataFrame([t.dict() for t in trades])
    df["public_id"] = pid
    frames.append(df)

events = pd.concat(frames, ignore_index=True)
events["timestamp"] = pd.to_datetime(events["timestamp"])
events.sort_values("timestamp", inplace=True)

# Cumulative realized PnL per trader.
closes = events[events["action"].isin(["CLOSE", "REDUCE", "FLIP"])].copy()
closes["cum_pnl"] = closes.groupby("public_id")["realized_pnl"].cumsum()

ax = closes.set_index("timestamp").groupby("public_id")["cum_pnl"].plot(
    figsize=(10, 5), legend=True
)
plt.title("Cumulative realized PnL — research backtest")
plt.tight_layout()
plt.savefig("backtest.png")
print("backtest.png saved")

Why this works. REST `traders/{id}/trades` returns the full chronologically-ordered event list, so backtests don't need to reconstruct the timeline themselves. Pandas + matplotlib is enough; no execution side effects, no broker integration, no live state.

Using these in Claude Code, Cursor, Cline, or Aider

The recipes above are self-contained — paste a snippet into your AI agent's context, ask it to adapt the watchlist / threshold / output destination to your project, and you should be running in under a minute. The Python SDK exposes typed methods (client.traders.list, client.stream.subscribe) so an AI assistant gets full autocomplete on every field. For OpenAPI-aware tooling, the spec lives at /developers/docs.

MillionScan provides on-chain data analytics for informational and research purposes only. Content does not constitute financial advice, investment recommendation, or solicitation. We don't custody funds, execute trades, or operate any trading service. Any execution or automation a builder writes against the API is the builder's own software running on their own venue, with their own decisions.

Back to /developers