Plans sized around real managed RPC limits, not vanity seat counts

Ever Green's hosted limits should track the actual cost drivers: live chain connectivity, watch execution, query volume, delivery fan-out, and historical bootstrap. This page is the current source of truth for plan design and future enforcement.

Managed RPC baseline

Initial hosted sizing assumes an Alchemy-class shared lane

Reviewed March 21, 2026
30M compute units per month on the free plan
About 500 CU/s and 25 req/s on the free lane
100 open WebSocket connections per app
1,000 subscriptions per WebSocket connection
200 concurrent requests per WebSocket connection

Practical implication: free hosted usage must stay sandbox-sized, and heavy workloads either upgrade into a managed paid tier or move to self-hosted / BYO RPC.

A model that matches the product we actually have today

Teams that want full control or already run infra

Self-hosted

Bring your own RPC
$0
+ your own infra

The open-source lane. You run the daemon, bring the RPC, and own all chain and delivery costs.

RPC model
Customer-managed RPC and infra
Limits
  • Active watchesYour hardware budget
  • API keysUnlimited
  • Delivery attemptsYour queue budget
  • Bootstrap / backfillYour provider budget
Feature shape
  • HTTP API, dashboard, and MCP
  • Webhook, Redis, and Kafka delivery
  • Historical bootstrap and replay
  • Best fit for advanced teams and regulated deployments
Control model
  • We do not enforce hosted quotas in self-hosted mode
  • The customer is responsible for RPC rate limits and uptime
Sandbox usage and early evaluation

Cloud Free

Best effort
$0
/ month

A managed trial lane sized to stay well below a shared Alchemy-class lane. Good for trying the workflow, not for production guarantees.

RPC model
Managed Ethereum mainnet RPC included
Limits
  • Workspaces1
  • Live demo watches1 at a time
  • Project API keys2
  • Query API + MCP calls10k / month
  • Delivery attempts100k / month
  • Bootstrap / backfillDisabled
Feature shape
  • One short-lived live demo watch per workspace
  • Demo deliveries stay internal and auto-stop after a few results
  • Project API keys and MCP access
  • Shared managed Ethereum mainnet lane
  • No Kafka, no BYO RPC, no historical backfill
Control model
  • Warn at 80% usage
  • Hard-stop new watch creation and query API at 100%
  • Pause new outbound delivery after a short grace window
Small teams shipping bots, agents, and alerting

Builder

Recommended
$79
/ month

The first real paid tier. Enough room for always-on watches, customer API keys, and agent workflows without forcing BYO infrastructure.

RPC model
Managed Ethereum mainnet RPC included
Limits
  • Workspaces1
  • Active watches25
  • Project API keys25
  • Query API + MCP calls250k / month
  • Delivery attempts1M / month
  • Bootstrap / backfill100k blocks / month
Feature shape
  • Webhook, Redis, and Kafka delivery
  • Managed mainnet lane with headroom for real usage
  • API keys for bots, services, and MCP agents
  • Historical bootstrap within quota
Control model
  • Warn at 80% and 100%
  • Block new watches and backfills at quota
  • Allow short delivery grace period before pausing
Teams that need serious volume or their own RPC lane

Scale

Managed + BYO
$299
/ month

For teams where query volume, delivery fan-out, or backfill load would otherwise consume the shared managed lane.

RPC model
Managed mainnet lane plus optional BYO RPC
Limits
  • Workspaces3
  • Active watches100
  • Project API keys100
  • Query API + MCP calls2M / month
  • Delivery attempts10M / month
  • Bootstrap / backfill1M blocks / month
Feature shape
  • BYO RPC support for heavy customers
  • Kafka, webhook, and Redis delivery
  • More aggressive backfills and higher query ceilings
  • Best path before dedicated enterprise infrastructure
Control model
  • Warn at 80% and 100%
  • Throttle query API and MCP before delivery is interrupted
  • Recommend BYO RPC when managed-lane headroom is low

Features to gate by plan instead of pretending every workload costs the same

FeatureCloud FreeBuilderScaleSelf-hosted
Managed Ethereum mainnet laneIncluded, best-effortIncludedIncludedBring your own
Webhook deliveryIncludedIncludedIncludedIncluded
Redis deliveryIncludedIncludedIncludedIncluded
Kafka deliveryNoIncludedIncludedIncluded
Historical bootstrap / replayNo100k blocks / month1M blocks / monthBring your own budget
Project API keys2 keys25 keys100 keysUnlimited
MCP accessIncludedIncludedIncludedIncluded
BYO RPC endpointNoNoOptionalRequired

Meter the load we pay for in the backend, not vanity metrics

active_watches

Active watches

Count of enabled watches per workspace

Why it matters

Each active watch runs on every incoming block, so this is the cleanest predictor of compute pressure inside the daemon.

Enforcement

Hard-cap create/enable actions by plan.

query_requests

Query API + MCP calls

Successful and rejected query executions per workspace

Why it matters

Queries are the customer-visible path that can become expensive quickly under automation, especially once MCP agents are looping.

Enforcement

Per-minute throttles plus monthly quota with 429 responses.

delivery_attempts

Delivery attempts

Every publish or webhook attempt, including retries

Why it matters

Retries are real cost. Metering attempts instead of only successful matches captures queue, webhook, and incident-driven fan-out.

Enforcement

Warn at 80%, then pause new outbound delivery after the grace window.

bootstrap_blocks

Bootstrap / backfill blocks

Historical blocks scanned on behalf of a workspace

Why it matters

This is the fastest way to burn managed RPC budget and is the main operation that should stay off the free tier.

Enforcement

Reject new bootstrap jobs when quota is exhausted.

api_keys

Project API keys

Active non-revoked keys per workspace

Why it matters

This is the most direct proxy for external integrations, bots, and MCP agents attached to a project.

Enforcement

Hard-cap creation until the workspace upgrades or revokes unused keys.

managed_chains

Managed chain slots

Distinct managed chain lanes assigned to a workspace

Why it matters

The hosted provider lane should be treated as scarce infrastructure. Additional chains should map directly to a plan upgrade.

Enforcement

Free and Builder stay on a single managed mainnet lane; Scale unlocks more lanes or BYO RPC.

Recommended product behavior when customers approach their limits

01
Warn in-product at 80% of any monthly quota, and surface the exact meter that is close to exhaustion.
02
Apply hard caps to creation and enable flows first. It is much better to stop creating new load than to silently drop live signals.
03
Rate-limit query API and MCP with 429s the moment a minute-level budget is exceeded.
04
Reject new bootstrap jobs when historical scan quota is exhausted instead of letting backfills starve live ingestion.
05
If delivery attempts exceed the monthly allowance, keep a short grace window, then pause outbound delivery until upgrade or reset.

Make MCP and every other surface feel like a real customer product

CustomerHow they use Ever GreenInterfaceWhat we control
OperatorsCreate watches, inspect retries, and debug delivery health.Web dashboardMeter active watches and backfill. Keep read-only inspection available even when write quotas are exhausted.
Developers and backend teamsCreate watches programmatically and consume webhook or queue output.HTTP API + webhook / Redis / KafkaMeter API keys, query traffic, delivery attempts, and bootstrap usage.
Agents and copilotsRun queries, create watches, and inspect current state through tool calls.MCPTreat MCP calls as first-class API traffic. Rate-limit queries and write operations the same way as HTTP.
Advanced teamsRun dedicated high-volume workloads or regulated setups.Self-hosted or BYO RPCMove upstream provider cost to the customer and relax hosted-lane quotas.

Start with self-hosted or sandbox, then upgrade when the lane gets real

The hosted product should feel turnkey for customers, but the control loop needs to protect the shared managed RPC lane. This plan gives us a clean story for pricing now and quota enforcement later.