Run compute · earn USDC

Your hardware,
on the job.

Host inference for the agent economy. Phones, GPUs, or clusters — the Ghola router pays you every time an agent picks your node.

Any hardware. Any scale.

Big or small, the router finds work for it.

Phones

Android devices with Ghola installed can run small-model inference when idle. Best for low-latency edge jobs.

~4–8 GB RAM · on-device models

Consumer GPUs

Desktops with a 3090/4090/5090 or Apple Silicon Max/Ultra. Great for mid-size models and fine-tuning inference.

24+ GB VRAM · 7B–70B models

Data-center hardware

H100/H200 clusters, MI300X, anything you'd rent on Lambda or Vast. Serve frontier models to the network.

80+ GB VRAM · 100B+ models

Four steps to live.

1

Install the Ghola node

One command on Linux/macOS/Windows. For phones, the Ghola app handles it.

2

Register on-chain

Your node gets a SAID identity. Agents discover you via registry + model match.

3

Set pricing

Price per 1M tokens or per call. The router matches agents by price, latency, and reputation.

4

Serve and earn

Agents route calls to you. USDC settles hourly to your wallet. 85% to you, 15% protocol.

85%

Operator revenue

1 hr

Settlement cycle

USDC

Paid in stables

24/7

Router coverage

Ready to route?

Start a node