What you capture when you integrate.
A position in the router.
Every meaningful routing improvement your data produces mints tokens to you. Hold them, or redeem for USDC anytime.
Lower inference cost.
A smarter router picks the right model for the task. You spend less on inference before counting any rewards.
Configurable value share.
Decide at integration whether to keep token flow, pass it through to your users, or split. Make Hokusai your monetization layer or your user-acquisition feature.
Compounding ownership.
Your early contributions keep paying as more harnesses route through the same model. You're not selling data; you're buying into an asset.
Coding / Multi-Model Routing
Coding Task Router
The first decision layer built on the Hokusai protocol, turning real coding tasks into a shared router that learns, compounds, and pays contributors back.
+3.2 pts
Cost-adjusted task success (illustrative)
12,400
Tasks routed, last 7d (illustrative)
27
Contributors (illustrative)
184,000
Tokens minted to date (illustrative)
Router activity
Live demo
Incoming task
Refactor auth middleware to support scoped API keys.
Context: 6 failing tests, Node 20, existing harness policies attached.
Candidate
Claude
Selected for refactor + test repair
Candidate
GPT
Available for fallback or critique
Candidate
Haiku
Available for fallback or critique
Candidate
Gemini
Available for fallback or critique
Candidate
OpenModel
Available for fallback or critique
Outcome
Tests passed · cost $0.012 · 1.4s
Router improvement: +0.04 DeltaOne, meaning 0.04 percentage points of cost-adjusted task success on the shared coding benchmark (illustrative).
Integration
Drop-in middleware.
Route tasks through Hokusai, execute them in your harness, then report the result back so the shared router can keep improving.
import { route } from '@hokusai/router'
const { model, reasoning } = await route({
task: userTask,
context: harnessContext,
})
const result = await models[model].run(userTask)
await route.reportOutcome(result) // mints tokens proportional to performance liftRead the integration guide Where does your routing data go today?
| Lab-owned auto-routing | Hokusai | |
|---|---|---|
| Who captures the optimization signal | The lab | You and the contributors |
| Who keeps the inference cost savings | The lab keeps margin | You |
| What you build over time | Nothing transferable | A token position in the router |
| Portability across harnesses | Locked in | Take your position with you |
| Auditability | Opaque | On-chain attribution |
What else the factory makes.
The router is the first decision layer built on Hokusai. The same protocol applies to every model where shared, incentive-aligned learning beats siloed effort.
Tool-selection router
Same factory, same upside structure: which tool/MCP to call, learned from outcomes.
Learn moreMemory retrieval optimizer
A learned policy for what to surface from a user's context, contributors share in the lift.
Learn moreCode-review critic router
Which reviewer model catches which class of bug, improved by outcome data from real PRs.
Learn morePrompt-policy optimizer
A shared model for prompt strategies, owned by the engineers who improve it.
Learn more