In the decentralized finance ecosystem, price action remains the primary focus of retail sentiment. However, sophisticated market participants understand that price is often a lagging indicator of a far more potent force: **Supply Velocity.** Information asymmetry regarding token unlocks, vesting cliffs, and emissions schedules creates a significant disadvantage for the average investor. SupplySide was architected to bridge this gap through deterministic data modeling and transparent risk scoring.
The Problem: Invisible Sell Pressure
Traditional market aggregators prioritize "Circulating Supply" and "Total Supply"—static metrics that offer zero insight into temporal risk. An asset might appear stable today, only to face a 5% dilution event next Tuesday due to a venture capital vesting cliff. Most of this data is buried in technical whitepapers or PDF legal disclosures, making it functionally invisible to real-time screening.
Core Architecture: The Normalization Engine
The primary technical challenge in building SupplySide was the lack of a standardized schema for tokenomics. Every project uses a different emission model: some are linear daily (e.g., Celestia), some are monthly cliffs (e.g., Starknet), and some are hybrid models with "ecosystem" pools that emit based on opaque governance triggers.
I engineered a **TypeScript-based Normalization Engine** that transforms these disparate inputs into a unified temporal flow. This involved creating a custom type system to handle different "Supply event" categories:
type EmissionModel = 'LINEAR' | 'CLIFF' | 'HYBRID' | 'STOCHASTIC';
interface TokenomicsProfile {
assetId: string;
totalCap: number;
initialCirculating: number;
schedules: {
type: EmissionModel;
startDate: number; // Unix timestamp
duration: number; // For linear
amount: number;
interval?: number; // For cliff
}[];
}
The engine runs forward-looking simulations for each asset, projecting the supply curve exactly 24 months into the future. This allows users to visualize not just the *if*, but the *exactly when* of market dilution.
The 4-Factor Risk Model
SupplySide doesn't rely on "black-box" AI for its rankings. Instead, it uses a deterministic 4-factor scoring model that rewards transparency and punishes aggressive emissions:
- Forward Supply Growth: The percentage of new supply entering the market over the next 180 days.
- Unlock Concentration: The ratio of "cliff" events (high impact) vs. "linear" events (low impact) in the near term.
- Emissions Velocity: The rate of daily issuance relative to the current circulating base.
- Historical Absorption: A calculated metric of how the market has historically performed during similar previous dilution events.
Frontend Orchestration with Next.js & Recharts
Visualizing 24 months of multi-asset data requires a highly optimized frontend. I utilized **Next.js 15 Server Components** to pre-process the heavy mathematical projections, serving small, hydration-ready JSON bundles to the client. For the visualization, I built custom interactive compositions using **Recharts**, allowing users to scrub across a timeline and see exactly how market capitalization must grow just to maintain the current price as supply expands.
// Scrubbing timeline logic implementation
const SupplyProjectionChart = ({ data }) => (
<AreaChart data={data}>
<Defs>
<linearGradient id="colorSupply" x1="0" y1="0" x2="0" y2="1">
<stop offset="5%" stopColor="#5eead4" stopOpacity={0.1}/>
<stop offset="95%" stopColor="#5eead4" stopOpacity={0}/>
</linearGradient>
</Defs>
<Area
type="monotone"
dataKey="projectedSupply"
stroke="#5eead4"
fillOpacity={1}
fill="url(#colorSupply)"
/>
<Tooltip content={<CustomTooltip />} />
</AreaChart>
);
Conclusion: Engineering for Market Honesty
SupplySide is more than just a dashboard; it's a tool for market honesty. By transforming obscure whitepaper math into clear, interactive engineering, we can eliminate the "surprises" that lead to retail losses. The project remains a study in how full-stack engineering can be used to democratize complex financial data.