Past
What if a portfolio bought specific assets on specific dates?
Simulate PnL based on token price plus claimable cashflows where available, with optional compounding assumptions.
We turn fragmented onchain data into standardized, decision-ready financial metrics for Web3 assets.
Built for decision-making. Manually verified data & interpretation framework without external dependencies.

Core Bottleneck
Interpretation & verified context attribution is the key bottleneck for decisions:
The economic path from protocol fees to token capture is often obscured and inconsistent.
Without a unified attribution model, comparing assets across different protocols remains speculative.
Visualizing data is only the first step; interpreting it for allocation requires a deeper analytical layer.
What drives this token's value?
Is this yield real or subsidized?
Which asset is better — and why?
How should different signals impact portfolio allocation?
Architecture
Layer 01
Proprietary data: real-time indexing, human verification & structuring across tokens, contracts, and revenue sources.
No dependencies on external data providers


Layer 02
Proprietary attribution and interpretation models for web3 protocols and their tokens:
Layer 03
Interfaces designed around investment workflows:

Decision tools
What if a portfolio bought specific assets on specific dates?
Simulate PnL based on token price plus claimable cashflows where available, with optional compounding assumptions.
What outcomes can be expected from buying assets now?
An AI-assisted tool for selecting assets based on multiplier range, sector exposure, and historical performance, then applying simulations on top of human-verified datasets.
Who it's for
Primary audience
Teams that need a consistent analytical basis for evaluating tokenized assets.
Consistent token screening
Faster and deeper due diligence
Cleaner asset comparison across sectors
Portfolio management, backtesting, and projections
Automatic memos and performance reporting
Alerts for metric updates and tokenomic changes
Embedded use
Teams that want structured token intelligence inside an existing interface.
Infrastructure for investment vehicles
Data for institutional-grade research
Structured inputs for signals
Widgets for wallets, trading terminals, and exchanges
Delivery modes
Product surface

Asset detail
Track token balances and DeFi positions, analyze rewards history, and calculate PnL — your single source of truth when every protocol claims the highest APY.
Balance history and reward history in one frame
Token-level metrics tied to actual capture logic
A clean basis for comparing strategies over time

Rewards history
How much net reward does each strategy generate? What if you separate points from token rewards? How do you compare pools across protocols? Valueverse provides the answers with detailed rewards history.
Net reward per strategy breakdown
Points vs. token rewards separation
Cross-protocol pool comparison
Early product
Get access to the product before public launch.
Direct feedback loop with the team
Shape the product alongside the team building it.
Priority influence on roadmap
Your workflows directly inform what gets built next.
Apply for access
5 required fieldsThis takes under a minute. We only ask for what is required to review the request properly.
Turning complex token designs into a value graph showcasing economic mechanisms
We invented a foundational model for understanding elementary economic functions, drivers, and revenue sources forming asset’s fundamental value. Original academic research and implementation, awarded by the TE Prize 2024.
Effective P/FCF (adjusted for eligibility)
Protocol Revenue ≠ Token Revenue in most cases, which means that P/F, P/S, P/E are protocol-specific, not token specific metrics. We adapted P/E to web3 revenue flows structures, distinguishing protocol-level cashflows from what tokenholders actually capture.
Structured dataset & attribution model for an entire protocol
Lack of context and verification limits the usefulness of AI-driven hypothesis testing. To help close this AI productivity gap, we build datasets that map all key protocol contracts, variables, and logic with deep explanatory context. These models are constructed and verified manually, making them reliable inputs for future AI-powered analysis and calculations.
Capabilities