Skip to main content
TE Prize 2024 Winner

The decision layer
for tokenized assets

We turn fragmented onchain data into standardized, decision-ready financial metrics for Web3 assets.

Built for decision-making. Manually verified data & interpretation framework without external dependencies.

Product overview
Valueverse product — Balance History and Rewards History dashboards
UNISWAPAAVELIDOCURVEMAKERDAOMORPHOPENDLE
UNISWAPAAVELIDOCURVEMAKERDAOMORPHOPENDLE
UNISWAPAAVELIDOCURVEMAKERDAOMORPHOPENDLE
UNISWAPAAVELIDOCURVEMAKERDAOMORPHOPENDLE

Core Bottleneck

While raw blockchain data is available for everyone

Interpretation & verified context attribution is the key bottleneck for decisions:

01

Protocol revenue ≠ tokenholder revenue

The economic path from protocol fees to token capture is often obscured and inconsistent.

02

Comparable assets are difficult to evaluate without the same basis

Without a unified attribution model, comparing assets across different protocols remains speculative.

03

Existing dashboards help monitor metrics, but rarely help make decisions

Visualizing data is only the first step; interpreting it for allocation requires a deeper analytical layer.

Key questions

What drives this token's value?

Is this yield real or subsidized?

Which asset is better — and why?

How should different signals impact portfolio allocation?

Architecture

How it works

Layer 01

Data Layer: the foundation of Valueverse

Proprietary data: real-time indexing, human verification & structuring across tokens, contracts, and revenue sources.

No dependencies on external data providers

Deep Context Attribution Model — data flow from Protocol and Token through Indexing, Labeling, and Deep Context Metrics to Decisions
Token Value Attribution — Curve $CRV and Aerodrome $AERO value mechanism graphs showing Cashflow, Governance, and Conditional Action relationships

Layer 02

Attribution Layer: what data means

Proprietary attribution and interpretation models for web3 protocols and their tokens:

  • Mapping protocol architecture & native asset design
  • Value flows & revenue attribution
  • Interpretation to Value drivers and Multipliers

Layer 03

Decision Layer: Pre-Investment & Post-Investment Analytics

Interfaces designed around investment workflows:

  • Assets Screening & Due Diligence
  • Portfolio construction & rebalancing
  • Performance analysis
Valueverse Decision Layer — Balance History and Rewards History dashboards

Decision tools

From Attribution Models to Full Decision System

01Simulation

Past

What if a portfolio bought specific assets on specific dates?

Simulate PnL based on token price plus claimable cashflows where available, with optional compounding assumptions.

02Simulation

Future

What outcomes can be expected from buying assets now?

  • Growth models
  • Dilution models
  • Revenue vs. token price models
  • Discounting frameworks
03AI-Assisted

Portfolio Composer

An AI-assisted tool for selecting assets based on multiplier range, sector exposure, and historical performance, then applying simulations on top of human-verified datasets.

Who it's for

Use cases

Primary audience

Investors and organizations

Teams that need a consistent analytical basis for evaluating tokenized assets.

01

Consistent token screening

02

Faster and deeper due diligence

03

Cleaner asset comparison across sectors

04

Portfolio management, backtesting, and projections

05

Automatic memos and performance reporting

06

Alerts for metric updates and tokenomic changes

Embedded use

Products, agents, and integrations

Teams that want structured token intelligence inside an existing interface.

A1

Infrastructure for investment vehicles

A2

Data for institutional-grade research

A3

Structured inputs for signals

A4

Widgets for wallets, trading terminals, and exchanges

Delivery modes

APIMCPMPP

Product surface

Product views built for interpretation, not screenshot theater.

Asset detail
Valueverse — Yield Basis asset detail with ERM multiplier history and revenue vs effective market cap charts

Asset detail

Track DeFi returns, not just positions

Track token balances and DeFi positions, analyze rewards history, and calculate PnL — your single source of truth when every protocol claims the highest APY.

01

Balance history and reward history in one frame

02

Token-level metrics tied to actual capture logic

03

A clean basis for comparing strategies over time

Rewards history
Valueverse Rewards History — detailed reward breakdown across protocols and strategies

Rewards history

Get a clear view of your rewards over time

How much net reward does each strategy generate? What if you separate points from token rewards? How do you compare pools across protocols? Valueverse provides the answers with detailed rewards history.

01

Net reward per strategy breakdown

02

Points vs. token rewards separation

03

Cross-protocol pool comparison

Early access intake

Request access to:

01

Early product

Get access to the product before public launch.

02

Direct feedback loop with the team

Shape the product alongside the team building it.

03

Priority influence on roadmap

Your workflows directly inform what gets built next.

Apply for access

5 required fields

Tell us who is applying

This takes under a minute. We only ask for what is required to review the request properly.

We use this information to qualify and route the request. No mass outreach, no newsletter dump.

TE Prize 2024 Winner

Built on top of our original inventions

TE Prize 202401

Token Value Attribution Model

Turning complex token designs into a value graph showcasing economic mechanisms

We invented a foundational model for understanding elementary economic functions, drivers, and revenue sources forming asset’s fundamental value. Original academic research and implementation, awarded by the TE Prize 2024.

Token Revenue Attribution02

Effective Revenue Multiplier (ERM)

Effective P/FCF (adjusted for eligibility)

Protocol Revenue ≠ Token Revenue in most cases, which means that P/F, P/S, P/E are protocol-specific, not token specific metrics. We adapted P/E to web3 revenue flows structures, distinguishing protocol-level cashflows from what tokenholders actually capture.

Deep Context Attribution03

Deep Context Attribution Model

Structured dataset & attribution model for an entire protocol

Lack of context and verification limits the usefulness of AI-driven hypothesis testing. To help close this AI productivity gap, we build datasets that map all key protocol contracts, variables, and logic with deep explanatory context. These models are constructed and verified manually, making them reliable inputs for future AI-powered analysis and calculations.

Capabilities

  • Track performance, revenue dynamics, and structural changes over time
  • Make allocation decisions based on value drivers
  • Compare tokens using consistent, interpretable metrics
  • Generate investment memos and performance reports from structured data
  • Track material changes in tokenomics, emissions, and revenue composition