How I Evaluate Crypto Projects: A Framework Built From Institutional Research
A practical framework for evaluating crypto projects the way institutions do
For the past five years, my work has revolved around evaluating crypto projects in professional settings — across research, tokenization, and portfolio-level decision making. Over that time, I’ve reviewed hundreds of protocols, assessed their fundamentals, and helped shape frameworks used to decide which assets deserve deeper attention and which should be filtered out early.
After recently relocating to the US and stepping into a more independent phase of my work, I’m starting with publishing the analytical framework I’ve developed over the years. Not as a checklist, and not as investment advice — but as a way of thinking about crypto more clearly, more structurally, and with far less noise.
This framework is not about predicting prices. It’s about understanding what is real, what is durable, and what is structurally sound in an industry that often rewards narratives before fundamentals.
What follows is the mental model I use to evaluate crypto projects — the same one that guided my work across institutional research, portfolio evaluation, and asset selection.
Why a Framework Matters
Most crypto analysis today lives at two extremes.
On one side, you have surface-level commentary driven by price action, narratives, and social media momentum. On the other, deeply technical research that’s inaccessible unless you already live inside protocol codebases.
Institutional evaluation sits in between.
It focuses on structure rather than hype, on durability rather than excitement, and on understanding why something should work — not just whether it might go up.
A good framework helps you:
filter noise quickly
avoid emotionally-driven decisions
compare projects objectively
understand long-term viability
recognize weak ideas early
The goal isn’t certainty. It’s clarity.
1. Start With the Documentation
Every serious evaluation begins with primary sources: the whitepaper, technical documentation, or official research posts.
Not skimming — actually reading.
The purpose here is simple: understand what problem the project believes exists.
Good projects articulate this clearly. Weak ones hide behind abstractions.
When reading documentation, I look for three things:
Clarity of the problem
Is the problem concrete, real, and well-scoped?Soundness of the solution
Does the proposed mechanism actually solve that problem, or just describe an outcome?Intellectual honesty
Are trade-offs acknowledged, or does everything sound “revolutionary”?
Strong documentation creates focus. It tells you what not to analyze further. Weak documentation already signals future issues.
When I first studied yield tokenization protocols, for example, the strong ones clearly articulated why yield markets were fragmented and how their architecture addressed that fragmentation. That clarity made further analysis worth the effort.
2. Understand the Competitive Landscape
Once the problem is clear, the next question is obvious:
Who else is trying to solve this?
If a project claims to have no competition, one of two things is true:
you are extremely early, or
the problem isn’t real
I use tools like DeFiLlama, Artemis, CoinGecko etc to map comparable projects. The goal isn’t to analyze every competitor deeply, but to understand positioning.
At this stage, I ask:
Is this meaningfully differentiated?
Or is it a variation of something that already exists?
What trade-offs does it make relative to incumbents?
Many projects default to vague claims like “faster” or “cheaper” without explaining why those advantages are sustainable.
With experience, patterns emerge quickly. You begin to recognize recycled ideas, copy-paste architectures, and narratives that resurface every cycle under new branding.
True differentiation is usually obvious — and rare.
3. Market Fundamentals
Even strong ideas fail in the wrong market.
This step is about zooming out and asking whether the environment supports the project’s success.
I look at:
market size and growth potential
current adoption and usage
regulatory exposure
maturity of surrounding infrastructure
Data matters here. I rely on:
Dune dashboards
project analytics dashboards
public research from firms like Glassnode, Kaiko, and Messari etc
What matters most is evidence of real usage.
Are people actually using the product?
Is activity growing?
Is revenue being generated?
Is demand organic or artificially incentivized?
A polished whitepaper without usage is not “early” — it’s unproven.
This step acts as a reality check. It prevents over-indexing on elegant ideas that lack economic gravity.
4. Tokenomics and Value Accrual
This is where many retail participants lose money.
The key question is simple but uncomfortable:
Does the success of the project translate into value for the token?
If the answer is unclear, that’s already a warning sign.
I break token analysis into five components:
Token utility
What does the token actually do? Governance alone is usually weak.Value accrual mechanism
How does protocol usage benefit token holders? Fees, burns, staking, or revenue participation?Supply distribution
How much goes to insiders versus the community?Vesting schedules
Are unlocks short-term or spread over multiple years?Inflation dynamics
Is supply expanding faster than demand?
Tools like Token Terminal and TokenUnlocks help quantify this, but interpretation matters more than numbers.
Many projects design tokenomics to optimize for launch optics — airdrops, incentives, and early activity — rather than long-term sustainability.
The core test is simple:
If this protocol succeeds, does the token meaningfully benefit?
If not, incentives will eventually break.
5. Team and Backing
Projects fail far more often from execution than from ideas.
So I spend time understanding who is actually building.
I look for:
public, verifiable founders
relevant domain experience
full-time commitment
continuity over time
If the team is anonymous, I look for institutional backing as a proxy for diligence. Serious investors do not allocate capital without understanding who they’re backing.
I also look for coherence: does the team’s background actually match the problem they’re solving?
Strong teams don’t guarantee success, but weak or opaque teams dramatically increase risk.
6. Second-Order Due Diligence
Once the core fundamentals pass, I go deeper.
This layer includes:
security audits and how issues were resolved
GitHub activity and development cadence
roadmap execution history
ecosystem integrations
wallet and infrastructure support
quality of community discussion
None of these alone decide an investment. But patterns matter.
One unresolved audit issue might be acceptable.
Multiple unresolved issues across audits, development, and partnerships usually are not.
This stage separates surface-level research from serious analysis.
7. Market Sense (The Non-Checklist Part)
Some judgment can’t be reduced to metrics.
Market sense develops from long exposure: reading research, tracking narratives, understanding incentives, and watching how cycles evolve.
To give a concrete example: I currently spend time following research on post-quantum cryptography.
Not because it’s a hot narrative — it isn’t — but because long-term blockchain security relies on cryptographic assumptions that may eventually be challenged by quantum computing.
Bitcoin and Ethereum today rely on elliptic curve cryptography. Over long horizons, the ecosystem will likely need transitions toward post-quantum systems such as lattice-based, multivariate or hash-based cryptography.
Most people aren’t thinking about this yet. That’s precisely the point.
Market sense is about understanding what will matter before it becomes a narrative. It’s not prediction; it’s preparation.
This kind of thinking comes from sustained exposure — reading research, following technical discussions, watching where capital and talent slowly accumulate.
Putting It All Together
No single step in this framework is sufficient on its own.
Strong fundamentals with poor timing are risky.
Perfect timing with weak fundamentals is worse.
But when structure, incentives, execution, and timing align, you get clarity.
If you’re not willing to do this kind of work, you’re not investing — you’re speculating.
What Comes Next
I’ll be using this framework to analyze real crypto projects in upcoming posts — applying each step transparently so you can see how the process works in practice.
These evaluations will be published in two formats:
Written deep dives here on Substack, for structured analysis
Video breakdowns on YouTube, where I walk through the framework visually and live (Link Attached below)
If there’s a project you’d like me to analyze next, leave it in the comments.
This is The Crypto Story — where crypto finally starts to make sense.
Youtube: https://www.youtube.com/@rishabhcryptostory
