
We analyze AI tools based on business impact and competitive durability — not hype.
We evaluate AI tools through a structural lens — not feature lists, not user reviews, not hype cycles.
Every tool is plotted on Business Impact × Competitive Defensibility. This reveals strategic position, not surface quality.
We ask one question: Will this tool still matter in 3 years? The answer depends on ecosystem capture and switching costs.
Most AI tools will not survive the next platform cycle. We use a structural framework to identify which ones have lasting power — and which are noise.
They solve a temporary friction that platforms will absorb. Without structural differentiation, they become features — not products.
GitHub, Google, Microsoft — major platforms are integrating AI capabilities at speed. Tools without ecosystem lock-in face existential risk.
Durable AI tools become infrastructure. They create switching costs, own workflows, and compound value over time. That is where real value lives.
A visual 2×2 matrix to position AI tools and evaluate their long-term strategic strength. This is not a gimmick — it is the core intellectual framework of this site.
AI Noise
Low impact, low defensibility. Likely short-lived.
Niche Engineering
Technically strong, economically limited.
Platform Target
High impact but vulnerable to absorption.
Category Infrastructure
High impact, high defensibility. Durable.
Deep structural evaluations of AI products — analyzing their positioning, durability, and strategic trajectory.
FansOfAI is a positioning lab where we publicly stress-test AI products — including our own — through a structural lens.
Transparency Statement: Some early tools evaluated are developed internally. This framework is applied consistently and critically. Over time, external tools will be added using the same methodology.
We do not claim neutrality. We claim rigor. Every assessment follows the same structural framework — regardless of origin.
