DeepSeek Shatters the Last Bubble in the Agent Track, DeFAI May Foster New Growth, and Industry Financing Methods Are Set to Change. This article is authored by Kevin, the Researcher at BlockBooster, and republished by Foresight News.
(Background Summary: Binance Report: How is DeFAI Reshaping the DeFi Interaction Experience?)
(Additional Background: Legendary Short Seller: Signs of a U.S. Stock Market Bubble Seen, the Biggest Risk in the Coming Year is the Deepseek Effect)
TLDR:
The emergence of DeepSeek has shattered the computational power moat, with open-source models leading the way in computational optimization as a new direction;
DeepSeek benefits the model and application layers in the industry’s upstream and downstream, negatively impacting computational power protocols in the infrastructure;
DeepSeek inadvertently bursts the last bubble in the Agent track, with DeFAI likely to give birth to new growth;
The zero-sum game of project financing may come to an end, with community launches and a small number of VC investments becoming the norm.
The impact of DeepSeek will profoundly affect the upstream and downstream of the AI industry this year, successfully allowing consumer-grade graphics cards to perform large model training tasks that were previously only feasible with high-end GPUs. The primary moat in AI development—computational power—begins to collapse, as algorithm efficiency surges at a rate of 68% per year, while hardware performance follows Moore’s Law in a linear ascent. The entrenched valuation models of the past three years are no longer applicable, and the next chapter of AI will be opened by open-source models.
Although the AI protocols in Web3 differ entirely from those in Web2, they inevitably bear the influence of DeepSeek, which will give rise to entirely new use cases across the upstream and downstream of Web3 AI: infrastructure layer, middleware layer, model layer, and application layer.
Analyzing the Collaborative Relationships Among Upstream and Downstream Protocols
Through the analysis of technical architecture, functional positioning, and actual use cases, I have categorized the entire ecosystem into: infrastructure layer, middleware layer, model layer, and application layer, outlining their dependencies:
Infrastructure Layer
The infrastructure layer provides decentralized underlying resources (computational power, storage, L1), with computational power protocols including: Render, Akash, io.net, etc.; storage protocols including: Arweave, Filecoin, Storj, etc.; and L1 including: NEAR, Olas, Fetch.ai, etc.
The computational power layer protocols support model training, inference, and framework execution; storage protocols store training data, model parameters, and on-chain interaction records; L1 optimizes data transmission efficiency and reduces latency through specialized nodes.
Middleware Layer
The middleware layer serves as a bridge connecting infrastructure with upper-layer applications, providing framework development tools, data services, and privacy protection, with data labeling protocols including: Grass, Masa, Vana, etc.; development framework protocols including: Eliza, ARC, Swarms, etc.; and privacy computing protocols including: Phala, etc.
The data service layer fuels model training, the development framework relies on the computational power and storage from the infrastructure layer, and the privacy computing layer ensures data security during training/inference.
Model Layer
The model layer is used for model development, training, and distribution, with open-source model training platforms such as: Bittensor.
The model layer depends on the computational power of the infrastructure layer and the data from the middleware layer; models are deployed on-chain via development frameworks; the model market delivers training results to the application layer.
Application Layer
The application layer comprises AI products aimed at end users, with agents including: GOAT, AIXBT, etc.; and DeFAI protocols including: Griffain, Buzz, etc.
The application layer calls pre-trained models from the model layer; relies on privacy computing from the middleware layer; and complex applications require real-time computational power from the infrastructure layer.
DeepSeek May Have a Negative Impact on Decentralized Computational Power
According to a survey, approximately 70% of Web3 AI projects actually call OpenAI or centralized cloud platforms, with only 15% using decentralized GPUs (such as Bittensor subnet models), and the remaining 15% employing a hybrid architecture (sensitive data processed locally, general tasks on the cloud).
The actual usage rate of decentralized computational power protocols is far below expectations, not matching their actual market value. There are three reasons for the low usage rate: Web2 developers carry over existing toolchains when migrating to Web3; decentralized GPU platforms have yet to achieve a price advantage; and some projects evade data compliance reviews under the guise of “decentralization,” while still relying on centralized clouds for computational power.
AWS/GCP holds over 90% of the AI computational power market share, while Akash’s equivalent computational power is only 0.2% of AWS. Centralized cloud platforms possess moats such as: cluster management, RDMA high-speed networking, and elastic scaling; decentralized cloud platforms have web3 improved versions of these technologies but suffer from unaddressed deficiencies, such as latency issues: communication delays among distributed nodes are six times that of centralized clouds; and toolchain fragmentation: PyTorch/TensorFlow do not natively support decentralized scheduling.
DeepSeek reduces computational power consumption by 50% through Sparse Training, enabling consumer-grade GPUs to train models with billions of parameters via dynamic model pruning. Market expectations for high-end GPUs in the short term have been significantly downgraded, and the market potential for edge computing has been revalued. As shown in the figure above, prior to the emergence of DeepSeek, the vast majority of protocols and applications in the industry utilized platforms like AWS, with only a handful of use cases deployed on decentralized GPU networks. These use cases focused on the latter’s price advantage in consumer-grade computational power while disregarding the impact of latency.
This situation may further deteriorate with the advent of DeepSeek. DeepSeek has liberated the constraints on long-tail developers, and low-cost, efficient inference models will proliferate at an unprecedented rate. In fact, many centralized cloud platforms and countries have already begun deploying DeepSeek, and the substantial reduction in inference costs will spawn numerous front-end applications, which have a massive demand for consumer-grade GPUs. Faced with an impending large market, centralized cloud platforms will embark on a new round of user competition, not only against top platforms but also against countless small centralized cloud platforms. The most direct form of competition will be through price reductions. It can be anticipated that the prices for the 4090 on centralized platforms will drop, which would be catastrophic for Web3’s computational power platforms. When price is not the only moat for the latter, and when industry computational power platforms are also forced to cut prices, the result is unsustainable for io.net, Render, Akash, and others. The price war will destroy the last valuation ceiling of the latter, and the downward spiral caused by declining revenues and user loss may force decentralized computational power protocols to pivot in a new direction.
Specific Implications of DeepSeek on Industry Upstream and Downstream Protocols
As shown in the figure, I believe DeepSeek will have different impacts on the infrastructure layer, model layer, and application layer, with positive impacts including:
The application layer will benefit from the substantial reduction in inference costs, allowing more applications to keep Agent applications online for extended periods at low cost, and to complete tasks in real-time;
At the same time, low-cost model expenses like those of DeepSeek will enable the DeFAI protocol to form more complex SWARMs, with thousands of Agents being used for a single use case, where each Agent’s role will be very granular and precise, significantly enhancing the user experience and preventing user input from being misinterpreted and executed by the model;
Developers in the application layer can fine-tune models, feeding DeFi-related AI applications with price data, on-chain data and analysis, and governance data, without having to pay exorbitant licensing fees.
The relevance of the open-source model layer is proven after the advent of DeepSeek, as high-end models are made available to long-tail developers, stimulating a broad development boom;
The computational power wall built around high-end GPUs over the past three years is entirely dismantled, providing developers with more choices and establishing a direction for open-source models, where the future competition among AI models will not be based on computational power but on algorithms—a shift in belief that will become the cornerstone of confidence for open-source model developers;
Specific sub-networks around DeepSeek will continuously emerge, with model parameters increasing at equivalent computational power, attracting more developers into the open-source community.
On the negative side:
The latency that objectively exists in the computational power protocols within the infrastructure cannot be optimized;
Moreover, a hybrid network composed of A100 and 4090 requires higher coordination algorithm demands, which is not an advantage of decentralized platforms.
DeepSeek Shatters the Last Bubble in the Agent Track, DeFAI May Foster New Growth, and Industry Financing Methods Are Set to Change
Agents are the last hope for AI in the industry. The emergence of DeepSeek has liberated computational power restrictions and painted a future expectation of application explosion. While this is a significant boon for the Agent track, it has been punctured due to the strong correlation between the industry, the U.S. stock market, and Federal Reserve policies, causing the market value of the track to plummet.
In the wave of integration between AI and the industry, technological breakthroughs and market games have always gone hand in hand. The chain reaction triggered by NVIDIA’s market value fluctuations mirrors a funhouse mirror, reflecting the deep dilemmas in the AI narrative within the industry: From On-chain Agents to the DeFAI Engine, beneath the seemingly complete ecological map lies the harsh reality of weak technological infrastructure, hollow value logic, and capital dominance. The superficially prosperous on-chain ecosystem conceals hidden ailments: a large number of high FDV tokens competing for limited liquidity, obsolete assets clinging to life through FOMO sentiment, and developers trapped in PVP games exhausting their innovative momentum. When incremental capital and user growth hit a ceiling, the entire industry falls into the “innovator’s dilemma”—longing for breakthroughs in narrative yet struggling to free itself from the shackles of path dependency. This tearing state provides a historic opportunity for AI Agents: it is not only an upgrade of the technical toolbox but also a restructuring of value creation normalization.
Over the past year, an increasing number of teams in the industry have discovered that traditional financing models are failing—the strategy of giving small shares to VCs while keeping tight control has become hard to sustain. With VCs tightening their pockets, retail investors refusing to take over, and high thresholds for large exchanges, a new approach more suited to a bear market is emerging: uniting top KOLs with a small number of VCs, launching projects with a large proportion of community involvement, and low market cap cold starts.
Innovators represented by Soon and Pump Fun are paving new paths through “community launches”—unifying top KOL endorsements, directly distributing 40%-60% of tokens to the community, initiating projects at valuation levels as low as $10 million FDV, achieving millions of dollars in financing. This model builds consensus FOMO through KOL influence, allowing teams to lock in profits early while exchanging high liquidity for market depth. Although it sacrifices short-term control advantages, it allows for token repurchases at low prices in bear markets through compliant market-making mechanisms. Essentially, this is a formalized shift in power structure: moving from a VC-led game of hot potato (institutional takeovers – exchange sell-offs – retail purchases) to a transparent game of community consensus pricing, forming a new symbiotic relationship between project parties and the community through liquidity premiums. As the industry enters a transparency revolution cycle, projects that cling to traditional control logic may become mere shadows of the past amid the wave of power shifts.
The short-term pain in the market precisely confirms the irreversibility of the long-term technological tide. When AI Agents reduce on-chain interaction costs by two orders of magnitude, and adaptive models continuously optimize the capital efficiency of DeFi protocols, the industry is expected to welcome the long-awaited Massive Adoption. This transformation does not rely on conceptual hype or capital maturation but is rooted in the penetrating power of technology driven by genuine needs—just as the electric revolution did not stall due to the bankruptcies of light bulb companies, Agents will eventually become the true golden track after the bubble bursts. DeFAI may be the fertile ground for new growth, as low-cost inference becomes routine, we may soon witness the birth of use cases where hundreds of Agents are combined into a Swarm. With equivalent computational power, the significant increase in model parameters can ensure that Agents in the open-source model era can be fine-tuned more thoroughly, enabling them to break down complex user input commands into task pipelines that single Agents can execute effectively. Each Agent optimizing on-chain operations may boost the overall activity of DeFi protocols and increase liquidity. More complex DeFi products led by DeFAI will emerge, which is precisely where new opportunities arise after the last round of bubble bursts.