On February 23, 2026, what should have been a peaceful Monday turned into a brutal single-day plunge for IBM's stock, the worst since October 2000. The stock closedOn February 23, 2026, what should have been a peaceful Monday turned into a brutal single-day plunge for IBM's stock, the worst since October 2000. The stock closed

IBM loses $40 billion, Block lays off half its staff but its stock price rises: In the AI ​​era, what assets are worth tokenizing?

2026/03/19 09:20
13 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

On February 23, 2026, what should have been a peaceful Monday turned into a brutal single-day plunge for IBM's stock, the worst since October 2000. The stock closed down 13.2%, wiping out approximately $40 billion in market capitalization within hours. The trigger wasn't a disastrous earnings report or regulatory crackdown, but rather a product announcement: AI startup Anthropic announced that its Claude Code tool could modernize the COBOL programming language running on IBM systems—COBOL being a core and highly profitable business for IBM.

Three days later, a similar scenario unfolded in a completely opposite way. On February 26, Block, Jack Dorsey's fintech company, announced layoffs of approximately 4,000 employees, nearly 50% of its workforce, citing the same reason: AI-driven efficiency improvements. However, the market reaction was drastically different—Block's stock price jumped more than 24% in after-hours trading. In his letter to shareholders, Dorsey frankly stated, "I believe that in the coming year, most companies will come to the same conclusion and make similar structural adjustments."

IBM loses $40 billion, Block lays off half its staff but its stock price rises: In the AI ​​era, what assets are worth tokenizing?

Two events, one driving factor—AI; two drastically different market reactions—one a sharp drop, the other a surge. What exactly happened behind the scenes? The answer may point to a deeper proposition: AI is redefining "what constitutes a valuable asset." For listed company executives, investors, and decision-makers in traditional enterprises, understanding this revaluation logic is no longer a forward-looking strategic consideration, but a matter of immediate survival.

I. Same AI, Different Market Judgments

To understand the contrast between these two events, it is necessary to first examine their respective asset structures.

IBM's stock price plunge, ostensibly due to the technological threat posed by Claude Code tools, is in reality a repricing of its core asset model by the market. COBOL, a programming language born in the late 1950s, still powers approximately 95% of ATM transactions globally and numerous core systems in critical sectors such as finance, aviation, and government. Anthropic wrote in his blog, "Hundreds of billions of lines of COBOL code run in production environments every day, powering critical systems. Despite this, the number of people who understand COBOL is decreasing year by year."

Modernizing the COBOL system has long been a complex and costly undertaking, a key to IBM's lucrative business. However, Anthropic claims, "With the power of AI, teams can modernize the COBOL codebase in just a few quarters, without spending years." The underlying message is that IBM's reliance on labor-intensive system maintenance revenue and mainframe service revenue is being eroded by AI technology.

Interestingly, IBM's stock price rebounded 2.68% the following day. Wall Street analysts such as Wedbush and Evercore ISI quickly intervened to support the stock, calling the plunge an "unfounded overreaction." Their reasoning went straight to the heart of the matter: enterprise customers cannot simply abandon their mainframe systems just because a new AI tool can translate legacy code. There is a significant gap between translating code syntax and modernizing systems with deep hardware-software integration.

IBM itself also issued a response on the same day, putting forward a key argument: the challenge of modernization is not the COBOL language problem, but the IBM Z platform problem—translated code can hardly capture the actual complexity, and the value of the platform comes from decades of software and hardware integration, which cannot be transferred by code translation.

Let's look at the Block case. It also involved large-scale layoffs and was driven by AI, yet the market's reaction was a 24% increase. The key lies in the changing asset structure of Block. Since 2024, Block has been restructuring its business model and staffing, while heavily investing in AI tools to improve operational efficiency, including developing its own tool called Goose.

In explaining the layoffs, Block's CFO, Amrita Ahuja, emphasized, "We are taking bold and decisive actions, but we are building on our strengths." This "strength" is supported by gross profit for the full year of 2025 reached $10.36 billion, a 17% year-over-year increase. This strong financial performance provides the company with a buffer to proceed with a large-scale restructuring at this time.

The market's interpretation is clear: Block is not passively shrinking in response to the impact of AI, but rather proactively optimizing its asset structure—exchanging less "human capital" for higher "technological capital" output efficiency. The 50% layoff while simultaneously raising full-year guidance signifies that the value of output per unit of human capital is being amplified by AI.

II. In the AI ​​era, four types of assets are being repriced.

These two cases reveal an emerging trend: AI is becoming a "repricing machine" for asset values. Different types of assets exhibit drastically different value curves under the AI ​​evaluation framework.

The first category is human capital-intensive assets . The value of IBM's COBOL maintenance team, traditional analysts, programmers, and other "information processors" is being diluted by AI. Anthropic, in introducing Claude Code, mentioned that the tool can identify "risks that would normally take human analysts months to discover." This doesn't mean humans are no longer important, but rather that the value of jobs relying on information asymmetry and procedural knowledge is being compressed by technology.

However, it's important to be cautious about the fact that AI replaces "information processing" rather than "value creation." Mitch Ashley, an analyst at Futurum Group, points out in a research report that successful COBOL modernization projects require multiple dimensions, including business scope definition, technology assessment, data migration planning, behavioral equivalence verification, observability, and organizational change management; code translation is only one part of this. The human ability to navigate complex systems, understand the essence of business, and make strategic judgments remains scarce.

The second category is data assets , which are becoming the high-value assets in the AI ​​era. With the rapid development of generative AI, the value attributes of data are being reshaped. Tang et al. pointed out in their research published in PLOS One that generative AI has changed the way data is acquired, processed, and utilized. The value of data assets depends not only on their intrinsic quality and relevance, but also on their application scenarios, transformation capabilities, and market demand within the generative AI framework.

This means that the uniqueness, continuity, and governability of data are becoming core value dimensions. A dataset may be extremely valuable in one scenario but useless in another. Companies that can provide exclusive, continuous, and high-quality data for AI model training are gaining new pricing power.

The third category is algorithm and model assets . The EVMbench, a collaboration between OpenAI and Paradigm, used to evaluate AI's ability to detect, patch, and exploit smart contract vulnerabilities, demonstrates that algorithms are becoming quantifiable assets. Model weights, algorithmic frameworks, and training methodologies are becoming identifiable, controllable, and monetizable intangible assets.

The fourth category is traditional tangible assets , which are undergoing differentiation. Physical assets reliant on "information asymmetry" and "human intermediaries" face depreciation pressure, while physical assets with "AI-resistant" attributes—such as energy facilities, scarce resources, and core infrastructure—remain relatively stable in value. The reason is simple: AI can analyze and optimize the operation of these assets, but it cannot replace their physical existence and value-carrying function.

III. From "Asset Revaluation" to "AI Immunity"

Based on the above analysis, enterprises need a systematic framework to determine whether their assets will appreciate or depreciate in the AI ​​era. RWA Research Institute proposes an "AI-immune" asset identification framework, which includes three core features.

The first characteristic is its unencodeability . This refers to the value elements that are difficult for AI to fully learn or replicate. While the COBOL code itself can be translated by AI, the transaction processing capabilities, quantum-secure encryption, and 8.999999% reliability of the Z-series mainframes running COBOL systems, built at the chip level, are things that AI tools cannot replicate. Research from Futurum Group points out that "code translation cannot capture the actual complexity; the platform's value comes from decades of hardware and software integration." Similarly, offline control over scenarios, tacit industry knowledge, and complex relationship networks—elements that are difficult to "encode"—constitute the first line of defense for assets.

The second characteristic is the data moat . Does the company possess exclusive, sustainable, and governable data assets? Does it merely use publicly available data, or can it generate data that others cannot access? CITIC Bank has begun exploring the use of large-scale models to assess the value of data assets and is attempting to "include data assets on its balance sheet." The logic behind this is that in the AI ​​era, data is not only the raw material for production but also the asset itself. However, not all data has a moat—publicly available online data will soon be "digested" by AI models, while only companies with exclusive data sources can obtain a premium under the AI ​​valuation framework.

The third characteristic is AI-enabled resilience . Can the asset itself be enhanced rather than replaced by AI? This is the key difference between an IBM-style disruption and a Block-style transformation. IBM's core business—maintaining the legacy COBOL system—is what AI "replaces"; while Block's business model—payments and financial services—can be "empowered" by AI. In fact, IBM itself has developed Watsonx Code Assistant for Z, a dedicated tool that allows customers to securely refactor and modernize legacy code directly on the platform while preserving enterprise-grade security. When assets can collaborate with AI rather than antagonize it, their value increases.

Conversely, AI-vulnerable assets also exhibit three characteristics: they rely on "information processing" as their core value, they can be replaced by standardized processes, and they lack the ability to generate and accumulate data. By comparing these three characteristics, companies can conduct "stress tests" on their asset portfolios.

IV. New Opportunities for RWA: What Assets Are Worthy of Tokenization?

Extending the above framework to the RWA (Real-World Asset Tokenization) field, we can draw a clear conclusion: RWA is not about "any asset can be put on the chain", but rather about screening out those hard assets that can survive the AI ​​cycle in the wave of AI revaluation.

In March 2026, the total value of on-chain RWAs exceeded $25 billion, nearly quadrupling from the previous year. However, the Hong Kong Web3.0 Standardization Association clearly stated in its RWA industry white paper released in August 2025 that "the idea that everything can be RWA is a false proposition." Assets that successfully achieve large-scale deployment need to meet three major hurdles : value stability, clear legal ownership, and verifiability of off-chain data .

Combining the "AI immunity" framework, we can further refine it as follows: assets worthy of tokenization are primarily those whose value remains stable during AI revaluation .

The first category consists of physical assets possessing "AI immunity" characteristics . These include energy assets, infrastructure, and scarce resources. The value of these assets does not depend on information processing but rather on their physical existence and practical utility. New energy RWA (such as charging piles and photovoltaic assets) and computing power assets like GPUs, mentioned in the white paper, fall into this category. Among them, GPU computing power assets, with their "rigid demand" from the AI ​​industry and their trustworthy "digital DNA," are becoming ideal anchor assets for RWA.

The second category is programmable data assets . These assets possess exclusive data sources and can be automatically monetized through smart contracts, combining a "data moat" with "AI-enabled flexibility." The white paper categorizes data, along with intellectual property and carbon credits, as intangible assets. However, it's important to note that not all data can become assets—only data that is continuously generated, authorizes ownership, and is verifiable has the foundation for tokenization.

The third type is hybrid assets , which combine "non-encodeable" physical control with "programmable" digital rights. For example, the ownership of commercial real estate can be tokenized, but the actual operation, maintenance, and leasing of the property—the control of these offline scenarios—remains in the hands of professional institutions. This "physical + digital" two-tier structure leverages the liquidity advantages of blockchain while retaining the "AI-immune" offline value anchor.

Conversely, there are two types of assets that require caution in tokenizing in the AI ​​era. One type is financial assets that heavily rely on human intermediaries, whose value is easily compressed by AI; the other type is standardized assets without a data moat, which lack bargaining power under the AI ​​valuation framework.

V. Action Guidelines: From Cognition to Decision Making

IBM's $40 billion loss signals an era—assets reliant on information asymmetry and manpower are being revalued by AI. Block's counter-trend rise heralds another era—companies that embrace AI and optimize their asset structure are being re-priced by the market.

For decision-makers in listed companies and traditional enterprises, this is not just about technological anxiety, but a fundamental restructuring of the asset value system. CEOs need to answer an unavoidable question: How much is my asset portfolio worth in the eyes of AI?

Based on the analysis in this article, three actionable suggestions can be proposed.

First, immediately initiate an "AI stress test" for your assets. Evaluate each of your core business units against the three characteristics of the "AI immunity" framework—non-codeability, data moat, and AI-enabled resilience. Identify which businesses are most vulnerable to value depreciation under the impact of AI, and which businesses may benefit from the amplifying effect of AI.

Second, establish a dynamic asset portfolio management mechanism. In the context of AI revaluation, asset allocation is no longer a static "buy and hold" strategy. Companies need to consciously increase the proportion of "AI-immune" assets while developing transformation or divestiture plans for those AI-vulnerable assets. This is not solely the responsibility of the finance department; it requires collaboration between the strategy, technology, and business departments.

Third, re-examine the RWA strategy. Before considering asset tokenization, use the "AI immunity" framework to screen underlying assets. The core value of RWA is not "on-chain" itself, but rather achieving better liquidity and pricing efficiency for high-quality assets through tokenization. If the underlying assets themselves are depreciating in the AI ​​era, then tokenization will only accelerate the loss of value.

Finally, it is important to note that, according to Document No. 42 jointly issued by eight Chinese departments, any form of token issuance and tokenized trading is strictly prohibited within mainland China . The RWA tokenization discussed in this article refers only to asset digitization practices within an overseas compliance framework. Enterprises exploring related businesses must strictly adhere to the regulatory red line of "strictly prohibited domestically, but requiring registration overseas."

When AI starts pricing assets, the only sense of security comes from things that AI cannot price—not code, not data, but the human ability to judge value itself.


(This article is based on publicly available information and data, sourced from authoritative media and research institutions including Nasdaq, Tencent News, Futurum Group, PLOS One, 21st Century Business Herald, and Commercial Times. The views expressed herein do not constitute any investment advice.)

Market Opportunity
Notcoin Logo
Notcoin Price(NOT)
$0.0003914
$0.0003914$0.0003914
-0.12%
USD
Notcoin (NOT) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.