Oracle

Oracles are essential infrastructure components that feed real-time, off-chain data (such as price feeds, weather, or sports results) into blockchain smart contracts. Without decentralized oracles like Chainlink and Pyth, DeFi could not function. In 2026, oracles have evolved to support verifiable randomness and cross-chain data synchronization. This tag covers the technical evolution of data availability, tamper-proof price feeds, and the critical role oracles play in ensuring the deterministic execution of complex decentralized applications.

5122 Articles
Created: 2026/02/02 18:52
Updated: 2026/02/02 18:52
Chainlink (LINK) Price Jumps as Grayscale Launches First U.S. Chainlink ETF on NYSE Arca

Chainlink (LINK) Price Jumps as Grayscale Launches First U.S. Chainlink ETF on NYSE Arca

TLDR Grayscale launched the first U.S. Chainlink ETF (GLNK) on NYSE Arca on Tuesday, converting from a private trust that operated since 2021. LINK price rose to $12.68, reflecting a 10% increase, with strong support at $12.00 and resistance levels at $13.50 and $15.00. The ETF uses a cash-only model and became the first product [...] The post Chainlink (LINK) Price Jumps as Grayscale Launches First U.S. Chainlink ETF on NYSE Arca appeared first on CoinCentral.

Author: Coincentral
How can Google, which is aiming for a $4 trillion TPU, make a significant impact in the blockchain field?

How can Google, which is aiming for a $4 trillion TPU, make a significant impact in the blockchain field?

Author: Eli5DeFi Compiled by: Tim, PANews PANews Editor's Note: On November 25th, Google's total market capitalization reached a record high of $3.96 trillion. Factors contributing to this surge in stock price included the newly released, most powerful AI chip, the Gemini 3, and its self-developed TPU chip. Beyond AI, the TPU will also play a significant role in blockchain technology. The hardware narrative of modern computing has been largely defined by the rise of the GPU. From gaming to deep learning, NVIDIA's parallel architecture has become an industry-recognized standard, causing CPUs to gradually shift to a co-management role. However, as AI models encounter scaling bottlenecks and blockchain technology moves towards complex cryptographic applications, a new competitor, the Tensor Processor (TPU), has emerged. Although TPU is often discussed within the framework of Google's AI strategy, its architecture unexpectedly aligns with the core needs of post-quantum cryptography, the next milestone in blockchain technology. This article explains, by reviewing the evolution of hardware and comparing architectural features, why TPUs (rather than GPUs) are better suited to handle the intensive mathematical operations required by post-quantum cryptography when building decentralized networks resistant to quantum attacks. Hardware Evolution: From Serial Processing to Pulsating Architecture To understand the importance of TPU, you need to first understand the problems it solves. Central Processing Unit (CPU): As an all-rounder, it excels at serial processing and logical branching operations, but its role is limited when it is necessary to perform massive mathematical operations simultaneously. Graphics Processing Unit (GPU): As an expert in parallel processing, it was originally designed to render pixels, and therefore excels at executing a large number of identical tasks simultaneously (SIMD: Single Instruction Multiple Data). This characteristic made it a mainstay of the early explosion of artificial intelligence. Tensor Processor (TPU): A specialized chip designed by Google specifically for neural network computing tasks. Advantages of Pulsating Architecture The fundamental difference between GPUs and TPUs lies in their data processing methods. GPUs require repeated access to memory (registers, cache) for computation, while TPUs employ a pulsating architecture. This architecture, like a heart pumping blood, causes data to flow through a large-scale computing cell grid in a regular pulsating manner. https://www.ainewshub.org/post/ai-inference-costs-tpu-vs-gpu-2025 The calculation results are directly passed to the next computation unit without needing to be written back to memory. This design greatly alleviates the von Neumann bottleneck, which is the latency caused by the repeated movement of data between memory and the processor, thereby achieving an order-of-magnitude increase in throughput for specific mathematical operations. The key to post-quantum cryptography: Why does blockchain need TPU? The most critical application of TPU in the blockchain field is not mining, but cryptographic security. Current blockchain systems rely on elliptic curve cryptography or RSA encryption, which have fatal weaknesses when dealing with Shor's algorithm. This means that once a sufficiently powerful quantum computer becomes available, an attacker could deduce the private key from the public key, potentially wiping out all crypto assets on Bitcoin or Ethereum. The solution lies in post-quantum cryptography. Currently, mainstream PQC standard algorithms (such as Kyber and Dilithium) are all based on Lattice cryptography. Mathematical fit of TPU This is precisely the advantage of TPUs over GPUs. Lattice cryptography heavily relies on intensive operations on large matrices and vectors, primarily including: Matrix-vector multiplication: As + e (where A is a matrix, and s and e are vectors). Polynomial operations: algebraic operations based on rings, usually implemented using number theory transformations. Traditional GPUs treat these computations as general-purpose parallel tasks, while TPUs achieve dedicated acceleration through hardware-level fixed matrix computation units. The mathematical structure of Lattice cryptography and the physical construction of the TPU's pulsating array form an almost seamless topological mapping. The technical battle between TPU and GPU While GPUs remain the industry's universal king of all trades, TPUs have a clear advantage when handling specific math-intensive tasks. Conclusion: GPUs excel in versatility and ecosystem, while TPUs have an advantage in intensive linear algebra computation efficiency, which is the core mathematical operation upon which AI and modern advanced cryptography rely. TPU Extends the Narrative: Zero-Knowledge Proofs and Decentralized AI Besides post-quantum cryptography, TPUs have also shown application potential in two other key areas of Web3. Zero-knowledge proof ZK-Rollups (such as Starknet or zkSync), as scaling solutions for Ethereum, require massive computations in their proof generation process, mainly including: Fast Fourier Transform: Enables rapid conversion of data representation formats. Multiscalar multiplication: Implementing point operations on elliptic curves. FRI Protocol: Cryptographic Proof System for Verifying Polynomials These types of operations are not hash calculations, which ASICs excel at, but rather polynomial mathematics. Compared to general-purpose CPUs, TPUs can significantly accelerate FFT and polynomial commitment operations; and because these algorithms have predictable data flow characteristics, TPUs can typically achieve higher efficiency acceleration than GPUs. With the rise of decentralized AI networks such as Bittensor, network nodes need to have the ability to run AI model inference. Running a general-purpose large language model is essentially performing massive matrix multiplication operations. Compared to GPU clusters, TPUs enable decentralized nodes to process AI inference requests with lower energy consumption, thereby improving the commercial viability of decentralized AI. TPU Ecosystem Although most projects still rely on GPUs due to the widespread adoption of CUDA, the following areas are poised for TPU integration, especially within the narrative framework of post-quantum cryptography and zero-knowledge proofs. Zero-knowledge proofs and scaling solutions Why choose TPU? Because ZK proof generation requires massively parallel processing of polynomial operations, and under certain architecture configurations, TPUs are far more efficient than general-purpose GPUs at handling such tasks. Starknet (two-layer expansion scheme): STARK proofs heavily rely on Fast Fourier Transform and Fast Reed-Solomon interactive oracle proofs, and these computationally intensive operations are highly compatible with the computational logic of TPU. zksync (two-layer scaling solution): Its Airbender prover needs to handle large-scale FFT and polynomial operations, which is the core bottleneck that TPU can crack. Scroll (two-layer expansion scheme): It adopts the Halo2 and Plonk proof system, and its core operation KZG commitment verification and multi-scalar multiplication can perfectly match the pulse architecture of TPU. Aleo (Privacy-Preserving Public Chain): Focuses on zk-SNARK zero-knowledge proof generation, and its core operations rely on polynomial mathematical characteristics that are highly compatible with the dedicated computing throughput of TPU. Mina (lightweight public blockchain): It adopts recursive SNARKs technology. Its mechanism for continuously regenerating proofs requires repeated execution of polynomial operations. This characteristic highlights the high-efficiency computing value of TPU. Zcash (privacy coin): The classic Groth16 proof system relies on polynomial operations. Although it is an early technology, high-throughput hardware still allows it to benefit significantly. Filecoin (DePIN, storage): Its proof-of-replication mechanism verifies the validity of stored data through zero-knowledge proofs and multinomial coding techniques. Decentralized AI and Agent Computing Why choose a TPU? This is precisely the native application scenario for TPUs, designed specifically to accelerate neural network machine learning tasks. Bittensor's core architecture is decentralized AI inference, which perfectly matches the tensor computing capabilities of the TPU. Fetch (AI Agent): Autonomous AI agents rely on continuous neural network inference to make decisions, and TPUs can run these models with lower latency. Singularity (AI Service Platform): As an artificial intelligence service trading marketplace, Singularity significantly improves the speed and cost-effectiveness of underlying model execution by integrating TPUs. NEAR (Public Chain, AI Strategic Transformation): The transformation towards on-chain AI and trusted execution environment proxy, the tensor operations it relies on require TPU acceleration. Post-quantum cryptography networks Why choose TPU? The core operations of post-quantum cryptography often involve the problem of finding the shortest vector in a lattice. These tasks, which require dense matrix and vector operations, are highly similar to AI workloads in terms of computational architecture. Algorand (public blockchain): It adopts a quantum-safe hashing and vector operation scheme, which is highly compatible with the parallel mathematical computing capabilities of TPU. QAN (Quantum Resistant Public Chain): Employs Lattice cryptography, whose underlying polynomial and vector operations are highly isomorphic to the mathematical optimization field that TPU specializes in. Nexus (computing platform, ZkVM): Its quantum-resistant computational preparation involves polynomial and lattice basis algorithms that can be efficiently mapped onto the TPU architecture. Cellframe (Quantum-resistant public blockchain): The Lattice cryptography and hash encryption technology it uses involve tensor-like operations, making it an ideal candidate for TPU acceleration. Abelian (privacy token): focuses on post-quantum cryptography Lattice operations. Similar to QAN, its technical architecture fully benefits from the high throughput of TPU vector processors. Quantus (public blockchain): Post-quantum cryptographic signatures rely on large-scale vector operations, and TPUs have a much higher parallelization capability for handling such operations than standard CPUs. Pauli (Computing Platform): Quantum-safe computing involves a large number of matrix operations, which is precisely the core advantage of the TPU architecture. Development bottleneck: Why has TPU not yet been widely adopted? If TPUs are so efficient in post-quantum cryptography and zero-knowledge proofs, why is the industry still scrambling to buy H100 chips? CUDA Moat: NVIDIA's CUDA software library has become an industry standard, and the vast majority of cryptography engineers program based on CUDA. Porting code to the JAX or XLA frameworks required by TPUs is not only technically challenging but also requires a significant investment of resources. Cloud platform entry barriers: High-end TPUs are almost exclusively monopolized by Google Cloud. Decentralized networks that rely too heavily on a single centralized cloud service provider will face censorship risks and single points of failure. Rigid architecture: If cryptographic algorithms require fine-tuning (such as introducing branching logic), TPU performance will drop sharply. GPUs, on the other hand, are far superior to TPUs in handling such irregular logic. Limitations of hash operations: TPUs cannot replace Bitcoin mining machines. The SHA-256 algorithm involves bit-level operations rather than matrix operations, rendering TPUs useless in this area. Conclusion: Layered architecture is the future. The future of Web3 hardware is not a winner-takes-all competition, but rather an evolution towards a layered architecture. GPUs will continue to play a leading role in general computing, graphics rendering, and tasks requiring complex branching logic. TPUs (and similar ASIC-based accelerators) will gradually become the standard configuration for the Web3 "mathematics layer," specifically designed to generate zero-knowledge proofs and verified quantum cryptographic signatures. As blockchains migrate to post-quantum security standards, the massive matrix operations required for transaction signing and verification will make the pulse architecture of TPUs no longer an option, but an essential infrastructure for building scalable quantum-safe decentralized networks.

Author: PANews
BlackRock warns AI boom could drive U.S. borrowing costs sharply higher

BlackRock warns AI boom could drive U.S. borrowing costs sharply higher

Investment giant BlackRock has changed its tune on long-term U.S. government bonds, saying a flood of spending on artificial intelligence could make borrowing more expensive. The firm’s research division said Tuesday it’s now bearish on these bonds after sitting on the fence before. The outlook covers the next six to 12 months. Here’s the issue: Tech companies are getting ready to borrow hundreds of billions of dollars to pay for AI projects. Their balance sheets look solid, but this new debt is piling on top of what the U.S. government already owes, more than $38 trillion as previously reported by Cryptopolitan. Rising leverage creates vulnerabilities “Higher borrowing across public and private sectors is likely to keep upward pressure on interest rates,” the BlackRock Investment Institute wrote in its 2026 outlook report. The institute gathered views from senior investment managers at the world’s largest asset management company. They’re seeing warning signs. “A structurally higher cost of capital raises the cost of AI-related investment and affects the broader economy,” the report said. There’s also the problem of more debt making things fragile. The system becomes vulnerable “to shocks such as bond yield spikes tied to fiscal concerns or policy tensions between managing inflation and debt servicing costs.” AI investment still drives stock optimism Still, BlackRock hasn’t soured on U.S. stocks. The firm thinks AI investments will keep pushing stock prices higher next year. Revenue gains from AI should lift the broader economy, though not every company will cash in equally. “Entirely new AI-created revenue streams are likely to develop. How those revenues are shared is likely to evolve – and we don’t yet know how. Finding winners will be an active investment story,” the institute said. The report admitted AI might eventually help government finances through better productivity and more tax money coming in. But that’s going to take time. Major tech firms like Oracle, Meta, and Alphabet have already issued massive bond sales this year to fund AI infrastructure. The borrowing wave comes as AI spending has become a backbone of U.S. economic growth. BlackRock also turned more negative on Japanese government bonds, pointing to higher interest rates ahead and more bonds hitting the market. There was one bright spot. The firm warmed up to debt from developing countries, flipping to a positive view from a negative one. That’s thanks to fewer new bonds and healthier government finances in those places. Get seen where it counts. Advertise in Cryptopolitan Research and reach crypto’s sharpest investors and builders.

Author: Coinstats
Grayscale Launches Chainlink Trust ETF (GLNK) With Zero Fees as LINK Infrastructure Demand Surges

Grayscale Launches Chainlink Trust ETF (GLNK) With Zero Fees as LINK Infrastructure Demand Surges

The post Grayscale Launches Chainlink Trust ETF (GLNK) With Zero Fees as LINK Infrastructure Demand Surges appeared on BitcoinEthereumNews.com. Grayscale has taken another major step into the tokenized economy. The asset manager has officially launched the Grayscale Chainlink Trust ETF (Ticker: GLNK) on NYSE Arca, giving U.S. investors a new way to access Chainlink’s expanding role across decentralized finance, tokenized markets, and enterprise blockchain adoption. The launch marks the first-ever Chainlink ETF available in the United States, a move that signals growing institutional recognition of Web3 infrastructure, especially the oracle networks powering on-chain data, interoperability, and smart contract automation. The ETF goes live with zero management fees, a notable choice at a time when digital asset exchange-traded products face increasing competition for investor attention. Grayscale Chainlink Trust ETF (Ticker: $GLNK) with 0% fees is now trading¹. The first @chainlink ETP in the U.S. — from Grayscale, the world’s largest crypto-focused asset manager². Gain exposure to $LINK, the core infrastructure for connecting blockchains to the real world.… pic.twitter.com/CjoemYxyEI — Grayscale (@Grayscale) December 2, 2025 A New Access Point for LINK Exposure GLNK provides investment exposure to LINK, the native token of Chainlink. But Grayscale emphasizes a critical point: this is not a direct investment in LINK. The fund is not registered under the Investment Company Act of 1940, which means:  It carries significant investment risks  It behaves differently from traditional ETFs  It may not be suitable for all retail or institutional investors Despite these caveats, the new ETF offers something that traditional financial markets have been slowly warming up to, secure, regulated exposure to digital asset infrastructure without requiring self-custody or direct crypto trading. From Private Trust to Public ETF GLNK has a long history before its official exchange debut.  It was originally launched in February 2021 as a private placement, limited to accredited investors.  It entered OTC trading in May 2022, giving broader market participants indirect access to LINK.…

Author: BitcoinEthereumNews
Prompt Engineering Urges ‘Legal Clearance Prompting’ As A Vital Technique To Protect From Getting Jammed Up By AI Unlawful Responses

Prompt Engineering Urges ‘Legal Clearance Prompting’ As A Vital Technique To Protect From Getting Jammed Up By AI Unlawful Responses

The post Prompt Engineering Urges ‘Legal Clearance Prompting’ As A Vital Technique To Protect From Getting Jammed Up By AI Unlawful Responses appeared on BitcoinEthereumNews.com. Prompt engineering welcomes a new prompting technique that aids in being legally mindful and could be a bit of a lifesaver. getty In today’s column, I examine a new technique in prompt engineering that provides a powerful way to keep out of trouble when relying on generative AI and large language models (LLMs) as your overall oracle for answers to all manner of questions. This has to do with being legally mindful as a prudent personal strategy. The idea is that you give the AI an initiator prompt that gets it to identify potential legal ramifications for the responses that are being generated. Doing so can be a bit of a lifesaver or at least be a handy heads-up that otherwise might not have been at the top of mind. You see, lots of answers could contain unstated legal implications, and you wouldn’t have thought about whether there are any lawful ramifications associated with the matter at hand. For example, suppose you were to innocently ask AI for some instructions on how to fly a drone. The AI would undoubtedly provide such instructions. Meanwhile, unbeknownst to you, flying your drone in certain circumstances and jurisdictions might be against the law (i.e., not flying after midnight, not flying over school grounds, and so on). All you would have in hand is an indication of physically flying the drone. By giving the AI a special prompt, the LLM will inform you about legal aspects that might be pertinent to the questions and answers of your AI-based dialogue. I provide you with a template for this special prompting that you can readily use whenever desired. It is known as the Legal Clearance prompt. Let’s talk about it. This analysis of AI breakthroughs is part of my ongoing Forbes column coverage on the latest…

Author: BitcoinEthereumNews
Inside the gigawatt supercomputers reshaping our world

Inside the gigawatt supercomputers reshaping our world

The post Inside the gigawatt supercomputers reshaping our world appeared on BitcoinEthereumNews.com. We are all talking about data centers, but nobody really knows what they are. We hear about them consuming large amounts of energy, driving up electricity bills, and somehow being the engines of the AI revolution. But what is exactly going on inside these large facilities? And why are tech companies such as META, Microsoft, Open AI competing to build structures that could cover Manhattan, consuming enough electricity to power millions of American homes? Let’s break it down with the most ambitious data center projects ever planned and the staggering numbers behind them. What actually is a data center? A data center is a very simple infrastructure: a facility with thousands computer servers, storage systems, and networking equipment organized in racks and rows. But modern hyperscale data centers are essentially large supercomputers, industrial facilities where hundreds of thousands of processors work in a synchronized way to train AI models, run cloud services, and power the digital infrastructure which we are all increasingly depending on. The key components: Servers containing CPUs to handle computational tasks, with GPUs accounting for roughly 60% of total electricity consumption in AI-focused facilities. Cooling systems prevent the equipment from overheating, consuming 7% to 30% of total power. Some say that number can be as high as 45%. Power infrastructure includes uninterruptible supplies and backup generators ensuring continuous operation because when you’re training a billion-dollar AI model, power interruptions cannot happen. The age of the gigawatt cluster Colossus: Speed at an unprecedented scale When Elon Musk’s xAI announced plans to build the world’s largest AI training cluster in Memphis, Tennessee, most experts predicted it would take 18 to 24 months. Instead, xAI accomplished it in 122 days. That is how Elon Musk gets stuff done. Colossus 1 launched in September 2024 with 100,000 Nvidia H100 GPUs, then…

Author: BitcoinEthereumNews
Taurus–Everstake Launch Institutional Staking Alliance to Unlock Billions in PoS Yields Worldwide

Taurus–Everstake Launch Institutional Staking Alliance to Unlock Billions in PoS Yields Worldwide

Key Takeaways: Taurus has integrated Everstake’s enterprise staking infrastructure into its regulated digital-asset platform, giving banks access to compliant PoS rewards. Institutions can now stake major assets including SOL, NEAR, The post Taurus–Everstake Launch Institutional Staking Alliance to Unlock Billions in PoS Yields Worldwide appeared first on CryptoNinjas.

Author: Crypto Ninjas
Flow has transitioned to DeFi; the confidence and predicament of the former NFT leader.

Flow has transitioned to DeFi; the confidence and predicament of the former NFT leader.

Author: Nancy, PANews After the brutal baptism of market cycles, very few survivors remain in the NFT sector. Even Flow, once a top performer, could not escape the fate of changing times and began to seek new growth points. On December 2nd, Flow announced its transformation into a democratized, consumer-grade DeFi platform, a strategic shift that has attracted significant market attention. Leveraging its large user base and unique technological advantages, Flow is attempting to adapt to market changes and save itself. However, whether it can secure a place in the fiercely competitive DeFi arena remains a huge question mark. Launching DeFi lending and wealth management products, and upgrading to a deflationary token. “Today’s DeFi is hostile; users must possess advanced technical skills to survive, with issues like slippage, MEV, and liquidation cascading effects constantly emerging. Every interface is designed for experts, forcing the rest to the margins. This is precisely the gap we aim to fill,” wrote Roham, CEO of Dapper Labs. In response to this situation, Flow's new goal is to create consumer-oriented DeFi, allowing ordinary users to enjoy the benefits of the crypto world without needing to be technical experts, and truly achieving an easy-to-use experience for mainstream users. Flow is building a series of network architecture components called "built-in protocols," which are more like public financial infrastructure directly embedded in the network layer. In the DeFi space, built-in protocols can provide shared liquidity across the entire ecosystem and integrate liquidity pools from various vertical sectors, avoiding liquidity fragmentation and allowing new projects to avoid the challenges of a cold start. Flow Credit Market (FCM), an automated lending protocol, is the first built-in protocol developed by the Flow Foundation. It utilizes Flow's native on-chain scheduling system to set periodic triggers without the need for external oracles, significantly reducing liquidation risk while increasing loan value (LTV), thereby bringing higher natural returns to both lenders and borrowers. Dapper Labs CEO Roham pointed out that traditional DeFi lending is typically highly punitive, only liquidating and charging penalties when a user's position is close to liquidation. FCM, on the other hand, employs proactive risk management, continuously monitoring each position on-chain automatically and rebalancing it before risks materialize. Internal risk simulations show that FCM has protected user deposits from liquidation during numerous major market crashes, while also reducing costs by up to 99.9% compared to lending protocols on other networks. To accelerate the launch of FCM (Financial Flywheel) services, Dapper Labs has launched Peak Money, a consumer-grade financial flywheel app aimed at becoming the next crypto gateway to 100 million new users. According to Roham, users can deposit cash or crypto assets (such as Bitcoin, Ethereum, and FLOW) into Peak Money and earn higher returns than any bank (APY up to 25% for cryptocurrencies and 10% for cash), while funds can be earned and used at any time. The product has no minimum investment, no gatekeeper, no mnemonic phrase required, and no liquidation risks. Peak Money will release details of coverage for specific loss events upon official launch. Currently, Peak Money has an open waiting list. Furthermore, Flow's built-in protocols may be expanded to perpetual contracts, prediction markets, and other applications in the future, providing more user-friendly DeFi applications for mainstream consumers. To achieve sustainable value capture, Flow upgraded its token, transitioning to a deflationary token. The Flow Foundation's FLIP-351 proposal directly links network usage to network value. Each transaction burns tokens, creating scarcity through network activity and thus increasing token value. When the network consistently operates at approximately 250 TPS, the FLOW token will achieve net deflation. Even so, Flow's transaction costs remain lower than mainstream networks like Solana and Base. It's worth noting that the current price of the FLOW token has fallen by over 90% from its all-time high. What gives Flow the confidence and challenges in its cross-industry transformation into DeFi? The current DeFi market is in a phase of rapid growth and fierce competition. As the regulatory environment becomes more favorable, leading protocols are leveraging their first-mover advantage to solidify their positions, while traditional institutions with both compliance and funding advantages are also accelerating their entry, continuously raising the barriers to entry in the field. As one of the few crypto sectors with proven product-market fit (PMF), DeFi still has enormous growth potential. For Flow, which is attempting to transform from consumer-grade Level 1 to DeFi infrastructure, this is not only an opportunity for strategic restructuring but also a challenging "reboot." As a "newcomer" to the DeFi sector, Flow possesses a certain degree of confidence for its cross-industry transformation. On one hand, Flow didn't start from scratch; its accumulated experience in the NFT field provided a unique starting line. With the phenomenal application NBA Top Shot, Flow amassed a large user base. Although its popularity has declined significantly from its peak, the accumulated traffic remains substantial. According to official data, Flow has over 41 million total accounts and over 1.1 million monthly active users. Meanwhile, according to DeFiLlama data, as of December 3rd, Flow's TVL reached $107 million, a 187.1% increase since the beginning of the year. Meanwhile, Flow boasts technological advantages, being designed specifically for large-scale consumer applications. Its low-barrier, low-cost, and high-throughput on-chain environment naturally aligns with the high-frequency trading needs of DeFi. In October of this year, Flow also launched two key upgrades, Forte and Crescendo, aiming to address scalability, deep innovation in DeFi, and cross-chain interoperability issues, further providing technological support for ecosystem transformation. Forte's core goal is to completely eliminate the reliance on off-chain bots or centralized custody services for complex on-chain financial logic. All automation (limit orders, dynamic interest rates, strategy vaults, etc.) runs securely directly on-chain, making it easier for developers to build complex financial applications. Crescendo upgrades Flow with Ethereum Virtual Machine (EVM) equivalence, enabling seamless interoperability with Ethereum-based applications and protocols. Flow claims to be one of the few blockchains capable of supporting millions of daily active users (DAU) without incurring high or unpredictable gas fees. However, Flow's transformation still faces considerable challenges. On one hand, all new public chains face the challenge of a liquidity cold start. Although Flow has a significant user base, it mainly consists of NFT users, most of whom have already left the market. How to re-attract these users and convert them into DeFi users remains highly uncertain. On the other hand, the ecosystems of leading public chains are already quite rich and have formed barriers. Flow needs to quickly attract high-quality developers and build innovative applications that are recognized by the market in order to form a sustainable positive cycle of ecosystem. More importantly, Flow has long been labeled by the market as an NFT public chain. To break this stereotype, Flow must present a successful DeFi application case to prove its suitability for the financial sector. Overall, the technical architecture and user base add more certainty to Flow's "re-entrepreneurial" endeavor. However, the success of this transformation hinges on Flow's ability to activate dormant NFT users through a compelling DeFi narrative and break down liquidity barriers.

Author: PANews
Oracle’s bond-market stress climbs to a 16-year high

Oracle’s bond-market stress climbs to a 16-year high

The post Oracle’s bond-market stress climbs to a 16-year high appeared on BitcoinEthereumNews.com. Oracle just landed at the center of the AI debt fear storm. On Tuesday in New York, a key credit‑risk gauge tied to Oracle’s debt closed at its highest point since the global financial crisis. The cost to protect Oracle against default climbed to about 1.28 percentage points per year, the highest since March 2009, based on end‑of‑day credit derivative pricing from ICE Data Services. That reading jumped almost 0.03 percentage point in a single day and has now more than tripled from 0.36 percentage point in June. This spike followed a heavy wave of bond sales across the tech sector, with Oracle standing out due to both its volume of issuance and its weaker credit rating versus rival cloud giants. The company has issued tens of billions of dollars in bonds in recent months through its own note sales and through large projects it is backing. That combination has turned Oracle’s credit default swaps into a frontline hedge for investors positioning for a possible AI market crash. Debt sales surge and CDS trading explodes The rising price of default protection tracks growing fear over the wide gap between how much cash has already been poured into AI and when real gains in productivity and profits will actually show up. Hans Mikkelsen, a strategist at TD Securities, said the current surge carries echoes of past market manias. “We’ve had these kinds of cycles before,” he said in an interview. “I can’t prove that it’s the same, but it seems like what we’ve seen, for example, during the dot‑com bubble.” Morgan Stanley raised fresh alarms in late November, warning that Oracle’s growing debt pile could push its credit default swaps closer to 2 percentage points, just above the company’s 2008 record high. Tuesday’s reading marked the highest close since March 2009,…

Author: BitcoinEthereumNews
Quant crypto price forecast as exchange supply crashes

Quant crypto price forecast as exchange supply crashes

Quant crypto price has staged a strong recovery in the past few weeks, soaring from a low of $69.12 on November 21 to $95 today. So, will the QNT token continue rising as whales buy and as the exchange reserves dip?Quant crypto price has strong fundamentalsThird-party data shows that the QNT network is doing well as whales buy and supply on exchanges dip. According to Nansen, the top 100 addresses have boosted their holdings to 17.34 million, up from 16.19 million in June this year. That is a sign that the biggest investors believe that it has more upside over time.Most importantly, the amount of QNT tokens in exchanges has been in a strong downward trend in the past few months, confirming that the network is seeing strong demand.Nansen data shows exchange outflows dropped to 3.06 million, the lowest level this year. It has dropped from 3.5 million in June, a sign that the trend is continuing. Quant exchange balances | Source: NansenFalling exchange outflows is an important aspect in crypto analysis as it shows that investors are not selling their tokens. It is also a sign that crypto investors are accumulating their tokens this year.🥖Tokenicer✲⥃⬢@Tokenicer·Follow🚨SUPPLY FOR $QNT IS DROPPING RAPIDLY Here we’ve got 2 charts displaying some on-chain metrics for QNT: 1⃣: Spot Exchange Outflows 2⃣: Spot Exchange Reserves Chart 1 shows us QNT withdrawals across spot CEXs are at the highest levels we’ve seen in 2025. Chart 2 displays a 7:45 PM · Dec 1, 2025373ReplyCopy linkRead 12 repliesWhy QNT demand is risingThere are several reasons why this trend is happening. First, the Quant network has made several major deals this year. Its most significant one is with Oracle, which is using its technology to build the Oracle Blockchain Platform Digital Assets Edition (OBP DA).OBP DA is a financial solution designed to help companies streamline tokenization, unify ledgers, and enable cross-ledger orchestration for digital assets. In other words, the platform creates a unified ledger framework for digital assets. Quant provides the interoperability and cross-ledger orchestration capabilities.Most recently, the developers launched Quant Fusion, an interoperability and liquidity solution designed to unify fragmented digital asset markets by connecting public and private blockchains. Its goal is to enable institutions to trade and move assets seamlessly across multiple networks. It eliminates that need for wrapped tokens and introduces the concept of layer 2.5.Analysts believe that Quant Fusion transactions, fusion gateways, and overledger licensing lockups will help to boost demand for the QNT token.Meanwhile, Quant is hoping to benefit from the ongoing growth of the real-world asset (RWA) tokenization industry, which analysts believe is in its infancy. RWA assets under management have grown to over $35 billion, and analysts believe the figure will be in the trillions in the next decade.Quant has a role to play through its Overledger product, which facilitates cross-chain communication across multiple chains and off-chain platforms. It is often seen as a more advanced product compared to Chainlink’s CCIP.Quant price technical analysisQNT price chart | Source: TradingViewThe daily timeframe chart shows that the Quant token price has rebounded in the past few weeks, moving from the November low of $69.12 to the current $96.QNT formed a double-bottom pattern at $69.1 and a neckline at $97.5. A double-bottom is one of the most common bullish reversal signs in technical analysis.The token has moved above the 50-day and 200-day Exponential Moving Averages (EMA). It is also hovering at the major S/R pivot point of the Murrey Math Lines tool.There are signs that the ongoing pullback will be temporary. If this is the case, there is a likelihood that it will rebound in the coming weeks, and possibly retest the ultimate resistance of the Murrey Math Lines tool at $125, which is about 30% above the current level.The post Quant crypto price forecast as exchange supply crashes appeared first on Invezz

Author: Coinstats