IBM Breakthroughs: Quantum Error Correction and Digital Asset Security
15:12
author photo
By Cam Sivesind
Tue | Oct 28, 2025 | 4:26 PM PDT

IBM delivered a one-two punch of major announcements today, reinforcing its position at the intersection of future computing and modern enterprise security. The news spans two seemingly disparate fields: the esoteric challenge of quantum error correction and the immediate, practical need for digital asset security in regulated industries.

For cybersecurity professionals, these developments are a masterclass in horizon scanning. One event signals the acceleration of a future threat (post-quantum cryptography), and the other addresses a current, high-value risk (blockchain and cryptocurrency).

The most scientifically significant news is the breakthrough published by an IBM research team, as reported by Reuters and detailed in their arXiv paper. The researchers successfully performed a quantum error correction (QEC) algorithm on a conventional AMD FPGA (Field-Programmable Gate Array).

This is a massive leap toward creating reliable, scalable quantum computers. The biggest obstacle to quantum computing has been decoherence—the instability and error-proneness of qubits. Qubits are fragile, and errors accumulate rapidly, rendering long calculations useless.

  • Solving the noise problem: QEC is the necessary mathematical and engineering discipline to detect and fix these errors. By demonstrating this process on a conventional AMD chip, IBM shows that the control and correction systems for quantum processors can be integrated with existing, scalable classical hardware.

  • The hybrid future: As AMD's Mark Papermaster noted on LinkedIn, this collaboration is a step toward an integrated, hybrid quantum-classical computing future. The quantum processor performs the complex calculation, while the classical processor (the AMD FPGA) handles the real-time error correction and control logic.

Papermaster, CTO and EVP at AMD, wrote: "We are very excited to be part of IBM's breakthrough in error correction for Quantum computing. Real-time quantum error correction on AMD VU19P FPGAs shows how tight control loops and efficient algorithms drove an impressive step toward fault-tolerant quantum computing." 

The academic reaction, particularly within the physics and computer science communities, is one of validation. The paper, accessible via the arXiv link, demonstrates a practical method for running QEC protocols, specifically on a surface code variant. This shifts the QEC conversation from purely theoretical models to demonstrable, scalable engineering implementations. It confirms that the path to a fault-tolerant quantum computer lies in a tight feedback loop between the fragile quantum core and robust classical control systems.

For CISOs, this news is the sound of a countdown timer accelerating:

  • PQC urgency: QEC breakthroughs bring the reality of a cryptographically relevant quantum computer (CRQC) closer. A CRQC will be able to break current public-key encryption (RSA and ECC) instantly, rendering current systems vulnerable to harvest-now-decrypt-later attacks.

  • Actionable mandate: This is a definitive signal to accelerate the post-quantum cryptography (PQC) migration roadmap. CISOs must finalize their cryptographic inventories, prioritize systems using vulnerable algorithms, and prepare for the deployment of NIST-standardized PQC algorithms. The window to prepare for this "quantum threat" (as referenced in reports like the Nokia Threat Intelligence Report 2025) is closing.

[REALTED: NIST Unveils Groundbreaking Post-Quantum Cryptography Standards]

"This development is concerning, as it showcases a path towards relevancy by addressing the stubbornly persistent engineering challenge of error-handling for quantum computing. The critical difference here is the potential solution utilizes FPGA chips that are largely available and do not require new fabrication processes," said Philip George, Executive Technical Strategist at Merlin Cyber.

George continued, "If this method proves to be viable, we could be looking at an even closer deadline for cryptographically relevant quantum computers, which are available for both nation-states and commercial usage alike. This means both the government and industry could be, in essence, out of time to make meaningful progress towards adopting the new quantum safe standards, FIPS 203, 204, and 205, respectively."

"Organizations should move their plans to execute a comprehensive automated cryptographic inventory forward today, while ensuring the subsequent remediation and migration follows in short order," George said. "Due to the nature of regular technological change and improvement, delaying remediation and migration actions after the completion of an automated inventory lessens the accuracy and overall effectiveness of remediation efforts; thus, it is critical to keep both actions closely paired with one another."

Here's what Jay Gambetta, Director of IBM Research and IBM Fellow, had to say about his company's latest quantum news on LinkedIn:

"A few months ago, we shared with you our progress on developing novel decoding algorithms for qLDPC codes. That effort resulted in the Relay-BP algorithm, which surpassed prior state-of-the-art qLDPC decoders in terms of logical error rate while simultaneously removing barriers toward real-time implementation. In particular, we showed that a novel variation of the belief propagation (BP) algorithm was sufficient for accurate decoding of our gross code without the need of an expensive second-stage decoder to fix cases where BP failed to converge.

I'm excited to tell you about some of the progress we've made on taking the first steps towards implementing a real-time decoder in hardware. Our initial effort has focused on FPGAs because they are very flexible and allow for very low-latency integration into our quantum control system.

FPGAs' flexibility in supporting custom logic and user-defined numerical formats allowed us to evaluate the performance of Relay-BP across a range of floating-point, fixed-point, and integer precisions. Encouragingly, we observe a high tolerance to reduced precision. Our experiments show that even 6-bit arithmetic is sufficient to maintain decoding performance.

We explored the speed limits of an FPGA Relay-BP implementation in a maximally-parallel computational architecture. Like traditional BP, the Relay-BP algorithm is a message-passing algorithm where messages are exchanged between nodes on a decoding graph. Our maximally parallel implementation assigns a unique compute resource to every node in this graph, allowing a full BP iteration to be computed on every clock cycle.

This decoder architecture is resource-intensive, but we succeeded in building a Relay-BP decoder for the gross code and fit it within a single AMD VU19P FPGA. Our implementation is limited to split X/Z decoding of the gross code syndrome cycle (we decode windows of 12 cycles), a simpler implementation than we'd need for Starling. That being said, it is extremely fast, an absolute requirement for practical implementation.

In fact, we can execute a Relay-BP iteration in 24ns. As physical error rates drop below 1e-3, Relay-BP typically converges in less than 20 iterations. This means we can complete the decoding task in about 480ns. This is significantly faster than what is possible with NVIDIA's DGX-Quantum solution, which requires a 4000ns start-up cost before decoding begins."

New platform for digital asset security

In parallel to the quantum news, IBM announced a "New Platform for Financial Institutions and Regulated Enterprises Entering the Digital Asset Economy" (per a PR Newswire release).

This move is IBM's strategic response to the increasing institutional interest in blockchain technology, cryptocurrencies, and tokenized assets. Regulated entities—banks, insurance companies, and asset managers—need platforms that meet compliance, security, and governance standards before they can participate in the digital asset market.

The platform's focus is on providing:

  • Security and compliance: Assuring that digital asset transactions meet the stringent regulatory requirements of global financial bodies.

  • Scalability and resilience: Offering enterprise-grade performance and recovery capabilities that consumer-grade crypto platforms often lack.

The platform's existence means that CISOs in finance and other regulated sectors will soon face the direct security challenge of digital assets:

  • Custody and key management: The biggest security risk in digital assets is the loss of the private keys that control the assets. CISOs must mandate the use of multi-signature schemes, hardware security modules (HSMs), and rigorous key management policies to protect these high-value cryptographic keys.

  • Regulatory scrutiny: Security controls surrounding this platform will be subject to intense regulatory scrutiny. Teams must be prepared to demonstrate auditable proof of control over access, transaction integrity, and compliance with anti-money laundering (AML) and know-your-customer (KYC) requirements.

  • Smart contract auditing: Any interaction with decentralized finance (DeFi) or tokenized assets requires rigorous security auditing of the underlying smart contracts to prevent costly logical flaws and exploits.

We asked a few more cybersecurity vendor SMEs their thoughts on the quantum breakthrough news.

Jason Soroko, Senior Fellow at Sectigo, said:

  • "Running a real-time quantum error handling loop on off the shelf AMD FPGAs signals that the classical control stack for quantum systems is maturing and getting cheaper. That lowers barriers for scale and reproducibility, pulls these systems closer to regular data center practices, and spreads the hardware supply chain across widely used components. It is encouraging for short term progress toward more stable qubits and for IBM hitting its roadmap, yet it does not change the near term risk picture for breaking today’s encryption."

  • "The security story, however, shifts quickly once control moves to commodity gear, because the attack surface grows beyond bespoke electronics and into firmware, drivers, orchestration software, and the physical interfaces that bind racks to cryogenic devices."

  • "Which protections secure the FPGA bitstream and configuration flow, including secure boot, code signing, and JTAG lockdown? How is the supply chain for boards, IP cores, and toolchains vetted and monitored for tampering? What network boundaries isolate the quantum control plane from user workloads and from vendor remote access? How is time synchronization enforced and audited so that an attacker cannot inject timing jitter into feedback loops? What telemetry is collected to detect drift, fault injection, or abnormal calibration patterns and how long is it retained? Are side channel risks from power, RF emissions, or temperature sensors assessed and mitigated in shared facilities? Do the systems use post quantum cryptography for management traffic and for data at rest, and how is key material protected? What is the patch and rollback plan for FPGA firmware, drivers, and orchestration services that must meet strict latency guarantees? How are physical ports and maintenance modes controlled during install and service windows? What incident scenarios have been tabletop tested, including denial of service on real time loops, spurious pulse generation, and compromise of vendor tooling?"

Dr. Adam Everspaugh, Cryptography Expert at Keeper Security, said:

  • "This latest quantum breakthrough marks another milestone in a race to fundamentally upend computer security as we know it. Quantum computers will render the public-key encryption that currently safeguards personal data, financial transactions, healthcare systems, cloud platforms, government operations, and critical infrastructure obsolete once they reach sufficient scale."

  • "The immediate concern isn't what quantum systems can do today, but what they will be capable of in the near future—a scenario that cybercriminals are actively preparing for. Sensitive information stolen today will be exposed and weaponized years from now if organizations fail to prepare."

  • "Transitioning to quantum-resistant cryptography is not a theoretical exercise, it's a strategic imperative. Governments and regulatory bodies are increasingly recognizing the urgency of the threat. Across industries, organizations are being directed to audit their cryptographic assets, plan migration paths, and implement crypto-agility frameworks that enable rapid adaptation to the new standards."

Tim Mackey, Head of Software Supply Chain Risk Strategy at Black Duck, said:

  • "The promise of quantum computing to decrypt harvested data may become a reality, however, the value that an attacker might get from older harvested data is only justifiable for the most valuable and targeted data. This is one reason why various governments have quantum resilient efforts underway rather than 'quantum-proof' solutions."

  • "Since we are talking about a future state for cryptographic capabilities in applications, performing a risk assessment focused on cryptographic usage within an application should be a priority for any organization working with the most sensitive of PII. At a minimum, that risk assessment should focus on what the impact to the system might be if weak encryption were used. Such an assessment would then become a gap analysis covering where sensitive data isn't being properly managed and help identify where quantum resilient approaches to system design and deployment should be employed."

Casey Ellis, Founder at Bugcrowd, said:

  • "Quantum will force organizations to embrace cryptoagility—essentially, the ability to swap out cryptographic algorithms quickly and efficiently. Humans write algorithms and software, and just as cryptographic algorithms seen as unbreakable for 30 years have since been found to be flawed, it's reasonable to assume that this trend will exist in QRC algorithms, as well. This isn't just a quantum problem, it's a broader resilience strategy. The shift to post-quantum cryptography (PQC) will highlight the importance of flexible, automated cryptographic management systems."

  • "The biggest hurdles are awareness, cost, and complexity. Many organizations underestimate the threat or lack the resources to inventory and update their cryptographic infrastructure. Standards bodies like NIST are making progress with PQC algorithms, but adoption will require significant investment and coordination."

  • "In the short term, quantum readiness builds trust with customers and partners. Medium-term, it reduces the risk of catastrophic breaches. Long-term, it ensures operational continuity in a post-quantum world. The cost of inaction far outweighs the investment in preparation."

Comments