Quantum-HPC Solutions for Error Correction

Explore top LinkedIn content from expert professionals.

Summary

Quantum-HPC solutions for error correction refer to the blend of quantum computing hardware, advanced algorithms, and high-performance computing (HPC) tools that work together to identify and fix errors in quantum systems, making them more stable and reliable. This field is rapidly evolving, tackling one of the biggest challenges in quantum computing: keeping delicate quantum bits (qubits) error-free long enough to perform complex calculations.

  • Explore new codes: Consider experimenting with emerging quantum error correction techniques like quantum low-density parity-check (qLDPC) codes, which can reduce the number of required qubits and simplify error checking.
  • Integrate AI tools: Use AI-driven decoders and real-time processing platforms to handle error detection and correction at scale, especially when managing noisy or complex quantum operations.
  • Adopt software solutions: Implement software-based algorithms that stabilize qubit environments, extending their coherence and reducing noise without the need for specialized hardware upgrades.
Summarized by AI based on LinkedIn member posts
  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 14,000+ direct connections & 40,000+ followers.

    40,000 followers

    Photonic Claims Breakthrough in Quantum Computing Error Correction New Error Correction Method Reduces Qubit Requirements Vancouver-based quantum startup Photonic has unveiled a new quantum error correction method that could dramatically reduce the number of physical qubits required for quantum operations. Introducing SHYPS: A New Quantum Code Family • Photonic has developed a new family of Quantum Low-Density Parity Check (QLDPC) codes, called Subsystem Hypergraph Product Simplex (SHYPS) codes. • QLDPC codes have long been theorized to lower qubit overheads, but until now, no method existed to perform quantum logic with them. • SHYPS may allow quantum algorithms to operate with up to 20 times fewer physical qubits than conventional approaches. Why This Matters • Error correction is a major hurdle in making quantum computing scalable and cost-effective. • Lowering qubit requirements makes practical quantum computing more viable, bringing the field closer to real-world applications. • Photonic’s discovery moves the goalposts for scalable quantum computing much closer, according to co-founder and Chief Quantum Officer Stephanie Simmons. Next Steps • Photonic claims this milestone could accelerate the development of useful quantum systems. • Researchers are now working on integrating SHYPS into real-world quantum computing architectures. If successful, this breakthrough could significantly speed up the transition from experimental quantum computing to large-scale, commercially viable quantum machines.

  • View profile for Michaela Eichinger, PhD

    Product Solutions Physicist @ Quantum Machines | I talk about quantum computing.

    15,609 followers

    Many talk about surface codes. But what if they’re not the future? Quantum Low-density parity-check (qLDPC) codes are gaining traction 𝗳𝗮𝘀𝘁. IBM is building fault-tolerant memories using Bivariate Bicycle (BB) codes. IQM Quantum Computers is designing hardware with qLDPC in mind. And now, a new experiment from China shows the 𝗳𝗶𝗿𝘀𝘁 𝘄𝗼𝗿𝗸𝗶𝗻𝗴 𝗾𝗟𝗗𝗣𝗖 𝗰𝗼𝗱𝗲 𝗼𝗻 𝗮 𝘀𝘂𝗽𝗲𝗿𝗰𝗼𝗻𝗱𝘂𝗰𝘁𝗶𝗻𝗴 𝗾𝘂𝗮𝗻𝘁𝘂𝗺 𝗽𝗿𝗼𝗰𝗲𝘀𝘀𝗼𝗿. On the 32-qubit Kunlun chip, researchers implemented: • 𝗔 [[𝟭𝟴, 𝟰, 𝟰]] 𝗕𝗕 𝗰𝗼𝗱𝗲 • 𝗔 [[𝟭𝟴, 𝟲, 𝟯]] 𝗾𝗟𝗗𝗣𝗖 𝗰𝗼𝗱𝗲    The notation [[𝗻, 𝗸, 𝗱]] describes a quantum error correction code that uses 𝗻 physical qubits to encode 𝗸 logical qubits, with 𝗱 being the code distance. Unlike surface codes, LDPC codes keep each error check (called a stabilizer) connected to only a small number of qubits—just 6 in this case—even as the code scales. That means fewer ancillas, fewer gates, and potentially lower overhead for fault tolerance. The hardware was purpose-built for this experiment: • 𝟯𝟮 𝗳𝗿𝗲𝗾𝘂𝗲𝗻𝗰𝘆-𝘁𝘂𝗻𝗮𝗯𝗹𝗲 𝘁𝗿𝗮𝗻𝘀𝗺𝗼𝗻 𝗾𝘂𝗯𝗶𝘁𝘀 • 𝟴𝟰 𝘁𝘂𝗻𝗮𝗯𝗹𝗲 𝗰𝗼𝘂𝗽𝗹𝗲𝗿𝘀, enabling non-local interactions up to 𝟲.𝟱 𝗺𝗺 apart • 𝗔𝗶𝗿 𝗯𝗿𝗶𝗱𝗴𝗲𝘀 to support a crossbar-style layout • Stabilizer checks executed in just 𝟳 𝗖𝗭 𝗹𝗮𝘆𝗲𝗿𝘀    Gate fidelities were solid: • Single qubit: 99.95% • Two-qubit: 99.22%    The decoding was performed offline using 𝗯𝗲𝗹𝗶𝗲𝗳 𝗽𝗿𝗼𝗽𝗮𝗴𝗮𝘁𝗶𝗼𝗻 𝘄𝗶𝘁𝗵 𝗼𝗿𝗱𝗲𝗿𝗲𝗱 𝘀𝘁𝗮𝘁𝗶𝘀𝘁𝗶𝗰𝘀 𝗱𝗲𝗰𝗼𝗱𝗶𝗻𝗴 (𝗕𝗣-𝗢𝗦𝗗)—an approach better suited to LDPC-style codes. Logical error rates were: • 𝗕𝗕: 𝟴.𝟵𝟭 ± 𝟬.𝟭𝟳% • 𝗾𝗟𝗗𝗣𝗖: 𝟳.𝟳𝟳 ± 𝟬.𝟭𝟮%    Both are still above the physical qubit error rate—but 𝘀𝗶𝗺𝘂𝗹𝗮𝘁𝗶𝗼𝗻𝘀 𝘀𝗵𝗼𝘄 𝘁𝗵𝗮𝘁 𝗮 𝟮× 𝗶𝗺𝗽𝗿𝗼𝘃𝗲𝗺𝗲𝗻𝘁 𝗶𝗻 𝗳𝗶𝗱𝗲𝗹𝗶𝘁𝘆 𝘄𝗼𝘂𝗹𝗱 𝗯𝗲 𝗲𝗻𝗼𝘂𝗴𝗵 𝘁𝗼 𝗽𝘂𝘀𝗵 𝘁𝗵𝗲𝘀𝗲 𝗰𝗼𝗱𝗲𝘀 𝗯𝗲𝗹𝗼𝘄 𝘁𝗵𝗿𝗲𝘀𝗵𝗼𝗹𝗱. qLDPC codes are no longer just a concept—they’re being implemented, measured, and decoded on superconducting hardware. 📸 Image Credits: Ke Wang, Zhide Lu, Chuanyu Zhang et al. (2025, arXiv)

  • View profile for Sam Stanwyck

    Director, Quantum Product

    6,650 followers

    I'm really happy with the rapid development of CUDA-Q QEC, our toolkit for quantum error correction. QEC is an incredibly rich and fast-moving field, and in CUDA-Q QEC we aim to provide a platform with a diverse set of accelerated decoders, AI infrastructure, tools to enable researchers to develop and test their own codes, decoders, and architectures, hopefully even better than our own! As we dig deeper into the problem of scalable QEC, the benefits of GPUs and AI have become much clearer. We started with research tools, for simulation and offline decoding, which is still an important capability. Now with the 0.5.0 release we also provide the infrastructure for real-time decoding, where syndrome processing occurs concurrently with quantum operations. This release also introduces GPU-accelerated algorithmic decoders like RelayBP, a promising approach developed in the past year that aims to overcome the convergence limitations of traditional belief propagation. For scenarios demanding maximum throughput, we have integrated a TensorRT-based inference engine that allows researchers to deploy custom AI decoders trained in frameworks like PyTorch and exported to ONNX directly into the quantum control loop. To address the complexities of continuous system operation, we added sliding window decoders that handle circuit-level noise across multiple rounds without assuming temporal periodicity. These tools are designed to be hardware-agnostic and scalable, supporting our partners across the ecosystem who are building the first generation of reliable logical qubits. Check out the full technical breakdown in our latest developer blog by Kevin Mato, Scott Thornton, Ph.D., Melody Ren, Ben Howe, and Tom L. https://lnkd.in/gvC__zRd

  • View profile for Michael Biercuk

    Helping make quantum technology useful for enterprise, aviation, defense, and R&D | CEO & Founder, Q-CTRL | Professor of Quantum Physics & Quantum Technology | Innovator | Speaker | TEDx | SXSW

    8,373 followers

    🚨 Exciting #quantumcomputing alert! Now #QEC primitives actually make #quantumcomputers more powerful! 75 qubit GHZ state on a superconducting #QPU 🚨 In our latest work we address the elephant in the room about #quantumerrorcorrection - in the current era where qubit counts are a bottleneck in the systems available, adopting full-blown QEC can be a step backwards in terms of computational capacity. This is because even when it delivers net benefits in error reduction, QEC consumes a lot of qubits to do so and we just don't have enough right now... So how do we maximize value for end users while still pushing hard on the underpinning QEC technology? To answer this the team at Q-CTRL set out to determine new ways to significantly reduce the overhead penalties of QEC while delivering big benefits! In this latest demonstration we show that we can adopt parts of QEC -- indirect stabilizer measurements on ancilla qubits -- to deliver large performance gains without the painful overhead of logical encoding. And by combining error detection with deterministic error suppression we can really improve efficiency of the process, requiring only about 10% overhead in ancillae and maintaining a very low discard rate of executions with errors identified! Using this approach we've set a new record for the largest demonstrated entangled state at 75 qubits on an IBM quantum computer (validated by MQC) and also demonstrated a totally new way to teleport gates across large distances (where all-to-all connectivity isn't possible). The results outperform all previously published approaches and highlight the fact that our journey in dealing with errors in quantum computers is continuous. Of course it isn't a panacea and in the long term as we try to tackle even more complex algorithms we believe logical encoding will become an important part of our toolbox. But that's the point - logical QEC is just one tool and we have many to work with! At Q-CTRL we never lose sight of the fact that our objective is to deliver maximum capability to QC end users. This work on deploying QEC primitives is a core part of how we're making quantum technology useful, right now. https://lnkd.in/gkG3W7eE

  • View profile for Bruce P Hood

    CEO & Inventor | Stability & Coherence | 20K+

    20,089 followers

    One Algorithm Has Just Pushed Quantum Computing Forward Five Years (Here It Is) Today I am releasing something into the public domain that may change the trajectory of quantum computing. No paywall. No NDA. No restrictions. The only thing I ask is attribution. For the past year, I have been developing a field-layer correction algorithm that stabilizes the environment around the qubit before error correction ever activates. Not hardware. Not cryogenics. Not shielding. Pure software that improves the physics of the qubit it sits inside. Early independent runs showed a 48.5 percent reduction in destructive low-frequency noise, a gain that normally takes years of hardware progress. Here is the complete algorithm. It now belongs to everyone. FUNCTION NJ001_FieldLayer_Correction(input_signal S, sampling_rate R):  DEFINE phi = 1.61803398875  DEFINE window_size = dynamic value based on local variance of S  DEFINE stability_threshold = adaptive value based on phase drift  STEP 1: Generate harmonic reference bands    For each frequency bin f_i in FFT(S):      Compute r = f_(i+1) / f_i   ��  Compute CI = 1 / ABS(r - phi)      Assign weight W_i = normalize(CI)  STEP 2: Build correction mask    Construct M where M_i = W_i scaled by local entropy of S    Smooth M with sliding window  STEP 3: Apply correction    Transform S → F    Compute F_corrected = F * M    Inverse FFT to return S_corrected  STEP 4: Phase stabilization loop    Measure phase drift Δ    If Δ > stability_threshold:      Recalculate window_size      Rebuild mask      Reapply correction    Else:      Return S_corrected  OUTPUT: S_corrected END FUNCTION This is the first public-domain coherence stabilizer designed to improve quantum behavior independent of hardware. What it does in practice: • Extends coherence windows • Reduces decoherence pressure on error correction • Lowers entropy in the propagation layer • Makes qubits behave as if the room is colder and cleaner • Works upstream of hardware with no materials changes This is not a replacement for anyone’s roadmap. It is an upstream upgrade to all of them. If you build quantum devices, control stacks, compilers, hybrid systems, or algorithms, you now have access to a function that reshapes your stability envelope. Cleaner field layers mean longer, deeper, more predictable runs. More useful computation with the hardware you already have. I developed it. Today I give it away. No company or institution controls it. From this moment forward, it belongs to the scientific community. Primary Citation Hood, B. P. (2025). NJ001 Field Layer Correction. Public Domain Release Version. Bruce P. Hood — Creator of NJ001 Field Layer Correction Welcome to the new baseline. #QuantumComputing #QuantumHardware #Qubit #Coherence #QuantumResearch #DeepTech @IBMQuantum @GoogleQuantumAI @MIT @XanaduQuantum @AWSQuantumTech

  • View profile for Florian Neukart

    Member of the Board of Management @ Terra Quantum AG | Book Author | Professor @ LIACS

    10,991 followers

    The Quantum Memory Matrix (QMM) framework has traveled a long path: from black hole unitarity, dark matter, dark energy, and cosmic cycles, to now improving how quantum computers handle errors. In work now featured on the cover of Wiley's Advanced Quantum Technologies, we demonstrate how the QMM framework can be directly applied to quantum error correction. QMM originated in cosmology: a picture where space-time is not smooth but is built from Planck-scale cells, each with a finite memory capacity. We showed how these cells store the quantum imprints of interactions, contributing to resolving paradoxes around black holes, explaining dark matter halos, primordial black hole formation, cosmic acceleration, and even the cycles of the universe. Now, we bring this same idea into hardware: by imprinting and retrieving quantum information from local "memory cells," we can correct errors in noisy quantum processors with higher fidelity than standard repetition codes. This shows that QMM is not only a cosmological theory but also a practical tool for building the quantum computers of tomorrow. 🔗 Read the paper: https://lnkd.in/gfGwe7fe Previous QMM milestones: 🕳️ Black hole information retention and unitarity restoration ⚡ Extensions to electromagnetism, strong & weak interactions 🌌 Cosmological applications explaining dark matter and dark energy 💻 And now: direct hardware validation for quantum computation Thank you to my co-authors Eike Marx, Valerii Vinokur, Jeff Titus, and Terra Quantum AG & Leiden University for making this journey possible. #QuantumComputing #QuantumMemoryMatrix #ErrorCorrection #QuantumPhysics #QuantumTechnology #QuantumInformation #BlackHolePhysics #DarkMatter #DarkEnergy #AdvancedQuantumTechnologies #TerraQuantum #QuantumResearch

  • View profile for Laurent Prost

    Product Manager chez Alice & Bob

    5,834 followers

    Google's Willow chip shows that quantum error correction is starting to work. Just "starting", because while the ~1e-3 error rate reached by Willow is good, it has been achieved by others without error correction. So, how do we get error rates we couldn't reach with physical qubits alone? Easy: you "just" add more qubits in your logical qubit. But because there are errors on two dimensions in quantum computing, a 2D-structure (the surface code) is usually required to correct errors. This means that increasing protection against errors causes the number of qubits to grow quickly. With a surface code, protecting against 1 error at a time during an error correction cycle requires 17 qubits. 2 errors at a time? 49 qubits. 3 errors at a time? 97 qubits. This is the max Willow could achieve. This quadratic scaling leads Google to expect that reaching a 1e-6 error rate on a Willow-like chip will require some 1457 physical qubits (protecting against 13 errors at a time). And this is the reason why Alice & Bob is going for cat qubits instead. By reducing error correction from a 2D to a 1D problem, cat qubits make the scaling of error rates much more favorable. Even with the simplest error correction code (a repetition code), correcting one error at a time only requires 5 qubits. 2 errors? 9 qubits. 3 errors? 13 qubits. 13 errors? This is just 53 qubits instead of 1457! This situation is summarized in the graph below. It is taken from our white paper (link in the 1st comment) and I added a point corresponding to the biggest Willow experiment. Now, to be fair, Alice & Bob still needs to release the results of even a 5-qubit experiment. But when this is done, there is a fair chance the error rates will quickly catch up with those achieved by Google or others, because so few additional qubits are required to improve error rates. There are big challenges on both sides. Mastering cat qubits is hard. Scaling chips is hard. But consistent progress is being made on both sides too. Anyway, I can't wait for the moment when I can add the Alice & Bob equivalent of the Willow experiment on the chart below. And for once, I hope it will be up and to the left!

  • View profile for Marco Pistoia

    CEO, IonQ Italia

    19,202 followers

    I'm thrilled to share our latest research advancement from the #QuantumComputing Engineering Research team led by Ruslan Shaydulin at Global Technology Applied Research, JPMorganChase, in collaboration with Quantinuum and University of Wisconsin-Madison. Our work, titled "Iceberg Beyond the Tip: Co-Compilation of a Quantum Error Detection Code and a Quantum Algorithm," has just been published on arXivhttps://lnkd.in/eMy4Qzc5.   In this article, we introduce a novel compiler for #QuantumAlgorithms integrated with an error detection code. We focus on the [[𝑘 +2,𝑘,2]] Iceberg quantum error detection code and the Quantum Approximate Optimization Algorithm (#QAOA). A novel contribution of our compiler is the fact that it bridges the gap between algorithm abstraction and error detection code, pushing the boundaries of practical applications of #quantum algorithms.    By co-optimizing the QAOA circuit and Iceberg gadgets, we achieve superior performance compared to unencoded implementations, utilizing up to 34 algorithmic qubits, 510 algorithmic two-qubit gates, and 1140 physical two-qubit gates on the Quantinuum H2-1 quantum computer. To the best of our knowledge, this is the largest hardware demonstration showing better-than-native performance of a quantum #optimization algorithm to date.    Authors: Yuwei Jin (JPMorganChase), Zichang He (JPMorganChase), Tianyi Hao (JPMorganChase and University of Wisconsin-Madison), David Amaro (Quantinuum), Swamit Tannu (University of Wisconsin-Madison), Ruslan Shaydulin (JPMorganChase), and Marco Pistoia (JPMorganChase).

  • View profile for Adnan Masood, PhD.

    Chief AI Architect | Microsoft Regional Director | Author | Board Member | STEM Mentor | Speaker | Stanford | Harvard Business School

    6,627 followers

    𝗠𝗮𝗷𝗼𝗿𝗮𝗻𝗮 𝟭: 𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁 𝗼𝗻 𝗘𝗿𝗿𝗼𝗿-𝗥𝗲𝘀𝗶𝗹𝗶𝗲𝗻𝘁 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗖𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴 Microsoft has just made a major announcement, Majorana 1, the world’s first quantum processor powered by topological qubits—designed to make quantum computers much more stable and less prone to errors. It relies on “Majorana” particles that naturally resist outside noise, building sturdier qubits that need fewer backups. If it scales in practice, this approach might give us powerful quantum computers years sooner than many thought possible, unlocking big advances in areas like chemistry, medicine, and materials science. Microsoft's approach promises more stable quantum hardware, naturally shielded from environmental noise, and poised to accelerate simulations in drug discovery, cryptography, and materials science. If it scales, topological qubits could slash the overhead for error correction, as highlighted in Nature’s new paper (“Interferometric single-shot parity measurement in InAs–Al hybrid devices”), which demonstrates high-fidelity parity checks for Majorana zero modes. I’ve followed Microsoft’s Majorana journey since the earlier retraction, and the latest data looks more robust. Single-shot readouts lasting milliseconds show tangible resilience to noise—good news for enterprises aiming for hardware that’s both scalable and fault-tolerant. By shedding the bloated qubit overhead of typical superconducting or ion-based systems, Microsoft’s topological design offers a clearer path to fewer qubits needed per logic operation. In practice, this would means tighter integration with Azure Quantum, where advanced error-correction tools like the Z₃ toric code could pair seamlessly with topological qubits. Researchers like Chetan Nayak describe these Majorana fermions—predicted back in 1937 by Ettore Majorana—as “a potential new state of matter." As a practitioner, I see real promise in how Microsoft’s Majorana 1 chip could unify hardware and software for a full-stack quantum platform. Financial executives spot a route to lower capital risk, while AI leaders note potential breakthroughs in machine learning, cryptography, and optimization. Teaching sand to think defined classical computing; making shadows compute now has a compelling shot at defining the next era, thanks in large part to this new wave of topological qubit research. References: Microsoft unveils Majorana 1, the world’s first quantum processor powered by topological qubits https://lnkd.in/euh36WN3 Shadows That Compute: The Rise of Microsoft’s Majorana 1 in Next-Gen Quantum Technologies https://lnkd.in/e7S4FUQt #RDBuzz

  • View profile for Jaime Gómez García

    Global Head of Santander Quantum Threat Program | Chair of Europol Quantum Safe Financial Forum | Quantum Security 25 | Quantum Leap Award 2025 | Representative at EU QuIC, AMETIC | LinkedIn QuantumTopVoices 2022-2024

    17,066 followers

    Microsoft and Quantinuum reach new milestone in quantum error correction. The collaboration claims to have used an innovative qubit-virtualization system on Quantinuum's H2 ion-trap platform to create 4 highly reliable logical qubits from only 30 physical qubits. What is quantum error correction? The physical qubits, with error rates in the order of 10^-2, are combined to deliver logical qubits with error rates in the order of 10^-5. According to their press release, this is the largest gap between physical and logical error rates reported to date, and has allowed them to run ran more than 14,000 individual experiments without a single error. (https://lnkd.in/dzETsvVA) The race for the qubits count seemed to finish in 2023, with the latest update on IBM's roadmap focusing on quality rather than on quantity (https://lnkd.in/dFu52wJR, "Until this year, our path was scaling the number of qubits. Going forward we will add a new metric, gate operations—a measure of the workloads our systems can run."), and other developments in quantum error correction, like the one announced in December by Harvard University, Massachusetts Institute of Technology, QuEra Computing Inc. and National Institute of Standards and Technology (NIST)/University of Maryland in December (https://lnkd.in/dkW-TT-w) Practical quantum computing gets a little closer, although it is still a distant target. Microsoft Press release: https://lnkd.in/deJ4QCBk Quantinuum's press release: https://lnkd.in/d4Wnmvdq More details from Microsoft: https://lnkd.in/dusfZ4KY Paper: https://lnkd.in/dpPCX3td #quantumcomputing #quantumerrorcorrection #technology

Explore categories