I’m excited to share this new work from our IBM Quantum team in collaboration with Oak Ridge National Laboratory. This is a major demonstration of what we mean by realizing useful Quantum-centric supercomputing. Building on the chemistry work developed with RIKEN (https://lnkd.in/eK8jW-Wp) last year, and the previous Krylov demonstration with University of Tokyo (https://lnkd.in/eae_8zGc), the IBM Quantum and ORNL teams developed a quantum algorithm for ground states with convergence guarantees similar to phase estimation, while retaining the error mitigation aspect of sample-based methods. Putting together sample-based approaches and Krylov methods, we call this sample-based Krylov quantum diagonalization (SKQD). The algorithm can be used to compute ground state energies of quantum systems for many lattice models relevant in materials science and high-energy physics. SKQD is demonstrated experimentally on 85 qubits and 6,000 two-qubit gates on IBM quantum processors, against the ground state of the Anderson impurity model, obtaining high accuracies for problem sizes beyond the reach of exact diagonalization. This marks one of the largest implementations of quantum diagonalization to date, and points at how quantum computing, combined with classical computation in quantum-centric supercomputing environments, will enable us to push beyond classical methods for interesting applications. These new results also show again how algorithmic discovery is essential, especially for quantum-centric supercomputing architectures. Classical algorithms for materials science have made an impressive progress in the last decades. However, by thinking of quantum-classical workflows where quantum can deliver a value that cannot be matched by classical, we will move closer to demonstrating quantum advantage. Congratulations again to the team on this achievement. Check out the paper here: https://lnkd.in/epwCrG5R.
Quantum Statistics in Modern Computing Applications
Explore top LinkedIn content from expert professionals.
Summary
Quantum statistics in modern computing applications refers to the use of principles from quantum mechanics and statistical methods to solve complex problems in fields like cybersecurity, material science, and artificial intelligence. These advanced techniques allow quantum computers to analyze and simulate data in ways that traditional computers cannot, unlocking new possibilities for scientific discovery and secure computing.
- Explore hybrid models: Combine quantum and classical computing methods to tackle problems that are too challenging for traditional algorithms alone.
- Prioritize data security: Utilize quantum-generated randomness and privacy-preserving workflows to ensure data integrity in sensitive applications like cryptography and fraud detection.
- Scale with innovation: Seek out opportunities to expand quantum computing resources and apply quantum statistics to larger datasets as hardware continues to advance.
-
-
Google’s 69-Qubit Quantum Simulator Outperforms Supercomputers in Key Calculations Researchers from Google and the PSI Center for Scientific Computing have developed a 69-qubit quantum simulator that can outperform the fastest classical supercomputers in studying complex quantum systems. This breakthrough brings unprecedented accuracy in modeling quantum processes, unlocking new possibilities in materials science, magnetism, and thermodynamics. Key Features of Google’s Quantum Simulator • Combines Digital & Analog Quantum Computing: The simulator supports both universal quantum gates (digital mode) and high-fidelity analog evolution, providing superior performance in cross-entropy benchmarking experiments. • Beyond Classical Computational Limits: This hybrid approach enables calculations that classical supercomputers cannot efficiently simulate, especially in quantum material and energy research. • Specialized for Quantum Simulations: Unlike general-purpose quantum computers, this simulator is optimized for modeling quantum interactions, making it a powerful tool for scientific discovery. Digital vs. Analog Quantum Computing • Digital Quantum Computing: • Uses quantum gates to manipulate qubits, similar to logic gates in classical computing. • Best suited for algorithms, machine learning, and cryptography applications. • Analog Quantum Computing: • Models physical quantum systems directly, simulating real-world interactions with fewer computational steps. • Ideal for studying material science, condensed matter physics, and quantum thermodynamics. Why This Matters • Accelerating Scientific Research: The simulator could help discover new materials, improve energy storage, and refine magnetism-based technologies. • Advancing Quantum Supremacy: By achieving results beyond classical computation, this simulator cements Google’s lead in quantum research. • Potential for Quantum AI Integration: Combining digital and analog approaches may enhance machine learning models and optimize large-scale computations. What’s Next? • Expanding Qubit Count: Google may scale up its hybrid quantum simulations, pushing closer to full-scale quantum supremacy. • Exploring More Applications: Future research could apply these simulations to biophysics, drug discovery, and nuclear physics. • Potential Industry Collaborations: Google’s breakthrough may lead to partnerships in materials engineering and quantum-enhanced AI systems. This 69-qubit quantum simulator represents a major leap in computational power, proving that quantum systems can now surpass supercomputers in specialized scientific tasks, bringing us closer to practical quantum applications.
-
Interesting approach alert! QUBO-based SVM tested on QPU (Neutral Atoms). A recent study, "QUBO-based SVM for credit card fraud detection on a real QPU," explores the application of a novel quantum approach to a critical cybersecurity challenge: credit card fraud detection. Here are some of the key findings: * QUBO-based SVM model: The study successfully implemented a Support Vector Machine (SVM) model whose training is reformulated as a Quadratic Unconstrained Binary Optimization (QUBO) problem. This approach could leverage the capabilities of quantum processors. * Performance: The results demonstrate that a version of the QUBO SVM model, particularly when used in a stacked ensemble configuration, achieves high performance with low error rates. The stacked configuration uses the QUBO SVM as a meta-model, trained on the outputs of other models. * Noise robustness: Surprisingly, the study observed that a certain amount of noise can lead to enhanced results. This is a new phenomenon in quantum machine learning, but it has been seen in other contexts. The models were robust to noise both in simulations and on the real QPU. * Scalability: Experiments were extended up to 24 atoms on the real QPU, and the study showed that performance increases as the size of the training set increases. This suggests that even better results are possible with larger QPUs. Practical implications: This research highlights the potential of quantum machine learning for real-world applications, using a hybrid approach where the training is performed on a QPU and the testing on classical hardware. This approach makes the model applicable on current NISQ devices. The model is also advantageous because it uses the QPU only for training, reducing costs and allowing the trained model to be reused. * Ideal for cybersecurity and regulatory issues: The study also observed that the model preserves data privacy because only the atomic coordinates and laser parameters reach the QPU, and the model test is done locally. Here the article: https://lnkd.in/d5Vfhq2G #quantumcomputing #machinelearning #cybersecurity #frauddetection #neutralatoms #QPU #NISQ #quantumml #fintech #datascience
-
Is this the first real-world use case for quantum computers? True randomness is hard to come by. And in a world where cryptography and fairness rely on it, “close enough” just doesn’t cut it. A new paper in Nature claims to present a demonstrated, certified application of quantum computing, not in theory or simulation, but in the real world. Led by Quantinuum, JPMorganChase, Argonne National Laboratory, Oak Ridge National Laboratory, and The University of Texas at Austin, the team successfully ran a certified randomness expansion protocol on Quantinuum’s 56-qubit H2 quantum computer, and validated the results using over 1.1 exaflops of classical computing power. TL;DR is certified randomness--the kind of true, verifiable unpredictability that’s essential to cryptography and security--was generated by a quantum computer and validated by the world’s fastest supercomputers. Here’s why that matters: True randomness is anything but trivial. Classical systems can simulate randomness, but they’re still deterministic at the core. And for high-stakes environments such as finance, national security, or fairness in elections, you don’t want pseudo-anything. You want cold, hard entropy that no adversary can predict or reproduce. Quantum mechanics is probabilistic by nature. But just generating randomness with a quantum system isn’t enough; you need to certify that it’s truly random and not spoofed. That’s where this experiment comes in. Using a method called random circuit sampling, the team: ⚇ sent quantum circuits to Quantinuum’s 56-qubit H2 processor, ⚇ had it return outputs fast enough to make classical simulation infeasible, ⚇ verified the randomness mathematically using the Frontier supercomputer ⚇ while the quantum device accessed remotely, proving a future where secure, certifiable entropy doesn’t require trusting the hardware in front of you The result? Over 71,000 certifiably random bits generated in a way that proves they couldn’t have come from a classical machine. And it’s commercially viable. Certified randomness may sound niche—but it’s highly relevant to modern cryptography. This could be the start of the earliest true “quantum advantage” that actually matters in practice. And later this year, Quantinuum plans to make it a product. It’s a shift— from demos to deployment from supremacy claims to measurable utility from the theoretical to the trustworthy read more from Matt Swayne at The Quantum Insider here --> https://lnkd.in/gdkGMVRb peer-reviewed paper --> https://lnkd.in/g96FK7ip #QuantumComputing #CertifiedRandomness #Cryptography