This image is from an Amazon Braket slide deck that just did the rounds of all the Deep Tech conferences I've been at recently (this one from Eric Kessler). It's more profound than it might seem. As technical leaders, we're constantly evaluating how emerging technologies will reshape our computational strategies. Quantum computing is prominent in these discussions, but clarity on its practical integration is... emerging. It's becoming clear however that the path forward isn't about quantum versus classical, but how quantum and classical work together. This will be a core theme for the year ahead. As someone now on the implementation partner side of this work, and getting the chance to work on specific implementations of quantum-classical hybrid workloads, I think of it this way: Quantum Processing Units (QPUs) are specialised engines capable of tackling calculations that are currently intractable for even the largest supercomputers. That's the "quantum 101" explanation you've heard over and over. However, missing from that usual story, is that they require significant classical infrastructure for: - Control and calibration - Data preparation and readout - Error mitigation and correction frameworks - Executing the parts of algorithms not suited for quantum speedup Therefore, the near-to-medium term future involves integrating QPUs as accelerators within a broader classical computing environment. Much like GPUs accelerate specific AI/graphics tasks alongside CPUs, QPUs are a promising resource to accelerate specific quantum-suited operations within larger applications. What does this mean for technical decision-makers? Focus on Integration: Strategic planning should center on identifying how and where quantum capabilities can be integrated into existing or future HPC workflows, not on replacing them entirely. Identify Target Problems: The key is pinpointing high-value business or research problems where the unique capabilities of quantum computation could provide a substantial advantage. Prepare for Hybrid Architectures: Consider architectures and software platforms designed explicitly to manage these complex hybrid workflows efficiently. PS: Some companies like Quantum Brilliance are focused on this space from the hardware side from the outset, working with Pawsey Supercomputing Research Centre and Oak Ridge National Laboratory. On the software side there's the likes of Q-CTRL, Classiq Technologies, Haiqu and Strangeworks all tackling the challenge of managing actual workloads (with different levels of abstraction). Speaking to these teams will give you a good feel for topic and approaches. Get to it. #QuantumComputing #HybridComputing #HPC
Accelerating Results With Quantum Computing Solutions
Explore top LinkedIn content from expert professionals.
Summary
Accelerating results with quantum computing solutions means using the unique strengths of quantum computers to tackle problems that are too complex for today’s supercomputers, especially in areas like data analysis, scientific research, and optimization. Quantum computers work by processing information in completely new ways, enabling breakthroughs in fields such as finance, machine learning, and materials science when combined with traditional computing methods.
- Explore hybrid approaches: Consider integrating quantum and classical computing so you can tackle challenging tasks more efficiently than with either technology alone.
- Pinpoint key problems: Identify complex business or research challenges, such as multi-objective optimization or advanced data modeling, where quantum solutions can deliver faster or previously unreachable results.
- Prepare your infrastructure: Plan for new software platforms and architectures that support quantum-classical workflows, so your organization is ready as these technologies become mainstream.
-
-
Sometimes when you set out to solve something small you end up delivering something huge. The team at Q-CTRL just did that with our partners NVIDIA and Oxford Quantum Circuits (OQC), achieving a totally new #GPU-optimized algorithm for the subgraph-isomorphism problem. One of the toughest challenges when it comes to practical scaling of #quantumcomputing is how to parse the problem of interest onto the device at hand. Which qubits are best? Which connectivity is most efficient? How can you use mathematical tricks to reduce the number of operations (and hence reduce opportunities for error)? Which parts of the process can be sped up with classical techniques? These questions are all part of a task called compilation, and even though it's less sexy than other areas, it's an actual performance bottleneck for most users in the real-world. We set out to investigate how to speed up certain subroutines with #GPUs, and in the process achieved something even more profound. The underlying problem is called the subgraph-isomorphism problem which is key to a range of #AI/ #machinelearning tasks. There are tons of algorithms allowing this problem to be solved, but most are stubbornly resistant to parallelization, rendering GPUs much less useful than in other areas. Until now. Working with NVIDIA and OQC, we developed a novel solution to this problem that combines insights from the graph database and analytics community, data science techniques, and leverages well-established open source software. Our new approach, named Δ-Motif, replaces traditional backtracking strategies with a data-centric approach that decomposes the graphs into fundamental motifs (small, reusable building blocks like paths and cycles), representing them in tabular formats and models graph processing with relational database operations like merges and filters. This shift transforms an inherently sequential problem into one that can be executed in parallel at scale, unlocking new levels of efficiency in graph processing. In an implementation on NVIDIA GPUs we achieved up to 600X speedups in wall clock time using test graphs, quantum-algorithm benchmarks (QASMBench) and classical ML benchmarks (SparseSuite Matrix Collection). This is an amazing example of how pushing the frontiers of PRACTICAL #quantumcomputing can deliver huge outcomes of much broader appeal. Our team is proud of this development and excited to continue expanding our partnerships with NVIDIA and OQC as we help deliver true #hybridcompute to the #datacenter. Jin-Sung Kim Oded Green Gerald Mullally Jamie Friel Jensen Huang Atsushi Sugiura Read more at our blog post: https://lnkd.in/gn-Bsurp Technical manuscript: https://lnkd.in/gxaJsumG
-
A new paper, now published in Nature Computational Science, introduces "Quantum Approximate Multi-Objective Optimization," a breakthrough from researchers at IBM, Los Alamos National Laboratory, and Zuse Institute Berlin. This work represents one of the most promising proposals for near-term demonstrations of quantum advantage in combinatorial optimization, with enormous relevance across industry and science: https://lnkd.in/ew7Pe2K5 Multi-objective optimization is a branch of mathematical optimization that deals with problems involving multiple often conflicting goals—e.g., constructing financial portfolios that minimize risk while maximizing returns. These problems can be extremely challenging for classical methods as the number of objective functions increases, even in cases where the single-objective version of the problem is easily solvable. The study demonstrates how quantum computers can approximate the optimal Pareto front, i.e., the set of all optimal trade-offs between conflicting objectives, showing better scaling than classical algorithms. Sampling good solutions from vast solution spaces is a task at which quantum computers excel, and the researchers take full advantage of that in their work. This marks an important step toward practical quantum advantage in optimization, and shows the value of exploring quantum capabilities beyond conventional problem classes. The paper is the latest outcome from our quantum optimization technical working group, and I encourage you to have a look.
-
Excited to announce a new #QuantumComputing result from JPMorganChase's Global Technology Applied Research, titled “Fast Convex Optimization with Quantum Gradient Descent,” which has just appeared on arXiv! Convex #optimization is a fundamental subroutine in #MachineLearning, engineering, and #DataScience, with many applications in financial engineering. We develop new #QuantumAlgorithms in the “derivative-free” setting where the algorithm only uses the function value and not its gradient. We show that #quantum algorithms without gradient access can match the convergence of classical gradient-descent methods, which do assume gradient access! In the derivative-free setting, this translates to an exponential speedup in terms of the dimension. Our results also have applications outside the black-box setting. By leveraging a connection between semi-definite programming and eigenvalue optimization, we develop algorithms that exhibit the best known quantum or classical runtimes for semi-definite programming, linear programming, and zero-sum games, which are the three most well-studied classes of structured convex optimization problems. These classes model many practical problems of interest, including portfolio optimization and least-squares regression problems. Coauthors: Brandon Augustino, Dylan Herman, Enrico Fontana, Junhyung Lyle Kim, Jacob Watkins, Shouvanik Chakrabarti, and Marco Pistoia. Link to the article: https://lnkd.in/eMtqXM-r
-
Google’s 69-Qubit Quantum Simulator Outperforms Supercomputers in Key Calculations Researchers from Google and the PSI Center for Scientific Computing have developed a 69-qubit quantum simulator that can outperform the fastest classical supercomputers in studying complex quantum systems. This breakthrough brings unprecedented accuracy in modeling quantum processes, unlocking new possibilities in materials science, magnetism, and thermodynamics. Key Features of Google’s Quantum Simulator • Combines Digital & Analog Quantum Computing: The simulator supports both universal quantum gates (digital mode) and high-fidelity analog evolution, providing superior performance in cross-entropy benchmarking experiments. • Beyond Classical Computational Limits: This hybrid approach enables calculations that classical supercomputers cannot efficiently simulate, especially in quantum material and energy research. • Specialized for Quantum Simulations: Unlike general-purpose quantum computers, this simulator is optimized for modeling quantum interactions, making it a powerful tool for scientific discovery. Digital vs. Analog Quantum Computing • Digital Quantum Computing: • Uses quantum gates to manipulate qubits, similar to logic gates in classical computing. • Best suited for algorithms, machine learning, and cryptography applications. • Analog Quantum Computing: • Models physical quantum systems directly, simulating real-world interactions with fewer computational steps. • Ideal for studying material science, condensed matter physics, and quantum thermodynamics. Why This Matters • Accelerating Scientific Research: The simulator could help discover new materials, improve energy storage, and refine magnetism-based technologies. • Advancing Quantum Supremacy: By achieving results beyond classical computation, this simulator cements Google’s lead in quantum research. • Potential for Quantum AI Integration: Combining digital and analog approaches may enhance machine learning models and optimize large-scale computations. What’s Next? • Expanding Qubit Count: Google may scale up its hybrid quantum simulations, pushing closer to full-scale quantum supremacy. • Exploring More Applications: Future research could apply these simulations to biophysics, drug discovery, and nuclear physics. • Potential Industry Collaborations: Google’s breakthrough may lead to partnerships in materials engineering and quantum-enhanced AI systems. This 69-qubit quantum simulator represents a major leap in computational power, proving that quantum systems can now surpass supercomputers in specialized scientific tasks, bringing us closer to practical quantum applications.
-
Is Quantum Machine Learning (QML) Closer Than We Think? Select areas within quantum computing are beginning to shift from long-term aspiration to practical impact. One of the most promising developments is Quantum Machine Learning, where early pilots are uncovering advantages that classical systems are unable to match. 🔷 The Quantum Advantage: Quantum computers operate on qubits, which can represent multiple states simultaneously. This enables them to process complex, interdependent variables at a scale and speed that classical machines cannot. While current hardware still faces limitations, consistent progress in simulation and optimization is confirming the technology’s potential. 🔷 Why QML Matters: QML combines quantum circuits with classical models to unlock performance improvements in targeted, data-intensive domains. Early-stage experimentation is already showing promise: • Accelerated training for complex models • More effective handling of high-dimensional and sparse datasets • Greater accuracy with smaller sample sizes 🔷 The Timeline Is Shortening: Quantum systems are inherently probabilistic, aligning well with generative AI and modeling under uncertainty. Just as classical computing advanced despite hardware imperfections, current-generation quantum systems are producing measurable results in narrow but high-value use cases. As these outcomes become more consistent, enterprise adoption will follow. 🔷 What Enterprises Can Do Today: Quantum hardware does not need to be perfect for companies to begin exploring value. Practical entry points include: • Simulating rare or complex risk scenarios in finance and operations • Using quantum inspired sampling for better forecasting and sensitivity analysis • Generating synthetic datasets in regulated or data scarce environments • Targeting challenges where classical AI struggles, such as subtle anomalies or low signal environments • Exploring use cases in fraud detection, claims forecasting, patient risk stratification, drug efficacy modeling, and portfolio optimization 🔷 Final Thought: Quantum Machine Learning is no longer confined to research. It is becoming a tool with real strategic potential. Organizations that begin investing in awareness, experimentation, and talent today will be better positioned to lead as the ecosystem matures. #QuantumMachineLearning #QuantumComputing #AI
-
⚛️ Sequential Quantum Computing 📑 We propose and experimentally demonstrate sequential quantum computing (SQC), a paradigm that utilizes multiple homogeneous or heterogeneous quantum processors in hybrid classical-quantum workflows. In this manner, we are able to overcome the limitations of each type of quantum computer by combining their complementary strengths. Current quantum devices, including analog quantum annealers and digital quantum processors, offer distinct advantages, yet face significant practical constraints when individually used. SQC addresses this by efficient inter-processor transfer of information through bias fields. Consequently, measurement outcomes from one quantum processor are encoded in the initial-state preparation of the subsequent quantum computer. We experimentally validate SQC by solving a combinatorial optimization problem with interactions up to three-body terms. A D-Wave quantum annealer utilizing 678 qubits approximately solves the problem, and an IBM’s 156-qubit digital quantum processor subsequently refines the obtained solutions. This is possible via the digital introduction of non-stoquastic counterdiabatic terms unavailable to the analog quantum annealer. The experiment shows a substantial reduction in computational resources and improvement in the quality of the solution compared to the standalone operations of the individual quantum processors. These results highlight SQC as a powerful and versatile approach for addressing complex combinatorial optimization problems, with potential applications in quantum simulation of many-body systems, quantum chemistry, among others. ℹ️ Romero et al - 2025
-
Quantum computing for financial mathematics A key paper published in 2023 by Jack Jacquier, Oleksiy Kondratyev, Gordon Lee, and Mugad Oumgari reviews the state of quantum computing in financial mathematics and leaves a clear message: the value is not in waiting for the perfect machine, but in how we manage the transition with what we already have. Three application lines highlighted by the authors - Portfolio optimization with variational algorithms (QAOA, VQE), where hybrid approaches already help explore scenarios that scale poorly in the classical world. - Quantum Machine Learning, with generative and discriminative models (QGANs, QNNs, Quantum Circuit Born Machines) applied to market data generation, credit scoring, and detection of distribution shifts. - Quantum Monte Carlo, with algorithms achieving a quadratic speedup in expectation estimation, useful for high-dimensional derivative pricing. Other areas mentioned The paper also points to the potential of Quantum Semidefinite Programming (QSDP) for robust risk management and portfolio optimization under uncertainty. The key takeaway The authors emphasize: it’s not just about speed, it’s about thinking differently. - Use quantum algorithms to accelerate critical steps of classical pipelines. - Develop hybrid and quantum-inspired schemes. - Prepare data structures and methodologies that can scale once hardware matures. Ultimately: the real race lies in turning current limitations into opportunities for integration and new value models, while technological acceleration follows its own path. Link https://lnkd.in/d-CPDkN9 Imperial College London Abu Dhabi Investment Authority (ADIA) Lloyds Banking Group
-
🚨 Quantum Computing Breakthrough in Finance 🚨 HSBC just announced a world-first. By using IBM’s Heron quantum processor, the bank achieved a 34% improvement in predicting bond trading probabilities. This marks the first time a bank has applied quantum computing to real financial trading data at scale, moving beyond theory and into production-level application. Some are calling this a “Sputnik moment” for quantum. That is not a perfect analogy, given the geopolitical nature of Sputnik and the corporate implications of HSBC's use of quantum computing. But I am not surprised to see a big leap forward for quantum in the world of finance. In fact, when I wrote Quantum: Computing Nouveau back in 2018, I predicted this exact trajectory: that quantum would move from academic labs to financial markets and other industries where optimization, forecasting, and massive data challenges are prevalent. In my 2018 book, I outlined - Why finance would be among the earliest adopters of quantum, thanks to its reliance on complex risk management, forecasting, and trading models. - How quantum computing could deliver step-change improvements in processing power, solving problems classical computing struggles and corporate NP problems. In computer science, NP (nondeterministic polynomial-time) problems are problems where it’s easy to verify a solution once you have it, but extremely hard to calculate the solution in the first place. - The looming arms race for quantum advantage, not only among tech companies, but also in financial services, energy, logistics, and governments. HSBC’s milestone confirms that we’re crossing the threshold from theory to practice. Quantum computing isn’t just “new math”—it’s new computing, with profound implications for markets, cybersecurity, and global competition. 🔮 Back in 2018, I wrote that quantum computing is not just optional. It is a conditio sine qua non for the future of finance and data-driven industries. Today, we’re watching that future unfold. #Quantum #QuantumComputing #Future #Finance https://lnkd.in/gMNc2M9b
-
Quantum computing in finance suggests a transition where the complexity of simulations and risk models becomes an asset, enabling strategies that adapt faster, forecast with more accuracy, and design portfolios with refined precision. Financial markets generate enormous amounts of data every second. Traditional systems reach limits when they face highly volatile scenarios or extreme events. Quantum approaches introduce a different perspective, because they can process massive sets of possibilities in parallel and make pattern recognition more effective. This means that forecasting models can reflect subtle market shifts with greater reliability. Risk simulations can extend to rare and extreme cases that were harder to anticipate before. Portfolio design can benefit from deeper optimization across multiple constraints, leading to more efficient allocations. The reduction of computational costs adds another dimension, allowing firms to achieve complex results with fewer resources and less time. This is where strategy and technology meet: the capacity to transform uncertainty into structured decision-making. The real question is how financial leaders will embrace these tools. Will they remain experimental, or will they guide the next wave of financial optimization? #QuantumComputing #FinancialOptimization #RiskManagement #FutureOfFinance