Many of you will have seen the news about HSBC’s world-first application of quantum computing in algorithmic bond trading. Today, I’d like to highlight the technical paper that explains the research behind this milestone. In collaboration with IBM, our teams investigated how quantum feature maps can enhance statistical learning methods for predicting the likelihood that a trade is filled at a quoted price in the European corporate bond market. Using production-scale, real trading data, we ran quantum circuits on IBM quantum computers to generate transformed data representations. These were then used as inputs to established models including logistic regression, gradient boosting, random forest, and neural networks. The results: • Up to 34% improvement in predictive performance over classical baselines. • Demonstrated on real, production-scale trading data, not synthetic datasets. • Evidence that quantum-enhanced feature representations can capture complex market patterns beyond those typically learned by classical-only methods. This marks the first known application of quantum-enhanced statistical learning in algorithmic trading. For full technical details please see our published paper: 📄 Technical paper: https://lnkd.in/eKBqs3Y7 📰 Press release: https://lnkd.in/euMRbbJG Congratulations to Philip Intallura Ph.D , Joshua Freeland Freeland and all HSBC colleagues involved — and huge thanks to IBM for their partnership.
Quantum Computing Applications in Predictive Modeling
Explore top LinkedIn content from expert professionals.
Summary
Quantum computing applications in predictive modeling use the unique abilities of quantum computers to solve complex prediction and classification problems that are challenging for traditional computers. By harnessing quantum mechanics, these applications bring faster processing and more nuanced analysis to fields like finance, healthcare, language modeling, and cybersecurity.
- Explore new algorithms: Try quantum-inspired approaches such as quantum feature mapping or Hamiltonian-based frameworks to tackle tasks that require sophisticated pattern recognition.
- Combine resources: Use hybrid quantum-classical workflows to maximize the strengths of both computing types for projects in areas like biotechnology, fraud detection, and financial modeling.
- Protect data: Take advantage of quantum models that support privacy by sharing only minimal information during training, making them suitable for sensitive applications.
-
-
I’d like to draw your attention to a new paper on arXiv, “Shallow-circuit Supervised Learning on a Quantum Processor”, from IBM and Qognitive that develops a Hamiltonian-based framework for quantum machine learning. Instead of fixed amplitude or angle encodings used in many prior approaches, our method learns a local Hamiltonian embedding for classical data. https://lnkd.in/ejcxYstW We are very interested in new approaches to QML as we deal with recurring bottlenecks like expensive classical data loading and difficult training dynamics in parameterized circuit models. Here, both the feature operators and the label operator are learned during training, with predictions obtained from measurements on an approximate ground state. This aims to avoid those bottlenecks. A key enabler is Sample-based Krylov Quantum Diagonalization (SKQD), which approximates low-energy states by sampling from time-evolved Krylov states and then diagonalizing the Hamiltonian in the sampled subspace. SKQD was recently employed to estimate low-energy properties of impurity models (https://lnkd.in/epwCrG5R). In our setting, restricting to 2-local Hamiltonian embeddings keeps the required time-evolution circuits relatively shallow, which helps make the approach practical on current quantum processors. The team demonstrates end-to-end training on IBM Heron processor up to 50 qubits, with non-vanishing gradients and strong proof-of-concept performance on a binary classification task. There are many exciting next steps here, including testing on broader datasets, using more expressive operator ansatz, and performing systematic comparisons to strong classical baselines to pinpoint when Hamiltonian-based encodings offer the right inductive bias. I encourage the community to try out this approach and explore where it can be extended in meaningful ways.
-
𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗣𝗿𝗼𝗯𝗮𝗯𝗶𝗹𝗶𝘁𝘆 × 𝗟𝗟𝗠 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝗰𝗲 𝖰𝗎𝖺𝗇𝗍𝗎𝗆 𝖺𝗆𝗉𝗅𝗂𝗍𝗎𝖽𝖾𝗌 𝗋𝖾𝖿𝗂𝗇𝖾 𝗅𝖺𝗇𝗀𝗎𝖺𝗀𝖾 𝗉𝗋𝖾𝖽𝗂𝖼𝗍𝗂𝗈𝗇 𝖯𝗁𝖺𝗌𝖾 𝖺𝗅𝗂𝗀𝗇𝗆𝖾𝗇𝗍 𝖾𝗇𝗋𝗂𝖼𝗁𝖾𝗌 𝖼𝗈𝗇𝗍𝖾𝗑𝗍𝗎𝖺𝗅 𝗇𝗎𝖺𝗇𝖼𝖾 Classical probability treats token likelihoods as isolated scalars, but quantum computation reimagines them as amplitude vectors whose phases encode latent context. By mapping transformer outputs onto Hilbert spaces, we unlock interference patterns that selectively amplify coherent meanings while cancelling noise, yielding sharper posteriors with fewer samples. Variational quantum circuits further permit gradient‑based training of unitary operators, allowing language models to entangle distant dependencies without the quadratic memory overhead of classical self‑attention. The result is not simply faster or smaller models, but a fundamentally richer probabilistic grammar where superposition captures ambiguity and measurement collapses it into actionable insight. As qubit counts rise and error rates fall, the convergence of quantum linear algebra and deep semantics promises a new era in which language understanding is limited less by data volume than by our willingness to rethink probability itself. #quantum #ai #llm
-
Interesting approach alert! QUBO-based SVM tested on QPU (Neutral Atoms). A recent study, "QUBO-based SVM for credit card fraud detection on a real QPU," explores the application of a novel quantum approach to a critical cybersecurity challenge: credit card fraud detection. Here are some of the key findings: * QUBO-based SVM model: The study successfully implemented a Support Vector Machine (SVM) model whose training is reformulated as a Quadratic Unconstrained Binary Optimization (QUBO) problem. This approach could leverage the capabilities of quantum processors. * Performance: The results demonstrate that a version of the QUBO SVM model, particularly when used in a stacked ensemble configuration, achieves high performance with low error rates. The stacked configuration uses the QUBO SVM as a meta-model, trained on the outputs of other models. * Noise robustness: Surprisingly, the study observed that a certain amount of noise can lead to enhanced results. This is a new phenomenon in quantum machine learning, but it has been seen in other contexts. The models were robust to noise both in simulations and on the real QPU. * Scalability: Experiments were extended up to 24 atoms on the real QPU, and the study showed that performance increases as the size of the training set increases. This suggests that even better results are possible with larger QPUs. Practical implications: This research highlights the potential of quantum machine learning for real-world applications, using a hybrid approach where the training is performed on a QPU and the testing on classical hardware. This approach makes the model applicable on current NISQ devices. The model is also advantageous because it uses the QPU only for training, reducing costs and allowing the trained model to be reused. * Ideal for cybersecurity and regulatory issues: The study also observed that the model preserves data privacy because only the atomic coordinates and laser parameters reach the QPU, and the model test is done locally. Here the article: https://lnkd.in/d5Vfhq2G #quantumcomputing #machinelearning #cybersecurity #frauddetection #neutralatoms #QPU #NISQ #quantumml #fintech #datascience
-
We’ve all known #Moderna for some years for their role in transforming mRNA science to rapidly develop life-saving vaccines and therapeutics. But what you might not know is that behind almost every mRNA innovation lies an incredibly hard problem: figuring out how each sequence folds. Each mRNA strand can twist and loop into an astronomical number of secondary structures. Only a handful of those make sense, given the physical laws governing molecular behavior. Predicting which ones are biologically plausible? That involves solving a complex combinatorial optimization problem, which turns out to be a sweet spot for quantum computing… exactly where pure classic approaches hit a wall. So the team began creating and testing quantum novel algorithms -like CVaR VQE- and benchmarking them against classical solvers to predict mRNA folding. And the results? The Quantum-enabled pipeline is already matching classic solvers and is expected to augment beyond what’s at reach of classic computers today. ‼️ You can read all details here: https://lnkd.in/ex5gxDCn. You will learn about: 🔹 A 𝐧𝐞𝐚𝐫-𝐭𝐞𝐫𝐦 𝐪𝐮𝐚𝐧𝐭𝐮𝐦-𝐞𝐧𝐚𝐛𝐥𝐞𝐝 𝐛𝐢𝐨𝐭𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐲 𝐩𝐢𝐩𝐞𝐥𝐢𝐧𝐞 🔹 Massive 𝐥𝐚𝐫𝐠𝐞 𝐬𝐜𝐚𝐥𝐞: last year, we ran the largest variational quantum algorithm yet -80 qubits modeling 60-nucleotide mRNA strands (and targeting this year 156 qubits and 950-gate circuits) 🔹A 𝐜𝐥𝐞𝐯𝐞𝐫 𝐚𝐥𝐠𝐨𝐫𝐢𝐭𝐡𝐦𝐢𝐜 𝐛𝐨𝐨𝐬𝐭: adding a Conditional Value at Risk (CVaR) lightweight classical post-processing step, to reduce the sensitivity to noisy outliers. 🔹 𝐑𝐞𝐜𝐨𝐫𝐝-𝐦𝐚𝐭𝐜𝐡𝐢𝐧𝐠 𝐩𝐞𝐫𝐟𝐨𝐫𝐦𝐚𝐧𝐜𝐞: the quantum-enhanced simulations are now reaching the same quality as top classical solvers and aiming at going beyond, proving what a powerful platform Quantum Computing is for Science. To me, this case study perfectly shows 2 vectors we are fully committed at IBM: 1. 𝐐𝐮𝐚𝐧𝐭𝐮𝐦-𝐜𝐥𝐚𝐬𝐬𝐢𝐜 𝐰𝐨𝐫𝐤𝐟𝐥𝐨𝐰𝐬: the future of computing is going to be full of these hybrid approaches aiming at combining the most efficient use of quantum and classical resources in a 𝐣𝐨𝐢𝐧𝐭 𝐪𝐮𝐚𝐧𝐭𝐮𝐦 𝐡𝐢𝐠𝐡-𝐩𝐞𝐫𝐟𝐨𝐫𝐦𝐚𝐧𝐜𝐞 𝐜𝐨𝐦𝐩𝐮𝐭𝐢𝐧𝐠 (𝐇𝐏𝐂) 𝐞𝐧𝐯𝐢𝐫𝐨𝐧𝐦𝐞𝐧𝐭. 2. 𝐀𝐥𝐠𝐨𝐫𝐢𝐭𝐡𝐦𝐢𝐜 𝐢𝐧𝐧𝐨𝐯𝐚𝐭𝐢𝐨𝐧: when you represent your problem in mathematical terms, abstracting from the domain, it is much easier to borrow ideas from other domains and boost innovation (probably you know that CVaR or Conditional Value at Risk comes from finance). IBM IBM Research IBM Quantum #innovationthatmatters #Science #FutureOfComputing
-
Quantum computing in finance suggests a transition where the complexity of simulations and risk models becomes an asset, enabling strategies that adapt faster, forecast with more accuracy, and design portfolios with refined precision. Financial markets generate enormous amounts of data every second. Traditional systems reach limits when they face highly volatile scenarios or extreme events. Quantum approaches introduce a different perspective, because they can process massive sets of possibilities in parallel and make pattern recognition more effective. This means that forecasting models can reflect subtle market shifts with greater reliability. Risk simulations can extend to rare and extreme cases that were harder to anticipate before. Portfolio design can benefit from deeper optimization across multiple constraints, leading to more efficient allocations. The reduction of computational costs adds another dimension, allowing firms to achieve complex results with fewer resources and less time. This is where strategy and technology meet: the capacity to transform uncertainty into structured decision-making. The real question is how financial leaders will embrace these tools. Will they remain experimental, or will they guide the next wave of financial optimization? #QuantumComputing #FinancialOptimization #RiskManagement #FutureOfFinance