Many of you will have seen the news about HSBC’s world-first application of quantum computing in algorithmic bond trading. Today, I’d like to highlight the technical paper that explains the research behind this milestone. In collaboration with IBM, our teams investigated how quantum feature maps can enhance statistical learning methods for predicting the likelihood that a trade is filled at a quoted price in the European corporate bond market. Using production-scale, real trading data, we ran quantum circuits on IBM quantum computers to generate transformed data representations. These were then used as inputs to established models including logistic regression, gradient boosting, random forest, and neural networks. The results: • Up to 34% improvement in predictive performance over classical baselines. • Demonstrated on real, production-scale trading data, not synthetic datasets. • Evidence that quantum-enhanced feature representations can capture complex market patterns beyond those typically learned by classical-only methods. This marks the first known application of quantum-enhanced statistical learning in algorithmic trading. For full technical details please see our published paper: 📄 Technical paper: https://lnkd.in/eKBqs3Y7 📰 Press release: https://lnkd.in/euMRbbJG Congratulations to Philip Intallura Ph.D , Joshua Freeland Freeland and all HSBC colleagues involved — and huge thanks to IBM for their partnership.
Quantum Computing Applications in Probability and Data Science
Explore top LinkedIn content from expert professionals.
Summary
Quantum computing applications in probability and data science use the unique properties of quantum mechanics to solve complex statistical problems and analyze data more efficiently than traditional computers. These innovations are transforming fields like finance, machine learning, and language processing by offering faster computations and deeper insights.
- Explore quantum circuits: Try using quantum circuits to simulate random events and estimate probabilities, which can speed up tasks like risk analysis and data modeling.
- Integrate quantum methods: Incorporate quantum algorithms into machine learning workflows to capture patterns and relationships that classical approaches might miss.
- Experiment with quantum probability: Apply quantum probability techniques to enrich predictions and contextual understanding, especially in areas such as language modeling and portfolio optimization.
-
-
Excited to share another new #QuantumComputing result from Global Technology Applied Research at JPMorganChase. We have justed posted a new arXiv preprint titled "On Speedups for Convex Optimization via Quantum Dynamics" (https://lnkd.in/e2sRz_my), which follows our recent work on “Fast Convex Optimization with Quantum Gradient Methods”(https://lnkd.in/eMtqXM-r). Convex optimization is a fundamental subroutine in #machinelearning, #engineering, and #datascience with many applications in #FinancialEngineering, and understanding the full potential for #quantum speedup is of great interest. Complementing our previous research on quantum gradient methods, we now consider a natural optimization algorithm inspired by physics, namely, the simulation of a quantum particle subject to a potential defined by the objective function. Specifically, we study discrete simulations of the Quantum Hamiltonian Descent (QHD) framework (https://lnkd.in/e9xw_DDb) and establish the first rigorous query complexity bounds for this approach. Our findings reveal that, while the simulation of QHD probably does not improve upon classical algorithms for exact objective functions, it in fact offers a super-quadratic speedup over all known classical algorithms in the high-dimensional regime for noisy or stochastic convex optimization! These settings are common in machine learning, #reinforcementlearning, and #portfoliooptimization with empirically calibrated parameters. Our research highlights the potential for large quantum speedups on such problems. Together with our previous work, this illustrates that gradient-based and dynamical methods for quantum convex optimization are complementary: with quantum gradient methods providing large speedups in the noiseless setting, and the dynamical approach providing large speedups in the noisy and stochastic setting. Co-authors: Shouvanik Chakrabarti, Dylan Herman, Jacob Watkins, Enrico Fontana, Brandon Augustino, Junhyung Lyle Kim, and Marco Pistoia.
-
𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗣𝗿𝗼𝗯𝗮𝗯𝗶𝗹𝗶𝘁𝘆 × 𝗟𝗟𝗠 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝗰𝗲 𝖰𝗎𝖺𝗇𝗍𝗎𝗆 𝖺𝗆𝗉𝗅𝗂𝗍𝗎𝖽𝖾𝗌 𝗋𝖾𝖿𝗂𝗇𝖾 𝗅𝖺𝗇𝗀𝗎𝖺𝗀𝖾 𝗉𝗋𝖾𝖽𝗂𝖼𝗍𝗂𝗈𝗇 𝖯𝗁𝖺𝗌𝖾 𝖺𝗅𝗂𝗀𝗇𝗆𝖾𝗇𝗍 𝖾𝗇𝗋𝗂𝖼𝗁𝖾𝗌 𝖼𝗈𝗇𝗍𝖾𝗑𝗍𝗎𝖺𝗅 𝗇𝗎𝖺𝗇𝖼𝖾 Classical probability treats token likelihoods as isolated scalars, but quantum computation reimagines them as amplitude vectors whose phases encode latent context. By mapping transformer outputs onto Hilbert spaces, we unlock interference patterns that selectively amplify coherent meanings while cancelling noise, yielding sharper posteriors with fewer samples. Variational quantum circuits further permit gradient‑based training of unitary operators, allowing language models to entangle distant dependencies without the quadratic memory overhead of classical self‑attention. The result is not simply faster or smaller models, but a fundamentally richer probabilistic grammar where superposition captures ambiguity and measurement collapses it into actionable insight. As qubit counts rise and error rates fall, the convergence of quantum linear algebra and deep semantics promises a new era in which language understanding is limited less by data volume than by our willingness to rethink probability itself. #quantum #ai #llm
-
In finance, Monte Carlo simulations help us to measure risks like VaR or price derivatives, but they’re often painfully slow because you need to generate millions of scenarios. Matsakos and Nield suggest something different: they build everything directly into a quantum circuit. Instead of precomputing probability distributions classically, they simulate the future evolution of equity, interest rate, and credit variables inside the quantum computer, including binomial trees for stock prices, models for rates, and credit migration or default models. All that is done within the circuit, and then quantum amplitude estimation is used to extract risk metrics without any offline preprocessing. This means you keep the quadratic speedup of quantum MC while also removing the bottleneck of classical distribution generation. If you want to explore the topic further, here is the paper: https://lnkd.in/dMHeAGnS #physics #markets #physicsinfinance #derivativespricing #quant #montecarlo #simulation #finance #quantitativefinance #financialengineering #modeling #quantum