As quantum computers enter the utility era, with users executing circuits on 100 or more qubits, the performance of quantum computing software begins to play a prominent role. With this in mind, starting in 2020 Qiskit began the move from a mainly Python-based package to one utilizing the Rust programming language. What began with creating a highly optimized graph library in Rust (https://lnkd.in/eUdwqiMU), has now culminated in most of the circuit creation, manipulation, and transpilation code being fully ported over in the upcoming Qiskit 1.3. The fruits of this labor are easy to verify, with Qiskit outperforming competing SDKs in terms of runtime by an order of magnitude or more, as measured by rigorous benchmarks (https://lnkd.in/e98wniXY). However, algorithmic improvements also play a critical role in Qiskit's continued success. The team recently released a paper highlighting 18-months of effort optimizing the routing of circuits to match the topology of a target quantum device. This new LightSABRE method (https://lnkd.in/eMgm3TMG) is 200x faster than previous implementations, and reduces the number of two-qubit gates by nearly 20% compared to the original SABRE algorithm. In addition, LightSABRE, supports complex quantum architectures, disjoint connectivity graphs, and classical flow-control. The work the team puts into optimizing and enhancing Qiskit is one of the primary reasons why nearly 70% of quantum developers select Qiskit as their go-to quantum computing SDK.
Quantum Programming Trends Since 2018
Explore top LinkedIn content from expert professionals.
Summary
Quantum programming has evolved rapidly since 2018, shifting from theoretical research to practical, hybrid systems that combine quantum and classical computing. Quantum programming refers to using specialized software and algorithms to harness the unique capabilities of quantum computers, solving problems that are too complex for traditional systems.
- Embrace hybrid models: Explore the integration of quantum and classical computing workflows, as they are increasingly becoming the norm for tackling complex challenges in fields like chemistry and artificial intelligence.
- Prioritize algorithmic advances: Focus on the latest quantum algorithms and programming frameworks, which are improving at a much faster rate than hardware and enabling breakthroughs in simulation and optimization.
- Monitor hardware evolution: Stay aware of new quantum hardware developments and data structures, since progress in photonics, error correction, and device connectivity is driving the transition toward scalable, practical quantum systems.
-
-
NVIDIA CEO Jensen Huang recently claimed that practical quantum computing is still 15 to 30 years away and will require NVIDIA #GPUs to build hybrid quantum/classical supercomputers. But both the timeline and the hardware assumption are off the mark. Quantum computing is progressing much faster than many realize. Google’s #Willow device has demonstrated that scaling up quantum systems can exponentially reduce errors, and it achieved a benchmark in minutes that would take classical supercomputers countless billions of years. While not yet commercially useful, it shows that both quantum supremacy and fault tolerance are possible. PsiQuantum, a company building large-scale photonic quantum computers, plans to bring two commercial machines online well before the end of the decade. These will be 10,000 times larger than Willow and will not use GPUs, but rather custom high-speed hardware specifically designed for error correction. Meanwhile, quantum algorithms are advancing rapidly. PsiQuantum recently collaborated with Boehringer Ingelheim to achieve over a 200-fold improvement in simulating molecular systems. Phasecraft, the leading quantum algorithms company, has developed quantum-enhanced algorithms for simulating materials, publishing results that threaten to outperform classical methods even on current quantum hardware. Algorithms are improving 1000s of times faster than hardware, and with huge leaps in hardware from PsiQuantum, useful quantum computing is inevitable and increasingly imminent. This progress is essential because our existing tools for simulating nature, particularly in chemistry and materials science, are limited. Density Functional Theory, or DFT, is widely used to model the electronic structure of materials but fails on many of the most interesting highly correlated quantum systems. When researchers tried to evaluate the purported room-temperature superconductor LK-99, #DFT failed entirely, and researchers were forced to revert to cook-and-look to get answers. Even cutting-edge #AI models like DeepMind’s GNoME depend on DFT for training data, which limits their usefulness in domains where DFT breaks down. Without more accurate quantum simulations, AI cannot meaningfully explore the full complexity of quantum systems. To overcome these barriers, we need large-scale quantum computers. Building machines with millions of qubits is a significant undertaking, requiring advances in photonics, cryogenics, and systems engineering. But the transition is already underway, moving from theoretical possibility to construction. Quantum computing offers a path from discovery to design. It will allow us to understand and engineer materials and molecules that are currently beyond our reach. Like the transition from the stone age to ages of metal, electricity, and semiconductors, the arrival of quantum computing will mark a new chapter in our mastery of the physical world.
-
#QuantumTuesday What if the key to unlocking quantum computing's full potential lies not in brute force but in elegant simplicity? As the GoTo Fractional Quantum Chief Intellectual Property Officer, I constantly explore the intersection of innovation, strategy, and disruptive technologies. Today, I’m thrilled to share insights from an extraordinary paper: "Tensor Quantum Programming" by A. Termanova et al. This work brilliantly merges tensor networks (TNs) and quantum computing, opening doors to solving some of the most complex computational problems of our time. Imagine tackling partial differential equations, quantum chemistry simulations, or machine learning models not with overwhelming computational resources but by leveraging tensor efficiency and the unique strengths of quantum circuits. This hybrid approach - classical for simplicity, quantum for complexity - redefines the rules of computation. Key takeaways from this breakthrough: 🔑 Efficiency Redefined: TNs are mapped to quantum circuits, creating a paradigm where high-dimensional problems scale linearly in complexity. Yes, you read that right - linear scalability in quantum circuits for problems that traditionally overwhelmed classical systems. 🔑 Applications Everywhere: - Simulating Hamiltonians for quantum systems. - Optimizing black-box functions with precision. - Revolutionizing quantum chemistry, from molecular dynamics to electron correlations. - Enhancing machine learning models by encoding TN architectures directly onto quantum platforms. 🔑 The Future Is Here: By bridging the gap between classical and quantum resources, Tensor Quantum Programming paves the way for solving real-world problems, from innovation-driven industries to fundamental research. This paper highlights an important truth: quantum computing isn't about doing more of the same; it’s about doing what was previously impossible. For those of us in the business of strategy and intellectual property, such breakthroughs represent not just scientific progress but entirely new frontiers for value creation. As an IP Alchemist, this inspires me to think about how we can protect and leverage these innovations to shape industries and fuel growth. How do we ensure that the architectures we build today are not just protected but optimized for tomorrow’s quantum future? What are your thoughts on the role of hybrid approaches like this in quantum computing? Let’s connect and dive into the possibilities. 🚀 #QuantumComputing #TensorNetworks #InnovationStrategy #IPManagement #DeepTechDisruption Terra Quantum AG Markus Pflitsch Artem Melnikov Aleksandr Berezutskii Roman Ellerbrock Michael Perelshtein
-
I walked into Quantum Developer Day 2025 in Chicago with a simple question: "When will quantum computing actually matter for the work many people do today?" The breakthrough isn't coming from quantum computers working alone, it's actually happening, right now, in the space where quantum and classical computing meet. 𝗣𝗶𝗰𝘁𝘂𝗿𝗲 𝘁𝗵𝗶𝘀: AI agents 𝗼𝗿𝗰𝗵𝗲𝘀𝘁𝗿𝗮𝘁𝗶𝗻𝗴 𝗾𝘂𝗮𝗻𝘁𝘂𝗺-𝗰𝗹𝗮𝘀𝘀𝗶𝗰𝗮𝗹 𝘄𝗼𝗿𝗸𝗳𝗹𝗼𝘄𝘀, automatically 𝗼𝗽𝘁𝗶𝗺𝗶𝘇𝗶𝗻𝗴 𝗰𝗶𝗿𝗰𝘂𝗶𝘁 𝗱𝗲𝘀𝗶𝗴𝗻𝘀 while classical systems handle the heavy lifting they do best. It's not science fiction. Teams at IBM Quantum, Xanadu, qBraid, IonQ, and Quantum Rings are building it today. 𝗛𝗲𝗿𝗲'𝘀 𝘄𝗵𝗮𝘁 𝘀𝘁𝗼𝗽𝗽𝗲𝗱 𝗺𝗲 𝗶𝗻 𝗺𝘆 𝘁𝗿𝗮𝗰𝗸𝘀: First, researchers from the University of Washington explained how quantum-classical data structures are becoming the bridge we've needed. Imagine seamlessly passing data between classical and quantum without the integration nightmare developers face today. These new data structures would simplify the handoff, making hybrid workflows feel natural rather than forced. Then Laura Gagliardi along with Mario Motta and Qiaohong(Joanna) Wang, shared how these hybrid systems are already changing chemistry timelines. Faster, smarter molecular simulations. 𝗕𝘂𝘁 𝘁𝗵𝗲 𝗵𝗶𝗴𝗵𝗹𝗶𝗴𝗵𝘁 𝗼𝗳 𝗺𝘆 𝗱𝗮𝘆? Sitting on a panel with brilliant minds Sanket Panda and Jordan Sullivan, discussing how quantum computing will impact developers across industry and academia. The questions from the audience reminded me why events like this matter, developers are ready to build with quantum tools, but they need the right abstractions, the right data structures, and the right integration patterns to make it practical. 𝗧𝗵𝗲 𝗿𝗲𝗮𝗹 𝗿𝗲𝘃𝗲𝗹𝗮𝘁𝗶𝗼𝗻? This isn't about replacing our current tech stack. It's about augmentation. Quantum computing has the potential to excel specific optimization problems. Classical computing handles everything else. Together, they're unlocking solutions neither could achieve alone. Massive thanks to Kenny Heitritter, and Brian Pearson for creating a space where these ideas didn't just feel possible—they felt inevitable. If you're working in chemistry, life sciences, ML, or any field where complex simulations and optimization are bottlenecks, this convergence deserves your attention. The developers building quantum-classical data structures today are paving the way for breakthroughs tomorrow. What's one computational challenge in your field? Whether it's molecular dynamics, materials discovery, or complex optimization that seems to hit a wall? I'm curious if quantum-classical hybrid systems might be the breakthrough we've been waiting for. #QuantumComputing #AI #Innovation #MachineLearning #TechLeadership #ChicagoQuantumExchange #LifeSciences #Chemistry #ChicagoQuantumSummit #CQS2025 #MidwestQuantumWeek #qBraidDeveloperDay
-
Quantum whispers in the GPU roar For Wall Street, more AI means more GPUs, more datacenters, more cloud contracts. And OpenAI–NVIDIA $100B deal locks it in. But quieter signals from research point to a second axis of scaling: not just more metal, but smarter math. It’s about quantum. Let me give you some notable examples from the last week research: 1. Compression: QKANs and quantum activation functions Paper: Quantum Variational Activation Functions Empower Kolmogorov-Arnold Networks Offers replacing fixed nonlinearities with single-qubit variational circuits (DARUANs). These tiny activations generate exponentially richer frequency spectra → so we get same power with exponentially fewer parameters. Quantum KANs (QKANs), built on this idea, already outperformed MLPs and KANs with 30% fewer parameters. 2. Exactness: Coset sampling for lattice algorithms Paper: Exact Coset Sampling for Quantum Lattice Algorithms Proposes a subroutine that cancels unknown offsets and produces exact, uniform cosets, making subsequent Fourier sampling provably correct. Injecting mathematically guaranteed steps into probabilistic workflows means precision: fewer wasted tokens, fewer dead-end paths, less variance in cost per query. 3. Hybridization: quantum-classical models in practice Paper: Hybrid Quantum-Classical Model for Image Classification These models dropped small quantum layers into classical CNNs, showing that they can train faster and use fewer parameters than classical versions. ▪️ What does this mean for inference scaling? Scaling won’t only mean bigger clusters for bigger models. It might also be about: - extracting more from each parameter - cutting errors at the source - and blending quantum and classical strengths. Notably, this direction is not lost on the companies like NVIDIA. There are several signs: • NVIDIA's CUDA-Q – an open software platform for hybrid quantum-classical programming. • NVIDIA also launched DGX Quantum, a reference architecture linking quantum control systems directly into AI supercomputers. • They are opening a dedicated quantum research center with hardware partners. • Jensen Huang is aggressively investing into quantum startups like PsiQuantum (just raised $1B, saying it’s computer will be ready in two years), Quantinuum, and QuEra through NVentures - a major strategic shift in 2025, validating quantum's commercial timeline. ▪️ So what we will see: GPUs will remain central. But quantum ideas will be slipping into the story of inference scaling. They are still early, but it's the new axis worth paying attention to. What do you think about it?
-
Quantum Dawn: IBM and Google Push Quantum Computing Toward Practical Supremacy Introduction Quantum computing is rapidly transitioning from theoretical exploration to applied capability. Recent advances from IBM and Google indicate that quantum advantage in real-world problems may arrive sooner than many expected, reshaping multiple industries. Key breakthroughs driving momentum IBM unveiled two major processors at its Quantum Developer Conference. Nighthawk delivers 120 qubits with 218 tunable couplers, enabling deeper circuits, higher connectivity, and reduced error rates. IBM targets verified quantum advantage by 2026. Loon focuses on fault tolerance, a prerequisite for scalable systems, supporting IBM’s goal of fault-tolerant quantum computing by 2029, reinforced by continued Qiskit software improvements. Google reported parallel progress with its Willow chip and the Quantum Echoes algorithm. Quantum Echoes reportedly outperforms leading classical supercomputers by up to 13,000 times on select workloads. Demonstrations in molecular modeling suggest early quantum utility rather than purely symbolic benchmarks. Emerging real-world applications Automotive and aerospace firms such as BMW and Airbus are applying quantum simulations to fuel cells, materials science, and aerodynamics. Financial institutions are testing quantum approaches for complex market simulations and risk analysis. Pharmaceutical and materials research could see dramatic reductions in development timelines. Challenges, risks, and implications Error correction and qubit stability remain core hurdles, though tunable architectures and hybrid quantum-classical methods are accelerating progress. Advances heighten concerns around cryptography, with growing urgency to adopt quantum-resistant security standards. Investment and policy attention are increasing as quantum shifts from speculative research to strategic infrastructure. Why it matters IBM and Google are no longer asking whether quantum advantage is possible, but when it becomes operational. The convergence of hardware, algorithms, and industry adoption marks an inflection point where quantum computing begins to deliver tangible economic and strategic impact within this decade. I share daily insights with 37,000+ followers across defense, tech, and policy. If this topic resonates, I invite you to connect and continue the conversation. Keith King https://lnkd.in/gHPvUttw