Current Trends in Quantum Computing Development

Explore top LinkedIn content from expert professionals.

Summary

Quantum computing development is advancing rapidly, with researchers and major tech companies racing to build machines that use quantum bits (qubits) to solve problems far beyond the reach of classical computers. These breakthroughs promise to transform industries by unlocking new capabilities in areas like materials science, medicine, and artificial intelligence, thanks to quantum computers’ ability to explore multiple possibilities at once.

  • Monitor hardware progress: Keep an eye on innovations in quantum chips and qubit stability, as these are accelerating the timeline toward practical, large-scale quantum computers.
  • Explore real-world applications: Start considering how quantum solutions could address foundational challenges in fields such as chemistry, logistics, finance, and cybersecurity.
  • Engage with accessible platforms: Take advantage of cloud-based quantum computing services to experiment with quantum algorithms and prepare for future advances.
Summarized by AI based on LinkedIn member posts
  • View profile for Rajat Taneja
    Rajat Taneja Rajat Taneja is an Influencer

    President, Technology at Visa

    123,616 followers

    We may be standing at a moment in time for Quantum Computing that mirrors the 2017 breakthrough on transformers – a spark that ignited the generative AI revolution 5 years later. With recent advancements from Google, Microsoft, IBM and Amazon in developing more powerful and stable quantum chips, the trajectory of QC is accelerating faster than many of us expected.   Google’s Sycamore and next gen Willow chips are demonstrating increasing fidelity. Microsoft’s pursuit of topological qubits using Majorana particles promises longer coherence times and IBM’s roadmap is pushing towards modular error corrected systems. These aren’t just incremental steps, they are setting the stage for scalable, fault tolerant quantum machines.   Quantum systems excel at simulating the behavior of molecules and materials at atomic scale, solving optimization problems with exponentially large solution spaces and modeling complex probabilistic systems – tasks that could take classical supercomputers millennia. For example, accurately simulating protein folding or discovering new catalysts for carbon capture are well within quantum’s potential reach.   If scalable QC is just five years away, now is the time to ask : What would you do differently today, if quantum was real tomorrow ?. That question isn’t hypothetical – it’s an invitation to start rethinking foundational problems in chemistry, logistics, finance, AI and cryptography.   Of course building quantum systems is notoriously hard. Fragile qubits, error correction and decoherence remain formidable challenges. But globally public and private institutions are pouring resources into cracking these problems. I was in LA today visiting the famous USC Information Sciences Institute where cutting edge work on QC is underway and the energy is palpable.   This feels like a pivotal moment. One where future shaping ideas are being tested in real labs. Just as with AI, the future belongs to those preparing for it now. QC Is an area of emphasis at Visa Research and I hope it is part of how other organizations are thinking about the future too.

  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 14,000+ direct connections & 39,000+ followers.

    39,029 followers

    IBM to Launch the Largest Quantum Computer Yet in 2025 Overview: IBM plans to build the largest quantum computer to date by linking multiple smaller quantum chips in parallel. The project, set for 2025, aims to shatter existing records for qubit count, marking a significant leap in quantum computing capabilities. IBM’s goal is to more than triple the size of the largest current quantum machine while advancing practical quantum computing applications. Key Details: 1. IBM’s Quantum Roadmap: • IBM’s largest current quantum chip, Condor, contains 1,121 qubits. • By 2025, IBM plans to interconnect multiple chips to exceed this number, ultimately aiming to triple the largest existing system. 2. Milestone Achievements: • The company has successfully demonstrated linking two quantum chips, a key step toward building larger, interconnected systems. • This modular approach allows IBM to scale quantum systems beyond the physical and error-correction limits of single chips. 3. Quantum Computing Use Cases: • IBM provides cloud access to its quantum systems, with most users currently utilizing about 100 qubits for practical tasks. • The expansion to larger systems will enable more complex computations in fields like drug discovery, materials science, and logistics optimization. The Significance of More Qubits: 1. Increased Computational Power: • More qubits enable quantum systems to solve problems exponentially faster than classical computers. 2. Error Correction: • Scaling qubits allows for improved quantum error correction, a critical barrier to achieving reliable quantum computations. 3. Broader Accessibility: • Larger systems will allow more researchers and industries to access practical quantum applications via IBM’s cloud platform. IBM’s Competition in Quantum Computing: 1. Atom Computing Holds Current Record: • Start-up Atom Computing currently holds the record for the largest quantum system, slightly surpassing IBM. 2. Tech Industry Quantum Race: • Competitors like Google, Rigetti, and IonQ are also racing to scale up their quantum systems. 3. IBM’s Modular Strategy: • IBM’s approach focuses on scaling through chip interconnection, which could sidestep the limitations of monolithic single-chip systems. The Takeaway: IBM’s 2025 quantum computer project aims to break new ground by creating the largest quantum system ever built, leveraging interconnected quantum chips to scale qubit counts. While significant technical challenges remain—particularly around error correction and chip interconnectivity—the initiative marks a critical step toward practical, large-scale quantum computing. With competitors like Atom Computing and Google also advancing rapidly, the race for quantum supremacy intensifies, promising transformative impacts across science, industry, and technology in the near future.

  • View profile for Ross Dawson
    Ross Dawson Ross Dawson is an Influencer

    Futurist | Board advisor | Global keynote speaker | Founder: AHT Group - Informivity - Bondi Innovation | Humans + AI Leader | Bestselling author | Podcaster | LinkedIn Top Voice

    34,780 followers

    The last two days have seen two extremely interesting breakthroughs announced in quantum computing. There is a long path ahead, but these both point to the potential for dramatically upscaling ambitions for what's possible in relatively short timeframes. The most prominent advance was Microsoft's announcement of Majorana 1, a chip powered by "topological qubits" using a new material. This enables hardware-protected qubits that are more stable and fault-tolerant. The chip currently contains 8 topologic qubits, but it is designed to house one million. This is many orders of dimension larger than current systems. DARPA has selected the system for its utility-scale quantum computing program. Microsoft believes they can create a fault-tolerant quantum computer prototype in years. The other breakthrough is extraordinary: quantum gate teleportation, linking two quantum processes using quantum teleportation. Instead of packing millions of qubits into a single machine—which is exceptionally challenging—this approach allows smaller quantum devices to be connected via optical fibers, working together as one system. Oxford University researchers proved that distributed quantum computing can perform powerful calculations more efficiently than classical systems. This could not only create a pathway to workable quantum computers, but also a quantum internet, enabling ultra-secure communication and advanced computational capabilities. It certainly seems that the pace of scientific progress is increasing. Some of the applications - such as in quantum computing - could have massive implications, including in turn accelerating science across domains.

  • View profile for Anthony Massobrio

    Deep Tech Evangelist | Quantum & AI & CFD

    9,512 followers

    Dear Prof Feynman, Since your 1982 paper “Simulating Physics with Computers,” quantum computing has developed from speculation into experimental reality. Here’s where we stand in June 2025. Your insight that classical computers cannot efficiently simulate quantum systems proved correct - this became the foundation for building quantum computers. Ion trapping techniques developed in the 1980s now control dozens of trapped ions as quantum bits, enabling high accuracy in single quantum operations and extended coherence times. Josephson junctions became artificial atoms: superconducting circuits that manipulate quantum states at millikelvin temperatures. Current superconducting processors include Google’s Willow chip and IBM’s advanced systems. Two-qubit gate accuracies approach 99%, though environmental noise still limits algorithmic applications to dozens of useful qubits working together. Shor’s factoring algorithm works on small numbers but would need millions of error-corrected quantum bits for practical cryptography. Google’s 2019 quantum demonstration solved a sampling problem faster than classical computers, though the practical advantage is close to nil. Scientists have built logical quantum bits that actually last longer and make fewer errors than the physical quantum bits they’re made from. However, fault-tolerant computation requires significant overhead, necessitating many physical quantum bits per logical quantum bit. IBM plans to develop 200-logical-qubit systems by 2029, utilizing advanced error correction codes. Your original challenge persists. Quantum many-body systems remain exponentially hard to simulate classically, yet building quantum simulators requires controlling thousands of quantum components with extraordinary precision.

  • View profile for Peter Barrett

    Founder and General Partner at Playground Global

    7,938 followers

    NVIDIA CEO Jensen Huang recently claimed that practical quantum computing is still 15 to 30 years away and will require NVIDIA #GPUs to build hybrid quantum/classical supercomputers. But both the timeline and the hardware assumption are off the mark. Quantum computing is progressing much faster than many realize. Google’s #Willow device has demonstrated that scaling up quantum systems can exponentially reduce errors, and it achieved a benchmark in minutes that would take classical supercomputers countless billions of years. While not yet commercially useful, it shows that both quantum supremacy and fault tolerance are possible. PsiQuantum, a company building large-scale photonic quantum computers, plans to bring two commercial machines online well before the end of the decade. These will be 10,000 times larger than Willow and will not use GPUs, but rather custom high-speed hardware specifically designed for error correction. Meanwhile, quantum algorithms are advancing rapidly. PsiQuantum recently collaborated with Boehringer Ingelheim to achieve over a 200-fold improvement in simulating molecular systems. Phasecraft, the leading quantum algorithms company, has developed quantum-enhanced algorithms for simulating materials, publishing results that threaten to outperform classical methods even on current quantum hardware. Algorithms are improving 1000s of times faster than hardware, and with huge leaps in hardware from PsiQuantum, useful quantum computing is inevitable and increasingly imminent. This progress is essential because our existing tools for simulating nature, particularly in chemistry and materials science, are limited. Density Functional Theory, or DFT, is widely used to model the electronic structure of materials but fails on many of the most interesting highly correlated quantum systems. When researchers tried to evaluate the purported room-temperature superconductor LK-99, #DFT failed entirely, and researchers were forced to revert to cook-and-look to get answers. Even cutting-edge #AI models like DeepMind’s GNoME depend on DFT for training data, which limits their usefulness in domains where DFT breaks down. Without more accurate quantum simulations, AI cannot meaningfully explore the full complexity of quantum systems. To overcome these barriers, we need large-scale quantum computers. Building machines with millions of qubits is a significant undertaking, requiring advances in photonics, cryogenics, and systems engineering. But the transition is already underway, moving from theoretical possibility to construction. Quantum computing offers a path from discovery to design. It will allow us to understand and engineer materials and molecules that are currently beyond our reach. Like the transition from the stone age to ages of metal, electricity, and semiconductors, the arrival of quantum computing will mark a new chapter in our mastery of the physical world.

  • View profile for Mrukant Popat

    🤖 BuildYantra.AI | CTO | AI / ML / Video / Computer Vision, OS - operating system, Platform firmware | 100M+ devices running my firmware

    5,326 followers

    𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗖𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴: 𝗔 𝗥𝗲𝘃𝗼𝗹𝘂𝘁𝗶𝗼𝗻 𝗼𝗻 𝘁𝗵𝗲 𝗛𝗼𝗿𝗶𝘇𝗼𝗻 🚀 Quantum computing represents a paradigm shift in how we approach computation. Unlike classical computers that use bits (0 or 1), quantum computers leverage qubits. Qubits can exist in multiple states simultaneously due to superposition, allowing quantum computers to explore countless possibilities and solve complex problems exponentially faster. This opens doors to breakthroughs in fields ranging from medicine and materials science to finance and artificial intelligence. 𝗪𝗶𝗹𝗹𝗼𝘄 (𝗚𝗼𝗼𝗴𝗹𝗲) Google's "Willow" chip showcases substantial progress in both quantum error correction and performance. Willow has achieved "below threshold" error rates, meaning that as the number of qubits scales up, errors decrease exponentially. It also achieved a standard benchmark computation in under five minutes that would take one of today's fastest supercomputers an unfathomable amount of time. Google's strategy revolves around improving qubit quality and error correction to achieve practical quantum advantage, with a clear focus on demonstrating real-world applications. 𝗠𝗮𝗷𝗼𝗿𝗮𝗻𝗮 𝟭 (𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁) Microsoft is taking a bold step with its "Majorana 1" chip, built upon a Topological Core architecture. This innovative design harnesses topoconductors to control Majorana particles, creating more stable and scalable qubits. Microsoft envisions this as the "transistor for the quantum age," paving the way for million-qubit systems capable of tackling industrial-scale challenges like breaking down microplastics or designing self-healing materials. Their strategy is to focus on creating inherently stable qubits that require less error correction, a significant hurdle in quantum computing. 𝗢𝗰𝗲𝗹𝗼𝘁 (𝗔𝗺𝗮𝘇𝗼𝗻 𝗪𝗲𝗯 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀) Amazon Web Services (AWS) is addressing quantum error correction directly with their "Ocelot" chip. Ocelot employs a novel architecture utilizing 'cat qubits' that are designed to reduce error correction costs significantly. This is a crucial advancement as quantum computers are incredibly sensitive to noise, and error correction is essential for reliable computation. AWS's strategy is to lower the barrier to entry for quantum computing through its Amazon Braket service, providing access to diverse quantum hardware and tools while focusing on making quantum computing more cost-effective and accessible. 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗖𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴 𝗮𝗻𝗱 𝗔𝗜: 𝗕𝗲𝘆𝗼𝗻𝗱 𝘁𝗵𝗲 𝗟𝗶𝗺𝗶𝘁𝘀 𝗼𝗳 𝗚𝗣𝗨𝘀 While GPUs have revolutionized AI by accelerating the training of complex models, quantum computing offers the potential for an even greater leap in AI capabilities. Quantum computers, by harnessing superposition and entanglement, can potentially solve optimization, machine learning, and simulation problems that are intractable for even the most powerful GPUs. #QuantumComputing #AI #GPU

  • View profile for Andreas Horn

    Head of AIOps @ IBM || Speaker | Lecturer | Advisor

    234,825 followers

    𝗘𝘃𝗲𝗿𝘆𝗼𝗻𝗲'𝘀 𝘁𝗮𝗹𝗸𝗶𝗻𝗴 𝗮𝗯𝗼𝘂𝘁 𝗔𝗜, 𝗟𝗟𝗠𝘀, 𝗮𝗻𝗱 𝗚𝗣𝗨𝘀 𝘁𝗵𝗲𝘀𝗲 𝗱𝗮𝘆𝘀! But there’s another technology quietly advancing — one that could make today’s AI systems look primitive: 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗰𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴. Last week, IBM revealed its roadmap to build the world’s first large-scale, fault-tolerant quantum computer — IBM Quantum Starling — targeted for delivery by 2029. This system is designed to perform 100 million quantum operations using 200 logical qubits, scaling far beyond current quantum machines. To represent its quantum state would require **more memory than 10⁴⁸ classical supercomputers combined*. 𝗪𝗵𝗮𝘁 𝗺𝗮𝗸𝗲𝘀 𝘁𝗵𝗶𝘀 𝘀𝗼 𝗱𝗶𝗳𝗳𝗲𝗿𝗲𝗻𝘁 𝗳𝗿𝗼𝗺 𝘁𝗼𝗱𝗮𝘆’𝘀 𝗰𝗼𝗺𝗽𝘂𝘁𝗲𝗿𝘀? ⬇️ - Quantum computers use qubits, which can represent multiple states at once — enabling exponential computational power. - They have the potential to transform industries like drug development, materials discovery, and optimization. - At the same time, their power threatens to break current encryption protocols, prompting urgent work on quantum-safe security. - The field is still experimental, requiring extreme conditions like temperatures close to absolute zero — but the trajectory is clear. 𝗜𝗕𝗠’𝘀 𝗮𝗽𝗽𝗿𝗼𝗮𝗰𝗵 𝗶𝘀 𝗴𝗿𝗼𝘂𝗻𝗱𝗲𝗱 𝗶𝗻 𝗿𝗶𝗴𝗼𝗿𝗼𝘂𝘀 𝗲𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴: ⬇️ It’s building toward fault-tolerant quantum computing through a stepwise hardware roadmap: 1. Loon (2025) will test new chip components for error correction using quantum LDPC codes — the foundation of scalable quantum computing. 2. Kookaburra (2026) introduces IBM’s first modular quantum processor, combining memory and logic to build systems beyond a single chip. 3.Cockatoo (2027) will entangle multiple Kookaburra modules, connecting chips like nodes in a distributed quantum system. All of this leads to Starling (2029) — IBM’s planned breakthrough system capable of running 100 million quantum operations on 200 logical qubits. These are tightly integrated hardware milestones — solving problems like error correction, interconnects, and scalability — that make large-scale quantum computing actually achievable. 𝗪𝗮𝘁𝗰𝗵 𝘁𝗵𝗲 𝘃𝗶𝗱𝗲𝗼 𝗯𝗲𝗹𝗼𝘄 𝘁𝗼 𝘀𝗲𝗲 𝗵𝗼𝘄 𝘁𝗵𝗶𝘀 𝗿𝗼𝗮𝗱𝗺𝗮𝗽 𝘂𝗻𝗳𝗼𝗹𝗱𝘀 — 𝗮𝗻𝗱 𝘄𝗵𝘆 𝘁𝗵𝗶𝘀 𝗰𝗼𝘂𝗹𝗱 𝗯𝗲𝗰𝗼𝗺𝗲 𝗼𝗻𝗲 𝗼𝗳 𝘁𝗵𝗲 𝗺𝗼𝘀𝘁 𝗶𝗺𝗽𝗼𝗿𝘁𝗮𝗻𝘁 𝗰𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴 𝗺𝗶𝗹𝗲𝘀𝘁𝗼𝗻𝗲𝘀 𝗼𝗳 𝘁𝗵𝗲 𝗱𝗲𝗰𝗮𝗱𝗲.

  • 🚀 Quantum Computing: Transitioning from Lab Theory to Operational Reality 2025 marks a definitive shift as the UN celebrates the International Year of Quantum Science and Technology. We are moving past speculative demos toward productive and operational utility, integrating quantum into core workflows to solve "intractable" problems. 🏆 Top 10 Quantum Achievements of 2025: 1. Verifiable Quantum Advantage: Google’s Willow chip achieved a 13,000x speedup over the world’s fastest supercomputers using the "Quantum Echoes" algorithm to model real physical experiments. 2. Topological Stability: Microsoft unveiled Majorana 1, achieving a 1,000-fold reduction in error rates using hardware-protected topological qubits. 3. The "Four-Nines" Barrier: IonQ reached a world-record 99.99% two-qubit gate fidelity, dramatically reducing the physical qubits needed for fault-tolerant operations. 4. Operational Scaling: Caltech researchers assembled a 6,100-qubit neutral-atom array, maintaining superposition for 13 seconds without compromising quality. 5. Extended Coherence: Alice & Bob created "cat qubits" that resisted bit-flip errors for more than one hour, essential for long-running operational algorithms. 6. Quantum Internet Breakthrough: T-Labs demonstrated high-fidelity (99%) transmission of entangled photons across 30km of commercial fiber for 17 days. 7. GPS-Denied Navigation: Q-CTRL achieved the first commercial advantage in sensing, using quantum magnetometers for navigation 100x more accurate than conventional systems without GPS. 8. Continuous Operation: Harvard and QuEra ran a 3,000-qubit array for over two hours by replenishing atoms mid-computation. 9. Standard Hardware Integration: IBM successfully ran quantum error correction algorithms on commercially available AMD chips, accelerating practical scalability. 10. Modular Interconnects: Oxford University achieved quantum gate teleportation between separate modules, proving that distributed quantum computing is viable. 🛠️ How to Prepare for the Operational Transition: • Operational Resilience: The timeline for Cyber-Resilience has accelerated; research shows that 1 million physical qubits could break RSA-2048 encryption in just one week. Experts recommend deprecating vulnerable systems by 2030, making the migration to Post-Quantum Cryptography (PQC) a current operational priority. • Infrastructure Integration: Utilize hybrid cloud-quantum architectures via Amazon Braket or Azure Quantum to test readiness without heavy capital investment. • Logistics Optimization: Organizations like D-Wave are already delivering an 80% reduction in scheduling efforts for complex supply chains. Quantum is no longer a "future" tech; it is an operational differentiator for the next decade. #QuantumComputing #Innovation #SupplyChain #CyberSecurity #CloudComputing #FutureOfTech

  • View profile for Joel F. K.

    CEO Qubic Quantum Computing

    12,268 followers

    🚀 Quantum Computing is no longer a future promise for Pharma — 2026 is the inflection point. After years of experimentation, we are witnessing a decisive transition: quantum computing is moving from theoretical potential to measurable business impact in pharmaceutical R&D. As highlighted in my latest blueprint “Quantum Computing in Pharmaceutical Innovation – 2026” (available for download on my LinkedIn profile), the industry is facing a structural challenge: 🔹 $2.6B per approved drug 🔹 10–15 years of development 🔹 ~90% failure rate, often driven by inaccurate molecular predictions These are not chemistry problems. They are computational limitations. 🧠 Why quantum computing changes the equation Drug discovery is inherently quantum: • Electron correlation in complex molecules • Protein folding state spaces • Binding energy precision • ADME & toxicity prediction Classical HPC systems rely on approximations. Quantum systems don’t. Early pilots already demonstrate tangible value: ✅ Up to 20x faster binding energy calculations ✅ Chemical accuracy (~0.5 kcal/mol) on real targets ✅ 83% reduction in virtual screening cycles This is not hype — it’s validated acceleration. 📊 2026–2030: a strategic window The most effective approaches today are hybrid quantum–HPC workflows, enabling pharma leaders to: • Compress R&D timelines by 30–50% • Increase lead success rates by 2–3x • Unlock portfolio-level ROI exceeding 2,000% The real differentiator is no longer if you explore quantum computing — it’s how early and how strategically you build capability. 🔑 My conviction Quantum computing will not replace classical methods. It will redefine what is computable in drug discovery. Organizations that invest now in literacy, pilots, and integration will compound advantage — scientifically, economically, and competitively. 📥 The full report is available for download on my LinkedIn profile. 💬 I’m actively working with pharma executives, R&D leaders, and investors to translate quantum physics into actionable, ROI-driven roadmaps. Let’s move from experimentation to execution. #QuantumComputing #PharmaInnovation #DrugDiscovery #DeepTech #RAndDStrategy #QuantumChemistry #ThoughtLeadership

  • View profile for Aamer Baig

    Senior Partner and Global Leader, McKinsey Technology

    7,634 followers

    Like AI, the speed to at-scale deployment in quantum is accelerating. If companies and governments are not readying themselves now, they will be in for a turbulent catch-up against the competition and markets.    For the first time since McKinsey began monitoring the quantum technology (QT) market four years ago, we see a shift from development to deployment. Additionally, our new research shows that the three core pillars of QT (in order of the most current impact and progress: quantum computing, quantum sensing, and quantum communications) could together generate up to $97 billion in revenue worldwide by 2035.    So what does this mean for organizations in the private and public sectors? Let’s break it down into the three categories.   Quantum Computing: The QT start-up ecosystem is fertile ground for potential breakthroughs, but leading technology companies drove the bulk of change in 2024. The likes of Amazon, Google, IBM, and Microsoft continued to progress in quantum innovation, unveiling key breakthroughs that signal a new era for the industry. The innovation on error correction is a major driver. As the number of qubits grows, effective error correction is simply not optional. Ensuring QT systems are less prone to error is essential for achieving the stability and accuracy needed to deploy quantum applications at scale.   Sensing: We also see leaders investing in getting the hardware ready for when the software side catches up. Real-world application development will be central to unlocking full potential. We saw significant breakthroughs in 2024 and early 2025, particularly in use cases across defense and semiconductors.    Quantum Communications: The total quantum communication market size was $1.2B in 2024 and that it will reach $10.5B to $14.9B by 2035—representing a CAGR of 22 to 25 percent over the next 10 years. Governments are currently the largest purchasers of quantum communication technologies, at approximately 57% of all purchases in 2024, but the private sector is increasingly adopting the technology.    While QT will affect many industries, the chemicals, life sciences, finance, mobility, and telecommunications industries will see the most growth. Further, we see four innovation domains: AI and machine learning, robotics, sustainability and climate tech, and cryptography and cybersecurity.    Check out the full use cases and research by our quantum experts, Henning Soller, Martina Gschwendtner, Sara Shabani, and Waldemar Svejstrup. https://lnkd.in/gxdTsHxW

Explore categories