Researchers Use Quantum Centric Supercomputing to Break Protein Simulation Records
An international team of researchers has successfully simulated the electronic structure of a protein complex with 12,635 atoms, which is a historic accomplishment for the domains of computational chemistry and quantum information science. The announcement of this milestone on May 5, 2026, signifies a significant advancement in the capacity of quantum computers to address physiologically significant issues at a scale that was previously believed to be years away.
Using a “quantum centric supercomputing” (QCSC) architecture that smoothly combines quantum processors with some of the most potent classical supercomputers in the world, Cleveland Clinic, RIKEN, and IBM collaborated on the research. In comparison to state-of-the-art findings from just four months earlier, the scientists showed a 40-fold increase in system size by simulating the proteins T4-Lysozyme and Trypsin in a liquid water solution.
You can also read IBM quantum centric supercomputing powers next-gen chemistry
A Novel Approach to Drug Development
For many years, scientists have envisioned a “digital laboratory” in which the behavior of novel medications and materials might be precisely predicted prior to the execution of a single physical experiment. But quantum mechanics rules the universe, and no matter how big a computer is, it cannot replicate the intricate entanglement of electrons in big molecules.
If we want another order-of-magnitude-or-two bump, quantum computing is probably the way to go, said Dr. Kenneth Merz, the study’s lead author and head of the Merz lab at Cleveland Clinic. Merz said that while advancements in conventional computers have halted, quantum technology is developing at a rate that has the potential to completely transform the pharmaceutical sector by producing superior, life-saving medications more quickly.
You can also read Cleveland Clinic News Today: Protein Simulation With IBM
Breaking the Scaling Barrier

A complex heterogeneous approach enabled this simulation’s enormous size, which reached 12,635 atoms and 30,000 orbitals. Instead of depending only on a quantum computer, the scientists employed a technique known as wave function-based embedding (EWF). A large molecule is broken up into more manageable “clusters” using this method.
In this architecture, the most complicated clusters with the highest electron entanglement are handled by quantum processors (QPUs), while traditional supercomputers handle the simpler sections of the molecule. The researchers improved these techniques to go over a “computational wall” to do the Trypsin simulation. The cost of calculating a molecule the size of trypsin would have been 24 million times higher than that of earlier mini protein models in the absence of these improvements.
The group used “linear-scaling methods,” realizing that these proteins have “localized” electrons. According to IBM researcher and co-author Mario Motta, entanglement essentially “dies” at a distance of 7–10 angstroms, which enables the researchers to limit intricate computations to a certain sphere around each atom.
You can also read IBM Unveils QCSC with QPU Quantum Processing Units
Technology: Fugaku and Heron
This record-breaking achievement was made possible by equally remarkable gear. Two 156-qubit IBM Quantum Heron r2 processors at RIKEN (ibm_kobe) and Cleveland Clinic (ibm_cleveland) were used by the researchers to conduct quantum sampling. Over the course of 100 hours, these QPUs executed 9,200 circuits, producing 1.3 billion measurement results.
The Fugaku supercomputer at RIKEN and Miyabi-G, a GPU-accelerated system run by the University of Tokyo and the University of Tsukuba, split up the classical processing. Compared to earlier quantum-centric methods, this close integration of QPUs, CPUs, and GPUs produced an accuracy boost of 210 times.
“Trimming” the Noise
TrimSQD is one of the main algorithmic breakthroughs presented in this paper. By assisting the computer in identifying the most significant electronic configurations, which theoretical chemists refer to as “livewood,” while disregarding the unimportant “deadwood,” this technique improves Sample-based Quantum Diagonalization (SQD).
Motta likened this procedure to figuring out a challenging problem like The Coronation of Napoleon by Jacques-Louis David. According to this comparison, “deadwood” elements from previous paintings, such as a Van Gogh or a Kahlo, may mix in with noise on a quantum gadget. By efficiently dividing the pieces into smaller piles, TrimSQD makes the original puzzle’s “livewood” faces more visible.
You can also read IBM Quantum-Centric Supercomputing: Pairing CPU-GPU-QPU
The Path to Come
The trend is evident, even if the researchers admit that this strategy does not yet surpass the best classical-only methods. The work shows that quantum computing is now a practical instrument for scientific investigation rather than just a theoretical endeavor.
The approaches used for this protein simulation are “future-proof,” which means they may be readily transferred to the upcoming generation of fault-tolerant quantum systems, such IBM Quantum Starling, which is anticipated to launch in 2029. In an effort to encourage other chemists to use these techniques in novel ways, the team is currently focusing on new issues in materials science and drug development.
Merz remarked, “I think this study may get people off the sidelines,” considering that these findings came out years earlier than he had anticipated. The goal of replicating the entire complexity of life at the atomic level is quickly coming true as quantum technology develops.
You can also read First Half-Möbius Electronic Molecule Built by IBM Team