Haiqu Demonstrates Quantum Machine Learning Efficiency on IBM Hardware, Signaling Near-Term Advantage in Anomaly Detection
Haiqu Quantum
A noteworthy demonstration provided by Haiqu Inc, a new quantum software business, strongly implies that Quantum Machine Learning (QML) could soon provide practical benefits. The business demonstrated experimentally that modern quantum computers are more effective than conventional classical systems at identifying patterns and detecting anomalies in large, complicated datasets. This innovation focusses on anomaly detection, a critical and resource-intensive operation across worldwide sectors, and was made possible by IBM’s potent Quantum Heron.
The most convincing empirical indication to date that the promise of quantum advantage in data processing is quickly approaching the near-term is the successful implementation of quantum systems to handle the most complex aspect of data analysis, leading to increased accuracy and quicker preprocessing times over purely classical methods.
You can also read SC25: Quantinuum Introduces Helios To Lead the Quantum-HPC
The Bottleneck: Classical Limits and the Curse of Dimensional
The “proverbial needle in the digital haystack” is anomaly detection, which is essential to modern infrastructure. It is essential for detecting financial fraud, identifying anomalous stock market trading, identifying minor variations in patients’ vital signs, and predicting odd weather patterns.
But in the Big Data era, the volume and complexity of data pose a crushing challenge to traditional algorithms. Real-world data is frequently categorized as “high-dimensional,” which means that hundreds or even thousands of attributes can be used to describe a single data piece. The “curse of dimensionality” refers to the exponential rise in computer resources required by classical systems to detect significant patterns or minor outliers as the number of characteristics increases.
This problem often causes operational bottlenecks, notably in high-frequency trading and real-time health monitoring, where real-time analysis is essential. This can lead to costly false positives or, worse, missing detections. QML seeks to take advantage of the fundamentally different representation and processing of information provided by quantum computing in order to extract these intricate patterns more effectively than traditional techniques.
Haiqu’s Solution: Scaling QML with Quantum Embedding
Haiqu’s success depends on a unique and very successful quantum embedding method. This bridge technology converts complex classical data into a quantum computer-friendly format. It allows condensing a large classical dataset into a complex quantum circuit.
What sets this demonstration apart from previous proofs-of-concept is its magnitude. Haiqu was able to successfully encode more than 500 features from a complicated financial dataset onto the IBM Quantum Heron processor’s 128 qubits. This accomplishment marks a significant turning point because the incapacity to load enough high-dimensional data to have a significant practical influence on Quantum Machine Learning (QML) on existing quantum hardware (referred to as NISQ Noisy Intermediate-Scale Quantum) was the previous practical limitation.
The technical significance was emphasized by Oleksandr Kyriienko, Professor and Chair in Quantum Technologies at the University of Sheffield. He pointed out that since quantum embedding defines the complexity and performance of the models, it is crucial to comprehend and use it when analysing data on quantum devices. Since even a slight increase in scores can result in important detections or the removal of false positives, Professor Kyriienko said he was “very happy to see this implemented at an unprecedented scale,” adding that anomaly detection is an ideal target.
Haiqu’s CTO and co-founder, Mykola Maksymenko, confirmed that this effective translation makes it possible for quantum applications to operate on a far bigger scale. Based on their research on anomaly detection, Maksymenko thinks here is where the impact of quantum data processing can be helpful.
You can also read 01 Quantum creates Quantum AI Wrapper QAW for data security
Hybrid Performance: Faster Preprocessing and Improved Accuracy
The hybrid quantum-classical method was used in the experiment. The most data-intensive step, preprocessing, was handled by the quantum computer. The raw, high-dimensional financial data was converted into a refined, superior feature set by this quantum preprocessing step. For the final classification and anomaly detection, this quantum-enhanced feature set was subsequently fed into a conventional, machine learning method.
The results showed a steady trend in favour of the quantum-enhanced preprocessing when compared to a pure classical baseline that used purely classical embeddings made using random parameters to enable a fair comparison. In identifying irregularities in the intricate, real-world financial datasets, the quantum approach demonstrated higher accuracy.
The scientists also examined computing speed and found that preprocessing time on the real IBM Quantum Heron device was faster than when the identical operations were simulated traditionally. This observation is strong and raises the possibility of instant time savings for data preparation chores.
The capacity to encode high-dimensional data with hundreds or even thousands of features allows for applications of a new scale, according to IBM Research Director Jay Gambetta, who praised the study. According to Gambetta, “Advances like this are what push the industry towards achieving a quantum advantage in the near term” .
A Signal, Not a Claim: The Road Ahead
Haiqu’s leadership is carefully controlling expectations on obtaining a clear quantum advantage in spite of the strong outcomes. Haiqu’s CEO and co-founder, Richard Givhan, explained the current situation by saying, They are not claiming quantum advantage just yet.” Nonetheless, he claimed that they are offering the most convincing empirical evidence to date that (1) high-dimensional real-world data can already be loaded onto a quantum computer and (2) QML may soon prove beneficial for processing such data.
In addition to providing more scalable embeddings and storing more classical data in quantum states, this most recent research validates earlier discoveries and shows more reliable, controlled, and repeatable results. Both in ideal simulation and on actual hardware, the work was effectively tested across many machine learning.
This technology has enormous potential to change industries in the future. The following applications go beyond finance (better fraud detection, risk modelling):
- Healthcare: By keeping an eye on minute changes in medical readings, health problems might be identified early.
- Industrial: Predictive maintenance through the detection of malfunctioning machine sensors.
- Environmental Monitoring: More precise and quick identification of anomalous seismic data, such earthquakes.
In order to investigate the applicability of its quantum feature embedding technique on these broader analysis challenges, Haiqu is currently taking beta tester applications. According to the company’s projections, the battle for a clear quantum advantage will accelerate when their quantum technique eventually scales to solve problems with tens of thousands of features on near-term quantum processor.
You can also read IonQ to Showcase Innovations at Web Summit 2025 Portugal




Thank you for your Interest in Quantum Computer. Please Reply