September 15, 2022 – Quantum computers hold huge promise in our world of big data. If researchers can harness their potential, these devices can perform complex calculations on a large scale at lightning speed.
Classical computers like our laptops store information in bits, which exist in one of two physical states: 0 or 1. But qubits, the equivalent form of data storage for quantum computers, work differently because their nature is probabilistic rather than deterministic. They can exist as 0 and 1 at the same time, which is what gives them their power. With the increase in the number of bits stored in a quantum computer, this computer can process information faster than a conventional computer.
But there is a downside. Qubits are fragile. Their states change very quickly, for example in response to environmental factors such as temperature, which leads to a lot of errors. Researchers have struggled to develop an effective way to correct these errors in real time. Methods for correcting such quantum errors are known as quantum error correction (QEC) schemes.
Dr. Sangka Pura, a postdoctoral researcher at Quantum machines unit Led by Professor Jason Twamley in Okinawa Institute of Science and Technology (OIST). “If we can figure out exactly how QEC performs, we may have usable quantum computers very soon.”
Now, Dr. Bora and colleagues at OIST and their collaborators at Trinity College in Dublin, Ireland, and the University of Queensland in Brisbane, Australia, have proposed a new error-correction method, recently published in Physical Review Research.
The QEC investigation involves performing an array of multiple qubits using a property of quantum mechanics called entanglement. To detect errors that occur in qubits, the QEC scheme must apply a series of measurements known as symmetry measurements. These measurements assess whether two of the nearest neighboring qubits are aligned in the same direction. The results of these measurements are called synapses, and based on them, the error in qubits can be detected and later corrected.
Commonly used QEC schemas are usually slow and result in rapid loss of information stored in qubits due to errors that they fail to detect and correct in real time. In addition, QEC methods use a traditional quantum measurement method called projective quantification to obtain symmetries. This method requires several additional qubits, which makes it very resource intensive.
Instead, Dr. Bora and colleagues used an approach called continuous scaling. Such measurements can be made much more quickly than traditional projective measurements in a highly resource-efficient manner. They developed a QEC scheme called the Measurement-Based Estimation Scheme for Continuous Quantitative Error Correction (MBE-CQEC), which can quickly and efficiently detect and correct errors from noisy partial syndrome measurements. They set up a powerful classical computer to act as an external controller (or estimator) that estimates errors in a quantum system, filters out noise perfectly, and applies feedback to correct them.
Dr. Bora explains that the new QEC scheme is based on a theoretical model that still needs to be validated experimentally on a quantum computer. Also, it has an important limitation: as the number of bits in the system increases, the real-time simulation of the estimator becomes significantly slower.
“We are working on it, and we hope that others in the field will deal with the problem as well,” Dr. Bora concluded.
source: Alla Katznelson, Okinawa Institute of Science and Technology