in

New study uses machine learning to bridge the reality gap in quantum devices

A study led by the University of Oxford has used the power of machine learning to overcome a key challenge affecting quantum devices. For the first time, the findings reveal a way to close the ‘reality gap’: the difference between predicted and observed behaviour from quantum devices. The results have been published in Physical Review X.

Quantum computing could supercharge a wealth of applications, from climate modelling and financial forecasting, to drug discovery and artificial intelligence. But this will require effective ways to scale and combine individual quantum devices (also called qubits). A major barrier against this is inherent variability: where even apparently identical units exhibit different behaviours.

Functional variability is presumed to be caused by nanoscale imperfections in the materials that quantum devices are made from. Since there is no way to measure these directly, this internal disorder cannot be captured in simulations, leading to the gap in predicted and observed outcomes.

To address this, the research group used a “physics-informed” machine learning approach to infer these disorder characteristics indirectly. This was based on how the internal disorder affected the flow of electrons through the device.

Lead researcher Associate Professor Natalia Ares (Department of Engineering Science, University of Oxford) said: ‘As an analogy, when we play “crazy golf” the ball may enter a tunnel and exit with a speed or direction that doesn’t match our predictions. But with a few more shots, a crazy golf simulator, and some machine learning, we might get better at predicting the ball’s movements and narrow the reality gap.’

The researchers measured the output current for different voltage settings across an individual quantum dot device. The data was input into a simulation which calculated the difference between the measured current with the theoretical current if no internal disorder was present. By measuring the current at many different voltage settings, the simulation was constrained to find an arrangement of internal disorder that could explain the measurements at all voltage settings. This approach used a combination of mathematical and statistical approaches coupled with deep learning.

Associate Professor Ares added: ‘In the crazy golf analogy, it would be equivalent to placing a series of sensors along the tunnel, so that we could take measurements of the ball’s speed at different points. Although we still can’t see inside the tunnel, we can use the data to inform better predictions of how the ball will behave when we take the shot.’

Not only did the new model find suitable internal disorder profiles to describe the measured current values, it was also able to accurately predict voltage settings required for specific device operating regimes.

Crucially, the model provides a new method to quantify the variability between quantum devices. This could enable more accurate predictions of how devices will perform, and also help to engineer optimum materials for quantum devices. It could inform compensation approaches to mitigate the unwanted effects of material imperfections in quantum devices.

Co-author David Craig, a PhD student at the Department of Materials, University of Oxford, added, ‘Similar to how we cannot observe black holes directly but we infer their presence from their effect on surrounding matter, we have used simple measurements as a proxy for the internal variability of nanoscale quantum devices. Although the real device still has greater complexity than the model can capture, our study has demonstrated the utility of using physics-aware machine learning to narrow the reality gap.’


Source: Computers Math - www.sciencedaily.com

Towards more accurate 3D object detection for robots and self-driving cars

Severe MS predicted using machine learning