A research team from the Department of Physics, the University of Hong Kong (HKU) has developed a new algorithm to measure entanglement entropy, advancing the exploration of more comprehensive laws in quantum mechanics, a move closer towards actualisation of application of quantum materials.
This pivotal research work has recently been published in Physical Review Letters.
Quantum materials play a vital role in propelling human advancement. The search for more novel quantum materials with exceptional properties has been pressing among the scientific and technology community.
2D Moire materials such as twisted bilayer graphene are having a far-reaching role in the research of novel quantum states such as superconductivity which suffers no electronic resistance. They also play a role in the development of “quantum computers” that vastly outperforming the best supercomputers in existence.
But materials can only arrive at “quantum state” , i.e. when thermal effects can no longer hinder quantum fluctuations which trigger the quantum phase transitions between different quantum states or quantum phases, at extremely low temperatures (near Absolute Zero, -273.15°C) or under exceptional high pressure. Experiments testing when and how atoms and subatomic particles of different substances “communicate and interact with each other freely through entanglement” in quantum state are therefore prohibitively costly and difficult to execute.
The study is further complicated by the failure of classical LGW (Landau, Ginzburg, Wilson) framework to describe certain quantum phase transitions, dubbed Deconfined Quantum Critical Points (DQCP). The question then arises whether DQCP realistic lattice models can be found to resolve the inconsistencies between DQCP and QCP. Dedicated exploration of the topic produces copious numerical and theoretical works with conflicting results, and a solution remains elusive.
Mr Jiarui ZHAO, Dr Zheng YAN, and Dr Zi Yang MENG from the Department of Physics, HKU successfully made a momentous step towards resolving the issue through the study of quantum entanglement, which marks the fundamental difference between quantum and classical physics.
The research team developed a new and more efficient quantum algorithm of the Monte Carlo techniques adopted by scientists to measure the Renyi entanglement entropy of objects. With this new tool, they measured the Rényi entanglement entropy at the DQCP and found the scaling behaviour of the entropy, i.e. how the entropy changes with the system sizes, is in sharp contrast with the description of conventional LGW types of phase transitions.
“Our findings helped confirm a revolutionised understanding of phase transition theory by denying the possibility of a singular theory describing DQCP. The questions raised by our work will contribute to further breakthroughs in the search for a comprehensive understanding of unchartered territory,” said Dr Zheng Yan.
“The finding has changed our understanding of the traditional phase transition theory and raises many intriguing questions about deconfined quantum criticality. This new tool developed by us will hopefully help the process of unlocking the enigma of quantum phase transitions that has perplexed the scientific community for two decades,” said Mr Zhao Jiarui, the first author of the journal paper and a PhD student who came up with the final fixes of the algorithm.
“This discovery will lead to a more general characterisation of the critical behaviour of novel quantum materials, and is a move closer towards actualisation of application of quantum materials which play a vital role in propelling human advancement.” Dr Meng Zi Yang remarked.
The models
To test the efficiency and superior power of the algorithm and demonstrate the distinct difference between the entanglement entropy of normal QCP between DQCP, the research team chose two representative models — the J1-J2 model hosting normal O(3) QCP and the J-Q3 model hosting DQCP, as shown in Image 2.
Nonequilibrium increment algorithm
Based on previous methods, the research team created a highly paralleled increment algorithm. As illustrated in Image 3, to the main idea of the algorithm is to divide the whole simulation task into many smaller tasks and uses massive CPUs to parallelly execute the smaller tasks thus greatly decreasing the simulation time. This improved method helped the team to simulate the two models previously mentions with high efficiency and better data quality.
Story Source:
Materials provided by The University of Hong Kong. Note: Content may be edited for style and length.