Transfer learning for scalability of neural-network quantum states

Abstract : Neural-network quantum states have shown great potential for the study of many-body quantum systems. In statistical machine learning, transfer learning designates protocols reusing features of a machine learning model trained for a problem to solve a possibly related but different problem. We propose to evaluate the potential of transfer learning to improve the scalability of neural-network quantum states. We devise and present physics-inspired transfer learning protocols, reusing the features of neural-network quantum states learned for the computation of the ground state of a small system for systems of larger sizes. We implement different protocols for restricted Boltzmann machines on general-purpose graphics processing units. This implementation alone yields a speedup over existing implementations on multi-core and distributed central processing units in comparable settings. We empirically and comparatively evaluate the efficiency (time) and effectiveness (accuracy) of different transfer learning protocols as we scale the system size in different models and different quantum phases. Namely, we consider both the transverse field Ising and Heisenberg XXZ models in one dimension, and also in two dimensions for the latter, with system sizes up to 128 and 8 x 8 spins. We empirically demonstrate that some of the transfer learning protocols that we have devised can be far more effective and efficient than starting from neural-network quantum states with randomly initialized parameters.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas
Contributor : Christian Miniatura <>
Submitted on : Sunday, November 10, 2019 - 7:55:26 AM
Last modification on : Wednesday, November 13, 2019 - 1:09:09 AM

Links full text


  • HAL Id : hal-02357370, version 1
  • ARXIV : 1908.09883



Remmy Zen, Long My, Ryan Tan, Frédéric Hébert, Mario Gattobigio, et al.. Transfer learning for scalability of neural-network quantum states. 2019. ⟨hal-02357370⟩



Record views