Journal of Hebei University (Natural Science Edition) ›› 2020, Vol. 40 ›› Issue (3): 328-336.DOI: 10.3969/j.issn.1000-1565.2020.03.015

Previous Articles    

Bidirectional learned inference based on Wasserstein distance

HUA Qiang, LIU Yigong, ZHANG Feng, DONG Chunru   

  1. Key Laboratory of Machine Learning and Computational Intelligence of Hebei Province, Hebei University, Baoding 071002, China
  • Received:2019-07-02 Online:2020-05-25 Published:2020-05-25

Abstract: In WGAN, embedding encoder into Generative Adversarial Networks(GAN)can enhance the learning ability of the generative model. However, using the Kullback-Leibler(KL)divergence to measure the difference between two distributions in the optimization objective will lead to the gradient vanishing or gradient explosion problem in the learning training process and reduce the robustness of model. In order to tackle this problem, a Wasserstein-distance-based Bidirectional Learned Inference(WBLI)model is proposed in this paper. A bidirectional network is first established for learning the distribution of the true data and latent variables, where the difference of the joint probability distribution is measured by the Wasserstein distance. Based on this Wasserstein distance, we redesign the loss function which is solvable and consequently propose an iterative algorithm. The experimental results show that the WBLI model overcomes the defects of traditional GAN and its variants. It effectively eliminates the model collapse problem of generating models, increases the robustness of training learning, and contributes to the improvement of the recognition rate of classifiers.

Key words: generative adversarial networks, KL divergence, Wasserstein distance, variational auto-encoder

CLC Number: