Unsupervised Image-to-Image Translation With Self-Attention and Relativistic Discriminator Adversarial Networks

Publication Type:
Journal Article
Citation:
Zidonghua Xuebao/Acta Automatica Sinica, 2021, 47, (9), pp. 2226-2237
Issue Date:
2021-09-01
Filename Description Size
19110430_7963514620005671.pdf4.61 MB
Adobe PDF
Full metadata record
Unsupervised image-to-image translation using unpaired training data can accomplish a variety of image translation tasks such as object transformation, seasonal transfer, and satellite and map transformation. The image-to-image translation method based on generative adversarial network (GAN) has not been satisfying due to the following reasons, the training process is unstable and the irrelevant domain changes greatly, the output images are blurred in detail and low in authenticity. This paper proposes an unsupervised image-to-image translation method with self-attention and relativistic discriminator adversarial networks based on dual learning. Firstly, in the generator, self-attention mechanism is designed to build long-short-range dependency for image generation tasks. Skip-connection between low and high convolution layers help reduce the loss of feature information in irrelevant image domain. Then, in the discriminator, spectral normalization is used to prevent the gradient disappearing caused by the mutation of the discrimination ability to enhance training stability. Finally, in the loss function, the self-reconstruction consistency is added on the basis of loop reconstruction to focus on target image domain change. The relativistic adversarial loss is designed to guide the zero-sum game between generator and discriminator. The experimental results from the Horse & Zebra, Summer & Winter, and AerialPhoto & Map datasets demonstrate that compared with the current image translation methods, our method can establish a more realistic image domain mapping relationship and improve the translation quality of the generated image.
Please use this identifier to cite or link to this item: