Final-Revised-NCAA-D-18-03254.pdf (905.89 kB)
Download file

Perceptual image quality using Dual generative adversarial network

Download (905.89 kB)
journal contribution
posted on 27.06.2019, 15:18 by Masoumeh Zareapoor, Huiyu Zhou, Jie Yang
Generative adversarial networks have received a remarkable success in many computer vision applications for their ability to learn from complex data distribution. In particular, they are capable to generate realistic images from latent space with a simple and intuitive structure. The main focus of existing models has been improving the performance; however, there is a little attention to make a robust model. In this paper, we investigate solutions to the super-resolution problems—in particular perceptual quality—by proposing a robust GAN. The proposed model unlike the standard GAN employs two generators and two discriminators in which, a discriminator determines that the samples are from real data or generated one, while another discriminator acts as classifier to return the wrong samples to its corresponding generators. Generators learn a mixture of many distributions from prior to the complex distribution. This new methodology is trained with the feature matching loss and allows us to return the wrong samples to the corresponding generators, in order to regenerate the real-look samples. Experimental results in various datasets show the superiority of the proposed model compared to the state of the art methods.

Funding

This research is partly supported by NSFC, China (U1803261, 61876107, 61572315); 973 Plan, China (2015CB856004). H. Zhou was supported by UK EPSRC under Grant EP/N011074/1, Royal Society-Newton Advanced Fellowship under Grant NA160342 and European Union’s Horizon 2020 research and innovation program under the Marie-Sklodowska-Curie grant agreement No 720325.

History

Citation

Neural Computing and Applications, 2019

Author affiliation

/Organisation/COLLEGE OF SCIENCE AND ENGINEERING/Department of Informatics

Version

AM (Accepted Manuscript)

Published in

Neural Computing and Applications

Publisher

Springer (part of Springer Nature)

eissn

1433-3058

Acceptance date

09/05/2019

Copyright date

2019

Publisher version

https://link.springer.com/article/10.1007/s00521-019-04239-0

Notes

The file associated with this record is under embargo until 12 months after publication, in accordance with the publisher's self-archiving policy. The full text may be available through the publisher links provided above.

Language

en