"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

January 22, 2023

GAN - Observations

Reading about GAN is easier but sad not to get any output after 2000 Epochs.

CycleGAN - CycleGAN is a model that aims to solve the image-to-image translation problem


After 2000 Epochs :( :(. Knowing is 10%, Experimenting is 50%, and Mastering is 40%. Always experiment.

MnistGANLoss in CycleGAN



How the loss is calculated while training?

Adversarial Loss: We apply Adversarial Loss to both the Generators, where the Generator tries to generate the images of it's domain, while its corresponding discriminator distinguishes between the translated samples and real samples.

Cycle Consistency Loss: It captures the intuition that if we translate the image from one domain to the other and back again we should arrive at where we started. Hence, it calculates the L1 loss between the original image and the final generated image,

CycleGAn Experiments - Implementing CycleGAN

Image to Image Translation using CycleGANs with Keras implementation

Experimented this code - Code Example

Keras-GAN, Conditional GAN

6 GAN Architectures

  • Transforming an image from one domain to another (CycleGAN),
  • Generating an image from a textual description (text-to-image),
  • Generating very high-resolution images (ProgressiveGAN) and many more

Loss Notes, pixelwise MSE loss

Keep Exploring!!!

No comments: