Face Generation

In this project, you'll use generative adversarial networks to generate new images of faces.

Get the Data

You'll be using two datasets in this project:

  • MNIST
  • CelebA

Since the celebA dataset is complex and you're doing GANs in a project for the first time, we want you to test your neural network on MNIST before CelebA. Running the GANs on MNIST will allow you to see how well your model trains sooner.

If you're using FloydHub, set data_dir to "/input" and use the FloydHub data ID "R5KrjnANiKVhLWAkpXhNBe".

In [1]:
#data_dir = './data'

# FloydHub - Use with data ID "R5KrjnANiKVhLWAkpXhNBe"
data_dir = '/input'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

helper.download_extract('mnist', data_dir)
helper.download_extract('celeba', data_dir)
Found mnist Data
Found celeba Data

Explore the Data

MNIST

As you're aware, the MNIST dataset contains images of handwritten digits. You can view the first number of examples by changing show_n_images.

In [2]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
import os
from glob import glob
from matplotlib import pyplot

mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'mnist/*.jpg'))[:show_n_images], 28, 28, 'L')
pyplot.imshow(helper.images_square_grid(mnist_images, 'L'), cmap='gray')
/usr/local/lib/python3.5/site-packages/matplotlib/font_manager.py:280: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
  'Matplotlib is building the font cache using fc-list. '
Out[2]:
<matplotlib.image.AxesImage at 0x7f045ffc30b8>

CelebA

The CelebFaces Attributes Dataset (CelebA) dataset contains over 200,000 celebrity images with annotations. Since you're going to be generating faces, you won't need the annotations. You can view the first number of examples by changing show_n_images.

In [3]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'img_align_celeba/*.jpg'))[:show_n_images], 28, 28, 'RGB')
pyplot.imshow(helper.images_square_grid(mnist_images, 'RGB'))
Out[3]:
<matplotlib.image.AxesImage at 0x7f045ff7fcc0>

Preprocess the Data

Since the project's main focus is on building the GANs, we'll preprocess the data for you. The values of the MNIST and CelebA dataset will be in the range of -0.5 to 0.5 of 28x28 dimensional images. The CelebA images will be cropped to remove parts of the image that don't include a face, then resized down to 28x28.

The MNIST images are black and white images with a single color channel while the CelebA images have 3 color channels (RGB color channel).

Build the Neural Network

You'll build the components necessary to build a GANs by implementing the following functions below:

  • model_inputs
  • discriminator
  • generator
  • model_loss
  • model_opt
  • train

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU

In [4]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer.  You are using {}'.format(tf.__version__)
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
TensorFlow Version: 1.0.0
Default GPU Device: /gpu:0

Input

Implement the model_inputs function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Real input images placeholder with rank 4 using image_width, image_height, and image_channels.
  • Z input placeholder with rank 2 using z_dim.
  • Learning rate placeholder with rank 0.

Return the placeholders in the following the tuple (tensor of real input images, tensor of z data)

In [5]:
import problem_unittests as tests

def model_inputs(image_width, image_height, image_channels, z_dim):
    """
    Create the model inputs
    :param image_width: The input image width
    :param image_height: The input image height
    :param image_channels: The number of image channels
    :param z_dim: The dimension of Z
    :return: Tuple of (tensor of real input images, tensor of z data, learning rate)
    """
    # TODO: Implement Function
    real_inputs = tf.placeholder(tf.float32, (None, image_width, image_height, image_channels), "input_real")
    z_inputs = tf.placeholder(tf.float32, (None, z_dim), "input_z")
    learning_rate = tf.placeholder(tf.float32, name="lr")

    return real_inputs, z_inputs, learning_rate



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)
Tests Passed

Discriminator

Implement discriminator to create a discriminator neural network that discriminates on images. This function should be able to reuse the variabes in the neural network. Use tf.variable_scope with a scope name of "discriminator" to allow the variables to be reused. The function should return a tuple of (tensor output of the generator, tensor logits of the generator).

In [6]:
def discriminator(images, reuse=False):
    """
    Create the discriminator network
    :param image: Tensor of input image(s)
    :param reuse: Boolean if the weights should be reused
    :return: Tuple of (tensor output of the discriminator, tensor logits of the discriminator)
    """
    # TODO: Implement Function
    alpha = 0.1
    keep_prob = 0.9

    with tf.variable_scope('discriminator', reuse=reuse):
        x1 = tf.layers.conv2d(images, 64, 5, strides=2, padding='same', activation=None)
        x1 = tf.maximum(alpha * x1, x1)
        
        x2 = tf.layers.conv2d(x1, 128, 5, strides=2, padding='same', activation=None)
        x2 = tf.layers.batch_normalization(x2, training=True)
        x2 = tf.maximum(alpha * x2, x2)
        x2 = tf.nn.dropout(x2, keep_prob=keep_prob)
        
        x3 = tf.layers.conv2d(x2, 256, 5, strides=2, padding='same', activation=None)
        x3 = tf.layers.batch_normalization(x3, training=True)
        x3 = tf.maximum(alpha * x3, x3)
        x3 = tf.nn.dropout(x3, keep_prob=keep_prob)
        
        flat = tf.reshape(x3, (-1,  4 * 4 * 256))
        logits = tf.layers.dense(flat, 1)
        out = tf.sigmoid(logits)
        
        return out, logits



"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(discriminator, tf)
Tests Passed

Generator

Implement generator to generate an image using z. This function should be able to reuse the variabes in the neural network. Use tf.variable_scope with a scope name of "generator" to allow the variables to be reused. The function should return the generated 28 x 28 x out_channel_dim images.

In [7]:
def generator(z, out_channel_dim, is_train=True):
    """
    Create the generator network
    :param z: Input z
    :param out_channel_dim: The number of channels in the output image
    :param is_train: Boolean if generator is being used for training
    :return: The tensor output of the generator
    """
    # TODO: Implement Function
    alpha = 0.1
    keep_prob = 0.9
    
    with tf.variable_scope('generator', reuse=not is_train):
        g1 = tf.layers.dense(z, 7*7*512)
        g1 = tf.reshape(g1, (-1, 7, 7, 512))
        g1 = tf.layers.batch_normalization(g1, training=is_train)
        g1 = tf.maximum(alpha * g1, g1)
        # shape = 7x7x512
        
        g2 = tf.layers.conv2d_transpose(g1, 256, 5, strides=2, padding='same', activation=None)
        g2 = tf.layers.batch_normalization(g2, training=is_train)
        g2 = tf.maximum(alpha * g2, g2)
        g2 = tf.nn.dropout(g2, keep_prob=keep_prob)
        # shape = 14x14x256
        
        g3 = tf.layers.conv2d_transpose(g2, 128, 5, strides=2, padding='same', activation=None)
        g3 = tf.layers.batch_normalization(g3, training=is_train)
        g3 = tf.maximum(alpha * g3, g3)   
        g3 = tf.nn.dropout(g3, keep_prob=keep_prob)
        # shape = 28x28x128
        
        logits = tf.layers.conv2d_transpose(g3, out_channel_dim, 3, strides=1, padding='same', activation=None)
        # shape = 28x28x5
        
        out = tf.tanh(logits)
        
        return out


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(generator, tf)
Tests Passed

Loss

Implement model_loss to build the GANs for training and calculate the loss. The function should return a tuple of (discriminator loss, generator loss). Use the following functions you implemented:

  • discriminator(images, reuse=False)
  • generator(z, out_channel_dim, is_train=True)
In [8]:
def model_loss(input_real, input_z, out_channel_dim):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    # TODO: Implement Function
    smooth = 0.1
    g_output = generator(input_z, out_channel_dim)
    disc_output_real, disc_logits_real = discriminator(input_real)
    disc_output_fake, disc_logits_fake = discriminator(g_output, reuse=True)
    
    disc_loss_real = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(
            logits = disc_logits_real,
            labels = tf.ones_like(disc_output_real) * (1 - smooth)
        )
    )
    disc_loss_fake = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(
            logits = disc_logits_fake,
            labels = tf.zeros_like(disc_output_fake)
        )
    )
    gen_loss = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(
            logits = disc_logits_fake,
            labels = tf.ones_like(disc_output_fake)
        )
    )
    
    disc_loss = disc_loss_real + disc_loss_fake
    return disc_loss, gen_loss


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_loss(model_loss)
Tests Passed

Optimization

Implement model_opt to create the optimization operations for the GANs. Use tf.trainable_variables to get all the trainable variables. Filter the variables with names that are in the discriminator and generator scope names. The function should return a tuple of (discriminator training operation, generator training operation).

In [9]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training operation, generator training operation)
    """
    # TODO: Implement Function
    train_vars = tf.trainable_variables()
    disc_vars = [var for var in train_vars if var.name.startswith('discriminator')]
    gen_vars = [var for var in train_vars if var.name.startswith('generator')]
    
    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)

    with tf.control_dependencies(update_ops):
        gen_train_opt = tf.train.AdamOptimizer(learning_rate=learning_rate, beta1=beta1).minimize(g_loss, var_list=gen_vars)
        disc_train_opt = tf.train.AdamOptimizer(learning_rate=learning_rate, beta1=beta1).minimize(d_loss, var_list=disc_vars)

        return disc_train_opt, gen_train_opt


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_opt(model_opt, tf)
Tests Passed

Neural Network Training

Show Output

Use this function to show the current output of the generator during training. It will help you determine how well the GANs is training.

In [10]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

def show_generator_output(sess, n_images, input_z, out_channel_dim, image_mode):
    """
    Show example output for the generator
    :param sess: TensorFlow session
    :param n_images: Number of Images to display
    :param input_z: Input Z Tensor
    :param out_channel_dim: The number of channels in the output image
    :param image_mode: The mode to use for images ("RGB" or "L")
    """
    cmap = None if image_mode == 'RGB' else 'gray'
    z_dim = input_z.get_shape().as_list()[-1]
    example_z = np.random.uniform(-1, 1, size=[n_images, z_dim])

    samples = sess.run(
        generator(input_z, out_channel_dim, False),
        feed_dict={input_z: example_z})

    images_grid = helper.images_square_grid(samples, image_mode)
    pyplot.imshow(images_grid, cmap=cmap)
    pyplot.show()

Train

Implement train to build and train the GANs. Use the following functions you implemented:

  • model_inputs(image_width, image_height, image_channels, z_dim)
  • model_loss(input_real, input_z, out_channel_dim)
  • model_opt(d_loss, g_loss, learning_rate, beta1)

Use the show_generator_output to show generator output while you train. Running show_generator_output for every batch will drastically increase training time and increase the size of the notebook. It's recommended to print the generator output every 100 batches.

In [11]:
def train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode):
    """
    Train the GAN
    :param epoch_count: Number of epochs
    :param batch_size: Batch Size
    :param z_dim: Z dimension
    :param learning_rate: Learning Rate
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :param get_batches: Function to get batches
    :param data_shape: Shape of the data
    :param data_image_mode: The image mode to use for images ("RGB" or "L")
    """
    # TODO: Build Model
    _, image_width, image_height, image_channels = data_shape
    input_real, input_z, lr = model_inputs(image_width, image_height, image_channels, z_dim)
    disc_loss, gen_loss = model_loss(input_real, input_z, image_channels)
    disc_opt, gen_opt = model_opt(disc_loss, gen_loss, lr, beta1)

    
    saver = tf.train.Saver()
    losses = []
    steps = 0
    total_steps = epoch_count * batch_size

    
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for epoch_i in range(epoch_count):
            for batch_images in get_batches(batch_size):
                # TODO: Train Model
                steps += 1
                batch_images *= 2
                batch_z = np.random.uniform(-1, 1, size=(batch_size, z_dim))
                
                _ = sess.run(
                    disc_opt, 
                    feed_dict={
                        input_real: batch_images, input_z: batch_z, lr: learning_rate 
                    }
                )
                _ = sess.run(
                    gen_opt, 
                    feed_dict={
                        input_real: batch_images, 
                        input_z: batch_z, 
                        lr:learning_rate
                    }
                )
                if steps == 1:
                    print('initial output:')
                    show_generator_output(sess, 16, input_z, image_channels, data_image_mode)
                    
                if steps % 10 == 0:
                    # At the end of each epoch, get the losses and print them out
                    train_loss_d = disc_loss.eval({input_z: batch_z, input_real: batch_images})
                    train_loss_g = gen_loss.eval({input_z: batch_z})

                    print("Epoch {}/{}...".format(epoch_i+1, epochs),
                            "Generator Loss: {:.4f}".format(train_loss_g),
                            "Discriminator Loss: {:.4f}...".format(train_loss_d))
                    # Save losses to view after training
                    losses.append((train_loss_d, train_loss_g))

                if steps % 100 == 0:
                    show_generator_output(sess, 16, input_z, image_channels, data_image_mode)
                    

        saver.save(sess, './generator.ckpt')
                
    return losses
                
                

MNIST

Test your GANs architecture on MNIST. After 2 epochs, the GANs should be able to generate images that look like handwritten digits. Make sure the loss of the generator is lower than the loss of the discriminator or close to 0.

In [12]:
batch_size = 128
z_dim = 100
learning_rate = 0.0002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 2

mnist_dataset = helper.Dataset('mnist', glob(os.path.join(data_dir, 'mnist/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, mnist_dataset.get_batches,
          mnist_dataset.shape, mnist_dataset.image_mode)
initial output:
Epoch 1/2... Generator Loss: 2.0398 Discriminator Loss: 0.5408...
Epoch 1/2... Generator Loss: 3.7924 Discriminator Loss: 0.3954...
Epoch 1/2... Generator Loss: 3.8809 Discriminator Loss: 0.3809...
Epoch 1/2... Generator Loss: 2.0178 Discriminator Loss: 1.1645...
Epoch 1/2... Generator Loss: 3.7360 Discriminator Loss: 0.4077...
Epoch 1/2... Generator Loss: 4.4128 Discriminator Loss: 0.3857...
Epoch 1/2... Generator Loss: 0.1067 Discriminator Loss: 3.4446...
Epoch 1/2... Generator Loss: 1.7442 Discriminator Loss: 1.8674...
Epoch 1/2... Generator Loss: 1.1248 Discriminator Loss: 0.9605...
Epoch 1/2... Generator Loss: 0.8258 Discriminator Loss: 1.3821...
Epoch 1/2... Generator Loss: 1.0475 Discriminator Loss: 1.1753...
Epoch 1/2... Generator Loss: 0.5087 Discriminator Loss: 1.6412...
Epoch 1/2... Generator Loss: 0.6678 Discriminator Loss: 1.2840...
Epoch 1/2... Generator Loss: 0.7642 Discriminator Loss: 1.2314...
Epoch 1/2... Generator Loss: 1.1231 Discriminator Loss: 1.1970...
Epoch 1/2... Generator Loss: 0.7543 Discriminator Loss: 1.3923...
Epoch 1/2... Generator Loss: 1.2204 Discriminator Loss: 1.1017...
Epoch 1/2... Generator Loss: 0.7797 Discriminator Loss: 1.1886...
Epoch 1/2... Generator Loss: 1.0678 Discriminator Loss: 1.1098...
Epoch 1/2... Generator Loss: 0.7354 Discriminator Loss: 1.3548...
Epoch 1/2... Generator Loss: 0.6137 Discriminator Loss: 1.3456...
Epoch 1/2... Generator Loss: 0.9832 Discriminator Loss: 1.0328...
Epoch 1/2... Generator Loss: 1.4482 Discriminator Loss: 0.9992...
Epoch 1/2... Generator Loss: 1.5615 Discriminator Loss: 1.0679...
Epoch 1/2... Generator Loss: 0.9128 Discriminator Loss: 1.0422...
Epoch 1/2... Generator Loss: 1.2510 Discriminator Loss: 0.9663...
Epoch 1/2... Generator Loss: 1.9572 Discriminator Loss: 1.1960...
Epoch 1/2... Generator Loss: 1.6584 Discriminator Loss: 0.8312...
Epoch 1/2... Generator Loss: 0.8652 Discriminator Loss: 0.9842...
Epoch 1/2... Generator Loss: 0.7334 Discriminator Loss: 1.1871...
Epoch 1/2... Generator Loss: 1.0391 Discriminator Loss: 1.0062...
Epoch 1/2... Generator Loss: 1.1700 Discriminator Loss: 0.8671...
Epoch 1/2... Generator Loss: 1.9766 Discriminator Loss: 1.0297...
Epoch 1/2... Generator Loss: 1.1478 Discriminator Loss: 0.9955...
Epoch 1/2... Generator Loss: 1.4359 Discriminator Loss: 0.8176...
Epoch 1/2... Generator Loss: 0.7970 Discriminator Loss: 1.1479...
Epoch 1/2... Generator Loss: 2.0901 Discriminator Loss: 1.1230...
Epoch 1/2... Generator Loss: 1.4292 Discriminator Loss: 1.0880...
Epoch 1/2... Generator Loss: 1.2128 Discriminator Loss: 0.9615...
Epoch 1/2... Generator Loss: 0.6310 Discriminator Loss: 1.2347...
Epoch 1/2... Generator Loss: 0.5343 Discriminator Loss: 1.5652...
Epoch 1/2... Generator Loss: 0.4754 Discriminator Loss: 1.5536...
Epoch 1/2... Generator Loss: 2.1569 Discriminator Loss: 1.5572...
Epoch 1/2... Generator Loss: 0.7945 Discriminator Loss: 1.1603...
Epoch 1/2... Generator Loss: 1.5442 Discriminator Loss: 1.3887...
Epoch 1/2... Generator Loss: 1.2444 Discriminator Loss: 1.2241...
Epoch 1/2... Generator Loss: 1.0032 Discriminator Loss: 1.2977...
Epoch 1/2... Generator Loss: 1.4105 Discriminator Loss: 1.0874...
Epoch 1/2... Generator Loss: 2.7575 Discriminator Loss: 1.7096...
Epoch 1/2... Generator Loss: 0.9204 Discriminator Loss: 1.1305...
Epoch 1/2... Generator Loss: 1.6362 Discriminator Loss: 1.3259...
Epoch 1/2... Generator Loss: 1.1579 Discriminator Loss: 1.0592...
Epoch 1/2... Generator Loss: 0.5111 Discriminator Loss: 1.4194...
Epoch 1/2... Generator Loss: 0.9444 Discriminator Loss: 1.0174...
Epoch 1/2... Generator Loss: 0.8396 Discriminator Loss: 1.2238...
Epoch 1/2... Generator Loss: 2.2396 Discriminator Loss: 1.2221...
Epoch 1/2... Generator Loss: 1.1359 Discriminator Loss: 1.1002...
Epoch 1/2... Generator Loss: 0.7148 Discriminator Loss: 1.3051...
Epoch 1/2... Generator Loss: 0.8847 Discriminator Loss: 1.0522...
Epoch 1/2... Generator Loss: 0.8032 Discriminator Loss: 1.0687...
Epoch 1/2... Generator Loss: 1.1387 Discriminator Loss: 1.1521...
Epoch 1/2... Generator Loss: 0.6984 Discriminator Loss: 1.2514...
Epoch 1/2... Generator Loss: 0.9395 Discriminator Loss: 1.0727...
Epoch 1/2... Generator Loss: 1.8273 Discriminator Loss: 1.0763...
Epoch 1/2... Generator Loss: 0.5754 Discriminator Loss: 1.4550...
Epoch 1/2... Generator Loss: 0.8305 Discriminator Loss: 1.2516...
Epoch 1/2... Generator Loss: 0.8365 Discriminator Loss: 1.2724...
Epoch 1/2... Generator Loss: 1.0344 Discriminator Loss: 0.9259...
Epoch 1/2... Generator Loss: 0.5766 Discriminator Loss: 1.3515...
Epoch 1/2... Generator Loss: 1.2805 Discriminator Loss: 1.0215...
Epoch 1/2... Generator Loss: 1.5033 Discriminator Loss: 1.0761...
Epoch 1/2... Generator Loss: 0.4663 Discriminator Loss: 1.6200...
Epoch 1/2... Generator Loss: 1.7536 Discriminator Loss: 1.0799...
Epoch 1/2... Generator Loss: 1.3100 Discriminator Loss: 1.0728...
Epoch 1/2... Generator Loss: 1.3725 Discriminator Loss: 1.0403...
Epoch 1/2... Generator Loss: 0.7248 Discriminator Loss: 1.4349...
Epoch 1/2... Generator Loss: 1.4275 Discriminator Loss: 1.2448...
Epoch 1/2... Generator Loss: 0.7440 Discriminator Loss: 1.2478...
Epoch 1/2... Generator Loss: 1.1864 Discriminator Loss: 1.2013...
Epoch 1/2... Generator Loss: 1.1590 Discriminator Loss: 1.0338...
Epoch 1/2... Generator Loss: 1.0371 Discriminator Loss: 0.9849...
Epoch 1/2... Generator Loss: 1.0527 Discriminator Loss: 0.9124...
Epoch 1/2... Generator Loss: 0.7506 Discriminator Loss: 1.2524...
Epoch 1/2... Generator Loss: 1.9116 Discriminator Loss: 1.0718...
Epoch 1/2... Generator Loss: 1.2074 Discriminator Loss: 1.0441...
Epoch 1/2... Generator Loss: 0.6330 Discriminator Loss: 1.4394...
Epoch 1/2... Generator Loss: 1.3971 Discriminator Loss: 0.9686...
Epoch 1/2... Generator Loss: 1.1367 Discriminator Loss: 1.0621...
Epoch 1/2... Generator Loss: 1.2183 Discriminator Loss: 1.1655...
Epoch 1/2... Generator Loss: 1.0775 Discriminator Loss: 1.0555...
Epoch 1/2... Generator Loss: 0.4632 Discriminator Loss: 1.5235...
Epoch 1/2... Generator Loss: 1.2474 Discriminator Loss: 0.8954...
Epoch 1/2... Generator Loss: 1.5099 Discriminator Loss: 0.9936...
Epoch 1/2... Generator Loss: 1.3557 Discriminator Loss: 1.0553...
Epoch 1/2... Generator Loss: 0.7502 Discriminator Loss: 1.0536...
Epoch 1/2... Generator Loss: 1.4562 Discriminator Loss: 1.1178...
Epoch 1/2... Generator Loss: 1.7402 Discriminator Loss: 1.3840...
Epoch 1/2... Generator Loss: 0.7412 Discriminator Loss: 1.2250...
Epoch 1/2... Generator Loss: 0.9091 Discriminator Loss: 1.2076...
Epoch 1/2... Generator Loss: 0.8683 Discriminator Loss: 1.1748...
Epoch 1/2... Generator Loss: 1.4336 Discriminator Loss: 1.1080...
Epoch 1/2... Generator Loss: 1.3027 Discriminator Loss: 0.9744...
Epoch 1/2... Generator Loss: 0.9123 Discriminator Loss: 1.3128...
Epoch 1/2... Generator Loss: 0.8826 Discriminator Loss: 1.1067...
Epoch 1/2... Generator Loss: 0.8893 Discriminator Loss: 1.2461...
Epoch 1/2... Generator Loss: 1.0646 Discriminator Loss: 1.1117...
Epoch 1/2... Generator Loss: 0.9265 Discriminator Loss: 1.1730...
Epoch 1/2... Generator Loss: 1.3300 Discriminator Loss: 1.2033...
Epoch 1/2... Generator Loss: 1.3859 Discriminator Loss: 1.1336...
Epoch 1/2... Generator Loss: 0.4917 Discriminator Loss: 1.5543...
Epoch 1/2... Generator Loss: 1.1295 Discriminator Loss: 1.0996...
Epoch 1/2... Generator Loss: 0.6756 Discriminator Loss: 1.2644...
Epoch 1/2... Generator Loss: 0.6865 Discriminator Loss: 1.2089...
Epoch 1/2... Generator Loss: 0.6665 Discriminator Loss: 1.1758...
Epoch 1/2... Generator Loss: 1.4966 Discriminator Loss: 1.0574...
Epoch 1/2... Generator Loss: 0.7805 Discriminator Loss: 1.1891...
Epoch 1/2... Generator Loss: 1.3008 Discriminator Loss: 0.8711...
Epoch 1/2... Generator Loss: 1.2144 Discriminator Loss: 1.0155...
Epoch 1/2... Generator Loss: 0.7388 Discriminator Loss: 1.0963...
Epoch 1/2... Generator Loss: 1.1069 Discriminator Loss: 1.0571...
Epoch 1/2... Generator Loss: 1.1242 Discriminator Loss: 1.0829...
Epoch 1/2... Generator Loss: 0.9486 Discriminator Loss: 1.0402...
Epoch 1/2... Generator Loss: 0.9475 Discriminator Loss: 1.1467...
Epoch 1/2... Generator Loss: 1.2283 Discriminator Loss: 0.9430...
Epoch 1/2... Generator Loss: 0.9999 Discriminator Loss: 0.9724...
Epoch 1/2... Generator Loss: 0.8599 Discriminator Loss: 1.0500...
Epoch 1/2... Generator Loss: 1.0655 Discriminator Loss: 1.0385...
Epoch 1/2... Generator Loss: 1.1696 Discriminator Loss: 0.9385...
Epoch 1/2... Generator Loss: 0.6968 Discriminator Loss: 1.2579...
Epoch 1/2... Generator Loss: 1.3063 Discriminator Loss: 0.9469...
Epoch 1/2... Generator Loss: 1.2141 Discriminator Loss: 1.0112...
Epoch 1/2... Generator Loss: 1.5699 Discriminator Loss: 1.0053...
Epoch 1/2... Generator Loss: 0.7633 Discriminator Loss: 1.2353...
Epoch 1/2... Generator Loss: 0.6936 Discriminator Loss: 1.3087...
Epoch 1/2... Generator Loss: 0.7960 Discriminator Loss: 1.1271...
Epoch 1/2... Generator Loss: 0.8197 Discriminator Loss: 1.1098...
Epoch 1/2... Generator Loss: 1.3790 Discriminator Loss: 1.2369...
Epoch 1/2... Generator Loss: 1.8329 Discriminator Loss: 1.1630...
Epoch 1/2... Generator Loss: 0.7788 Discriminator Loss: 1.1702...
Epoch 1/2... Generator Loss: 0.6878 Discriminator Loss: 1.1611...
Epoch 1/2... Generator Loss: 1.5138 Discriminator Loss: 1.0692...
Epoch 1/2... Generator Loss: 1.0700 Discriminator Loss: 1.0208...
Epoch 1/2... Generator Loss: 1.2979 Discriminator Loss: 0.8545...
Epoch 1/2... Generator Loss: 0.6755 Discriminator Loss: 1.2367...
Epoch 1/2... Generator Loss: 0.7471 Discriminator Loss: 1.1699...
Epoch 1/2... Generator Loss: 1.3795 Discriminator Loss: 1.0773...
Epoch 1/2... Generator Loss: 0.9524 Discriminator Loss: 1.0650...
Epoch 1/2... Generator Loss: 0.9524 Discriminator Loss: 0.9772...
Epoch 1/2... Generator Loss: 0.4952 Discriminator Loss: 1.3890...
Epoch 1/2... Generator Loss: 1.2926 Discriminator Loss: 1.0665...
Epoch 1/2... Generator Loss: 1.3763 Discriminator Loss: 0.9860...
Epoch 1/2... Generator Loss: 1.0986 Discriminator Loss: 0.9608...
Epoch 1/2... Generator Loss: 0.8245 Discriminator Loss: 1.1314...
Epoch 1/2... Generator Loss: 1.0524 Discriminator Loss: 1.0401...
Epoch 1/2... Generator Loss: 0.7652 Discriminator Loss: 1.1262...
Epoch 1/2... Generator Loss: 1.5099 Discriminator Loss: 0.7982...
Epoch 1/2... Generator Loss: 1.0490 Discriminator Loss: 1.0331...
Epoch 1/2... Generator Loss: 0.6099 Discriminator Loss: 1.3294...
Epoch 1/2... Generator Loss: 1.3814 Discriminator Loss: 0.9809...
Epoch 1/2... Generator Loss: 1.0924 Discriminator Loss: 1.0372...
Epoch 1/2... Generator Loss: 0.8759 Discriminator Loss: 1.0857...
Epoch 1/2... Generator Loss: 1.2765 Discriminator Loss: 0.8751...
Epoch 1/2... Generator Loss: 1.7403 Discriminator Loss: 0.8682...
Epoch 1/2... Generator Loss: 1.6782 Discriminator Loss: 0.8541...
Epoch 1/2... Generator Loss: 0.7071 Discriminator Loss: 1.1890...
Epoch 1/2... Generator Loss: 1.2455 Discriminator Loss: 0.9306...
Epoch 1/2... Generator Loss: 0.8732 Discriminator Loss: 1.0911...
Epoch 1/2... Generator Loss: 0.6241 Discriminator Loss: 1.2360...
Epoch 1/2... Generator Loss: 0.8828 Discriminator Loss: 1.2432...
Epoch 1/2... Generator Loss: 0.9831 Discriminator Loss: 1.1229...
Epoch 1/2... Generator Loss: 0.7863 Discriminator Loss: 1.1890...
Epoch 1/2... Generator Loss: 0.7805 Discriminator Loss: 1.1402...
Epoch 1/2... Generator Loss: 1.6056 Discriminator Loss: 0.8154...
Epoch 1/2... Generator Loss: 0.7420 Discriminator Loss: 1.1592...
Epoch 1/2... Generator Loss: 1.0283 Discriminator Loss: 0.9623...
Epoch 1/2... Generator Loss: 1.3784 Discriminator Loss: 1.0556...
Epoch 1/2... Generator Loss: 1.7704 Discriminator Loss: 0.8210...
Epoch 1/2... Generator Loss: 1.3207 Discriminator Loss: 0.9028...
Epoch 1/2... Generator Loss: 1.0300 Discriminator Loss: 1.1677...
Epoch 1/2... Generator Loss: 1.1075 Discriminator Loss: 1.0751...
Epoch 1/2... Generator Loss: 0.4779 Discriminator Loss: 1.7303...
Epoch 1/2... Generator Loss: 1.4116 Discriminator Loss: 0.8743...
Epoch 1/2... Generator Loss: 1.3794 Discriminator Loss: 0.9071...
Epoch 1/2... Generator Loss: 0.9176 Discriminator Loss: 1.1387...
Epoch 1/2... Generator Loss: 1.0492 Discriminator Loss: 0.9790...
Epoch 1/2... Generator Loss: 1.1555 Discriminator Loss: 1.0126...
Epoch 1/2... Generator Loss: 1.8748 Discriminator Loss: 1.2823...
Epoch 2/2... Generator Loss: 1.3481 Discriminator Loss: 0.9169...
Epoch 2/2... Generator Loss: 1.1434 Discriminator Loss: 0.8797...
Epoch 2/2... Generator Loss: 1.3400 Discriminator Loss: 0.8255...
Epoch 2/2... Generator Loss: 0.9175 Discriminator Loss: 1.0556...
Epoch 2/2... Generator Loss: 1.8345 Discriminator Loss: 0.8857...
Epoch 2/2... Generator Loss: 1.0202 Discriminator Loss: 1.0145...
Epoch 2/2... Generator Loss: 1.3121 Discriminator Loss: 0.8291...
Epoch 2/2... Generator Loss: 1.2140 Discriminator Loss: 0.8952...
Epoch 2/2... Generator Loss: 1.9358 Discriminator Loss: 0.8262...
Epoch 2/2... Generator Loss: 1.0510 Discriminator Loss: 1.0220...
Epoch 2/2... Generator Loss: 1.7060 Discriminator Loss: 0.8747...
Epoch 2/2... Generator Loss: 1.1988 Discriminator Loss: 0.9848...
Epoch 2/2... Generator Loss: 0.7821 Discriminator Loss: 1.1644...
Epoch 2/2... Generator Loss: 1.7214 Discriminator Loss: 0.7487...
Epoch 2/2... Generator Loss: 1.4533 Discriminator Loss: 0.7376...
Epoch 2/2... Generator Loss: 2.1632 Discriminator Loss: 0.8757...
Epoch 2/2... Generator Loss: 1.2961 Discriminator Loss: 0.8360...
Epoch 2/2... Generator Loss: 0.5857 Discriminator Loss: 1.7435...
Epoch 2/2... Generator Loss: 1.1379 Discriminator Loss: 1.0621...
Epoch 2/2... Generator Loss: 1.4459 Discriminator Loss: 0.8831...
Epoch 2/2... Generator Loss: 1.1201 Discriminator Loss: 0.9123...
Epoch 2/2... Generator Loss: 1.0493 Discriminator Loss: 0.9268...
Epoch 2/2... Generator Loss: 0.8949 Discriminator Loss: 1.0951...
Epoch 2/2... Generator Loss: 1.7389 Discriminator Loss: 0.8866...
Epoch 2/2... Generator Loss: 0.5828 Discriminator Loss: 1.3460...
Epoch 2/2... Generator Loss: 1.3987 Discriminator Loss: 0.8827...
Epoch 2/2... Generator Loss: 0.8494 Discriminator Loss: 0.9870...
Epoch 2/2... Generator Loss: 0.9786 Discriminator Loss: 1.0143...
Epoch 2/2... Generator Loss: 1.0822 Discriminator Loss: 1.0448...
Epoch 2/2... Generator Loss: 0.6357 Discriminator Loss: 1.4624...
Epoch 2/2... Generator Loss: 0.9415 Discriminator Loss: 1.0787...
Epoch 2/2... Generator Loss: 1.3921 Discriminator Loss: 0.9495...
Epoch 2/2... Generator Loss: 1.2774 Discriminator Loss: 0.9059...
Epoch 2/2... Generator Loss: 2.3611 Discriminator Loss: 1.2335...
Epoch 2/2... Generator Loss: 1.0331 Discriminator Loss: 0.9866...
Epoch 2/2... Generator Loss: 1.1330 Discriminator Loss: 0.9644...
Epoch 2/2... Generator Loss: 1.6018 Discriminator Loss: 0.7993...
Epoch 2/2... Generator Loss: 1.1330 Discriminator Loss: 1.0248...
Epoch 2/2... Generator Loss: 1.0709 Discriminator Loss: 0.9453...
Epoch 2/2... Generator Loss: 0.7918 Discriminator Loss: 1.2797...
Epoch 2/2... Generator Loss: 1.7171 Discriminator Loss: 0.7912...
Epoch 2/2... Generator Loss: 0.9585 Discriminator Loss: 1.0096...
Epoch 2/2... Generator Loss: 1.0977 Discriminator Loss: 1.0241...
Epoch 2/2... Generator Loss: 0.4674 Discriminator Loss: 1.4435...
Epoch 2/2... Generator Loss: 1.4528 Discriminator Loss: 0.8615...
Epoch 2/2... Generator Loss: 0.7951 Discriminator Loss: 1.0590...
Epoch 2/2... Generator Loss: 1.6647 Discriminator Loss: 0.9695...
Epoch 2/2... Generator Loss: 1.8839 Discriminator Loss: 0.7950...
Epoch 2/2... Generator Loss: 1.0775 Discriminator Loss: 0.9067...
Epoch 2/2... Generator Loss: 2.0159 Discriminator Loss: 1.1359...
Epoch 2/2... Generator Loss: 1.1964 Discriminator Loss: 0.9031...
Epoch 2/2... Generator Loss: 2.3263 Discriminator Loss: 1.1100...
Epoch 2/2... Generator Loss: 0.8299 Discriminator Loss: 1.1223...
Epoch 2/2... Generator Loss: 1.6368 Discriminator Loss: 0.9538...
Epoch 2/2... Generator Loss: 1.3404 Discriminator Loss: 0.8470...
Epoch 2/2... Generator Loss: 1.3991 Discriminator Loss: 0.9234...
Epoch 2/2... Generator Loss: 1.1873 Discriminator Loss: 1.0511...
Epoch 2/2... Generator Loss: 1.1441 Discriminator Loss: 0.9460...
Epoch 2/2... Generator Loss: 0.8973 Discriminator Loss: 1.0196...
Epoch 2/2... Generator Loss: 1.3138 Discriminator Loss: 0.8243...
Epoch 2/2... Generator Loss: 1.0794 Discriminator Loss: 1.1229...
Epoch 2/2... Generator Loss: 1.6006 Discriminator Loss: 0.8186...
Epoch 2/2... Generator Loss: 1.1887 Discriminator Loss: 0.9813...
Epoch 2/2... Generator Loss: 1.2635 Discriminator Loss: 1.2654...
Epoch 2/2... Generator Loss: 1.3978 Discriminator Loss: 0.7988...
Epoch 2/2... Generator Loss: 1.4892 Discriminator Loss: 0.8127...
Epoch 2/2... Generator Loss: 1.1759 Discriminator Loss: 0.9115...
Epoch 2/2... Generator Loss: 1.2632 Discriminator Loss: 0.9384...
Epoch 2/2... Generator Loss: 0.9874 Discriminator Loss: 0.9818...
Epoch 2/2... Generator Loss: 1.4048 Discriminator Loss: 0.8869...
Epoch 2/2... Generator Loss: 0.8255 Discriminator Loss: 0.9509...
Epoch 2/2... Generator Loss: 1.0432 Discriminator Loss: 0.8605...
Epoch 2/2... Generator Loss: 1.0868 Discriminator Loss: 1.0082...
Epoch 2/2... Generator Loss: 1.9151 Discriminator Loss: 0.8697...
Epoch 2/2... Generator Loss: 0.9232 Discriminator Loss: 1.0644...
Epoch 2/2... Generator Loss: 1.1619 Discriminator Loss: 0.9793...
Epoch 2/2... Generator Loss: 2.5830 Discriminator Loss: 1.1122...
Epoch 2/2... Generator Loss: 1.0688 Discriminator Loss: 0.8385...
Epoch 2/2... Generator Loss: 1.4841 Discriminator Loss: 0.7774...
Epoch 2/2... Generator Loss: 0.6948 Discriminator Loss: 1.3098...
Epoch 2/2... Generator Loss: 1.4609 Discriminator Loss: 0.8412...
Epoch 2/2... Generator Loss: 1.0003 Discriminator Loss: 0.9806...
Epoch 2/2... Generator Loss: 1.6287 Discriminator Loss: 0.8954...
Epoch 2/2... Generator Loss: 1.3760 Discriminator Loss: 0.9540...
Epoch 2/2... Generator Loss: 0.9757 Discriminator Loss: 0.8851...
Epoch 2/2... Generator Loss: 0.9491 Discriminator Loss: 0.9408...
Epoch 2/2... Generator Loss: 1.9409 Discriminator Loss: 0.9030...
Epoch 2/2... Generator Loss: 1.7975 Discriminator Loss: 0.7528...
Epoch 2/2... Generator Loss: 1.5099 Discriminator Loss: 0.8892...
Epoch 2/2... Generator Loss: 0.6869 Discriminator Loss: 1.2001...
Epoch 2/2... Generator Loss: 1.0288 Discriminator Loss: 0.8720...
Epoch 2/2... Generator Loss: 0.7531 Discriminator Loss: 1.3649...
Epoch 2/2... Generator Loss: 1.2097 Discriminator Loss: 1.0826...
Epoch 2/2... Generator Loss: 1.7463 Discriminator Loss: 0.8524...
Epoch 2/2... Generator Loss: 1.7236 Discriminator Loss: 0.7269...
Epoch 2/2... Generator Loss: 1.2176 Discriminator Loss: 0.9144...
Epoch 2/2... Generator Loss: 1.3574 Discriminator Loss: 0.8990...
Epoch 2/2... Generator Loss: 1.4840 Discriminator Loss: 0.7849...
Epoch 2/2... Generator Loss: 1.3308 Discriminator Loss: 0.8026...
Epoch 2/2... Generator Loss: 0.6081 Discriminator Loss: 1.4443...
Epoch 2/2... Generator Loss: 1.3473 Discriminator Loss: 0.9279...
Epoch 2/2... Generator Loss: 1.0420 Discriminator Loss: 1.2024...
Epoch 2/2... Generator Loss: 1.0286 Discriminator Loss: 1.0083...
Epoch 2/2... Generator Loss: 0.7759 Discriminator Loss: 1.0871...
Epoch 2/2... Generator Loss: 1.4562 Discriminator Loss: 0.7685...
Epoch 2/2... Generator Loss: 1.4622 Discriminator Loss: 0.7688...
Epoch 2/2... Generator Loss: 0.9857 Discriminator Loss: 0.9343...
Epoch 2/2... Generator Loss: 0.6556 Discriminator Loss: 1.2248...
Epoch 2/2... Generator Loss: 3.3023 Discriminator Loss: 1.3899...
Epoch 2/2... Generator Loss: 1.5936 Discriminator Loss: 0.7365...
Epoch 2/2... Generator Loss: 1.3177 Discriminator Loss: 0.9100...
Epoch 2/2... Generator Loss: 1.7941 Discriminator Loss: 1.3487...
Epoch 2/2... Generator Loss: 1.5479 Discriminator Loss: 0.8636...
Epoch 2/2... Generator Loss: 1.6201 Discriminator Loss: 0.7815...
Epoch 2/2... Generator Loss: 1.1571 Discriminator Loss: 1.0128...
Epoch 2/2... Generator Loss: 0.9718 Discriminator Loss: 1.0144...
Epoch 2/2... Generator Loss: 1.1508 Discriminator Loss: 0.8069...
Epoch 2/2... Generator Loss: 1.1804 Discriminator Loss: 1.0030...
Epoch 2/2... Generator Loss: 1.1574 Discriminator Loss: 0.9717...
Epoch 2/2... Generator Loss: 0.7308 Discriminator Loss: 1.2364...
Epoch 2/2... Generator Loss: 1.1027 Discriminator Loss: 0.8774...
Epoch 2/2... Generator Loss: 1.6028 Discriminator Loss: 0.8575...
Epoch 2/2... Generator Loss: 1.4957 Discriminator Loss: 0.7280...
Epoch 2/2... Generator Loss: 1.4089 Discriminator Loss: 0.9683...
Epoch 2/2... Generator Loss: 1.0824 Discriminator Loss: 0.9339...
Epoch 2/2... Generator Loss: 1.0426 Discriminator Loss: 0.9700...
Epoch 2/2... Generator Loss: 2.0061 Discriminator Loss: 0.8975...
Epoch 2/2... Generator Loss: 1.0596 Discriminator Loss: 0.9205...
Epoch 2/2... Generator Loss: 1.1758 Discriminator Loss: 0.8183...
Epoch 2/2... Generator Loss: 1.0631 Discriminator Loss: 0.9898...
Epoch 2/2... Generator Loss: 1.4373 Discriminator Loss: 0.7888...
Epoch 2/2... Generator Loss: 1.4202 Discriminator Loss: 0.8771...
Epoch 2/2... Generator Loss: 1.6930 Discriminator Loss: 0.9293...
Epoch 2/2... Generator Loss: 0.5859 Discriminator Loss: 1.4171...
Epoch 2/2... Generator Loss: 0.9739 Discriminator Loss: 1.1283...
Epoch 2/2... Generator Loss: 1.8162 Discriminator Loss: 0.7998...
Epoch 2/2... Generator Loss: 1.4895 Discriminator Loss: 0.8308...
Epoch 2/2... Generator Loss: 1.4615 Discriminator Loss: 0.8166...
Epoch 2/2... Generator Loss: 1.1478 Discriminator Loss: 0.8312...
Epoch 2/2... Generator Loss: 1.2538 Discriminator Loss: 0.8640...
Epoch 2/2... Generator Loss: 1.6380 Discriminator Loss: 0.7462...
Epoch 2/2... Generator Loss: 1.7150 Discriminator Loss: 0.9316...
Epoch 2/2... Generator Loss: 0.8174 Discriminator Loss: 1.0846...
Epoch 2/2... Generator Loss: 1.0978 Discriminator Loss: 0.9813...
Epoch 2/2... Generator Loss: 1.3557 Discriminator Loss: 1.0264...
Epoch 2/2... Generator Loss: 0.7163 Discriminator Loss: 1.0615...
Epoch 2/2... Generator Loss: 1.4485 Discriminator Loss: 0.8374...
Epoch 2/2... Generator Loss: 0.8047 Discriminator Loss: 1.0074...
Epoch 2/2... Generator Loss: 0.5532 Discriminator Loss: 1.3079...
Epoch 2/2... Generator Loss: 1.2570 Discriminator Loss: 0.7827...
Epoch 2/2... Generator Loss: 1.7514 Discriminator Loss: 0.7777...
Epoch 2/2... Generator Loss: 1.2382 Discriminator Loss: 0.8534...
Epoch 2/2... Generator Loss: 1.5318 Discriminator Loss: 0.6997...
Epoch 2/2... Generator Loss: 1.1713 Discriminator Loss: 0.8230...
Epoch 2/2... Generator Loss: 1.5984 Discriminator Loss: 0.7478...
Epoch 2/2... Generator Loss: 1.7895 Discriminator Loss: 0.7907...
Epoch 2/2... Generator Loss: 1.6136 Discriminator Loss: 0.8544...
Epoch 2/2... Generator Loss: 1.4304 Discriminator Loss: 0.8173...
Epoch 2/2... Generator Loss: 2.2621 Discriminator Loss: 0.6591...
Epoch 2/2... Generator Loss: 0.7169 Discriminator Loss: 1.4143...
Epoch 2/2... Generator Loss: 1.1502 Discriminator Loss: 1.0114...
Epoch 2/2... Generator Loss: 1.4532 Discriminator Loss: 0.8498...
Epoch 2/2... Generator Loss: 0.9749 Discriminator Loss: 0.9116...
Epoch 2/2... Generator Loss: 1.3295 Discriminator Loss: 0.8638...
Epoch 2/2... Generator Loss: 1.4563 Discriminator Loss: 0.8649...
Epoch 2/2... Generator Loss: 1.9306 Discriminator Loss: 1.7812...
Epoch 2/2... Generator Loss: 1.4216 Discriminator Loss: 0.8107...
Epoch 2/2... Generator Loss: 1.2392 Discriminator Loss: 0.8637...
Epoch 2/2... Generator Loss: 1.4651 Discriminator Loss: 0.8392...
Epoch 2/2... Generator Loss: 1.6476 Discriminator Loss: 0.7495...
Epoch 2/2... Generator Loss: 1.8651 Discriminator Loss: 0.9037...
Epoch 2/2... Generator Loss: 1.7518 Discriminator Loss: 0.8212...
Epoch 2/2... Generator Loss: 1.7153 Discriminator Loss: 0.7647...
Epoch 2/2... Generator Loss: 1.1080 Discriminator Loss: 0.9752...
Epoch 2/2... Generator Loss: 1.6446 Discriminator Loss: 0.6902...
Epoch 2/2... Generator Loss: 1.4345 Discriminator Loss: 0.8767...
Epoch 2/2... Generator Loss: 1.5103 Discriminator Loss: 0.9072...
Epoch 2/2... Generator Loss: 0.9313 Discriminator Loss: 1.1836...
Epoch 2/2... Generator Loss: 1.5484 Discriminator Loss: 0.8194...
Epoch 2/2... Generator Loss: 1.8638 Discriminator Loss: 0.5907...
Epoch 2/2... Generator Loss: 0.9304 Discriminator Loss: 1.0008...
Epoch 2/2... Generator Loss: 1.5436 Discriminator Loss: 0.6586...
Epoch 2/2... Generator Loss: 1.2446 Discriminator Loss: 0.9192...
Epoch 2/2... Generator Loss: 1.0972 Discriminator Loss: 0.8900...
Epoch 2/2... Generator Loss: 1.1289 Discriminator Loss: 0.8969...
Epoch 2/2... Generator Loss: 1.3474 Discriminator Loss: 0.9247...
Epoch 2/2... Generator Loss: 1.1831 Discriminator Loss: 0.8899...
Epoch 2/2... Generator Loss: 1.1793 Discriminator Loss: 0.8488...

CelebA

Run your GANs on CelebA. It will take around 20 minutes on the average GPU to run one epoch. You can run the whole epoch or stop when it starts to generate realistic faces.

In [ ]:
batch_size = 64
z_dim = 100
learning_rate = 0.0004
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
initial output:
Epoch 1/1... Generator Loss: 3.1880 Discriminator Loss: 0.5312...
Epoch 1/1... Generator Loss: 5.8659 Discriminator Loss: 0.4101...
Epoch 1/1... Generator Loss: 1.9841 Discriminator Loss: 0.5600...
Epoch 1/1... Generator Loss: 4.7590 Discriminator Loss: 1.0462...
Epoch 1/1... Generator Loss: 1.7000 Discriminator Loss: 0.7501...
Epoch 1/1... Generator Loss: 5.5154 Discriminator Loss: 0.8921...
Epoch 1/1... Generator Loss: 4.4758 Discriminator Loss: 0.9301...
Epoch 1/1... Generator Loss: 1.8632 Discriminator Loss: 0.6037...
Epoch 1/1... Generator Loss: 8.4192 Discriminator Loss: 2.3092...
Epoch 1/1... Generator Loss: 4.5047 Discriminator Loss: 0.4116...
Epoch 1/1... Generator Loss: 3.4781 Discriminator Loss: 0.5590...
Epoch 1/1... Generator Loss: 4.4806 Discriminator Loss: 0.7846...
Epoch 1/1... Generator Loss: 5.5086 Discriminator Loss: 1.8908...
Epoch 1/1... Generator Loss: 1.4214 Discriminator Loss: 0.7168...
Epoch 1/1... Generator Loss: 1.6244 Discriminator Loss: 0.8169...
Epoch 1/1... Generator Loss: 1.2269 Discriminator Loss: 0.9598...
Epoch 1/1... Generator Loss: 1.1946 Discriminator Loss: 0.9586...
Epoch 1/1... Generator Loss: 1.5818 Discriminator Loss: 1.0988...
Epoch 1/1... Generator Loss: 1.6489 Discriminator Loss: 0.9728...
Epoch 1/1... Generator Loss: 1.2185 Discriminator Loss: 0.8719...
Epoch 1/1... Generator Loss: 0.6783 Discriminator Loss: 1.3058...
Epoch 1/1... Generator Loss: 1.2046 Discriminator Loss: 0.9103...
Epoch 1/1... Generator Loss: 1.6319 Discriminator Loss: 0.9143...
Epoch 1/1... Generator Loss: 1.7103 Discriminator Loss: 0.7602...
Epoch 1/1... Generator Loss: 0.5872 Discriminator Loss: 1.3465...
Epoch 1/1... Generator Loss: 1.1258 Discriminator Loss: 0.9747...
Epoch 1/1... Generator Loss: 2.4265 Discriminator Loss: 0.9894...
Epoch 1/1... Generator Loss: 1.9659 Discriminator Loss: 1.2932...
Epoch 1/1... Generator Loss: 0.4765 Discriminator Loss: 1.5346...
Epoch 1/1... Generator Loss: 1.6336 Discriminator Loss: 1.0220...
Epoch 1/1... Generator Loss: 0.6532 Discriminator Loss: 1.3027...
Epoch 1/1... Generator Loss: 0.5206 Discriminator Loss: 1.4070...
Epoch 1/1... Generator Loss: 2.9789 Discriminator Loss: 1.4708...
Epoch 1/1... Generator Loss: 1.2069 Discriminator Loss: 1.2999...
Epoch 1/1... Generator Loss: 1.3433 Discriminator Loss: 1.0615...
Epoch 1/1... Generator Loss: 2.0278 Discriminator Loss: 1.4691...
Epoch 1/1... Generator Loss: 0.8304 Discriminator Loss: 1.3113...
Epoch 1/1... Generator Loss: 1.3234 Discriminator Loss: 1.0839...
Epoch 1/1... Generator Loss: 1.6088 Discriminator Loss: 1.2751...
Epoch 1/1... Generator Loss: 1.5374 Discriminator Loss: 1.1723...
Epoch 1/1... Generator Loss: 1.0386 Discriminator Loss: 1.1139...
Epoch 1/1... Generator Loss: 1.1841 Discriminator Loss: 1.0517...
Epoch 1/1... Generator Loss: 0.8367 Discriminator Loss: 1.1522...
Epoch 1/1... Generator Loss: 0.7242 Discriminator Loss: 1.2626...
Epoch 1/1... Generator Loss: 2.0105 Discriminator Loss: 0.9907...
Epoch 1/1... Generator Loss: 1.2306 Discriminator Loss: 0.9442...
Epoch 1/1... Generator Loss: 0.6270 Discriminator Loss: 1.4249...
Epoch 1/1... Generator Loss: 0.9325 Discriminator Loss: 1.1622...
Epoch 1/1... Generator Loss: 1.4577 Discriminator Loss: 1.2725...
Epoch 1/1... Generator Loss: 1.2509 Discriminator Loss: 1.2220...
Epoch 1/1... Generator Loss: 0.8667 Discriminator Loss: 1.1699...
Epoch 1/1... Generator Loss: 0.6983 Discriminator Loss: 1.2240...
Epoch 1/1... Generator Loss: 0.7512 Discriminator Loss: 1.2440...
Epoch 1/1... Generator Loss: 0.5337 Discriminator Loss: 1.3740...
Epoch 1/1... Generator Loss: 0.8784 Discriminator Loss: 1.1375...
Epoch 1/1... Generator Loss: 1.4197 Discriminator Loss: 1.1282...
Epoch 1/1... Generator Loss: 1.2432 Discriminator Loss: 1.1738...
Epoch 1/1... Generator Loss: 0.7206 Discriminator Loss: 1.2188...
Epoch 1/1... Generator Loss: 0.8080 Discriminator Loss: 1.2256...
Epoch 1/1... Generator Loss: 2.4920 Discriminator Loss: 1.8209...
Epoch 1/1... Generator Loss: 0.7319 Discriminator Loss: 1.3214...
Epoch 1/1... Generator Loss: 1.2491 Discriminator Loss: 1.0963...
Epoch 1/1... Generator Loss: 1.0356 Discriminator Loss: 1.0759...
Epoch 1/1... Generator Loss: 0.5164 Discriminator Loss: 1.3666...
Epoch 1/1... Generator Loss: 0.6928 Discriminator Loss: 1.2093...
Epoch 1/1... Generator Loss: 1.1585 Discriminator Loss: 1.1902...
Epoch 1/1... Generator Loss: 0.5746 Discriminator Loss: 1.3059...
Epoch 1/1... Generator Loss: 0.6710 Discriminator Loss: 1.2243...
Epoch 1/1... Generator Loss: 0.9909 Discriminator Loss: 1.0835...
Epoch 1/1... Generator Loss: 2.6387 Discriminator Loss: 1.5098...
Epoch 1/1... Generator Loss: 1.0885 Discriminator Loss: 1.2217...
Epoch 1/1... Generator Loss: 1.0379 Discriminator Loss: 1.2541...
Epoch 1/1... Generator Loss: 1.2140 Discriminator Loss: 1.2609...
Epoch 1/1... Generator Loss: 0.7954 Discriminator Loss: 1.1861...
Epoch 1/1... Generator Loss: 2.6049 Discriminator Loss: 2.0296...
Epoch 1/1... Generator Loss: 1.4362 Discriminator Loss: 1.1586...
Epoch 1/1... Generator Loss: 0.6119 Discriminator Loss: 1.2794...
Epoch 1/1... Generator Loss: 1.1147 Discriminator Loss: 1.2238...
Epoch 1/1... Generator Loss: 1.3081 Discriminator Loss: 1.1558...
Epoch 1/1... Generator Loss: 0.9159 Discriminator Loss: 1.2173...
Epoch 1/1... Generator Loss: 0.5973 Discriminator Loss: 1.4560...
Epoch 1/1... Generator Loss: 0.8129 Discriminator Loss: 1.2589...
Epoch 1/1... Generator Loss: 0.5596 Discriminator Loss: 1.4163...
Epoch 1/1... Generator Loss: 1.0900 Discriminator Loss: 1.2533...
Epoch 1/1... Generator Loss: 0.9320 Discriminator Loss: 1.1738...
Epoch 1/1... Generator Loss: 0.7543 Discriminator Loss: 1.3056...
Epoch 1/1... Generator Loss: 1.0426 Discriminator Loss: 1.3906...
Epoch 1/1... Generator Loss: 1.1604 Discriminator Loss: 1.3348...
Epoch 1/1... Generator Loss: 0.8972 Discriminator Loss: 1.1914...
Epoch 1/1... Generator Loss: 1.0818 Discriminator Loss: 1.2376...
Epoch 1/1... Generator Loss: 0.8831 Discriminator Loss: 1.1440...
Epoch 1/1... Generator Loss: 0.7145 Discriminator Loss: 1.2866...
Epoch 1/1... Generator Loss: 1.0098 Discriminator Loss: 1.3003...
Epoch 1/1... Generator Loss: 1.3429 Discriminator Loss: 1.3759...
Epoch 1/1... Generator Loss: 0.4986 Discriminator Loss: 1.4708...
Epoch 1/1... Generator Loss: 0.9463 Discriminator Loss: 1.1214...
Epoch 1/1... Generator Loss: 0.8645 Discriminator Loss: 1.1789...
Epoch 1/1... Generator Loss: 0.8907 Discriminator Loss: 1.2477...
Epoch 1/1... Generator Loss: 0.8938 Discriminator Loss: 1.1382...
Epoch 1/1... Generator Loss: 0.2985 Discriminator Loss: 1.7909...
Epoch 1/1... Generator Loss: 0.7058 Discriminator Loss: 1.3421...
Epoch 1/1... Generator Loss: 0.5444 Discriminator Loss: 1.4187...
Epoch 1/1... Generator Loss: 0.7369 Discriminator Loss: 1.3510...
Epoch 1/1... Generator Loss: 0.7584 Discriminator Loss: 1.3660...
Epoch 1/1... Generator Loss: 0.9565 Discriminator Loss: 1.2018...
Epoch 1/1... Generator Loss: 0.8374 Discriminator Loss: 1.4089...
Epoch 1/1... Generator Loss: 1.0304 Discriminator Loss: 1.2269...
Epoch 1/1... Generator Loss: 0.7206 Discriminator Loss: 1.3186...
Epoch 1/1... Generator Loss: 0.6313 Discriminator Loss: 1.3380...
Epoch 1/1... Generator Loss: 0.8683 Discriminator Loss: 1.3320...
Epoch 1/1... Generator Loss: 0.5633 Discriminator Loss: 1.4105...
Epoch 1/1... Generator Loss: 0.9152 Discriminator Loss: 1.2167...
Epoch 1/1... Generator Loss: 0.9219 Discriminator Loss: 1.2365...
Epoch 1/1... Generator Loss: 0.8458 Discriminator Loss: 1.4121...
Epoch 1/1... Generator Loss: 0.9446 Discriminator Loss: 1.3792...
Epoch 1/1... Generator Loss: 0.7957 Discriminator Loss: 1.3617...
Epoch 1/1... Generator Loss: 0.9812 Discriminator Loss: 1.2270...
Epoch 1/1... Generator Loss: 0.8302 Discriminator Loss: 1.3710...
Epoch 1/1... Generator Loss: 0.6912 Discriminator Loss: 1.3501...
Epoch 1/1... Generator Loss: 0.8170 Discriminator Loss: 1.3890...
Epoch 1/1... Generator Loss: 0.7054 Discriminator Loss: 1.3197...
Epoch 1/1... Generator Loss: 1.1091 Discriminator Loss: 1.1280...
Epoch 1/1... Generator Loss: 0.8886 Discriminator Loss: 1.2071...
Epoch 1/1... Generator Loss: 1.1621 Discriminator Loss: 1.3449...
Epoch 1/1... Generator Loss: 0.9474 Discriminator Loss: 1.3468...
Epoch 1/1... Generator Loss: 0.9627 Discriminator Loss: 1.2661...
Epoch 1/1... Generator Loss: 0.8655 Discriminator Loss: 1.2049...
Epoch 1/1... Generator Loss: 0.7247 Discriminator Loss: 1.3585...
Epoch 1/1... Generator Loss: 1.1131 Discriminator Loss: 1.2751...
Epoch 1/1... Generator Loss: 0.9865 Discriminator Loss: 1.2210...
Epoch 1/1... Generator Loss: 0.7510 Discriminator Loss: 1.3752...
Epoch 1/1... Generator Loss: 0.7537 Discriminator Loss: 1.3358...
Epoch 1/1... Generator Loss: 0.9477 Discriminator Loss: 1.1303...
Epoch 1/1... Generator Loss: 0.9277 Discriminator Loss: 1.1720...
Epoch 1/1... Generator Loss: 0.8082 Discriminator Loss: 1.3504...
Epoch 1/1... Generator Loss: 0.7956 Discriminator Loss: 1.3489...
Epoch 1/1... Generator Loss: 0.9548 Discriminator Loss: 1.2064...
Epoch 1/1... Generator Loss: 0.9622 Discriminator Loss: 1.3163...
Epoch 1/1... Generator Loss: 0.6277 Discriminator Loss: 1.3942...
Epoch 1/1... Generator Loss: 0.7951 Discriminator Loss: 1.2008...
Epoch 1/1... Generator Loss: 0.8518 Discriminator Loss: 1.2941...
Epoch 1/1... Generator Loss: 0.7274 Discriminator Loss: 1.2982...
Epoch 1/1... Generator Loss: 0.6976 Discriminator Loss: 1.3470...
Epoch 1/1... Generator Loss: 0.7456 Discriminator Loss: 1.2853...
Epoch 1/1... Generator Loss: 0.7649 Discriminator Loss: 1.3557...
Epoch 1/1... Generator Loss: 0.6637 Discriminator Loss: 1.4190...
Epoch 1/1... Generator Loss: 0.5582 Discriminator Loss: 1.4484...
Epoch 1/1... Generator Loss: 1.0468 Discriminator Loss: 1.2111...
Epoch 1/1... Generator Loss: 1.0041 Discriminator Loss: 1.3143...
Epoch 1/1... Generator Loss: 0.6457 Discriminator Loss: 1.3580...
Epoch 1/1... Generator Loss: 0.8290 Discriminator Loss: 1.3114...
Epoch 1/1... Generator Loss: 0.7470 Discriminator Loss: 1.3926...
Epoch 1/1... Generator Loss: 0.8682 Discriminator Loss: 1.3354...
Epoch 1/1... Generator Loss: 0.9708 Discriminator Loss: 1.2631...
Epoch 1/1... Generator Loss: 0.7109 Discriminator Loss: 1.3778...
Epoch 1/1... Generator Loss: 0.6963 Discriminator Loss: 1.3755...
Epoch 1/1... Generator Loss: 0.7051 Discriminator Loss: 1.2737...
Epoch 1/1... Generator Loss: 0.7523 Discriminator Loss: 1.3269...
Epoch 1/1... Generator Loss: 0.8369 Discriminator Loss: 1.1761...
Epoch 1/1... Generator Loss: 0.9356 Discriminator Loss: 1.1891...
Epoch 1/1... Generator Loss: 0.7568 Discriminator Loss: 1.4590...
Epoch 1/1... Generator Loss: 0.5727 Discriminator Loss: 1.3562...
Epoch 1/1... Generator Loss: 0.8622 Discriminator Loss: 1.4289...
Epoch 1/1... Generator Loss: 0.9287 Discriminator Loss: 1.4986...
Epoch 1/1... Generator Loss: 0.8836 Discriminator Loss: 1.3031...
Epoch 1/1... Generator Loss: 0.5656 Discriminator Loss: 1.4378...
Epoch 1/1... Generator Loss: 0.5973 Discriminator Loss: 1.4323...
Epoch 1/1... Generator Loss: 0.8966 Discriminator Loss: 1.2418...
Epoch 1/1... Generator Loss: 0.7382 Discriminator Loss: 1.4102...
Epoch 1/1... Generator Loss: 0.7563 Discriminator Loss: 1.3360...
Epoch 1/1... Generator Loss: 0.6936 Discriminator Loss: 1.3488...
Epoch 1/1... Generator Loss: 0.8696 Discriminator Loss: 1.2249...
Epoch 1/1... Generator Loss: 0.8376 Discriminator Loss: 1.2350...
Epoch 1/1... Generator Loss: 0.8542 Discriminator Loss: 1.3201...
Epoch 1/1... Generator Loss: 0.7885 Discriminator Loss: 1.3878...
Epoch 1/1... Generator Loss: 0.8739 Discriminator Loss: 1.1242...
Epoch 1/1... Generator Loss: 1.0630 Discriminator Loss: 1.2583...
Epoch 1/1... Generator Loss: 0.8110 Discriminator Loss: 1.3492...
Epoch 1/1... Generator Loss: 0.7620 Discriminator Loss: 1.3345...
Epoch 1/1... Generator Loss: 0.8052 Discriminator Loss: 1.2799...
Epoch 1/1... Generator Loss: 0.7156 Discriminator Loss: 1.4314...
Epoch 1/1... Generator Loss: 0.6987 Discriminator Loss: 1.3078...
Epoch 1/1... Generator Loss: 0.9264 Discriminator Loss: 1.2913...
Epoch 1/1... Generator Loss: 1.0684 Discriminator Loss: 1.2276...
Epoch 1/1... Generator Loss: 0.7873 Discriminator Loss: 1.1203...
Epoch 1/1... Generator Loss: 1.0369 Discriminator Loss: 1.3388...
Epoch 1/1... Generator Loss: 0.7617 Discriminator Loss: 1.2232...
Epoch 1/1... Generator Loss: 0.7346 Discriminator Loss: 1.3511...
Epoch 1/1... Generator Loss: 0.9367 Discriminator Loss: 1.2823...
Epoch 1/1... Generator Loss: 1.0510 Discriminator Loss: 1.2632...
Epoch 1/1... Generator Loss: 0.8758 Discriminator Loss: 1.3112...
Epoch 1/1... Generator Loss: 0.8723 Discriminator Loss: 1.2766...
Epoch 1/1... Generator Loss: 0.7122 Discriminator Loss: 1.2994...
Epoch 1/1... Generator Loss: 0.8296 Discriminator Loss: 1.2918...
Epoch 1/1... Generator Loss: 0.9944 Discriminator Loss: 1.2616...
Epoch 1/1... Generator Loss: 0.9117 Discriminator Loss: 1.4026...
Epoch 1/1... Generator Loss: 0.8607 Discriminator Loss: 1.3105...
Epoch 1/1... Generator Loss: 0.9826 Discriminator Loss: 1.2575...
Epoch 1/1... Generator Loss: 0.8065 Discriminator Loss: 1.2191...
Epoch 1/1... Generator Loss: 0.8858 Discriminator Loss: 1.3444...
Epoch 1/1... Generator Loss: 0.8755 Discriminator Loss: 1.2901...
Epoch 1/1... Generator Loss: 0.6112 Discriminator Loss: 1.4476...
Epoch 1/1... Generator Loss: 1.2506 Discriminator Loss: 1.1799...
Epoch 1/1... Generator Loss: 0.7287 Discriminator Loss: 1.4366...
Epoch 1/1... Generator Loss: 0.5446 Discriminator Loss: 1.4161...
Epoch 1/1... Generator Loss: 0.9733 Discriminator Loss: 1.2567...
Epoch 1/1... Generator Loss: 0.9914 Discriminator Loss: 1.1275...
Epoch 1/1... Generator Loss: 0.8235 Discriminator Loss: 1.3742...
Epoch 1/1... Generator Loss: 0.7694 Discriminator Loss: 1.4195...
Epoch 1/1... Generator Loss: 1.0994 Discriminator Loss: 1.3624...
Epoch 1/1... Generator Loss: 0.6758 Discriminator Loss: 1.3137...
Epoch 1/1... Generator Loss: 0.8886 Discriminator Loss: 1.2213...
Epoch 1/1... Generator Loss: 0.9135 Discriminator Loss: 1.2744...
Epoch 1/1... Generator Loss: 0.9895 Discriminator Loss: 1.2376...
Epoch 1/1... Generator Loss: 0.6518 Discriminator Loss: 1.3967...
Epoch 1/1... Generator Loss: 0.6952 Discriminator Loss: 1.4191...
Epoch 1/1... Generator Loss: 0.7682 Discriminator Loss: 1.3622...
Epoch 1/1... Generator Loss: 0.7581 Discriminator Loss: 1.4277...
Epoch 1/1... Generator Loss: 0.7674 Discriminator Loss: 1.3712...
Epoch 1/1... Generator Loss: 0.7661 Discriminator Loss: 1.3267...
Epoch 1/1... Generator Loss: 0.6760 Discriminator Loss: 1.3658...
Epoch 1/1... Generator Loss: 0.9652 Discriminator Loss: 1.2144...
Epoch 1/1... Generator Loss: 0.8785 Discriminator Loss: 1.2331...
Epoch 1/1... Generator Loss: 0.8559 Discriminator Loss: 1.3218...
Epoch 1/1... Generator Loss: 0.7325 Discriminator Loss: 1.3266...
Epoch 1/1... Generator Loss: 0.8307 Discriminator Loss: 1.2938...
Epoch 1/1... Generator Loss: 0.6974 Discriminator Loss: 1.3726...
Epoch 1/1... Generator Loss: 0.6865 Discriminator Loss: 1.2818...
Epoch 1/1... Generator Loss: 0.9450 Discriminator Loss: 1.3728...
Epoch 1/1... Generator Loss: 0.8622 Discriminator Loss: 1.3300...
Epoch 1/1... Generator Loss: 0.7501 Discriminator Loss: 1.3513...
Epoch 1/1... Generator Loss: 0.7340 Discriminator Loss: 1.4291...
Epoch 1/1... Generator Loss: 0.7421 Discriminator Loss: 1.4660...
Epoch 1/1... Generator Loss: 0.7828 Discriminator Loss: 1.3756...
Epoch 1/1... Generator Loss: 0.9387 Discriminator Loss: 1.3153...
Epoch 1/1... Generator Loss: 0.7144 Discriminator Loss: 1.3329...
Epoch 1/1... Generator Loss: 0.7566 Discriminator Loss: 1.3751...
Epoch 1/1... Generator Loss: 0.7741 Discriminator Loss: 1.1986...
Epoch 1/1... Generator Loss: 0.7461 Discriminator Loss: 1.3215...
Epoch 1/1... Generator Loss: 0.7656 Discriminator Loss: 1.3422...
Epoch 1/1... Generator Loss: 0.8741 Discriminator Loss: 1.2502...
Epoch 1/1... Generator Loss: 0.8782 Discriminator Loss: 1.2967...
Epoch 1/1... Generator Loss: 0.7273 Discriminator Loss: 1.4212...
Epoch 1/1... Generator Loss: 0.6844 Discriminator Loss: 1.3636...
Epoch 1/1... Generator Loss: 0.8746 Discriminator Loss: 1.2018...
Epoch 1/1... Generator Loss: 0.9322 Discriminator Loss: 1.2616...
Epoch 1/1... Generator Loss: 0.7470 Discriminator Loss: 1.3608...
Epoch 1/1... Generator Loss: 0.9325 Discriminator Loss: 1.3048...
Epoch 1/1... Generator Loss: 0.9030 Discriminator Loss: 1.3039...
Epoch 1/1... Generator Loss: 0.9363 Discriminator Loss: 1.2599...
Epoch 1/1... Generator Loss: 0.7104 Discriminator Loss: 1.3815...
Epoch 1/1... Generator Loss: 0.7369 Discriminator Loss: 1.3691...
Epoch 1/1... Generator Loss: 0.7057 Discriminator Loss: 1.4310...
Epoch 1/1... Generator Loss: 0.8413 Discriminator Loss: 1.2620...
Epoch 1/1... Generator Loss: 0.6156 Discriminator Loss: 1.5326...
Epoch 1/1... Generator Loss: 0.9339 Discriminator Loss: 1.2898...
Epoch 1/1... Generator Loss: 0.6084 Discriminator Loss: 1.4562...
Epoch 1/1... Generator Loss: 0.7674 Discriminator Loss: 1.3417...
Epoch 1/1... Generator Loss: 0.7891 Discriminator Loss: 1.2634...
Epoch 1/1... Generator Loss: 0.6960 Discriminator Loss: 1.3043...
Epoch 1/1... Generator Loss: 0.7922 Discriminator Loss: 1.3376...
Epoch 1/1... Generator Loss: 0.7329 Discriminator Loss: 1.3319...
Epoch 1/1... Generator Loss: 0.7193 Discriminator Loss: 1.3468...
Epoch 1/1... Generator Loss: 0.7009 Discriminator Loss: 1.3374...
Epoch 1/1... Generator Loss: 1.0823 Discriminator Loss: 1.2562...
Epoch 1/1... Generator Loss: 0.8936 Discriminator Loss: 1.3808...
Epoch 1/1... Generator Loss: 0.9341 Discriminator Loss: 1.3196...
Epoch 1/1... Generator Loss: 0.7756 Discriminator Loss: 1.2852...
Epoch 1/1... Generator Loss: 0.8532 Discriminator Loss: 1.2480...
Epoch 1/1... Generator Loss: 0.7608 Discriminator Loss: 1.3899...
Epoch 1/1... Generator Loss: 0.8705 Discriminator Loss: 1.2013...
Epoch 1/1... Generator Loss: 0.9635 Discriminator Loss: 1.2811...
Epoch 1/1... Generator Loss: 0.9634 Discriminator Loss: 1.1362...
Epoch 1/1... Generator Loss: 0.6958 Discriminator Loss: 1.3363...
Epoch 1/1... Generator Loss: 0.6179 Discriminator Loss: 1.5403...
Epoch 1/1... Generator Loss: 0.8855 Discriminator Loss: 1.2114...
Epoch 1/1... Generator Loss: 0.6857 Discriminator Loss: 1.3453...
Epoch 1/1... Generator Loss: 0.7580 Discriminator Loss: 1.3672...
Epoch 1/1... Generator Loss: 0.7391 Discriminator Loss: 1.3466...
Epoch 1/1... Generator Loss: 0.8287 Discriminator Loss: 1.3069...
Epoch 1/1... Generator Loss: 0.6510 Discriminator Loss: 1.4318...
Epoch 1/1... Generator Loss: 0.8495 Discriminator Loss: 1.1784...
Epoch 1/1... Generator Loss: 0.9858 Discriminator Loss: 1.4482...
Epoch 1/1... Generator Loss: 0.7742 Discriminator Loss: 1.2778...
Epoch 1/1... Generator Loss: 0.8205 Discriminator Loss: 1.3330...
Epoch 1/1... Generator Loss: 0.7417 Discriminator Loss: 1.3836...
Epoch 1/1... Generator Loss: 0.8809 Discriminator Loss: 1.2286...
Epoch 1/1... Generator Loss: 0.7610 Discriminator Loss: 1.4525...
Epoch 1/1... Generator Loss: 0.9175 Discriminator Loss: 1.3360...
Epoch 1/1... Generator Loss: 0.7725 Discriminator Loss: 1.3547...
Epoch 1/1... Generator Loss: 0.7529 Discriminator Loss: 1.3186...
Epoch 1/1... Generator Loss: 0.8696 Discriminator Loss: 1.2653...
Epoch 1/1... Generator Loss: 0.8728 Discriminator Loss: 1.2939...
Epoch 1/1... Generator Loss: 0.8408 Discriminator Loss: 1.2752...
Epoch 1/1... Generator Loss: 0.9110 Discriminator Loss: 1.2495...
Epoch 1/1... Generator Loss: 0.6371 Discriminator Loss: 1.4350...
Epoch 1/1... Generator Loss: 0.8236 Discriminator Loss: 1.2514...
Epoch 1/1... Generator Loss: 0.8167 Discriminator Loss: 1.4529...
Epoch 1/1... Generator Loss: 0.9778 Discriminator Loss: 1.2094...
Epoch 1/1... Generator Loss: 0.9016 Discriminator Loss: 1.3269...
Epoch 1/1... Generator Loss: 0.8962 Discriminator Loss: 1.2203...
Epoch 1/1... Generator Loss: 0.7226 Discriminator Loss: 1.3987...
Epoch 1/1... Generator Loss: 0.7628 Discriminator Loss: 1.2769...
Epoch 1/1... Generator Loss: 0.9906 Discriminator Loss: 1.2233...
Epoch 1/1... Generator Loss: 0.8311 Discriminator Loss: 1.2493...
Epoch 1/1... Generator Loss: 0.7857 Discriminator Loss: 1.3166...
Epoch 1/1... Generator Loss: 0.8290 Discriminator Loss: 1.2806...
Epoch 1/1... Generator Loss: 1.0062 Discriminator Loss: 1.3424...
Epoch 1/1... Generator Loss: 0.7648 Discriminator Loss: 1.3851...
Epoch 1/1... Generator Loss: 0.7871 Discriminator Loss: 1.3207...
Epoch 1/1... Generator Loss: 0.7840 Discriminator Loss: 1.2678...
Epoch 1/1... Generator Loss: 1.0036 Discriminator Loss: 1.2333...
Epoch 1/1... Generator Loss: 0.9987 Discriminator Loss: 1.1325...
Epoch 1/1... Generator Loss: 0.7526 Discriminator Loss: 1.3356...
Epoch 1/1... Generator Loss: 0.7681 Discriminator Loss: 1.3283...
Epoch 1/1... Generator Loss: 0.7332 Discriminator Loss: 1.3671...

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_face_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.