Skip to content

Minor change in loss function is not working as expected! #1

Open
@padmaksha18

Description

@padmaksha18

Hi there!, thank you for sharing this code. I am trying to add an entropy function as density estimation in addition to the reconstruction loss and kernel alignment loss.

$ \mathcal{L} = \frac{1}{N} \sum_{N=1}^{N} L(X, Z) + \min_{x} (H(X) + \lambda D_{KL}(X||Z)) $

I have made the following changes in the code.

prob = prior_K_norm / tf.reduce_sum(prior_K_norm)   
tf_entrpy = -tf.reduce_sum(prob * tf.log(prob))
tot_loss = reconstruct_loss + args.w_reg * reg_loss + args.a_reg * (k_loss + tf_entrpy)
 _, train_loss, train_kloss, train_entropy_loss = sess.run([update_step, reconstruct_loss, k_loss, tf_entrpy], fdtr)

     entrpyloss_track.append(train_entropy_loss)
outvs, lossvs, klossvs, entrpy_loss_vs, vs_code_K, summary = sess.run(
    [dec_out, reconstruct_loss,k_loss, tf_entrpy, code_K, merged_summary], fdvs)
train_writer.add_summary(summary, ep)
print('VS r_loss=%.3f, k_loss=%.3f ,entrpy_loss= %.3f -- TR r_loss=%.3f, k_loss=%.3f, entrpy_loss= %.3f' % (
lossvs, klossvs, entrpy_loss_vs, np.mean(loss_track[-100:]), 
np.mean(kloss_track[-100:]),np.mean((entrpyloss_track[-100:]))))

But the entropy loss is stuck at a value. I am not sure if I am correctly calculating backpropagation on the entropy loss. Can you kindly have a look? Thank you in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions