Open
Description
Hi there!, thank you for sharing this code. I am trying to add an entropy function as density estimation in addition to the reconstruction loss and kernel alignment loss.
$ \mathcal{L} = \frac{1}{N} \sum_{N=1}^{N} L(X, Z) + \min_{x} (H(X) + \lambda D_{KL}(X||Z)) $
I have made the following changes in the code.
prob = prior_K_norm / tf.reduce_sum(prior_K_norm)
tf_entrpy = -tf.reduce_sum(prob * tf.log(prob))
tot_loss = reconstruct_loss + args.w_reg * reg_loss + args.a_reg * (k_loss + tf_entrpy)
_, train_loss, train_kloss, train_entropy_loss = sess.run([update_step, reconstruct_loss, k_loss, tf_entrpy], fdtr)
entrpyloss_track.append(train_entropy_loss)
outvs, lossvs, klossvs, entrpy_loss_vs, vs_code_K, summary = sess.run(
[dec_out, reconstruct_loss,k_loss, tf_entrpy, code_K, merged_summary], fdvs)
train_writer.add_summary(summary, ep)
print('VS r_loss=%.3f, k_loss=%.3f ,entrpy_loss= %.3f -- TR r_loss=%.3f, k_loss=%.3f, entrpy_loss= %.3f' % (
lossvs, klossvs, entrpy_loss_vs, np.mean(loss_track[-100:]),
np.mean(kloss_track[-100:]),np.mean((entrpyloss_track[-100:]))))
But the entropy loss is stuck at a value. I am not sure if I am correctly calculating backpropagation on the entropy loss. Can you kindly have a look? Thank you in advance!
Metadata
Metadata
Assignees
Labels
No labels