```bash for epoch in range(EPOCH): for step, (x, b_label) in enumerate(train_loader): b_x = x.view(-1, 28*28) # batch x, shape (batch, 28*28) b_y = x.view(-1, 28*28) # batch y, shape (batch, 28*28) encoded, decoded = autoencoder(b_x) loss = loss_func(decoded, b_y) # mean square error optimizer.zero_grad() # clear gradients for this training step loss.backward() # backpropagation, compute gradients optimizer.step() # apply gradients ``` Maybe b_y = b_label.view(-1, 28*28) is more appropriate?