Skip to content

Conversation

tiberiu44
Copy link
Contributor

This update solves breaking changes from numpy and torch and releases updated pre-trained models to work with the new torch versions.

if params.resume:
model.load('{0}.last'.format(params.output_base))
optimizer = torch.optim.Adam(model.parameters())
optimizer = torch.optim.Adam(model.parameters(), lr=1e-4)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the learning rate should be specified in params, with a default value

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants