You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm confused whether we need to update the vgg network. The example in pytorch tutorial keeps the parameters in vgg16 untouched. But as far as I understand, your implementation updates the parameters. Did I miss something important?
The text was updated successfully, but these errors were encountered:
God my silly question. The parameters should only be updated by optimizers. So the content_loss will be backpropagated through VGG, and then the Generator since one candidate of the content loss criterion, the content_input, comes from the input(which requires gradient), do I get the point? Thanks for your reply.
Edit: Your implementation is different from others' that I viewed where the total loss is usually in a manner of total_loss = loss1 + loss2 + ... and then backpropagated. It was a little difficult for me to understand your implementation at the first glimpse because of my poor knowledge about PyTorch. But your implementation is quite great. Thanks again!
I'm confused whether we need to update the vgg network. The example in pytorch tutorial keeps the parameters in vgg16 untouched. But as far as I understand, your implementation updates the parameters. Did I miss something important?
The text was updated successfully, but these errors were encountered: