Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why vgg19 updates? #11

Open
AlbertZhangHIT opened this issue Jun 6, 2018 · 2 comments
Open

Why vgg19 updates? #11

AlbertZhangHIT opened this issue Jun 6, 2018 · 2 comments

Comments

@AlbertZhangHIT
Copy link

AlbertZhangHIT commented Jun 6, 2018

I'm confused whether we need to update the vgg network. The example in pytorch tutorial keeps the parameters in vgg16 untouched. But as far as I understand, your implementation updates the parameters. Did I miss something important?

@twtygqyy
Copy link
Owner

twtygqyy commented Jun 7, 2018

Hi @AlbertZhangHIT I didn't update the params in VGG network, just apply the loss for back propagating the sr network.

@AlbertZhangHIT
Copy link
Author

AlbertZhangHIT commented Jun 19, 2018

God my silly question. The parameters should only be updated by optimizers. So the content_loss will be backpropagated through VGG, and then the Generator since one candidate of the content loss criterion, the content_input, comes from the input(which requires gradient), do I get the point? Thanks for your reply.

Edit: Your implementation is different from others' that I viewed where the total loss is usually in a manner of total_loss = loss1 + loss2 + ... and then backpropagated. It was a little difficult for me to understand your implementation at the first glimpse because of my poor knowledge about PyTorch. But your implementation is quite great. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants