Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no backward pass? #6

Open
etienne87 opened this issue Jul 5, 2018 · 2 comments
Open

no backward pass? #6

etienne87 opened this issue Jul 5, 2018 · 2 comments

Comments

@etienne87
Copy link

etienne87 commented Jul 5, 2018

i'm printing the weights of the network and they are not changing. It makes sense since all the binarization is happening only on the data (not in the graph, so weight will not update)

how can this code trains a networks for scratch with binarization?

@jafermarq
Copy link

I think you are right. we can see that the network learns because the batchnorm parameters do change. But that, of course, is not enough to reach high accuracies....

@appleleaves
Copy link

In the file main_binary.py line 252, there is the loss.backward().
I think the backward pass has already done!
Also, are you really sure that the weights are not changing?
Maybe you should try a higher leaning rate to make sure the code is ok.
I think the gradient may be too small to change the signs of the weights.
@etienne87 @jafermarq @itayhubara

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants