Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

additional details on ResNet20 (low rank) #29

Open
akamaster opened this issue May 2, 2018 · 4 comments
Open

additional details on ResNet20 (low rank) #29

akamaster opened this issue May 2, 2018 · 4 comments

Comments

@akamaster
Copy link

Would you please share speed up ratios and final ranks on ResNet20 (CIFAR) obtained in Coordinating Filters for Faster Deep Neural Networks paper?

@wenwei202
Copy link
Owner

Baseline 8 13 13 14 14 14 13 23 26 27 28 28 28 47 54 56 55 56 31
L2-norm force 8 9 10 10 10 9 10 8 9 9 10 9 8 2 1 1 1 1 1
Full rank 16 16 16 16 16 16 16 32 32 32 32 32 32 64 64 64 64 64 64

@akamaster
Copy link
Author

akamaster commented May 4, 2018

Thanks a lot, and what is the accuracy of this model? Is is 7.97% as you report in paper? Or it is something else, and in paper plot was for one particular regularization value and acc was for another. Could you please elaborate?

And one more question, you gave 19 values, but network has 22 layers, does it mean that for other layer no parametrization happens?

@wenwei202
Copy link
Owner

It's the one in Figure 5, 8.82% for baseline and 9.57% for ours. Force regularization is only applied on conv layers and indexed in order in the table.

@akamaster
Copy link
Author

What is the lambda value you used to obtain that result? I don't see any particular values in any of the scripts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants