-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Added LearningRateScheduler functionality for Image Classification #4340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ass for allowing learning rate as Tensor.
Codecov Report
@@ Coverage Diff @@
## master #4340 +/- ##
=========================================
Coverage ? 74.66%
=========================================
Files ? 883
Lines ? 155117
Branches ? 16931
=========================================
Hits ? 115824
Misses ? 34549
Partials ? 4744
|
…o take learning rate as a tensor. Addressed Zeeshan's comments. Added linear scale rule LR decay method for learning rate scheduling.
…to LRSchedulerCodeOnly
…es with GrandientDescentOptimizer for Tensor input. 2. Added Exponential decay and Linear Scaling Decay for learning rate scheduling. Removed BasicLR class. 3. Added a sample for testing linear scaling rule and LR decay for Cifar dataset with resnet_v2_101. 4. Added a unit test to test Exponential decay.
…to LRSchedulerCodeOnly
… with Eric. Added more comments for the learning rate functions.
|
@harshithapv Can you please update the title and description of the PR? #Resolved |
| scoreColumnName = ctx.Reader.ReadString(); | ||
| predictedColumnName = ctx.Reader.ReadString(); | ||
| learningRate = ctx.Reader.ReadFloat(); | ||
| useLearningRateScheduling = ctx.Reader.ReadBoolByte(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
useLearningRateScheduling [](start = 12, length = 25)
why are we serializing this? It is only needed at training and not inferencing, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not be serializing this. This was discussed with Zeeshan and will be addressed by him in his PR.
In reply to: 337825750 [](ancestors = 337825750)
…tor required for Linear scale rule and read them as IReadOnlyList. Addressed Zeeshan's comments.
codemzs
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
![]()
…otnet#4340) * Code with LearningRateScheduler class and GradientDescentOptimizer Class for allowing learning rate as Tensor. * Installed Tensoflow.Net version 0.11.7 for GradientDescentOptimizer to take learning rate as a tensor. Addressed Zeeshan's comments. Added linear scale rule LR decay method for learning rate scheduling. * synced with master and editted a few comments. * 1. Updated TensorFlow .Net Nuget to 0.11.8.1 which fixes all the issues with GrandientDescentOptimizer for Tensor input. 2. Added Exponential decay and Linear Scaling Decay for learning rate scheduling. Removed BasicLR class. 3. Added a sample for testing linear scaling rule and LR decay for Cifar dataset with resnet_v2_101. 4. Added a unit test to test Exponential decay. * Fixed a bug that occurs while loading in-memory images * Changed LearningScheduler interface to an abstract class as discussed with Eric. Added more comments for the learning rate functions. * Reverted LearningRateSchedulingCifarResnetTransferLearning.cs * Fixed unit test. Addressed Eric's comments * Added an internal constructor in LearningRateScheduler class. * Added LearningRateSchedulerItem struct to represent epoch-scaling factor required for Linear scale rule and read them as IReadOnlyList. Addressed Zeeshan's comments.
Uh oh!
There was an error while loading. Please reload this page.