Skip to content
Snippets Groups Projects

Torch anderson

Merged Reshniak, Viktor requested to merge torch_Anderson into master

The list of changes:

  • reimplemented utils/anderson_acceleration.py
  • removed modules/AccelerationModule.py:
    • in my opinion, it is unnecessary
    • moved all logic to modules/optimizers.py
  • updated modules/optimizers.py:
    • removed abstract Optimizers class
    • the training loop is now implemented only in FixedPointIteration class
    • DeterministicAcceleration now inherits from FixedPointIteration and just reimplements its accelerate method
    • acceleration is now performed every time parameters are updated and not just on every epoch (not sure about this)
    • added with torch.no_grad() logic to avoid memory leaking
    • history of updates is now collections.deque instead of list
Edited by Reshniak, Viktor

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
Please register or sign in to reply
Loading