machinelearning

[R] Unraveling the Mysteries: Why is AdamW Often Superior to Adam+L2 in Practice?

Hello, ML enthusiasts! πŸš€πŸ€– We analyzed rotational equilibria in our latest work, ROTATIONAL EQUILIBRIUM: HOW WEIGHT DECAY BALANCES LEARNING ACROSS NEURAL NETWORKS

πŸ’‘ Our Findings: Balanced average rotational updates (effective learning rate) across all network components may play a key role in the effectiveness of AdamW.

πŸ”— ROTATIONAL EQUILIBRIUM: HOW WEIGHT DECAY BALANCES LEARNING ACROSS NEURAL NETWORKS

Looking forward to hearing your thoughts! Let’s discuss more about this fascinating topic together!

19
3
Comments 3