暂无评分数据
ICLR 2024
Stochastic Re-weighted Gradient Descent via Distributionally Robust Optimization
TL;DR
We design simple, explicit, and flexible per-sample re-weighting techniques for learning deep neural networks inspired by distributionally robust optimization.
摘要
关键词
Deep LearningRobustnessPer-sample weighting
评审与讨论
暂无评审记录