暂无评分数据
ICLR 2024
Mitigating Accumulated Distribution Divergence in Batch Normalization for Unsupervised Domain Adaptation
摘要
Batch Normalization (BN) is a widely used technique in modern deep neural networks that has been proven to be effective in tasks such as Unsupervised Domain Adaptation (UDA) in cross-domain scenarios. However, existing BN variants tend to aggregate source and target domain knowledge in the same channel, which can lead to suboptimal transferability due to unaligned features between domains. To address this issue, we propose a new normalization method called Refined Batch Normalization (RBN), which leverages estimated shift to quantify the difference between estimated population statistics and expected statistics. Our key finding is that the estimated shift can accumulate due to BN stacking in the network, which can adversely affect target domain performance. We further demonstrate that RBN can prevent the accumulation of estimated shift and improve overall performance. To implement this technique, we introduce the RBNBlock, which replaces a BN with RBN in the bottleneck block of a residual network. Our comprehensive experiments on cross-domain benchmarks confirm the effectiveness of $\mathrm{RBN}$ in improving transferability across domains.
关键词
Optimized Batch NormalizationDistribution DivergenceUnsupervised Domain Adaptation
评审与讨论
暂无评审记录