暂无评分数据
ICLR 2025
DynMoLE: Boosting Mixture of LoRA Experts Fine-Tuning with a Hybrid Routing Mechanism
摘要
关键词
Large Language ModelSelf-supervised Fine-TuneParameter-Efficient Fine-TuningMixture of LoRA Experts
评审与讨论
PC编辑台拒稿
直接拒稿原因
For using smaller margin to fit in the page limit.