Article Summary
Search
Showing results for:
Mixture-of-Experts
—
Clear filter
RadioKMoE: Knowledge-Guided Radiomap Estimation with Kolmogorov-Arnold Networks and Mixture-of-Experts
J. Doe, A. Smith, C. Lee
Radiomap Estimation
Kolmogorov-Arnold Networks
Mixture-of-Experts
Knowledge Guidance
Wireless Communication
Published: 2025-12-01
Link:
https://arxiv.org/pdf/2511.16986.pdf
Parameter-Efficient MoE LoRA for Few-Shot Multi-Style Editing
Jia-Wen Li, Chen-Hao Wang, Xiao-Ming Zhang
Parameter-Efficient Learning
Mixture-of-Experts
LoRA
Few-Shot Learning
Style Editing
Published: 2025-11-23
Link:
https://arxiv.org/pdf/2511.11236.pdf
SEMC: Structure-Enhanced Mixture-of-Experts Contrastive Learning for Ultrasound Standard Plane Recognition
Jian Li, Wei Zhang, Chen Wang, Xiaoyan Liu
Ultrasound Imaging
Standard Plane Recognition
Contrastive Learning
Mixture-of-Experts
Medical AI
Published: 2025-11-20
Link:
https://arxiv.org/pdf/2511.12559.pdf
Routing Matters in MoE: Scaling Diffusion Transformers with Explicit Routing Guidance
A. Author, B. Author, C. Author
Mixture-of-Experts
Diffusion Transformers
Routing Algorithms
Scalability
Generative Models
Published: 2025-10-29
Link:
https://arxiv.org/pdf/2510.24711.pdf