LWM-Spectro: A Foundation Model for Wireless Baseband Signal Spectrograms
Published in arXiv preprint (cs.IT; eess.SP), 2026
Abstract
We present LWM-Spectro, a transformer-based foundation model pretrained on large-scale wireless I/Q data represented as time-frequency spectrograms. The model combines masked modeling, contrastive learning, and a mixture-of-experts (MoE) architecture to learn general-purpose representations that transfer effectively to downstream wireless tasks, including modulation classification and joint SNR/mobility recognition, even with limited supervision.
Key Contributions
- Wireless foundation model: Transformer-based representation learning for I/Q spectrograms
- Self-supervised pretraining: Masked modeling + contrastive learning to reduce label needs
- MoE scaling: Mixture-of-experts architecture for capacity/compute trade-offs
- Transfer learning: Strong performance on downstream tasks in few-shot and data-rich regimes
Status
Preprint - arXiv:2601.08780 (Jan. 2026)
Keywords
Foundation models, wireless I/Q, spectrograms, self-supervised learning, contrastive learning, mixture-of-experts, transfer learning
