Filtered not Mixed: Stochastic Filtering-Based Online Gating for Mixture of Large Language Models

التفاصيل البيبلوغرافية
العنوان: Filtered not Mixed: Stochastic Filtering-Based Online Gating for Mixture of Large Language Models
المؤلفون: Saqur, Raeid, Kratsios, Anastasis, Krach, Florian, Limmer, Yannick, Tian, Jacob-Junqi, Willes, John, Horvath, Blanka, Rudzicz, Frank
سنة النشر: 2024
المجموعة: Computer Science
Quantitative Finance
مصطلحات موضوعية: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Computation and Language, Quantitative Finance - Computational Finance, Quantitative Finance - Mathematical Finance, 60J05, 60G35, 68T20, 68T42, 68T50, I.2.6, I.2.7, G.3
الوصف: We propose MoE-F -- a formalised mechanism for combining $N$ pre-trained expert Large Language Models (LLMs) in online time-series prediction tasks by adaptively forecasting the best weighting of LLM predictions at every time step. Our mechanism leverages the conditional information in each expert's running performance to forecast the best combination of LLMs for predicting the time series in its next step. Diverging from static (learned) Mixture of Experts (MoE) methods, MoE-F employs time-adaptive stochastic filtering techniques to combine experts. By framing the expert selection problem as a finite state-space, continuous-time Hidden Markov model (HMM), we can leverage the Wohman-Shiryaev filter. Our approach first constructs $N$ parallel filters corresponding to each of the $N$ individual LLMs. Each filter proposes its best combination of LLMs, given the information that they have access to. Subsequently, the $N$ filter outputs are aggregated to optimize a lower bound for the loss of the aggregated LLMs, which can be optimized in closed-form, thus generating our ensemble predictor. Our contributions here are: (I) the MoE-F algorithm -- deployable as a plug-and-play filtering harness, (II) theoretical optimality guarantees of the proposed filtering-based gating algorithm, and (III) empirical evaluation and ablative results using state of the art foundational and MoE LLMs on a real-world Financial Market Movement task where MoE-F attains a remarkable 17% absolute and 48.5% relative F1 measure improvement over the next best performing individual LLM expert.
Comment: 29 pages, 5 Appendix sections
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2406.02969
رقم الأكسشن: edsarx.2406.02969
قاعدة البيانات: arXiv