论文标题

多级数据的专家模型的混合物:建模框架和近似理论

Mixture of experts models for multilevel data: modelling framework and approximation theory

论文作者

Fung, Tsz Chai, Tseung, Spark C.

论文摘要

在许多现实世界中,多级数据很普遍。但是,它仍然是一个开放的研究问题,可以识别和证明一类模型灵活地捕获广泛的多级数据。由专家(MOE)模型在拟合回归数据中的多功能性的动机,在本文中,我们扩展了MOE,并研究了一类混合MOE(MMOE)模型,以用于多级数据。在某些规律性的条件下,我们证明MMOE在弱收敛意义上的任何连续混合效应模型的空间中都是密集的。结果,MMOE具有准确类似于多级数据中遗传的几乎所有特征,包括边缘分布,依赖性结构,回归链接,随机截距和随机斜率。在多级数据是分层的特定情况下,我们进一步表明,MMOE的嵌套版本普遍近似不同因素水平之间随机效应的广泛依赖性结构。

Multilevel data are prevalent in many real-world applications. However, it remains an open research problem to identify and justify a class of models that flexibly capture a wide range of multilevel data. Motivated by the versatility of the mixture of experts (MoE) models in fitting regression data, in this article we extend upon the MoE and study a class of mixed MoE (MMoE) models for multilevel data. Under some regularity conditions, we prove that the MMoE is dense in the space of any continuous mixed effects models in the sense of weak convergence. As a result, the MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data, including the marginal distributions, dependence structures, regression links, random intercepts and random slopes. In a particular case where the multilevel data is hierarchical, we further show that a nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源