论文标题
多任务学习和任务群集的压缩层次表示
Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering
论文作者
论文摘要
在本文中,我们将同质功能多任务学习(MTL)作为分层表示学习问题,并具有一个任务不合时宜的和多个特定于任务的潜在表示。从信息瓶颈原理中汲取灵感,并假设任务无关和特定于任务的潜在表示之间有一个添加剂独立的噪声模型,我们限制了每个特定于任务的表示中包含的信息。结果表明,我们的形式为多个MTL基准测试产生竞争性能。此外,对于某些设置,我们证明了加性噪声模型的训练参数与不同任务的相似性密切相关。这表明我们的方法产生了一个任务不合时宜的表示形式,该表示的意义是从特定于任务的角度可以解释其个体维度。
In this paper, we frame homogeneous-feature multi-task learning (MTL) as a hierarchical representation learning problem, with one task-agnostic and multiple task-specific latent representations. Drawing inspiration from the information bottleneck principle and assuming an additive independent noise model between the task-agnostic and task-specific latent representations, we limit the information contained in each task-specific representation. It is shown that our resulting representations yield competitive performance for several MTL benchmarks. Furthermore, for certain setups, we show that the trained parameters of the additive noise model are closely related to the similarity of different tasks. This indicates that our approach yields a task-agnostic representation that is disentangled in the sense that its individual dimensions may be interpretable from a task-specific perspective.