论文标题
学习对:采用基于变压器的电子商务产品代表模型学习
Learning-To-Embed: Adopting Transformer based models for E-commerce Products Representation Learning
论文作者
论文摘要
学习电子商务目录中存在的大量产品的低维度代表性起着至关重要的作用,因为它们在产品排名,产品建议,寻找类似产品,对用户behaviour进行建模等任务中都有帮助,最近,NLP领域中的许多任务都在使用基于变形金刚的模型和这些深层模型中可以解决各种问题,这些任务都可以解决各种问题,以解决各种问题。有了这种动机,我们将基于变压器的模型应用于电子商务环境中的产品上下文表示。在这项工作中,我们提出了一种基于训练的变压器模型的新方法,该模型是从一个从大型时尚电子商务平台获得的用户生成的会话数据集,以获得潜在的产品表示。一旦预先训练,我们表明,鉴于产品属性作为文本句子,可以获得产品的低维表示。我们主要是Transformer模型的BERT,Roberta,Albert和XLNET变体,并对从这些模型获得的下一个产品建议(NPR)和内容排名(CR)任务获得的产品表示的定量分析。对于这两个任务,我们从时尚电子商务平台收集评估数据,并观察到XLNET模型的表现优于其他变体,而NPR的MRR为0.5,而NDCG的CR为0.634。 XLNET模型还胜过两个下游任务上基于Word2Vec的非转换基线。据我们所知,这是使用用户生成的会话数据,这些数据包含包含丰富属性信息以在电子商务环境中采用的富属性信息,用于基于训练的变压器模型。这些模型可以进一步进行微调,以解决电子商务中的各种下游任务,从而消除了从头开始训练模型的需求。
Learning low-dimensional representation for large number of products present in an e-commerce catalogue plays a vital role as they are helpful in tasks like product ranking, product recommendation, finding similar products, modelling user-behaviour etc. Recently, a lot of tasks in the NLP field are getting tackled using the Transformer based models and these deep models are widely applicable in the industries setting to solve various problems. With this motivation, we apply transformer based model for learning contextual representation of products in an e-commerce setting. In this work, we propose a novel approach of pre-training transformer based model on a users generated sessions dataset obtained from a large fashion e-commerce platform to obtain latent product representation. Once pre-trained, we show that the low-dimension representation of the products can be obtained given the product attributes information as a textual sentence. We mainly pre-train BERT, RoBERTa, ALBERT and XLNET variants of transformer model and show a quantitative analysis of the products representation obtained from these models with respect to Next Product Recommendation(NPR) and Content Ranking(CR) tasks. For both the tasks, we collect an evaluation data from the fashion e-commerce platform and observe that XLNET model outperform other variants with a MRR of 0.5 for NPR and NDCG of 0.634 for CR. XLNET model also outperforms the Word2Vec based non-transformer baseline on both the downstream tasks. To the best of our knowledge, this is the first and novel work for pre-training transformer based models using users generated sessions data containing products that are represented with rich attributes information for adoption in e-commerce setting. These models can be further fine-tuned in order to solve various downstream tasks in e-commerce, thereby eliminating the need to train a model from scratch.