论文标题

在任意运动序列上的用户识别的数据表示和机器学习体系结构的比较

Comparison of Data Representations and Machine Learning Architectures for User Identification on Arbitrary Motion Sequences

论文作者

Schell, Christian, Hotho, Andreas, Latoschik, Marc Erich

论文摘要

可靠且强大的用户识别和身份验证对于许多数字服务都是重要的,通常是必要的要求。在社交虚拟现实(VR)中,它变得至关重要,以确保信任,特别是在数字相遇中与栩栩如生的现实化头像作为真实人的忠实复制。最近的研究表明,扩展现实(XR)系统中用户的运动携带了特定于用户的信息,因此可以用于验证其身份。本文比较了来自头和手的运动数据的三个不同潜在的编码(场景相关,相关性和身体相关速度),以及五个不同的机器学习体系结构的性能(随机森林,多层感知器,完全复发性的神经网络,长期短暂的期限内存,长期术语记忆,封闭的复发单位)。我们使用公开可用的数据集“用手交谈”并发布所有代码以允许可重复性并为将来的工作提供基准。在优化超参数之后,长期任期内存架构和人体相关数据的组合优于竞争组合:该模型在150秒内正确识别了34个受试者中的任何一个。总的来说,我们的方法为基于行为计量的识别和身份验证提供了有效的基础,以指导研究人员和从业人员。数据和代码在https://go.uniwue.de/58w1r下发布。

Reliable and robust user identification and authentication are important and often necessary requirements for many digital services. It becomes paramount in social virtual reality (VR) to ensure trust, specifically in digital encounters with lifelike realistic-looking avatars as faithful replications of real persons. Recent research has shown that the movements of users in extended reality (XR) systems carry user-specific information and can thus be used to verify their identities. This article compares three different potential encodings of the motion data from head and hands (scene-relative, body-relative, and body-relative velocities), and the performances of five different machine learning architectures (random forest, multi-layer perceptron, fully recurrent neural network, long-short term memory, gated recurrent unit). We use the publicly available dataset "Talking with Hands" and publish all code to allow reproducibility and to provide baselines for future work. After hyperparameter optimization, the combination of a long-short term memory architecture and body-relative data outperformed competing combinations: the model correctly identifies any of the 34 subjects with an accuracy of 100% within 150 seconds. Altogether, our approach provides an effective foundation for behaviometric-based identification and authentication to guide researchers and practitioners. Data and code are published under https://go.uniwue.de/58w1r.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源