About

Abstract

Artificial Intelligence has become a fundamental ally to improve the reliability, efficiency, and effectiveness of Human-Machine Interaction (HMI) systems. This is especially true considering the latest advancements of wearable and sensing technologies, which are accompanying us in every-day life, and are exploited in different applications spanning from continuous patient monitoring to immersive video-game experiences.

While these novel applications can positively impact a user’s life, work, education, health, and free-time, their development should be rigorous and consider a possible real-time configuration, the quality and quantity of available data, the portability of the employed technologies, the scalability of AI strategies, and the ethical and regulatory aspects concerning the use of personal data, just to name a few challenges.

The AIxHMI workshop aims to connect researchers and practitioners from different fields to collect multidisciplinary contributions on topics concerning HMI and especially on the influence that AI has in the interaction between humans and machines. Contributions coming from universities, research institutes, and industries are very welcomed and are not limited to technological advancements in terms of hardware and software, but can also provide discussions on cognitive aspects, ethical and juridical concerns, ergonomic issues and user experience in a variety of application fields.

Description

The basic definition of Human-Machine Interaction (HMI) involves the bidirectional communication between humans and machines by means of user interfaces. This definition has expanded to include the user’s demands, the characteristics of new user-centered systems and the available technologies pervading real, virtual, and augmented environments.

Recent advances in wearable and sensing technologies are aiming at providing more flexible, comfortable and personalized wearable HMI systems, that could be accepted with more ease by their users and provide reliable data collection and feedback.

However, it could be questioned whether these technologies are really up to this challenge. Besides the technological concerns in the development and design of such sensors and devices, it is necessary to consider how the interaction can effectively be made. This could be also translated into understanding how Artificial Intelligence (AI) influences the HMI system development and what kind of challenges arise when having to face wearable devices and sensing technologies in real-time, instead of wired ones that are usually handled off-line. Let us consider wearable Brain-Computer Interfaces (BCIs) as a practical example. 

Human-users’ neural data are acquired through wearable sensors and translated in commands to a specific real-time application, providing a direct feedback to the BCI-user. The wireless transmission of data and the online configuration of the application presents a series of issues that are different from the ones related to the use of wired and off-line devices and systems. For example, data transmission should be guarded from possible dual use, the resulting data are usually of lower quantity and quality in respect to their wired counterparts, the system needs to provide an instantaneous and proper feedback, and in general should follow some ergonomic rules related to its usability, such as satisfaction, efficiency, and effectiveness of the BCI.

These issues can be expanded and translated to systems employing other control and sensing devices, which can be configured in a multi sensorial and multimodal fashion, require the integration of heterogeneous data, and consider the user environment as part of the information to be used.  Moreover, a key aspect is the emotional involvement of the users when dealing with HMI systems, thus giving space to the fields of emotional intelligence and affective computing. In fact, having machines that are able to adapt to the emotional states of their users may provide better communication between them. For example, being able to detect frustration could allow the re-modeling of a specific control system to the necessities of a single user.

This observation highlights the need to move towards human-centered computing and sensing,  ensuring a better user experience. It is again necessary to provide a good data quality, organization and management, considering that these data come from multiple sources.

Therefore, the AIxHMI workshop is opened to multidisciplinary contributions that pertain but that are not limited to the fields of HMI, BCI, control systems, wearable sensing and devices, virtual and augmented reality, emotional intelligence, affective computing, human-centered sensing and computing, human factors and ergonomics, user experience, interface and sensor design, and ethics and security in AI, having that the AI is a transversal discipline that influences all these aspects.

Topics of Interest

The AIxHMI workshop welcomes submissions including, but not limited to:

Submission 

The workshop invites two types of submissions:


Please, consider also sending position papers (as short papers), experimental protocols, ongoing research projects, and pilot studies.

All papers will be peer-reviewed (single-blind) by the program committee members and their camera ready versions will be included in the conference proceedings published on CEUR in the AI*IA Series (Scopus indexed).

The papers will be evaluated considering the relevance of their content to the workshop main topics (i.e., human-machine interaction and artificial intelligence), novelty, technical soundness, and quality of the presentation. 


Notice that papers with less than 25000 characters will be considered as short papers in the CEUR proceedings.

Manuscripts should be formatted using the 1-column CEUR-ART Style, which is available as:


Papers submission is electronic through EasyChair, at the link:
https://easychair.org/conferences/?conf=aixhmi2024