← Back

Human-Computer Interaction

The design and evaluation of user interfaces and interactive systems, including usability, user experience, accessibility, mobile computing, augmented reality, and social aspects of computing

7 papers

Papers

Scholar Inbox: Personalized Paper Recommendations for Scientists

The paper presents Scholar Inbox, a new open-access platform providing personalized paper recommendations and research tools for scientists. Utilizing a content-based model trained on a large dataset of explicit user ratings, the platform aims to combat information overload and improve research efficiency. A comprehensive evaluation, including a large user study, demonstrates high user satisfaction and retention, though the system's ability to explicitly model multiple diverse research interests is still an area for future improvement.

Human-Computer Interaction Nov 05, 06:24 PM

shinyUMAP: an online tool for promoting understanding of single cell omics data visualization

This paper introduces shinyUMAP, an online tool that allows users to interactively explore and adjust parameters for UMAP, a common dimensionality reduction technique used for visualizing single-cell omics data. By manipulating parameters and observing their impact on data visualization, users can gain a deeper understanding of UMAP's limitations and avoid misinterpretations.

Human-Computer Interaction Sep 09, 12:35 PM

Not All Moods Are Created Equal! Exploring Human Emotional States in Social Media

The study found that the expression of different moods on Twitter is not uniform, with negative moods being more frequently expressed than positive moods. Furthermore, "sociality" (ratio of followers to followees) was found to be positively correlated with the expression of positive moods, and activity level was higher for individuals expressing positive moods.

Human-Computer Interaction Jul 14, 11:23 AM

Accelerating eye movement research via accurate and affordable smartphone eye tracking

This paper introduces a machine learning-based eye tracking method using a smartphone's front-facing camera, achieving accuracy comparable to specialized mobile eye trackers at a fraction of the cost. The researchers validated their method by replicating key findings from previous eye movement studies and demonstrated its potential for assessing reading comprehension difficulty and other applications.

Human-Computer Interaction Jul 14, 11:23 AM

AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove

Researchers developed smart gloves with triboelectric sensors to capture sign language gestures, which are then processed by a deep learning model for recognition. The system achieves high accuracy in recognizing 50 words and 20 sentences from American Sign Language and translates them into text and speech within a virtual reality interface, allowing for interactive communication between signers and non-signers.

Human-Computer Interaction Jul 14, 11:23 AM