Accurate detection of unconscious daily gestures and movement patterns has many potential applications in habit tracking, hygiene, sports, self-improvement, and specifically in healthcare and disease prevention. Key habits to limit transmission of infectious disease, like the coronavirus disease 2019 (COVID-19) pandemic, are face touching avoidance, especially mucosal membranes (eyes, nose, and mouth), and regular hand washing. It has been estimated that face touching occurs, on average, 23 times per hour.
6 However, although efforts have been made to detect various hand-body interactions,
7–9 face-touching detection represents a challenge due to differentiating hand-to-face proximity (in gestures such as glasses removal, eating/drinking, smoking, hair brushing, or toothbrushing) with actual contact. Moreover, differentiating high-risk mucosal membrane contact with contact with skin, glasses, or clothes is of primordial importance as some may be relevant for disease transmission, and others for hygiene or other applications. Researchers have had encouraging results but are limited by either technical restraints, such as multiple or unpractical sensors, body instrumentation with multiple devices, or artificial constraints, such as fixed predetermined gestures.
8,10,11 Some have achieved promising results, up to accurately predicting the specific area of the face that was touched, however, without detecting actual contact, resulting in a high proportion of false positive results.
12 Although acceptable for preliminary stages of development, in a real-life situation, increased rate of false positive results will undoubtedly lead to users’ fatigability, and prevent long-term use of such devices. It is therefore key to further improve the face-touching detection as well as the specificity of the notifications while preserving user-friendliness, ease-of-use, and avoiding hardware encumbrance.