nerdculture.de is one of the many independent Mastodon servers you can use to participate in the fediverse.
Be excellent to each other, live humanism, no nazis, no hate speech. Not only for nerds, but the domain is somewhat cool. ;) No bots in general. Languages: DE, EN, FR, NL, ES, IT

Administered by:

Server stats:

1.2K
active users

#haptics

1 post1 participant0 posts today

RFC 9695: The 'haptics' Top-level Media Type

L'enregistrement d'un nouveau type de média de premier niveau est plutôt rare (le dernier avait été font/ en 2017). Notre #RFC introduit le type haptics/, pour les formats de fichier des interfaces haptiques, celles qu'on touche et qui vous touchent (une manette de jeu vidéo à retour d'effort, par exemple).

bortzmeyer.org/9695.html

www.bortzmeyer.orgBlog Stéphane Bortzmeyer: RFC 9695: The 'haptics' Top-level Media Type
www.nature.comWearable multi-sensory haptic devices | Nature Reviews BioengineeringHaptic devices enable communication via touch, augmenting visual and auditory displays, or by offering alternative channels of communication when vision and hearing are unavailable. Because of the different types of haptic stimuli that are perceivable by users — vibration, skin stretch, pressure and temperature, among others — devices can be designed to communicate complex information by delivering multiple types of haptic stimuli simultaneously. These multi-sensory haptic devices are often designed to be wearable and have been developed for use in a wide variety of applications, including communication, entertainment and rehabilitation. Multi-sensory haptic devices present unique challenges to designers because human perceptual acuity can vary widely depending on the wearable location on the body and/or the heterogeneity in human perceptual performance, particularly when multiple cues are presented simultaneously. Additionally, packaging haptic systems in a wearable form factor presents its own engineering challenges such as cue masking, device mounting and actuator capabilities, among others. Thus, in this Review, we discuss the state-of-the-art and specific obstacles present in the field to produce multi-sensory devices that enhance the human capacity for haptic interaction and information transmission. Haptic devices enable communication via touch, augmenting visual and auditory displays, or by offering alternative channels of communication when vision and hearing are unavailable. This Review discusses multi-sensory wearable haptics, focusing on body-worn devices that convey multiple types of cutaneous haptic feedback.

(thesis, November 2024) Using multimodal attention to design sensory substitution devices: Basic research and application opinvisindi.is/handle/20.500.1 by Ivan Makarov on #haptics #SSD.

The research group also conducted research for the Sound of Vision (SOV) project, so it is probably a competitive compliment that there is no mention of The vOICe anywhere in this thesis. ;-)

opinvisindi.isUsing multimodal attention to design sensory substitution devices: Basic research and application

Optimality of multisensory integration while compensating for uncertain visual target information with artificial vibrotactile cues during reach planning jneuroengrehab.biomedcentral.c #haptics

BioMed CentralOptimality of multisensory integration while compensating for uncertain visual target information with artificial vibrotactile cues during reach planning - Journal of NeuroEngineering and RehabilitationBackground Planning and executing movements requires the integration of different sensory modalities, such as vision and proprioception. However, neurological diseases like stroke can lead to full or partial loss of proprioception, resulting in impaired movements. Recent advances focused on providing additional sensory feedback to patients to compensate for the sensory loss, proving vibrotactile stimulation to be a viable option as it is inexpensive and easy to implement. Here, we test how such vibrotactile information can be integrated with visual signals to estimate the spatial location of a reach target. Methods We used a center-out reach paradigm with 31 healthy human participants to investigate how artificial vibrotactile stimulation can be integrated with visual-spatial cues indicating target location. Specifically, we provided multisite vibrotactile stimulation to the moving dominant arm using eccentric rotating mass (ERM) motors. As the integration of inputs across multiple sensory modalities becomes especially relevant when one of them is uncertain, we additionally modulated the reliability of visual cues. We then compared the weighing of vibrotactile and visual inputs as a function of visual uncertainty to predictions from the maximum likelihood estimation (MLE) framework to decide if participants achieve quasi-optimal integration. Results Our results show that participants could estimate target locations based on vibrotactile instructions. After short training, combined visual and vibrotactile cues led to higher hit rates and reduced reach errors when visual cues were uncertain. Additionally, we observed lower reaction times in trials with low visual uncertainty when vibrotactile stimulation was present. Using MLE predictions, we found that integration of vibrotactile and visual cues followed optimal integration when vibrotactile cues required the detection of one or two active motors. However, if estimating the location of a target required discriminating the intensities of two cues, integration violated MLE predictions. Conclusion We conclude that participants can quickly learn to integrate visual and artificial vibrotactile information. Therefore, using additional vibrotactile stimulation may serve as a promising way to improve rehabilitation or the control of prosthetic devices by patients suffering loss of proprioception.