WEARHAP -WEARable HAPtics for humans and robots

Funded from the European Union Seventh Framework Programme FP7/2007-2013 under grant agreement n° 601165. Information and Communication Technologies, Collaborative Large-scale integrating project (IP), FP7-ICT-2011-9-2.1: Cognitive Systems and Robotics. 01.03.2013 — 31.08.2017.

The complexity of the world around us is creating a demand for cognition-enabled interfaces that will simplify and enhance the way we interact with the environment.

Project WEARHAP, aims at laying the scientific and technological foundations for wearable haptics, a novel concept for the systematic exploration of haptics in advanced cognitive systems and robotics that will redefine the way humans will cooperate with robots.
The challenge of this new paradigm stems from the need for wearability which is a key element for natural interaction.
This paradigm shift will enable novel forms of human intention recognition through haptic signals and novel forms of communication and cooperation between humans and robots.
Wearable haptics will enable robots to observe humans during natural interaction with their shared environment. Research challenges are ambitious and cross traditional boundaries between robotics, cognitive science and neuroscience.
Research findings derived from distributed robotics, biomechanical modeling, multisensory tracking, underaction in control and cognitive systems will be integrated to address the scientific and technological challenges imposed in creating effective wearable haptic interaction.
To highlight the enabling nature, the versatility and the potential for industrial exploitation of WEARHAP, the research challenges will be guided by representative application scenarios. These applications cover robotics, health and social scenarios, stretching from human-robot interaction and cooperation for search and rescue, to human-human communication, and interaction with virtual worlds through interactive games.

Within the Wearhap project, Prof. Frisoli has  been the responsible for WP4 on “wearable devices”.  Within the project wearable haptic devices conveying tactile feedback at parts of the upper limbs other than fingertips have been devised, exploring 4 dimensions: cutaneous feedback by skin stretch and curvature display, transient information by dynamic stimulation and thermal feedback.

Fingertip haptic device

New Version of the Fingertip Device for rendering  interaction forces (skin stretch) and contact transition

  • More compact actuators
  • New spherical and revolute joints (improving stiffness and precision)
  • Multi-finger configuration
  • Wireless version (battery at the wrist) under test
  • Size: 18x30x32 mm
  • Mass: 18 grams
  • Displacement (max): +-4mm
  • Force (max): 3N
Immagine2
Immagine1

Reference: D. Leonardis, M. Solazzi, I. Bortone and A. Frisoli, “A 3-RSR Haptic Wearable Device for Rendering Fingertip Contact Forces,” in IEEE Transactions on Haptics, vol. 10, no. 3, pp. 305-316, 1 July-Sept. 2017.
doi: 10.1109/TOH.2016.2640291 [pdf]

Pacchierotti, C., Sinclair, S., Solazzi, M., Frisoli, A., Hayward, V., & Prattichizzo, D. (2017). Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives. IEEE transactions on haptics10(4), 580-600. [pdf]

Virtual rehabilitation with fingertip haptics in developmental age

Scientific goals

Children with neuro motor disorders (Cerebral Palsy (CP) and Developmental Dyspraxia (DD)) show difficulties in:

  • Motor planning and execution
  • Understand movements and body limits
  • Proprioception and sensory afferents from interaction with the environment

The therapeutic intervention was based on:

  • Motor tasks requiring motor planning and coordination
  • Repetition of targeted motor tasks (brain plasticity)
  • Enhance proprioception and sensory-motor functions (tactile and visual feedback during interaction)

 

The rehabilitation game scenario

A gaming scenario was developed consisting of two rehab serious games:

  1. The game scenario presents a groove shaped like a labyrinth that is displayed to the user through a head mounted display. The child wears on his/her finger a cutaneous device, the 3-dof prototype by SSSA, that can exert and modulate forces at the level of the fingerpad
  2. The scenario exercises different manipulation and movement abilities, in particular for what concerns upper limb orientation and wrist pronation – supination.  The child wears the fingertip devices with two fingers in order to be able to perform grasping with thumb and index. The scenario will exercise different manipulation and movement abilities, in particular for what concerns upper limb orientation and wrist pronation – supination.  The child wears the fingertip devices with two fingers in order to be able to perform grasping with thumb and index