Haptic technology

From Infogalactic: the planetary knowledge core
Jump to: navigation, search


Rumble packs for controllers, such as this Dreamcast Jump Pack, provide haptic feedback through a user's hands

Haptic or kinesthetic communication recreates the sense of touch by applying forces, vibrations, or motions to the user.[1] This mechanical stimulation can be used to assist in the creation of virtual objects in a computer simulation, to control such virtual objects, and to enhance the remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface.

Most researchers distinguish three sensory systems related to sense of touch in humans: cutaneous, kinesthetic and haptic.[2][3] All perceptions mediated by cutaneous and/or kinesthetic sensibility are referred to as tactual perception. The sense of touch may be classified as passive and active,[4] and the term "haptic" is often associated with active touch to communicate or recognize objects.[5]

Haptic technology has made it possible to investigate how the human sense of touch works by allowing the creation of carefully controlled haptic virtual objects.

The word haptic, from the Greek: ἁπτικός (haptikos), means "pertaining to the sense of touch" and comes from the Greek verb ἅπτεσθαι haptesthai, meaning "to contact" or "to touch".

History

One of the earliest applications of haptic technology was in large aircraft that use servomechanism systems to operate control surfaces.[citation needed] Such systems tend to be "one-way", meaning external forces applied aerodynamically to the control surfaces are not perceived at the controls. Here, the missing normal forces are simulated with springs and weights. In lighter aircraft without servo systems, as the aircraft approached a stall the aerodynamic buffeting (vibrations) was felt in the pilot's controls. This was a useful warning of a dangerous flight condition. This control shake is not felt when servo control systems are used. To replace this missing sensory cue, the angle of attack is measured and when it approaches the critical stall point, a stick shaker is engaged which simulates the response of a simpler control system. Alternatively, the servo force may be measured and the signal directed to a servo system on the control, known as force feedback. Force feedback has been implemented experimentally in some excavators and is useful when excavating mixed material such as large rocks embedded in silt or clay. It allows the operator to "feel" and work around unseen obstacles, enabling significant increases in productivity and less risk of damage to the machine.

The first US patent for a tactile telephone was granted to Thomas D. Shannon in 1973.[6] An early tactile man-machine communication system was constructed by A. Michael Noll at Bell Telephone Laboratories, Inc. in the early 1970s[7] and a patent issued for his invention in 1975.[8]

A photo of an Aura Interactor vest
Aura Interactor vest

In 1994, Aura Systems launched the Interactor Vest, a wearable force-feedback device that monitors an audio signal and uses Aura's patented electromagnetic actuator technology to convert bass sound waves into vibrations that can represent such actions as a punch or kick. The Interactor vest plugs into the audio output of a stereo, TV, or VCR and the user is provided with controls that allow for adjusting of the intensity of vibration and filtering out of high frequency sounds. The Interactor Vest is worn over the upper torso and the audio signal is reproduced through a speaker embedded in the vest. After selling 400,000 of its Interactor Vest, Aura began shipping the Interactor Cushion, a device which operates like the Vest but instead of being worn, it's placed against a seat back and the user must lean against it. Both the Vest and the Cushion were launched with a price tag of $99.[citation needed]

An image of the Tap-in wrist watch.
Jensen's Tap-in device

In 1995 Norwegian Geir Jensen described a wrist watch haptic device with a skin tap mechanism, termed Tap-in. It would connect to a mobile phone via Blueteooth. Tapping-frequency patterns would identify callers to a mobile and enable the wearer to respond by selected short messages. It was submitted for a governmental innovation contest and received no award. It was not pursued or published until recovered in 2015.[9] The Tap-in device by Jensen was devised facing the user to avoid twisting of the wrist, see image. It would adapt across all mobile phone and watch brands. In 2015 Apple started to sell a wrist watch which included skin tap sensing of notifications and alerts to mobile phone of the watch wearer.

Commercial applications

Tactile electronic displays

A tactile electronic display is a kind of display device that presents information in tactile form. The two most popular kinds of tactile electronic displays.

Teleoperators and simulators

Teleoperators are remote controlled robotic tools—when contact forces are reproduced to the operator, it is called haptic teleoperation. The first electrically actuated teleoperators were built in the 1950s at the Argonne National Laboratory by Raymond Goertz to remotely handle radioactive substances. Since then, the use of force feedback has become more widespread in other kinds of teleoperators such as remote controlled underwater exploration devices.

When such devices are simulated using a computer (as they are in operator training devices) it is useful to provide the force feedback that would be felt in actual operations. Since the objects being manipulated do not exist in a physical sense, the forces are generated using haptic (force generating) operator controls. Data representing touch sensations may be saved or played back using such haptic technologies. Haptic simulators are used in medical simulators and flight simulators for pilot training. It is very critical to exert proper force magnitude to the user. It requires considering human force sensitivity.[10]

Video games

Haptic feedback is commonly used in arcade games, especially racing video games. In 1976, Sega's motorbike game Moto-Cross,[11] also known as Fonz,[12] was the first game to use haptic feedback which caused the handlebars to vibrate during a collision with another vehicle.[13] Tatsumi's TX-1 introduced force feedback to car driving games in 1983.[14] The game Earthshaker! was the first pinball machine with haptic feedback in 1989.[15]

Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels. Early implementations were provided through optional components, such as the Nintendo 64 controller's Rumble Pak in 1997. In the same year, the Microsoft SideWinder Force Feedback Pro with built in feedback from Immersion Corporation was released.[16] Many newer generation console controllers and joysticks feature built in feedback devices too, including Sony's DualShock technology. Some automobile steering wheel controllers, for example, are programmed to provide a "feel" of the road. As the user makes a turn or accelerates, the steering wheel responds by resisting turns or slipping out of control.

In 2007, Novint released the Falcon, the first consumer 3D touch device with high resolution three-dimensional force feedback; this allowed the haptic simulation of objects, textures, recoil, momentum, and the physical presence of objects in games.[17][18]

In 2013, Valve announced a line of Steam Machines microconsoles, including a new Steam Controller unit that that uses weighted electromagnets capable of delivering a wide range of haptic feedback via the unit's trackpads.[19] These controllers' feedback systems are open to the user, which allows the user to configure the feedback to occur in nearly limitless ways and situations. Also, due to the community orientation of the controller, the possibilities to have games interact with the controller's feedback system are only limited to the game's design.

Personal computers

In 2008, Apple's MacBook and MacBook Pro started incorporating a "Tactile Touchpad" design[20][21] with button functionality and haptic feedback incorporated into the tracking surface.[22] Products such as the Synaptics ClickPad[23] followed thereafter.

Mobile devices

Tactile haptic feedback is common in cellular devices. Handset manufacturers like Nokia, LG and Motorola are including different types of haptic technologies in their devices; in most cases, this takes the form of vibration response to touch. Alpine Electronics uses a haptic feedback technology named PulseTouch on many of their touch-screen car navigation and stereo units.[24] The Nexus One features haptic feedback, according to their specifications.[25] Samsung first launched a phone with haptics in 2007.[26]

In February 2013, Apple Inc. was awarded the patent for a more accurate haptic feedback system that is suitable for multitouch surfaces. Apple's U.S. Patent for a "Method and apparatus for localization of haptic feedback"[27] describes a system where at least two actuators are positioned beneath a multitouch input device to provide vibratory feedback when a user makes contact with the unit. More specifically, the patent provides for one actuator to induce a feedback vibration, while at least one other actuator creates a second vibration to suppress the first from propagating to unwanted regions of the device, thereby "localizing" the haptic experience. While the patent gives the example of a "virtual keyboard," the language specifically notes the invention can be applied to any multitouch interface.[28]

Virtual reality

Haptics are gaining widespread acceptance as a key part of virtual reality systems, adding the sense of touch to previously visual-only interfaces. Most of these use stylus-based haptic rendering, where the user interfaces to the virtual world via a tool or stylus, giving a form of interaction that is computationally realistic on today's hardware. Systems are being developed to use haptic interfaces for 3D modeling and design that are intended to give artists a virtual experience of real interactive modeling. Researchers from the University of Tokyo have developed 3D holograms that can be "touched" through haptic feedback using "acoustic radiation" to create a pressure sensation on a user's hands (see future section). The researchers, led by Hiroyuki Shinoda, had the technology on display at SIGGRAPH 2009 in New Orleans.[29] Several companies are making full-body or torso haptic vests or haptic suits for use in immersive virtual reality so that explosions and bullet impacts can be felt.

Research

Research has been done to simulate different kinds of taction by means of high-speed vibrations or other stimuli. One device of this type uses a pad array of pins, where the pins vibrate to simulate a surface being touched. While this does not have a realistic feel, it does provide useful feedback, allowing discrimination between various shapes, textures, and resiliencies. Several haptics APIs have been developed for research applications, such as Chai3D, OpenHaptics, and the Open Source H3DAPI.

Medicine

Haptic interfaces for medical simulation may prove especially useful for training in minimally invasive procedures such as laparoscopy and interventional radiology,[30] as well as for performing remote surgery. A particular advantage of this type of work is that surgeons can perform more operations of a similar type with less fatigue.

Tactile images for breast lesions. A – two cysts; B – invasive ductal carcinoma.

Tactile Imaging, as a medical imaging modality, translates the sense of touch into a digital image. The tactile image is a function of P(x,y,z), where P is the pressure on soft tissue surface under applied deformation and x,y,z are coordinates where pressure P was measured. Tactile imaging closely mimics manual palpation, since the probe of the device with a pressure sensor array mounted on its face acts similar to human fingers during clinical examination, deforming soft tissue by the probe and detecting resulting changes in the pressure pattern. Clinical applications include imaging of the prostate,[31][32] breast,[33][34] elasticity assessment of vagina and pelvic floor support structures,[35] muscle functional imaging of the female pelvic floor [36] and myofascial trigger points in muscle.[37]

Mechanical imaging, as a modality of medical diagnostics using mechanical sensors, was introduced in mid 1990s.[38][39]

A Virtual Haptic Back (VHB) was successfully integrated in the curriculum at the Ohio University College of Osteopathic Medicine.[40]

Robotics

The Shadow Hand uses the sense of touch, pressure, and position to reproduce the strength, delicacy, and complexity of the human grip.[41] The SDRH was developed by Richard Greenhill and his team of engineers in London as part of The Shadow Project, now known as the Shadow Robot Company, an ongoing research and development program whose goal is to complete the first convincing artificial humanoid. An early prototype can be seen in NASA's collection of humanoid robots, or robonauts.[42] The Shadow Hand has haptic sensors embedded in every joint and finger pad, which relay information to a central computer for processing and analysis. Carnegie Mellon University in Pennsylvania and Bielefeld University in Germany found The Shadow Hand to be an invaluable tool in advancing the understanding of haptic awareness, and in 2006 they were involved in related research.[citation needed] The first PHANTOM, which allows one to interact with objects in virtual reality through touch, was developed by Thomas Massie while a student of Ken Salisbury at MIT.[43]

Arts and design

Touching is not limited to feeling, but allows interactivity in real-time with virtual objects. Thus, haptics are used in virtual arts, such as sound synthesis or graphic design and animation.[citation needed] The haptic device allows the artist to have direct contact with a virtual instrument that produces real-time sound or images. For instance, the simulation of a violin string produces real-time vibrations of this string under the pressure and expressiveness of the bow (haptic device) held by the artist. This can be done with physical modeling synthesis.

Designers and modellers may use high-degree-of-freedom input devices that give touch feedback relating to the "surface" they are sculpting or creating, allowing faster and more natural workflow than traditional methods.[44]

Artists working with haptic technology such as vibrotactile effectors are Christa Sommerer, Laurent Mignonneau, and Stahl Stenslie.[citation needed]

Future applications

Future applications of haptic technology cover a wide spectrum of human interaction with technology. Current[timeframe?] research focuses on the mastery of tactile interaction with holograms and distant objects, which if successful may result in applications and advancements in gaming, movies, manufacturing, medical, and other industries.[citation needed] The medical industry stands to gain from virtual and telepresence surgeries, which provide new options for medical care. The clothing retail industry could gain from haptic technology by allowing users to "feel" the texture of clothes for sale on the internet.[45] Future advancements in haptic technology may create new industries that were previously not feasible nor realistic.

Holographic interaction

Researchers at the University of Tokyo are working on adding haptic feedback to holographic projections.[46][47][48] The feedback allows the user to interact with a hologram and receive tactile responses as if the holographic object were real. The research uses ultrasound waves to create acoustic radiation pressure, which provides tactile feedback as users interact with the holographic object. The haptic technology does not affect the hologram, or the interaction with it, only the tactile response that the user perceives. The researchers posted a video displaying what they call the Airborne Ultrasound Tactile Display.[29] As of 2008, the technology was not ready for mass production or mainstream application in industry, but was quickly progressing, and industrial companies showed a positive response to the technology. This example of possible future application is the first in which the user does not have to be outfitted with a special glove or use a special control—they can "just walk up and use [it]".[49]

Future medical applications

One currently developing[timeframe?] medical innovation is a central workstation used by surgeons to perform operations remotely. Local nursing staff set up the machine and prepare the patient, and rather than travel to an operating room, the surgeon becomes a telepresence. This allows expert surgeons to operate from across the country, increasing availability of expert medical care. Haptic technology provides tactile and resistance feedback to surgeons as they operate the robotic device. As the surgeon makes an incision, they feel ligaments as if working directly on the patient.[50]

As of 2003, researchers at Stanford University were developing technology to simulate surgery for training purposes. Simulated operations allow surgeons and surgical students to practice and train more. Haptic technology aids in the simulation by creating a realistic environment of touch. Much like telepresence surgery, surgeons feel simulated ligaments, or the pressure of a virtual incision as if it were real. The researchers, led by J. Kenneth Salisbury Jr., professor of computer science and surgery, hope to be able to create realistic internal organs for the simulated surgeries, but Salisbury stated that the task will be difficult.[45] The idea behind the research is that "just as commercial pilots train in flight simulators before they're unleashed on real passengers, surgeons will be able to practice their first incisions without actually cutting anyone".[45]

According to a Boston University paper published in The Lancet, "Noise-based devices, such as randomly vibrating insoles, could also ameliorate age-related impairments in balance control."[51] If effective, affordable haptic insoles were available, perhaps many injuries from falls in old age or due to illness-related balance-impairment could be avoided.

In February 2013, an inventor in the United States built a "spider-sense" bodysuit, equipped with ultrasonic sensors and haptic feedback systems, which alerts the wearer of incoming threats; allowing them to respond to attackers even when blindfolded.[52]

Laparoscopic Tactile Imaging probe (design concept).

During a laparoscopic surgery the video camera becomes a surgeon’s eyes, since the surgeon uses the image from the video camera positioned inside the patient’s body to perform the procedure. Visual feedback is either similar or often superior to open procedures. The greatest limitation to these minimally invasive approaches is the impairment (in the case of traditional laparoscopy) or complete lack of tactile sensation (in the case of robotic laparoscopy) normally used to assist in surgical dissection and decision making. Despite multiple attempts, no tactile imaging device or probe is currently commercially available for laparoscopic surgery.[53][54][55][56] Figure on the right presents one of the proposed devices, which is in the development phase.[57]

See also

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Srinivasan, M. A., & LaMotte, R. H. (1995). Tactual discrimination of softness. Journal of Neurophysiology, 73, 88–101.
  3. Freyberger, F. K. B., & Färber, B. Compliance discrimination of deformable objects by squeezing with one and two fingers. In Proceedings of EuroHaptics 2006 (pp. 271–276).
  4. Bergmann Tiest, W. M., & Kappers, A. M. L. (2009a). Cues for haptic perception of compliance. IEEE Transactions on Haptics, 2, 189–199.
  5. Tiest WM. Tactual perception of material properties. Vision Res 2010; 50(24): 2775-82.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. "Man-Machine Tactile Communication," SID Journal (The Official Journal of the Society for Information Display), Vol. 1, No. 2, (July/August 1972), pp. 5-11.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. Feyzabadi, S.; Straube, S.; Folgheraiter, M.; Kirchner, E.A.; Su Kyoung Kim; Albiez, J.C., "Human Force Discrimination during Active Arm Motion for Force Feedback Design," Haptics, IEEE Transactions on , vol.6, no.3, pp.309,319, July-Sept. 2013
  11. Moto-Cross at the Killer List of Videogames
  12. Fonz at the Killer List of Videogames
  13. Mark J. P. Wolf (2008), The video game explosion: a history from PONG to PlayStation and beyond, p. 39, ABC-CLIO, ISBN 0-313-33868-X
  14. TX-1 at the Killer List of Videogames
  15. http://thepinballreview.com/2013/06/15/1989-williams-earthshaker-overview/
  16. http://news.microsoft.com/1998/02/03/microsoft-and-immersion-continue-joint-efforts-to-advance-future-development-of-force-feedback-technology/
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. [1] Archived November 17, 2008 at the Wayback Machine
  25. [2]
  26. Lua error in package.lua at line 80: module 'strict' not found.
  27. U.S. Patent No. 8,378,797
  28. Lua error in package.lua at line 80: module 'strict' not found.
  29. 29.0 29.1 Lua error in package.lua at line 80: module 'strict' not found.
  30. Jacobus, C., et al., Method and system for simulating medical procedures including virtual reality and control method and system, US Patent 5,769,640
  31. Egorov V, Ayrapetyan S, Sarvazyan AP. Prostate Mechanical Imaging: 3-D image composition and feature calculations. IEEE Trans Med Imaging 2006; 25(10): 1329-40.
  32. Weiss RE, Egorov V, Ayrapetyan S, Sarvazyan N, Sarvazyan A. Prostate mechanical imaging: a new method for prostate assessment. Urology 2008; 71(3):425-429.
  33. Egorov V, Sarvazyan AP. Mechanical Imaging of the Breast. IEEE Transactions on Medical Imaging 2008; 27(9):1275-87.
  34. Egorov V, Kearney T, Pollak SB, Rohatgi C, Sarvazyan N, Airapetian S, Browning S, Sarvazyan A. Differentiation of benign and malignant breast lesions by mechanical imaging. Breast Cancer Research and Treatment 2009; 118(1): 67-80.
  35. Egorov V, van Raalte H, Sarvazyan A. Vaginal Tactile Imaging. IEEE Transactions on Biomedical Engineering 2010; 57(7):1736-44.
  36. van Raalte H, Egorov V. Characterizing female pelvic floor conditions by tactile imaging. International Urogynecology Journal 2015; 26(4): 6097-7, with Video Supplement.
  37. Turo D, Otto P, Egorov V, Sarvazyan A, Gerber LH, Sikdar S. Elastography and tactile imaging for mechanical characterization of superficial muscles. J Acoust Soc Am 2012; 132(3):1983.
  38. Lua error in package.lua at line 80: module 'strict' not found.
  39. Sarvazyan AP, Skovoroda AR. June 1996. Method and apparatus for elasticity imaging. U.S. Patent 5,524,636; 1996.
  40. Lua error in package.lua at line 80: module 'strict' not found.
  41. Lua error in package.lua at line 80: module 'strict' not found.
  42. Lua error in package.lua at line 80: module 'strict' not found.
  43. Lua error in package.lua at line 80: module 'strict' not found.
  44. Lua error in package.lua at line 80: module 'strict' not found.
  45. 45.0 45.1 45.2 Lua error in package.lua at line 80: module 'strict' not found.
  46. Mary-Ann Russon (2016). Holograms you can reach out and touch developed by Japanese scientists. IBTimes
  47. Makino, Y., Furuyama, Y., & Shinoda, H. (2015, August). HaptoClone (Haptic-Optical Clone): Mid-air Haptic-Optical Human-Human Interaction with Perfect Synchronization. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction (pp. 139-139). ACM.
  48. Shinoda, H. (2015, November). Haptoclone as a test bench of weak force haptic interaction. In SIGGRAPH Asia 2015 Haptic Media And Contents Design (p. 3). ACM.
  49. Lua error in package.lua at line 80: module 'strict' not found.
  50. [3] Archived September 15, 2008 at the Wayback Machine
  51. Attila A Priplata, James B Niemi, Jason D Harry, Lewis A Lipsitz, James J Collins. "Vibrating insoles and balance control in elderly people" The Lancet, Vol 362, October 4, 2003.
  52. Lua error in package.lua at line 80: module 'strict' not found.
  53. Talasaz A, Patel RV. Integration of force reflection with tactile sensing for minimally invasive robotics-assisted tumor localization. IEEE Trans Haptics. 2013; 6(2): 217-28.
  54. Hollenstein M1, Bugnard G, Joos R, Kropf S, Villiger P, Mazza E. Towards laparoscopic tissue aspiration. Med Image Anal. 2013; 17(8): 1037-45.
  55. Beccani M, Di Natali C, Sliker LJ, Schoen JA, Rentschler ME, Valdastri P. Wireless tissue palpation for intraoperative detection of lumps in the soft tissue. IEEE Trans Biomed Eng. 2014; 61(2): 353-61.
  56. Pacchierotti C, Prattichizzo D, Kuchenbecker K. Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery. IEEE Trans Biomed Eng 2015 Jul 13. [Epub ahead of print]
  57. Egorov V, Sarvazyan AP. Method and probe for providing tactile feedback in laparoscopic surgery. USA Provisional Patent Application, No. 62199899; July 31, 2015.

Further reading

  • Klein. D, D. Rensink, H. Freimuth, G. J. Monkman, S. Egersdörfer, H. Böse & M. Baumann. Modelling the Response of a Tactile Array using an Electrorheological Fluids. Journal of Physics D: Applied Physics, vol 37, no. 5, pp794–803, 2004.
  • Klein. D, H. Freimuth, G. J. Monkman, S. Egersdörfer, A. Meier, H. Böse M. Baumann, H. Ermert & O. T. Bruhns. Electrorheological Tactile Elements. Mechatronics Vol 15, No 7, pp883–897. Pergamon, September 2005.
  • Monkman. G. J. An Electrorheological Tactile Display. Presence (Journal of Teleoperators and Virtual Environments) Vol. 1, issue 2, pp. 219–228, MIT Press, July 1992.
  • Robles-De-La-Torre G. Principles of Haptic Perception in Virtual Environments. In Grunwald M (Ed.), Human Haptic Perception, Birkhäuser Verlag, 2008.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links