skip to main content
research-article

Curve shape and curvature perception through interactive sonification

Published: 22 October 2012 Publication History

Abstract

In this article we present an approach that uses sound to communicate geometrical data related to a virtual object. This has been developed in the framework of a multimodal interface for product design. The interface allows a designer to evaluate the quality of a 3-D shape using touch, vision, and sound. Two important considerations addressed in this article are the nature of the data that is sonified and the haptic interaction between the user and the interface, which in fact triggers the sound and influences its characteristics. Based on these considerations, we present a number of sonification strategies that are designed to map the geometrical data of interest into sound. The fundamental frequency of various sounds was used to convey the curve shape or the curvature to the listeners. Two evaluation experiments are described, one involves partipants with a varied background, the other involved the intended users, i.e. participants with a background in industrial design. The results show that independent of the sonification method used and independent of whether the curve shape or the curvature were sonified, the sonification was quite successful. In the first experiment participants had a success rate of about 80% in a multiple choice task, in the second experiment it took the participants on average less than 20 seconds to find the maximum, minimum or inflection points of the curvature of a test curve.

References

[1]
Alonso, M., Shelley, S., Hermes, D., and Kohlrausch, A. 2008. Evaluating geometrical properties of virtual shapes using interactive sonification. In Proceedings of the IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE). IEEE, Los Alamitos, CA. 154--159.
[2]
Axen, U. and Choi, I. 1996. Investigating geometric data with sound. In Proceedings of the 3rd International Conference on Auditory Display. S. Frysinger and G. Kramer, Eds.
[3]
Axen, U. and Edelsbrunner, H. 1998. Auditory Morse analysis of triangulated manifolds. In Mathematical Visualization, Springer, Berlin. 223--236.
[4]
Barrass, S. 1997. Auditory information design. Ph.D. dissertation, The Australian National University.
[5]
Bordegoni, M. 2010. SATIN project channel. http://www.youtube.com/user/SATINproject.
[6]
Bordegoni, M., Ferrise, F., Covarrubias, M., and Antolini, M. 2010. Haptic and sound interface for shape rendering. Presence 19, 4, 341--363.
[7]
Bordegoni, M., Ferrise, F., Covarrubias, M., and Antolini, M. 2011. Geodesic spline interface for haptic curve rendering. IEEE Trans. Haptics.
[8]
Catalano, C. A., Falcidieno, B., Giannini, F., and Monti, M. 2002. A survey of computer-aided modeling tools for aesthetic design. J. Comput. Inf. Sci. Eng. 2, 11--20.
[9]
Godoy, R. I. 2010. Musical Gestures: Sound, Movement and Meaning. Routledge, New York. 103--125.
[10]
Gossmann, J. 2005. Towards an auditory representation of complexity. In Proceedings of the 11th International Conference on Auditory Display (ICAD'05). 264--268.
[11]
Hermann, T. 2002. Sonification for exploratory data analysis. Ph.D. dissertation, Bielefeld University, Germany.
[12]
Hermes, D. J., de Koning, S. V., and Geelen, P. 2009. Perception of real rubbing sounds. In Proceedings of the 4th International Haptic and Auditory Interaction Design Workshop (HAID'09). 8--9.
[13]
Hermes, D. J., van der Pol, K., Vankan, A., Boeren, F., and Kuip, O. 2008. Perception of rubbing sounds. In Proceedings of the 4th International Workshop on Haptic and Audio Interaction Design (HAID'08). Vol. II, 18--19.
[14]
Hollander, A. J. and Furness, T. A. 1994. Perception of virtual auditory shapes. In Proceedings of the 2nd International Conference on Auditory Display (ICAD'94). 157--170.
[15]
Hollowood, J., Pettitt, M., Sharples, S., Shelley, S., Alonso, M., and Hermes, D. 2012. Seeing what the ears hear: Audio and visual perception of curve shape and curvature data. In preparation.
[16]
Janata, P. and Childs, E. 2004. Marketbuzz: Sonification of real-time financial data. In Proceedings of the 10th International Conference on Auditory Display (ICAD'04).
[17]
Joseph, A. J. and Lodha, S. K. 2002. Musart: Musical audio transfer function real-time toolkit. In Proceedings of the International Conference on Auditory Display.
[18]
Kamel, H. M., Roth, P., and Sinha, R. R. 2001. Graphics and user's exploration via simple sonics (GUESS): Providing inter-relational representation of objects in a non-visual environment. In Proceedings of the 7th International Conference on Auditory Display (ICAD'01), 261--266.
[19]
Kennel, A. R. 1996. Audiograf: A diagram-reader for the blind. In Proceedings of the 2nd Annual ACM Conference on Assistive Technologies. ACM, New York. 51--56.
[20]
Kohler, E., Keysers, C. M., Umilta, A., Fogassi, L., Gallese, V., and Rizzolatti, G. 2002. Hearing sounds, understanding actions: Action representation in mirror neurons. Science 297, 5582, 846--848.
[21]
Lacey, S., Campbell, C., and Sathian, K. 2007. Vision and touch: Multiple of multisensory representations of objects? Perception 36, 1513--1521.
[22]
Lakatos, S. and Marks, L. E. 1999. Haptic form perception: Relative salience of local and global features. Percept. Psychophys. 61, 5, 895--908.
[23]
Leissa, A. W. 1969. Vibration of plates. NASA SP-160, NASA, Washington, D.C.
[24]
Leman, M. 2007. Embodied Music Cognition and Mediation Technology. MIT Press, Cambridge, MA.
[25]
Lodha, S. K., Beahan, J., Heppe, T., Joseph, A. J., and Zne-Ulman, B. 1997. Muse: A musical data sonificaton toolkit. In Proceedings of the International Conference on Auditory Display (ICAD'97).
[26]
Mansur, D. L., Blattner, M. M., and Joy, K. I. 1985. Sound graphs: A numerical data analysis method for the blind. J. Medical Syst. 9, 163--174.
[27]
Mauney, B. S. and Walker, B. N. 2004. Creating functional and livable soundscapes for peripheral monitoring of dynamic data. In Proceedings of the International Conference on Auditory Display (ICAD'04).
[28]
Mezrich, J., Frysinger, S., and Slivjanovski, R. 1984. Dynamic representation of multivariate time series data. J. Amer. Stat. Assoc. 79, 385, 34--40.
[29]
Minghim, R. and Forrest, A. R. 1995. An illustrated analysis of sonification for scientific visualisation. In Proceedings of the 6th IEEE Visualization Conference. 110--117.
[30]
MIS. 2009. Musical instrument samples. The University of Iowa Electronic Music Studios. http://theremin.music.uiowa.edu/MIS.html.
[31]
MOOG. 2010. MOOG-Hapticmaster. http://www.moog.com/products/haptics-robotics/.
[32]
Moulines, E. and Charpentier, F. 1990. Pitch-synchronous waveform processing techniques for text-to-speech synthesis using diphones. Speech Commun. 9, 5/6, 453--467.
[33]
Nesbitt, K. and Barrass, S. 2004. Finding trading patterns in stock market data. IEEE Comput. Graph. Appl. 24, 5, 45--55.
[34]
Norman, J. F., Norman, H. F., Clayton, A. M., and Lianekhammy, J. 2004. The visual and haptic perception of natural object shape. Percept. Psychophys. 66, 342--351.
[35]
Rizzolatti, G. and Craighero, L. 2004. The mirror-neuron system. Ann. Rev. Neurosci. 27, 169--192.
[36]
Roseman, D. 1997. Visualization and Mathematics. Springer, Berlin. 67--82.
[37]
Rossing, T. D. and Fletcher, N. H. 2004. Principles of Vibration and Sound. 2nd Ed. Springer, Berlin.
[38]
Roth, P., Kamel, H., Petrucci, L., and Pun, T. 2002. A comparison of three nonvisual methods for presenting scientific graphs. J. Visual Impairment and Blindness 96, 6, 420--428.
[39]
Sauro, J. and Lewis, J. R. 2010. Average task times in usability tests: What to report? In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI). ACM, New York.
[40]
Scaletti, C. 1994. Auditory Display: Sonification, Audification, and Auditory Interfaces. Addison-Wesley, 223--251.
[41]
Shelley, S., Alonso, M., Hollowood, J., Pettitt, M., Sharples, S., Hermes, D., and Kohlrausch, A. 2009. Interactive sonifcation of curve shape and curvature data. In Proceedings of the International Workshop on Haptic and Audio Interaction Design (HAID'09).
[42]
Smith, S., Bergeron, R. D., and Grinstein, G. G. 1990. Stereophonic and surface sound generation for exploratory data analysis. In Proceedings of the ACM Conference on Human Factors in Computing Systems. ACM, New York. 125--132.
[43]
Stevens, R. D., Edwards, A. D. N., and Harling, P. A. 1997. Access to mathematics for visually disabled students through multimodal interaction. Hum. Comput. Interact. 12,1, 47--92.
[44]
Stockman, T., Nickerson, L. V., and Hind, G. 2005. Auditory graphs: A summary of current experience and towards a research agenda. In Proceedings of the International Conference on Auditory Display (ICAD'05).
[45]
van den Doel, K. and Pai, D. K. 2004. Audio Anecdotes: Tools, Tips, and Techniques for Digital Audio. AK Press (Chapter Modal Synthesis for Vibrating Objects.)
[46]
van den Doel, K., Smilek, D., Bodnar, A., Chita, C., Corbett, R., Nekrasovski, D., and McGrenere, J. 2004. Geometric shape detection with soundview. In Proceedings of the10th International Conference on Auditory Display (ICAD).
[47]
Varela, F., Thompson, E., and Rosch, E. 1991. The Embodied Mind: Cognitive Science and Human Experience. MIT Press, Cambridge, MA.
[48]
von Bismarck, G. 1974a. Sharpness as an attribute of the timbre of steady sounds. Acustica 30, 159--172.
[49]
von Bismarck, G. 1974b. Timbre of steady sounds: A factorial investigation of its verbal attributes. Acustica 30, 146--159.
[50]
Walker, B. N. 2002. Magnitude estimation of conceptual data dimensions for use in sonification. J. Exper. Psychol. Appl. 8, 4, 211--221.
[51]
Walker, B. N. 2007. Consistency of magnitude estimations with conceptual data dimensions used for sonification. Appl. Cognitive Psychol. 21, 579--599.
[52]
Walker, B. N. and Cothran, J. T. 2003. Sonification sandbox: A graphical toolkit for auditory graphs. In Proceedings of the International Conference on Auditory Display (ICAD'03).
[53]
Walker, B. N. and Kramer, G. 2005. Mappings and metaphors in auditory displays: An experimental assessment. ACM Trans. Appl. Percept. 2, 4, 407--412.
[54]
Weisstein, E. W. 2011a. Cubic spline. MathWorld. http://mathworld.wolfram.com/CubicSpline.html.
[55]
Weisstein, E. W. 2011b. Curvature. MathWorld. http://mathworld.wolfram.com/Curvature.html.
[56]
Wier, C. C., Jesteadt, W., and Green, D. M. 1977. Frequency discrimination as a function of frequency and sensation level. J. Acoustical Soc. Amer. 61, 178--183.
[57]
Wilson, C. M. and Lodha, S. K. 1996. Listen: A data sonification toolkit. In Proceedings of the International Conference on Auditory Display (ICAD'96).

Cited By

View all
  • (2024)Hearing a circle: An exploratory study of accessible sonification for young children with blindness and low visionBritish Journal of Visual Impairment10.1177/02646196241253534Online publication date: 21-May-2024
  • (2024)Interactive Shape Sonification for Tumor Localization in Breast Cancer SurgeryProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642257(1-15)Online publication date: 11-May-2024
  • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 9, Issue 4
October 2012
109 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/2355598
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 October 2012
Accepted: 01 May 2012
Revised: 01 May 2012
Received: 01 January 2012
Published in TAP Volume 9, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Sonification
  2. haptics
  3. modal synthesis
  4. sound synthesis

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)18
  • Downloads (Last 6 weeks)2
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Hearing a circle: An exploratory study of accessible sonification for young children with blindness and low visionBritish Journal of Visual Impairment10.1177/02646196241253534Online publication date: 21-May-2024
  • (2024)Interactive Shape Sonification for Tumor Localization in Breast Cancer SurgeryProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642257(1-15)Online publication date: 11-May-2024
  • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
  • (2023)ShapeSonic: Sonifying Fingertip Interactions for Non-Visual Virtual Shape PerceptionSIGGRAPH Asia 2023 Conference Papers10.1145/3610548.3618246(1-9)Online publication date: 10-Dec-2023
  • (2022)Slide-Tone and Tilt-Tone: 1-DOF Haptic Techniques for Conveying Shape Characteristics of Graphs to Blind UsersProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517790(1-19)Online publication date: 29-Apr-2022
  • (2022)Feasibility Study on Interactive Geometry Sonification2022 International Conference on Cyberworlds (CW)10.1109/CW55638.2022.00036(159-162)Online publication date: Sep-2022
  • (2021)MovEchoACM Transactions on Applied Perception10.1145/346469218:3(1-19)Online publication date: 20-Aug-2021
  • (2020)Immersive sonification of protein surface2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW50115.2020.00082(380-383)Online publication date: Mar-2020
  • (2018)Immersive sonification of protein surfaceProceedings of the 30th Conference on l'Interaction Homme-Machine10.1145/3286689.3286713(149-155)Online publication date: 23-Oct-2018
  • (2017)Evaluating the Use of Sound in Static Program ComprehensionACM Transactions on Applied Perception10.1145/312945615:1(1-20)Online publication date: 6-Oct-2017
  • Show More Cited By

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media