People are now familiar and even like multi touch and it’s time to think about what would be a next multi touch. One good way to getting an answer is to study HCI-related research works from SIGCHI, mobile CHI, IEEE tabletop conference, etc.
Among many cool touch UI research works, this post introduces an intriguing article entitled “New mobile UI with hand-grip recognition” found at SIGCHI2009 Video showcase [1]. Unlike other multi touch interaction techniques, it takes a different way of applying multi touch technologies. The paper says the device recognizes how users hold a device to infer and provide what they want to do with it.
The proposed user interface is based on the assumption that the device can detect how a user holds the device. By analyzing the user's grip-pattern, the device recognizes the user's intention and adjusts itself to meet the specific needs of the user such as accessing an application. The concept of the user interface is presented through several use-case scenarios. In addition, the technical feasibility of the proposed interface is validated by implementing a working prototype system. [1]
The video clip includes several use-case scenarios and a prototype demo:

[Use-case scenarios]

[Prototype demo]
After googling, more detailed versions of the work were found [2-4]. Figure below illustrates the motivation, interaction concept, and technologies behind the work.

[Different tools, different grip-patterns]

[All surfaces of the device are covered with capacitive multi touch sensors to sense the user’s hand shape]
Apparatus and method for controlling portable terminal, US20060111093 -
[US20060111093 [4]]
Of course, there are also many relevant research works. Graspable [5] and HandSense [6] are most similar works. Back-of-device interaction [7, 8], double-side interaction [9], sphere and curved interaction [10-13] are worthwhile to study.
References
[1] H. Lee, W. Chang, J. Park, and J. Shim, "New mobile UI with hand-grip recognition," in Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, Boston, MA, USA, 2009, pp. 3521-3522.
[2] W. Chang, K. E. Kim, H. Lee, J. K. Cho, B. S. Soh, J. H. Shim, G. Yang, S.-J. Cho, and J. Park, "Recognition of Grip-Patterns by Using Capacitive Touch Sensors," in Industrial Electronics, 2006 IEEE International Symposium on, 2006, pp. 2936-2941.
[3] K. E. Kim, W. Chang, S. J. Cho, J. Shim, H. Lee, J. Park, Y. Lee, and S. Kim, "Hand Grip Pattern Recognition for Mobile User Interfaces," in Proceedings of AAAI/IAAI-2006, 2006.
[4] J.-H. Shim, K.-e. Kim, H.-j. Kim, J.-a. Park, H.-y. Lee, H.-j. Lee, W. Chang, S.-n. Chung, and S.-j. Cho, "Apparatus and method for controlling portable terminal," US20060111093: Samsung Electronics, Nov. 8, 2005.
[5] B. T. Taylor and J. V. Michael Bove, "Graspables: grasp-recognition as a user interface," in Proceedings of the 27th international conference on Human factors in computing systems Boston, MA, USA: ACM, 2009.
[6] R. Wimmer and S. Boring, "HandSense: discriminating different ways of grasping and holding a tangible user interface," in Proceedings of the 3rd International Conference on Tangible and Embedded Interaction Cambridge, United Kingdom: ACM, 2009.
[7] P. Baudisch and G. Chu, "Back-of-device interaction allows creating very small touch devices," in Proceedings of the 27th international conference on Human factors in computing systems Boston, MA, USA: ACM, 2009.
[8] D. Wigdor, C. Forlines, P. Baudisch, J. Barnwell, and C. Shen, "Lucid touch: a see-through mobile device," in Proceedings of the 20th annual ACM symposium on User interface software and technology Newport, Rhode Island, USA: ACM, 2007.
[9] E.-l. E. Shen, S.-s. D. Tsai, H.-h. Chu, Y.-j. J. Hsu, and C.-w. E. Chen, "Double-side multi-touch input for mobile devices," in Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, Boston, MA, USA, 2009, pp. 4339-4344.
[10] D. K. Pai, E. W. VanDerLoo, S. Sadhukhan, and P. G. Kry, "The Tango: a tangible tangoreceptive whole-hand human interface," in Eurohaptics Conference, 2005 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2005. World Haptics 2005. First Joint, 2005, pp. 141-147.
[11] H. Benko, A. D. Wilson, and R. Balakrishnan, "Sphere: Multi-touch interactions on a spherical display," in UIST 2008 - Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, Monterey, CA, 2008, pp. 77-86.
[12] J. Lee, Y. Hu, and T. Selker, "iSphere: A free-hand 3D modeling interface," International journal of Architectural Computing, vol. 4, pp. 19-31, 2006.
[13] H. Benko, "Beyond flat surface computing: challenges of depth-aware and curved interfaces," in Proceedings of the seventeen ACM international conference on Multimedia Beijing, China: ACM, 2009.
[14] J.-B. d. l. Riviere, C. Kervegant, E. Orvain, and N. Dittlo, "CubTile: A multi-touch cubic interface," in Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, Bordeaux, 2008, pp. 69-72.
No comments:
Post a Comment