Thursday, September 29, 2011
Wednesday, September 28, 2011
Tuesday, September 27, 2011
Scientists probe connection between sight and touch in the brain
USC scientists have discovered that as you look at an object, your brain not only processes what the object looks like, but remembers what it feels like to touch it as well. This connection is so strong that a computer examining data coming only from the part of your brain that processes touch can predict which object you are actually looking at.
Robotics team finds artificial fingerprints improve tactile abilities
Over the past couple of decades, many people in and out of the science community have watched the steady progress being made in robotics. It’s an exceptionally interesting field due to the anthropomorphic nature of the results. Each new step brings such machines closer to emulating us even as we look forward to the next step. One interesting thing about robotics is that certain areas seem to be advancing faster than others. Robot arms for example are old news, new research is focused more on hand movements. And has advances in hand movements have been made, more research has come to focus on finger movements and finally tactile sensations. Now new work by a trio of researches from the National University of Singapore describe in their paper published on the preprint server arXiv, how affixing artificial fingerprints to robot fingers can increase tactile "sensation" allowing such a robot to discern the differences in curvature of objects.
Monday, September 26, 2011
Hasso-Plattner-Institut: Understanding Touch
A study participant targeting crosshairs using different finger angles. Can you guess how this user is conceptualizing touch, i.e., what geometric relationship between finger and crosshairs the user is trying to maintain independent of how the finger is held? Our findings suggest that users indeed target as suggested by this illustration, i.e., by aligning finger features and outlines in a hypothesized top-down perspective.
Intel Labs Seattle | Publications: HeatWave: Exploring the Feasibility of Thermal Imaging for Surface User Interaction
Abstract: With increasing interest in natural interaction (such as gesture or body motion), researchers are exploring sensors and algorithms that more accurately detect these user interactions unobtrusively to create opportunities for new applications. The HeatWave system describes our exploration of digital thermal imaging cameras to detect, track, and support user interaction on arbitrary surfaces. Thermal sensing has had limited examination in the HCI research community and is generally under-explored outside of law enforcement and energy auditing applications. This paper describes HCI applications for this technology, our algorithm implementation, and highlights how this new imaging technology might complement or augment more traditional RGB (video) and/or depth cameras. We identify unique contributions and opportunities afforded by thermal imaging and its limitations as a practical sensor for naturalistic user interaction.
Visualization Lab | FingerGlass: Efficient Multiscale Interaction on Multitouch Screens
Abstract: Many tasks in graphical user interfaces require users to inter- act with elements at various levels of precision. We present FingerGlass, a bimanual technique designed to improve the precision of graphical tasks on multitouch screens. It enables users to quickly navigate to different locations and across multiple scales of a scene using a single hand. The other hand can simultaneously interact with objects in the scene. Unlike traditional pan-zoom interfaces, FingerGlass retains contextual information during the interaction. We evaluated our technique in the context of precise object selection and translation and found that FingerGlass significantly outperforms three state-of-the-art baseline techniques in both objective and subjective measurements: users acquired and translated targets more than 50% faster than with the second- best technique in our experiment.
Sunday, September 25, 2011
Medusa: A Proximity-Aware Multi-touch Tabletop - Publications - Autodesk Research
Medusa: A Proximity-Aware Multi-touch Tabletop
(2011)
Abstract
We present Medusa, a proximity-aware multi-touch tab-letop. Medusa uses 138 inexpensive proximity sensors to: detect a user’s presence and location, determine body and arm locations, distinguish between the right and left arms, and map touch point to specific users and specific hands. Our tracking algorithms and hardware designs are de-scribed. Exploring this unique design, we develop and re-port on a collection of interactions enabled by Medusa in support of multi-user collaborative design, specifically within the context of Proxi-Sketch, a multi-user UI proto-typing tool. We discuss design issues, system implementa-tion, limitations, and generalizable concepts throughout the paper.
The Magic Desk - YouTube
I really like this kind of stuffs!
Link
* http://www.autodeskresearch.com/projects/multitouch
* http://www.autodeskresearch.com/publications/magicdesk#
Saturday, September 24, 2011
Saturday, September 3, 2011
Friday, September 2, 2011
LinkWithin
Force pressure touch technology: FSR sensor, electronics, firmware and software
Design Service Low Cost Pressure Mapping