Wednesday, December 28, 2011
Friday, November 11, 2011
Wednesday, November 2, 2011
Saturday, October 29, 2011
Friday, October 28, 2011
Wednesday, October 26, 2011
Monday, October 24, 2011
Tuesday, October 18, 2011
Monday, October 17, 2011
Thursday, October 13, 2011
Wednesday, October 5, 2011
Thursday, September 29, 2011
Wednesday, September 28, 2011
Tuesday, September 27, 2011
USC scientists have discovered that as you look at an object, your brain not only processes what the object looks like, but remembers what it feels like to touch it as well. This connection is so strong that a computer examining data coming only from the part of your brain that processes touch can predict which object you are actually looking at.
Over the past couple of decades, many people in and out of the science community have watched the steady progress being made in robotics. It’s an exceptionally interesting field due to the anthropomorphic nature of the results. Each new step brings such machines closer to emulating us even as we look forward to the next step. One interesting thing about robotics is that certain areas seem to be advancing faster than others. Robot arms for example are old news, new research is focused more on hand movements. And has advances in hand movements have been made, more research has come to focus on finger movements and finally tactile sensations. Now new work by a trio of researches from the National University of Singapore describe in their paper published on the preprint server arXiv, how affixing artificial fingerprints to robot fingers can increase tactile "sensation" allowing such a robot to discern the differences in curvature of objects.
Monday, September 26, 2011
A study participant targeting crosshairs using different finger angles. Can you guess how this user is conceptualizing touch, i.e., what geometric relationship between finger and crosshairs the user is trying to maintain independent of how the finger is held? Our findings suggest that users indeed target as suggested by this illustration, i.e., by aligning finger features and outlines in a hypothesized top-down perspective.
Intel Labs Seattle | Publications: HeatWave: Exploring the Feasibility of Thermal Imaging for Surface User Interaction
Abstract: With increasing interest in natural interaction (such as gesture or body motion), researchers are exploring sensors and algorithms that more accurately detect these user interactions unobtrusively to create opportunities for new applications. The HeatWave system describes our exploration of digital thermal imaging cameras to detect, track, and support user interaction on arbitrary surfaces. Thermal sensing has had limited examination in the HCI research community and is generally under-explored outside of law enforcement and energy auditing applications. This paper describes HCI applications for this technology, our algorithm implementation, and highlights how this new imaging technology might complement or augment more traditional RGB (video) and/or depth cameras. We identify unique contributions and opportunities afforded by thermal imaging and its limitations as a practical sensor for naturalistic user interaction.
Abstract: Many tasks in graphical user interfaces require users to inter- act with elements at various levels of precision. We present FingerGlass, a bimanual technique designed to improve the precision of graphical tasks on multitouch screens. It enables users to quickly navigate to different locations and across multiple scales of a scene using a single hand. The other hand can simultaneously interact with objects in the scene. Unlike traditional pan-zoom interfaces, FingerGlass retains contextual information during the interaction. We evaluated our technique in the context of precise object selection and translation and found that FingerGlass significantly outperforms three state-of-the-art baseline techniques in both objective and subjective measurements: users acquired and translated targets more than 50% faster than with the second- best technique in our experiment.
Sunday, September 25, 2011
Medusa: A Proximity-Aware Multi-touch Tabletop
We present Medusa, a proximity-aware multi-touch tab-letop. Medusa uses 138 inexpensive proximity sensors to: detect a user’s presence and location, determine body and arm locations, distinguish between the right and left arms, and map touch point to specific users and specific hands. Our tracking algorithms and hardware designs are de-scribed. Exploring this unique design, we develop and re-port on a collection of interactions enabled by Medusa in support of multi-user collaborative design, specifically within the context of Proxi-Sketch, a multi-user UI proto-typing tool. We discuss design issues, system implementa-tion, limitations, and generalizable concepts throughout the paper.
I really like this kind of stuffs!
Saturday, September 24, 2011
Saturday, September 3, 2011
Friday, September 2, 2011
Wednesday, June 29, 2011
Friday, February 25, 2011
Tuesday, February 22, 2011
Thursday, February 17, 2011
Wednesday, February 16, 2011
Tuesday, February 15, 2011
Monday, February 14, 2011
Friday, February 11, 2011
Thursday, February 10, 2011
Sunday, February 6, 2011
NoteSlate is low cost tablet device with true one colour display, real paper look design, long life battery (180h !), together with very handy usage and very simple and helpful interface for pen and paper. This easy, compact and portable gadget is used anywhere you want to make any notes, drafts, sketches, any ideas for future reference. Paper for everyone! Write a note and check it later, save it, or delete it. Maybe send it after. Just one colour is enough to express the basics. Keep your life simple.
Tuesday, January 25, 2011
Sunday, January 23, 2011
Wednesday, January 19, 2011
Dan Saffer has made some nice illustrations of "activity zones" on touchscreen phones and tablets, based on which areas are easiest to reach given normal ways of holding the devices. Put frequent actions in the "easy" zones and infrequent or dangerous ones in the "reach" zones.
See his post for the rest of the drawings: Activity Zones for Touchscreen Tablets and Phones (Kicker Studio).
For some research that validates this idea of zones, I'd recommend the publications from Amy Karlson of Microsoft Research. She has done a lot of research on one-handed thumb use of mobile phones in particular.
Thursday, January 6, 2011
With the latest technologies opening new horizons in the consumer electronic market, futuristic gadgets promise better looks and user interfaces to enhance the experience of the user.
Conceived by American designer Andrew Namminga based in Orange County, California, the “Flexible Mobile” is a conceptual mobile device that utilizes emerging technology to be created around a soft surface, ideally a fabric and or mesh, which would house the main components and mount the screen.
Featuring the touch sensitive, flexible screen to allow the device fold out and become a tablet, the new mobile device also utilizes the outside screen to display the graphic treatment/background, picked by the user, to allow quick and easy customization and unlimited CFM options.
"Design" Other Trends in this category
DEPICT - Don’t Dial My Number 2011/01/06 Case of the Smokey Tissue Box 2011/01/04 Retractable Tape Cord : Flat and Anti-Tangle 2010/12/24 Winner Red Dot Award : Design Concept 2010 2010/12/23 Red Dot Concept Design Entry - Double Phone 2010/12/22 Fossil's New Concept Watches 2010/12/21 The abcDARIAN Reading Assistant 2010/12/20 Red Dot Design Award 2010 : A-Check 2010/12/17 Plan.b Digital Map to Guide Visually Impaired 2010/12/16 Lotus Fresh Supermarket in Shanghai by HEAD 2010/12/15