Thursday, March 26, 2009

DIY Capacitive Touchpad

DIY engineers now go one step further. MAKE magazine reports that a Russian engineer has implemented a DIY capacitive touch pad. The system is built by referencing the whitepaper of Quantum Research Group (now acquired by Atmel). For unknown reason, I can’t open the linked PDF file.. So, I put another link for the paper here. All Quantum’s touch sensing ICs use Quantum’s propriety charge transfer sensing technique. You can easily understand its operating principles by consulting charge transfer sensing whitepaper.

Saturday, March 21, 2009

Concept Designs

PMP With Gasp! No Touchscreen


Phone with Optimus LED keyboards


Elan demonstrates Smart Remote Controller, the touchpad pair desperately seeking a home


Touch Screen Programming & Development

IdentityMine Introduces the IdentityMine Gesture Engine to Support Advanced Multi-Touch Development

Published Thursday, March 19, 2009 5:10 AM

IdentityMine announces the IdentityMine Gesture Engine, a new solution accelerator to assist clients looking to develop multi-touch applications for Windows 7.

LAS VEGAS, NV (March 19, 2009) - IdentityMine, an expert user-experience company, introduces the IdentityMine Gesture Engine to bring expanded Multi-Touch capabilities to applications for Windows 7.

With the new range of hardware and software available, it is now possible to create interfaces that respond directly to touch. Specifically these new paradigms allow more than one touch to be processed at the same time – allowing for more natural navigation and commands. Examples of these emerging technologies include Windows 7 and Microsoft Surface.

The IdentityMine Gesture Engine implements a strategy where gestures can be placed at any point of the user interface, and work seamlessly together to interpret the intended navigation or command. The technology created to implement this strategy is known as multi-level gesturing.  By supporting multi-level cooperative gesturing, designers and developers are able to implement grouping of gestures to accomplish a very natural user experience.

In addition to multi-level gesturing, the IdentityMine Gesture Engine supports and expanded set of gestures including the ability for user-defined gestures.

"The IdentityMine Gesture Engine is a major step forward in allowing us to push the envelope in natural user interface (NUI) development,” said Mark Brown, CEO at IdentityMine.  “We are extremely excited about the new technologies coming in Windows 7 and are ready to support clients who are gearing up for the upcoming OS launch.”

For more information on the IdentityMine Gesture Engine and their solutions and services focused on Windows 7 please visit

Strucktable Multi-Touch Interface


Struktable Multitouch Installation from Gregor Hofbauer on Vimeo.

Multi-Touch Developer Journal


Richard Monson-Haefel launched multi-touch developer journal. He is also running MultiTouch blog

DIY 3D Capacitive Touch Sensing

DIY 3D Interface: Tic Tac Toe from Kyle McDonald on Vimeo.


Source: DIY 3D Controller, instructables


See Also:

The Stribe - A DIY LED touch Interface

Source: The Stribe - A DIY LED touch Interface

Touch Paper: Thracker - Using Capacitive Sensing for Gesture Recognition


R. Wimmer et al., “Thracker-using capacitive sensing for gesture recognition,” in Proceedings of the 26th IEEE International ConferenceWorkshops on Distributed Computing Systems, vol. 64 (IEEE Computer Society Washington, DC, USA, 2006). [PDF]

Touch Paper: Extending 2D object arrangement with pressure-sensitive layering cues

Common Layering Gestures


Use of a peel-back gesture to allow layering with occluded elements

Philip L. Davidson and Jefferson Y. Han, “Extending 2D object arrangement with pressure-sensitive layering cues,” in Proceedings of the 21st annual ACM symposium on User interface software and technology (Monterey, CA, USA: ACM, 2008), 87-90.

Co-creation using multitouch surfaces

Guys at future workspaces are working on two ‘co-creation’ scenarios using multi-touch:

Mitsubishi Demos ‘3D Touch Panel’ — Tech-On!

Mitsubishi Electric Corp prototyped a capacitive touch panel that can detect the distance between a finger and the panel and demonstrated it at Interaction 2009, which took place from March 5 to 6, 2009, in Tokyo.

The prototyped "3D touch panel." A white circle is displayed on the area approached by a finger. The circle becomes larger as the finger moves closer to the panel. When a finger approaches the panel, icons pop up around it. One of the thumbnail pictures enlarges when approached by a finger.


  • Display: 5.7", 640x480 pixels
  • Touch panel: projective capacitive touch panel
  • Measurement range:
  1. contact state: 8 ~ 19 pF change
  2. proximity state: 0.3 pF change
  • Measurments: x, y, z positions and the speed of the approaching finger.
  • Resolution
  1. contact state: 0.2 mm
  2. proximity state: 10 mm (x and y), 256 stage (z axis, up to 20 mm)
  • Response time
  1. contact state: 10 ms
  2. proximity state: 50 ms

Source: Mitsubishi Demos '3D Touch Panel', Tech-On!

As expected, Mitsubishi is not the first one on this kind of technology:

Two Handed FieldMouse

  • Sony CSL: SmartSkin can be considered as an ancestor of iPhone multi-touch screen. CSL has presented an interaction architecture called PreSense, which show a preview before a user execute commands.



Updated Dec. 11, 2009


US2009/0183931A1 Touch Panel Device -

Wednesday, March 18, 2009

DigiDesk: Desktop for Future Information Workers


MS Surface is undoubtedly one of coolest gadgets Microsoft has showed off. Because Surface is too famous, people forget another promising project from Microsoft – DigiDesk. It was originally introduced at Convergence 2007. From HCI view point, both Surface and DigiDesk can be categorized as a tabletop interface. However, their applications are slightly different. Surface aims at public and home entertainment markets, while DigiDesk is designed for information workers. Let’s first see introductory videos of DigiDesk below:

The future beneath your fingertips

The success of the iPhone has given rise to a new grammar of touch control while the advent of multi-touch in Windows 7 will further accelerate the evolution of human computer interfaces, the South by SouthWest festival has been told.

Source: The future beneath your fingertips

Touch Paper: A Robust Blob Recognition and Tracking Method in Vision-Based Multi-touch Technique

F. Wang, X. Ren, and Z. Liu, “A Robust Blob Recognition and Tracking Method in Vision-Based Multi-touch Technique,” Parallel and Distributed Processing with Applications, 2008. ISPA'08. International Symposium on, 2008, pp. 971-974.

F. Wang, X. Ren, and Z. Liu, “A Robust Blob Recognition and Tracking Method in Vision-Based Multi-touch Technique,” Parallel and Distributed Processing with Applications, 2008. ISPA'08. International Symposium on, 2008, pp. 971-974.

The fundamental technique of vision-based multi-touch technique is the high performance image processing. In this paper, two algorithms are presented to resolve the problems of blob recognition and tracking. Image contour transformation algorithm is adopted to analyze the existence of finger contact area and Minimum Distance First (MDF) algorithm is designed to identify and track the corresponding blob in two sequential images. The result shows that the performance of two algorithms fully meets the requirement of real time multi-touch system.

Tuesday, March 17, 2009

Multi-Touch Humanoid



Japanese researchers shoed off a whole new HRP-4C female fashion model robot at Fashion Week In Tokyo. It (She?) has 42 motion motors to mimic the movements and facial motions of human fashion models.



Its walking and facial motions are still far from those of real humans, there's absolutely no question that those problems will be solved in the near future. But.. How about touch? Current humanoid robots has many sensors to control its walking. To name a few, accelerometers, gyroscopes, cameras, encoders, etc. Touch (in robotics, tactile is more appropriate but let me keep the term touch in the following) sensors are also used but they are just for delivering foot pressure to a central motion controller.

Robots should feel touch because they are designed to operate among us. And it means that literally the whole surface of the robot is covered with large, flexible (at least, curved) touch sensors. Many researchers have been working on making those kind of touch sensors. Robots with touch will interact with humans very naturally and harmlessly.


Mechanics, AI, control, and robot touch sensors have been studied for a long time. I think there is one missing link among the research areas: interpreting touch data. Touch data from robot skins will be very different from ordinary sensor data. It is distributed and large. Robot's brain has to process it very fast and control the body accordingly. Lumelsky et al. [1] exactly indicated this point:

Sensitive skin devices will include thousands and millions of elements that generate and process tremendous amounts of information, in parallel and in real time. This will hence be a new physical basis of information technology.With the eventual ubiquity of the sensing skin on various machinery, it is likely to bring the biggest leap in information technology hardware since the introduction of computers.

Because I totally agree with their view points, I quote some more paragraphs from their paper:

A. Sensing and Dynamic Control

Consider our home Helper mentioned above. When its arm senses an obstacle, the control system must analyze it and modify the motion accordingly. Here, the arm dynamics, sensing resolution, and the allowed speed of motion are all tied in some relationship. For example, if the sensing is “myopic” and the arm is heavy, the Helper will move slower, “fearing” a collision, no matter how good its control system is. Since the Helper’s arm is a highly nonlinear system, realizing good real-time control is a challenging problem of control theory.

B. Need for New Control Theory

Note, for example, that the admittedly complex control system of today’s flying aircraft focuses primarily on achieving desired properties of motion at a single point of the aircraft — say, its center of gravity. Other characteristics, such as accounting for body dynamics, appear as constraints on control. However, when controlling a sensitive skin-equipped machine, the control system should be able to focus intermittently on various single and multiple points of potential collision on the machine’s body, and modify the control accordingly, all within the draconian constraints of real-time operation and changing kinematics of the body. Perhaps a better analogy is the control of a bat flying among tree branches, or attempts of reconfigurable control for the changing shape of a jet fighter in battle. These complications call for novel, exciting control theory.

C. Motion Planning Based on Sensitive Skin Data

This research is likely to make use of tools from graph theory, search algorithms, computational geometry, differential geometry, and topology. One serious issue, largely not addressed today, is the symbiosis of real-time motion planning algorithms with control of the machine’s body dynamics and with nonholonomic constraints on the machine motion. That is, based on the continuous stream of data from the sensitive skin, the planning algorithm not only has to produce collision-free motion for every point of the machine’s body, but it has to do it within the system’s dynamic constraints (masses, accelerations, etc.) and the design constraints of its actuators (e.g., an automobile cannot turn on the spot and must move along a curve).

E. Man–Machine Systems

Human and machine intelligence could be merged in real-time motion planning. Envision, for example, a pilot trying to keep his helicopter hovering low above the ground to collect samples from the rain forest without colliding with the underbrush. Besides informing the pilot of any undesirable contacts with plants below, data from the sensitive skin-covered helicopter underbody can be directly used for automatic control of the hovering height to avoid collision. Prototypes of control systems that combine human and machine intelligence in real time have already been demonstrated.

Not directly related to touch, Zbikowsky [2] made an interesting comparison between a jet fighter and a fly.

Whereas the F-35 Joint Strike Fighter, the most advanced fighter plane in the world, takes a few measurements—airspeed, rate of climb, rotations, and so on—and then plugs them into complex equations, which it must solve in real time, the fly relies on many measurements from a variety of sensors but does relatively little computation.

The fly attains remarkable performance, yet is computationally quite simple, relying on extensive, distributed measurement of parameters of interest. From an engineering viewpoint, this opens up new possibilities in control, as well as in sensors and instrumentation.

Multi-touch is not just for touch screens…



[1] V. Lumelsky, M. Shur, and S. Wagner, “Sensitive skin,” Sensors Journal, IEEE, vol. 1, pp. 41-51, 2001.

[2] R. Zbikowski, “Fly like a fly [micro-air vehicle],” IEEE Spectrum, vol. 42, pp. 46-51, 2005.


Links on HRP-4C

Monday, March 16, 2009

Adobe and the future of multitouch

Adobe and the future of multitouch

Link: Adobe and the future of multi-touch

Multitouch and Museums

Pete at NEW CURATOR posted an opinion about applying multi-touch screens to museum applications: notes on multi-touch and museums

Samsung Selects Cypress TrueTouch™ Touchscreen Solution To Power Innovative Interface on Award-Winning P3 Portable Media Player

SAN JOSE, Calif. - (Business Wire) Cypress Semiconductor Corp. (NYSE:CY) today announced that Samsung Electronics Co. Ltd. has selected Cypress’s TrueTouch™ touchscreen solution to implement the touchscreen interface in its new award-winning P3 portable media player. With its dynamic interface and innovative design, the P3 won a CES Innovations 2009 Design and Engineering Award. The TrueTouch solution, based on the PSoC® programmable system-on-chip architecture, enabled Samsung to develop customized multi-touch gestures, such as swiping a finger across the screen to switch audio tracks, or holding down a digital button to fast-forward video.

Cypress’s TrueTouch™ touchscreen solution enables OEMs to utilize a full portfolio of Cypress-developed gestures, from single-touch to multi-touch gestures, such as tapping an application to open, panning through photo albums, scrolling down an e-mail, and rotating and pinching pictures. The unique PSoC architecture allows designers to implement differentiated features and make last-minute design iterations without board changes. Additional information about the TrueTouch solution is available at

Featuring a 3-inch WQVGA TFT-LCD touchscreen, the P3 delivers widescreen video and photos at a 16:9 aspect ratio without the need for letterboxing. The P3’s EmoTure interface offers true haptic feedback, reacting to every command gesture with a variety of physical sensations for a more intimate user experience versus other media players. It also features a new “Music Hot Touch Key” that allows users to instantly access favorite music features and selections without cycling through multiple menus.

“We wanted the touchscreen on the P3 to offer a new level of responsiveness for the end-user,” said Hugh Hyung-uk Choi, senior engineer at Samsung. “Cypress’s TrueTouch touchscreen solution gave us the customization and responsiveness we needed.”

“In the P3, our TrueTouch solution helps Samsung push the envelope of touchscreen interfaces,” said Darrin Vallis, Director of the TrueTouch Touchscreen Business Unit at Cypress. “TrueTouch is gaining rapid adoption because its programmability allows designers to implement unique features and its high-precision delivers the responsiveness consumers demand.”

About TrueTouch

Cypress's TrueTouch touchscreen solution is based on projected capacitive touchscreen technology, offering numerous benefits over touchscreens based on resistive technology. These advantages include optical clarity, durability, reliability and cost-effective implementation of multi-touch features. The TrueTouch family, based on the PSoC® programmable system-on-chip architecture, is the industry's broadest touchscreen offering, including single-touch, multi-touch gesture, and multi-touch all-point solutions. Because of the flexible and programmable PSoC-based TrueTouch architecture, customers can also choose to work with a wide variety of touchscreen vendors and/or LCD module vendors to create their designs. Other touchscreen solutions are not programmable and require designers to implement a fixed solution with limited materials choices. The TrueTouch solution also enables designers to integrate additional functions such as CapSense™ touch-sensing buttons and sliders, driving LEDs, backlight control and I/O expansion. These functions, in conjunction with flexible communication options (I2C and SPI), allow for unparalleled system integration for touchscreen systems. Additional information about the TrueTouch solution is available at

About Samsung Electronics America, Inc.

Headquartered in Ridgefield Park, NJ, Samsung Electronics America, Inc. (SEA), a wholly owned subsidiary of Samsung Electronics Co., Ltd., markets a broad range of award-winning, advanced digital consumer electronics and home appliance products, including HDTVs, home theater systems, MP3 players, refrigerators and laundry machines. A recognized innovation leader in consumer electronics design and technology, Samsung is the HDTV market leader in the U.S. and is the only manufacturer that produces all four major digital television technologies. Please visit for more information.

About Cypress

Cypress delivers high-performance, mixed-signal, programmable solutions that provide customers with rapid time-to-market and exceptional system value. Cypress offerings include the PSoC® programmable system-on-chip, USB controllers, general-purpose programmable clocks and memories. Cypress also offers wired and wireless connectivity technologies ranging from its CyFi™ Low-Power RF solution, to West Bridge® and EZ-USB® FX2LP controllers that enhance connectivity and performance in multimedia handsets. Cypress serves numerous markets including consumer, computation, data communications, automotive, and industrial. Cypress trades on the NYSE under the ticker symbol CY. Visit Cypress online at

Cypress, the Cypress logo and PSoC are registered trademarks and TrueTouch and CapSense are trademarks of Cypress Semiconductor Corp. All other trademarks are property of their owners.

Cypress Public Relations
Samer Bahou, 408-544-1081

Source: The Earth Times

Surface as a business card reader

C. Davies, “Microsoft Surface interactive business card reader, plus gaming,” Slash Gear, Mar. 11, 2009.

Australia-based digital agency Amnesia have been playing with their multitouch Microsoft Surface table, and have come up with perhaps the best way to make sure people actually look through the business cards they get showered with at corporate events.  Each of the specially-designed Amnesia cards has a code which the Surface recognizes, calling up contact detail, social networking feeds and more.

Amnesia Razorfish - Staff Directory on Surface from Razorfish - Emerging Experiences on Vimeo.

Immersive: a multi-touch company

Immersive is a multi-touch interface company that provides hardware and integration services to fit various customers. Also, read their story at Rapid Growth.

Immersive Multi-touch Showcase Demo from Jason Sosa on Vimeo.

MMF - Max Multitouch Framework

MMF (Max Multitouch Framework) is designed to control Max user interface directly within a multi-touch screen.

Tuesday, March 10, 2009

The surface surfaced

Tobias E. Cordsen at nesdroc posted a long review about his experience with the MS Surface.


No video? Get the DivX Web Player for Windows or Mac

Stantum Secures $13 Million in Series B Funding Round

Co-Led by CDC Innovation and Auriga

BORDEAUX, France, March 10, 2009 – Stantum Technologies (, a pioneer developer of multi-touch sensing technologies, announced today it has secured $13 million in Series B funding.  CDC Innovation and Auriga Partners co-led the round with XAnge Private Equity as historical investor (2007).

Valery Huot, managing partner of CDC Innovation, and Philippe Granger, partner at Auriga Partners, have joined Stantum’s board of directors, alongside Nicolas Rose, partner at XAnge Private Equity.

According to Stantum CEO Etienne Paillard, the new funding will be used to develop a worldwide sales and marketing organization in the U.S., Europe and Asia; increase R&D capacity for next-generation sensing technologies and new products; and establish and increase mass manufacturing capabilities through partners.

Stantum, originally called JazzMutant, was founded in 2002 with an aim to develop new human-machine interface standards for the creative industry.  In 2003, JazzMutant produced the world’s first multi-touch screen that could track an unlimited number of fingers at once.  In 2005, JazzMutant launched the first multi-touch product, the first such on the market.  Facing an ever-growing demand from various OEMs, the company began working in 2006 to make its patented multi-touch technology available to third-party integrators.  In 2007, after a round of financing led by XAnge Private Equity, JazzMutant became Stantum and officially launched its OEM activity.

“Stantum has assembled an unparalleled engineering and top management team with a proven track record of bringing leading products to market,” said Huot.  “The company has already demonstrated cutting edge multi-touch solutions that are attracting strong partner and customer interest for mobile phones, navigation devices, and other consumer electronics equipment.”

“We are very excited to be supporting Stantum and are impressed by the progress the company has made to date.  Stantum’s focused strategy and unique multi-touch technologies provide a very strong foundation for future success,” added Granger.

Vadis Ventures, a corporate finance boutique focused on the high-tech sector, advised Stantum through this round of financing.

Touch Screen for DJs

KRE8 Phone For DJs

KRE8 Phone For DJs

Source: KRE8 by Jose Tomas DeLuna, Yanko Design

Monday, March 9, 2009

How capacitive multi-touch screen works

This is an 1-page illustration by G. Retseck at STMicroelectronics showing how capacitive multi-touch screens work.

How capacitive multi-touch screen works

Source: Smart Phones: Touch Screens Redefine the Market, Scientific American

STMicroelectronics Unveils Microcontroller-Based Sensing Capability, Placing Touch Control within Easy Reach

STMicroelectronics releases an open-source capacitive touch sensing library for its STM8 microcontrollers. Compare it with freescale proximity sensing software.

STMicroelectronics Unveils Microcontroller-Based Sensing Capability, Placing Touch Control within Easy Reach Touch Sensing - Technical Literature and Support Files STMicroelectronics

Sunday, March 8, 2009

Multitouch 7 and 12-inch panels in 2H09 say EETI



Touch Patent US7202859: Capacitive Sensing Pattern

Popular sensing patterns for capacitive touch screens are two layered matrix array on which diamond or rectangular electrodes are printed.

As expected, there another kind of sensing electrode pattern. Here, I try posting an exemplary patent on touch sensing patterns. Yes, reading this kind of patent is somewhat boring compared with reading cute UI patents from companies such as Apple. However, “sensing pattern” is also an important factor in designing a capacitive touch screen.

The patent US7202859 explains a sensor (or sensing pattern) as shown in the figure below. The X traces and Y traces are arranged in an intertwined pattern around each crossing.


Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.


The typical construction of the sensor matrix is shown below. In this case two X and Y traces are printed on separate layers. In other embodiments, two trace patterns are printed on opposite faces or the same face of the insulating layer.


Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.


In order to reduce capacitive coupling between X traces and Y traces, two methods are presented: 1) Traces become very thin around each crossing or 2) Either of X and Y traces is connected through via holes as shown in the Fig. 5 of the patent. This configuration is quite useful when two X and Y traces are disposed on the same face of the insulating layer.

Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.


Many variations of sensing patterns are also introduced (see Fig. 6 and Fig. 7 of the patent). Possible advantages of the Fig. 6 of the patent are:

Grouping traces in this manner can allow individual traces of the group to be arbitrarily narrow relative to the size of the spiral, which may be desirable for reasons including, and not limited to: cost, ease of manufacture, availability of fabrication expertise or equipment, availability of material and components, and specific sensor design. For example, one may want to design a touch-sensor which glows, or a touch screen through which a display can be viewed. One desirable property of a grouping of thin traces is to enable the overall trace matrix to pass light around individual traces, while still allowing the group as a whole to have sufficient surface area to achieve the desired sensitivity.


Fig. 6: Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.

Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.


Source: Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.

Tuesday, March 3, 2009

Two Dual Touchscreen Labtop Designs

ASUS V12 Design
Asus Concept Laptop Boasts Dual Multitouch Screens, Ditches Keyboard image

Microsoft Surface Computing Crosses the Pond

 Microsoft Surface Computing Crosses the Pond

Microsoft Surface, the company's first commercial example of surface computing, has “crossed the pond” and is available in no less than 12 markets over the Atlantic. The Redmond company announced at CeBIT 2009 that Surface would be made available in the Europe, Middle East and Africa (EMEA) regions. Until the start of CeBIT 2009 only companies in the United States and Canada were able to purchase Microsoft Surface. Now the tabletop computer will be made available in Austria, Belgium, France, Germany, Ireland, Italy, Netherlands, Qatar, Spain, Sweden, the United Arab Emirates and the U.K.

Apple multi-touch patent explains Palm threats, Pocket-lint

Palm shares down 10% on the news Apple owns tech Pre uses

Ultraportable with dual multitouch AMOLED screens One Ultraportable, Many Ways to Interact, PC World

The MacBook Air, the Lenovo X301, the Asus Eee PC--dinosaurs! At least, they are compared with the ideal ultraportable laptop we have in our heads. This month we took a stab at imagining the perfect ultraportable, and here are the features we came up with. Of course, we're not all geniuses (just some of us are), so if you have some better ideas for our mashed-up machine, tell us in the comments section.

N-trig Challenges ISV to Develop New Hands-on Computing™ Applications for SID Display Week 2009, Business Wire

N-trig is seeking application developers who can creatively integrate hands-on input into new application software, utilizing full multi-touch capabilities. The new beta application must run over Windows 7. N-trig is looking for developers to be creative, push the boundaries and further help to break down the barriers between people and their computers for a true Hands-on computing™ experience.

Vision Objects and Stantum enhance user experience with a unique solution, Media Syndicate

Stantum’s technology can be used by all mobile electronic devices, from the PDA and smartphone through to MP3 players, as long as they use resistive multi-touch sensors. This exciting new technology is able to detect several finger movements simultaneously and contains a series of applications that let you maneuver several objects on the screen at the same time.
In association with this technology, Vision Objects integrates MyScript, its handwriting recognition solution. Backed up by a simple, intuitive user interface, users can write a message effortlessly with either their fingertips or a stylus and see their text instantly transcribed into digital text with excellent accuracy.
MyScript not only recognizes all handwriting styles but also intuitive gestures which allow users to write naturally, to insert spaces and line breaks, and to easily correct text by editing gestures (from simple backspace gestures to natural scratch outs).

Silicon Labs Expands MCU Portfolio with High Pin-Count, Touch-Sensing Device

C8051F700 Enables Robust, Cost-Effective Capacitive Touch Sensing

Monday, March 2, 2009

Microsoft Surface in Entertainment Area

A test of Microsoft's flashy computer-as-table prompts adoption when the technology helps the gaming company sell more drinks.

Source: Casino Puts Microsoft Surface to Work (and Play), PC World

Experimental 'Multi-touch' on nokia 5800, game implementation

Source: Technology Preview: Experimental Multitouch UI for S60 5th Ed, Symbian Freak

Sunday, March 1, 2009

Napkin PC concept and flexible touch screen display

I found a Napkin PC concept design at The article says it has a flexible multi-touch input display which responds to human touch as well as the Pen.


Napkin PC Concept by Avery Holleman Napkin PC Concept by Avery Holleman

[Pictures from]


Almost at the same time, I come across with news articles about the world first (?) flexible touch screen.

Concept and supporting technology are prepared. A next action may be waiting for some brave people or company to make a commercial product.

Flexible touchscreen debuts

[Photo from EETimes]


For more details on the flexible touch screen, consult the following videos and links.


Enterprise Touch UI

Touch is not just for fun stuffs. Lachlan Cash at Microsoft has introduced an application example using touch interface technology.

Link: Using Touch in Line-of-Business Apps

Touch Enhancements in Windows 7 Release Candidate

With the Windows 7 Release Candidate about to roll out Chaltanya Sareen on the Engineering Windows 7 Blog is detailing some of the changes that have been made since the public beta. Notable amongthe 36 features highlighted are some enhancements for touch and multi-touch. Check them out after the jump.

Full Article:

See also: Windows 7 RC improvements for touch. Good enough or not?

Microsoft Office Labs vision 2019

Wow, the video is full of touch!!

Link: istartedsomething

Realistic Physics Engine for Multi-Touch

Multi-touch has been a hot issue for touch UI field since the iPhone’s surprising debut at MacWorld 2007. At first, multi-touch is enough to differentiate electronic products. But now, many ideas come from various people and fields to improve the usability of touch UIs, and realistic interaction seems to be one of them.


The following video is a demonstration of a physics engine combined with multi-touch display.

Via Mahmoud Thoughts

Physics engine is not just for entertainment or demo applications. BumpTop is a 3D desktop utility combining pen touch and a physic engine.



The final video is a game play with MS Surface. I’m not sure how I categorize it – a mixed reality natural touch user interface?

Table Toss (Microsoft Surface Game) from Razorfish - Emerging Experiences on Vimeo.

Via Point & Do

Multi-touch on resistive touchscreens - possible, at least

Steve Litchfield at Telephone Issues has posted some opinions on the requirement of resistive touch screen technology for a more efficient virtual full-screen keyboard.

In addition, I found the following video at the same post. The video is a more detailed view of the Stantum multi-touch screen.


Force, Pressure, and Touch -
Force pressure touch technology: FSR sensor, electronics, firmware and software
Design Service Low Cost Pressure Mapping
Related Posts with Thumbnails