DIY engineers now go one step further. MAKE magazine reports that a Russian engineer has implemented a DIY capacitive touch pad. The system is built by referencing the whitepaper of Quantum Research Group (now acquired by Atmel). For unknown reason, I can’t open the linked PDF file.. So, I put another link for the paper here. All Quantum’s touch sensing ICs use Quantum’s propriety charge transfer sensing technique. You can easily understand its operating principles by consulting charge transfer sensing whitepaper.
Thursday, March 26, 2009
Saturday, March 21, 2009
Published Thursday, March 19, 2009 5:10 AM
IdentityMine announces the IdentityMine Gesture Engine, a new solution accelerator to assist clients looking to develop multi-touch applications for Windows 7.
LAS VEGAS, NV (March 19, 2009) - IdentityMine, an expert user-experience company, introduces the IdentityMine Gesture Engine to bring expanded Multi-Touch capabilities to applications for Windows 7.
With the new range of hardware and software available, it is now possible to create interfaces that respond directly to touch. Specifically these new paradigms allow more than one touch to be processed at the same time – allowing for more natural navigation and commands. Examples of these emerging technologies include Windows 7 and Microsoft Surface.
The IdentityMine Gesture Engine implements a strategy where gestures can be placed at any point of the user interface, and work seamlessly together to interpret the intended navigation or command. The technology created to implement this strategy is known as multi-level gesturing. By supporting multi-level cooperative gesturing, designers and developers are able to implement grouping of gestures to accomplish a very natural user experience.
In addition to multi-level gesturing, the IdentityMine Gesture Engine supports and expanded set of gestures including the ability for user-defined gestures.
"The IdentityMine Gesture Engine is a major step forward in allowing us to push the envelope in natural user interface (NUI) development,” said Mark Brown, CEO at IdentityMine. “We are extremely excited about the new technologies coming in Windows 7 and are ready to support clients who are gearing up for the upcoming OS launch.”
For more information on the IdentityMine Gesture Engine and their solutions and services focused on Windows 7 please visit www.identitymine.com/windows7.
Source: DIY 3D Controller, instructables
R. Wimmer et al., “Thracker-using capacitive sensing for gesture recognition,” in Proceedings of the 26th IEEE International ConferenceWorkshops on Distributed Computing Systems, vol. 64 (IEEE Computer Society Washington, DC, USA, 2006). [PDF]
Philip L. Davidson and Jefferson Y. Han, “Extending 2D object arrangement with pressure-sensitive layering cues,” in Proceedings of the 21st annual ACM symposium on User interface software and technology (Monterey, CA, USA: ACM, 2008), 87-90.
Mitsubishi Electric Corp prototyped a capacitive touch panel that can detect the distance between a finger and the panel and demonstrated it at Interaction 2009, which took place from March 5 to 6, 2009, in Tokyo.
- Display: 5.7", 640x480 pixels
- Touch panel: projective capacitive touch panel
- Measurement range:
- contact state: 8 ~ 19 pF change
- proximity state: 0.3 pF change
- Measurments: x, y, z positions and the speed of the approaching finger.
- contact state: 0.2 mm
- proximity state: 10 mm (x and y), 256 stage (z axis, up to 20 mm)
- Response time
- contact state: 10 ms
- proximity state: 50 ms
Source: Mitsubishi Demos '3D Touch Panel', Tech-On!
As expected, Mitsubishi is not the first one on this kind of technology:
- EtherTouch: the company offers AD7103 3D touch sensing IC (They seem to almost cease operations as a business)
- MIT Media Lab:They have made several interesting 3D input devices using electric field imaging.
- Sony CSL: SmartSkin can be considered as an ancestor of iPhone multi-touch screen. CSL has presented an interaction architecture called PreSense, which show a preview before a user execute commands.
Updated Dec. 11, 2009
US2009/0183931A1 Touch Panel Device -
Wednesday, March 18, 2009
MS Surface is undoubtedly one of coolest gadgets Microsoft has showed off. Because Surface is too famous, people forget another promising project from Microsoft – DigiDesk. It was originally introduced at Convergence 2007. From HCI view point, both Surface and DigiDesk can be categorized as a tabletop interface. However, their applications are slightly different. Surface aims at public and home entertainment markets, while DigiDesk is designed for information workers. Let’s first see introductory videos of DigiDesk below:
The success of the iPhone has given rise to a new grammar of touch control while the advent of multi-touch in Windows 7 will further accelerate the evolution of human computer interfaces, the South by SouthWest festival has been told.
F. Wang, X. Ren, and Z. Liu, “A Robust Blob Recognition and Tracking Method in Vision-Based Multi-touch Technique,” Parallel and Distributed Processing with Applications, 2008. ISPA'08. International Symposium on, 2008, pp. 971-974.
The fundamental technique of vision-based multi-touch technique is the high performance image processing. In this paper, two algorithms are presented to resolve the problems of blob recognition and tracking. Image contour transformation algorithm is adopted to analyze the existence of finger contact area and Minimum Distance First (MDF) algorithm is designed to identify and track the corresponding blob in two sequential images. The result shows that the performance of two algorithms fully meets the requirement of real time multi-touch system.
Tuesday, March 17, 2009
Japanese researchers shoed off a whole new HRP-4C female fashion model robot at Fashion Week In Tokyo. It (She?) has 42 motion motors to mimic the movements and facial motions of human fashion models.
Its walking and facial motions are still far from those of real humans, there's absolutely no question that those problems will be solved in the near future. But.. How about touch? Current humanoid robots has many sensors to control its walking. To name a few, accelerometers, gyroscopes, cameras, encoders, etc. Touch (in robotics, tactile is more appropriate but let me keep the term touch in the following) sensors are also used but they are just for delivering foot pressure to a central motion controller.
Robots should feel touch because they are designed to operate among us. And it means that literally the whole surface of the robot is covered with large, flexible (at least, curved) touch sensors. Many researchers have been working on making those kind of touch sensors. Robots with touch will interact with humans very naturally and harmlessly.
Mechanics, AI, control, and robot touch sensors have been studied for a long time. I think there is one missing link among the research areas: interpreting touch data. Touch data from robot skins will be very different from ordinary sensor data. It is distributed and large. Robot's brain has to process it very fast and control the body accordingly. Lumelsky et al.  exactly indicated this point:
Sensitive skin devices will include thousands and millions of elements that generate and process tremendous amounts of information, in parallel and in real time. This will hence be a new physical basis of information technology.With the eventual ubiquity of the sensing skin on various machinery, it is likely to bring the biggest leap in information technology hardware since the introduction of computers.
Because I totally agree with their view points, I quote some more paragraphs from their paper:
A. Sensing and Dynamic Control
Consider our home Helper mentioned above. When its arm senses an obstacle, the control system must analyze it and modify the motion accordingly. Here, the arm dynamics, sensing resolution, and the allowed speed of motion are all tied in some relationship. For example, if the sensing is “myopic” and the arm is heavy, the Helper will move slower, “fearing” a collision, no matter how good its control system is. Since the Helper’s arm is a highly nonlinear system, realizing good real-time control is a challenging problem of control theory.
B. Need for New Control Theory
Note, for example, that the admittedly complex control system of today’s flying aircraft focuses primarily on achieving desired properties of motion at a single point of the aircraft — say, its center of gravity. Other characteristics, such as accounting for body dynamics, appear as constraints on control. However, when controlling a sensitive skin-equipped machine, the control system should be able to focus intermittently on various single and multiple points of potential collision on the machine’s body, and modify the control accordingly, all within the draconian constraints of real-time operation and changing kinematics of the body. Perhaps a better analogy is the control of a bat flying among tree branches, or attempts of reconfigurable control for the changing shape of a jet fighter in battle. These complications call for novel, exciting control theory.
C. Motion Planning Based on Sensitive Skin Data
This research is likely to make use of tools from graph theory, search algorithms, computational geometry, differential geometry, and topology. One serious issue, largely not addressed today, is the symbiosis of real-time motion planning algorithms with control of the machine’s body dynamics and with nonholonomic constraints on the machine motion. That is, based on the continuous stream of data from the sensitive skin, the planning algorithm not only has to produce collision-free motion for every point of the machine’s body, but it has to do it within the system’s dynamic constraints (masses, accelerations, etc.) and the design constraints of its actuators (e.g., an automobile cannot turn on the spot and must move along a curve).
E. Man–Machine Systems
Human and machine intelligence could be merged in real-time motion planning. Envision, for example, a pilot trying to keep his helicopter hovering low above the ground to collect samples from the rain forest without colliding with the underbrush. Besides informing the pilot of any undesirable contacts with plants below, data from the sensitive skin-covered helicopter underbody can be directly used for automatic control of the hovering height to avoid collision. Prototypes of control systems that combine human and machine intelligence in real time have already been demonstrated.
Not directly related to touch, Zbikowsky  made an interesting comparison between a jet fighter and a fly.
Whereas the F-35 Joint Strike Fighter, the most advanced fighter plane in the world, takes a few measurements—airspeed, rate of climb, rotations, and so on—and then plugs them into complex equations, which it must solve in real time, the fly relies on many measurements from a variety of sensors but does relatively little computation.
The fly attains remarkable performance, yet is computationally quite simple, relying on extensive, distributed measurement of parameters of interest. From an engineering viewpoint, this opens up new possibilities in control, as well as in sensors and instrumentation.
Multi-touch is not just for touch screens…
 V. Lumelsky, M. Shur, and S. Wagner, “Sensitive skin,” Sensors Journal, IEEE, vol. 1, pp. 41-51, 2001.
 R. Zbikowski, “Fly like a fly [micro-air vehicle],” IEEE Spectrum, vol. 42, pp. 46-51, 2005.
Links on HRP-4C
- HRP-4C fashion model robot, Boing Boing
- HRP-4C Female Fashion Robot, Aerospace and Paranormal and UFOs News
- HRP-4C Robot Woman Is A Cybernetic Humanoid, Technovelgy.com
- HRP-4C robot ready to put human fashion models out of work, DVICE
Monday, March 16, 2009
Samsung Selects Cypress TrueTouch™ Touchscreen Solution To Power Innovative Interface on Award-Winning P3 Portable Media Player
SAN JOSE, Calif. - (Business Wire) Cypress Semiconductor Corp. (NYSE:CY) today announced that Samsung Electronics Co. Ltd. has selected Cypress’s TrueTouch™ touchscreen solution to implement the touchscreen interface in its new award-winning P3 portable media player. With its dynamic interface and innovative design, the P3 won a CES Innovations 2009 Design and Engineering Award. The TrueTouch solution, based on the PSoC® programmable system-on-chip architecture, enabled Samsung to develop customized multi-touch gestures, such as swiping a finger across the screen to switch audio tracks, or holding down a digital button to fast-forward video.
Cypress’s TrueTouch™ touchscreen solution enables OEMs to utilize a full portfolio of Cypress-developed gestures, from single-touch to multi-touch gestures, such as tapping an application to open, panning through photo albums, scrolling down an e-mail, and rotating and pinching pictures. The unique PSoC architecture allows designers to implement differentiated features and make last-minute design iterations without board changes. Additional information about the TrueTouch solution is available at www.cypress.com/go/pr/TrueTouch.
Featuring a 3-inch WQVGA TFT-LCD touchscreen, the P3 delivers widescreen video and photos at a 16:9 aspect ratio without the need for letterboxing. The P3’s EmoTure interface offers true haptic feedback, reacting to every command gesture with a variety of physical sensations for a more intimate user experience versus other media players. It also features a new “Music Hot Touch Key” that allows users to instantly access favorite music features and selections without cycling through multiple menus.
“We wanted the touchscreen on the P3 to offer a new level of responsiveness for the end-user,” said Hugh Hyung-uk Choi, senior engineer at Samsung. “Cypress’s TrueTouch touchscreen solution gave us the customization and responsiveness we needed.”
“In the P3, our TrueTouch solution helps Samsung push the envelope of touchscreen interfaces,” said Darrin Vallis, Director of the TrueTouch Touchscreen Business Unit at Cypress. “TrueTouch is gaining rapid adoption because its programmability allows designers to implement unique features and its high-precision delivers the responsiveness consumers demand.”
Cypress's TrueTouch touchscreen solution is based on projected capacitive touchscreen technology, offering numerous benefits over touchscreens based on resistive technology. These advantages include optical clarity, durability, reliability and cost-effective implementation of multi-touch features. The TrueTouch family, based on the PSoC® programmable system-on-chip architecture, is the industry's broadest touchscreen offering, including single-touch, multi-touch gesture, and multi-touch all-point solutions. Because of the flexible and programmable PSoC-based TrueTouch architecture, customers can also choose to work with a wide variety of touchscreen vendors and/or LCD module vendors to create their designs. Other touchscreen solutions are not programmable and require designers to implement a fixed solution with limited materials choices. The TrueTouch solution also enables designers to integrate additional functions such as CapSense™ touch-sensing buttons and sliders, driving LEDs, backlight control and I/O expansion. These functions, in conjunction with flexible communication options (I2C and SPI), allow for unparalleled system integration for touchscreen systems. Additional information about the TrueTouch solution is available at www.cypress.com/go/pr/TrueTouch.
About Samsung Electronics America, Inc.
Headquartered in Ridgefield Park, NJ, Samsung Electronics America, Inc. (SEA), a wholly owned subsidiary of Samsung Electronics Co., Ltd., markets a broad range of award-winning, advanced digital consumer electronics and home appliance products, including HDTVs, home theater systems, MP3 players, refrigerators and laundry machines. A recognized innovation leader in consumer electronics design and technology, Samsung is the HDTV market leader in the U.S. and is the only manufacturer that produces all four major digital television technologies. Please visit www.samsung.com for more information.
Cypress delivers high-performance, mixed-signal, programmable solutions that provide customers with rapid time-to-market and exceptional system value. Cypress offerings include the PSoC® programmable system-on-chip, USB controllers, general-purpose programmable clocks and memories. Cypress also offers wired and wireless connectivity technologies ranging from its CyFi™ Low-Power RF solution, to West Bridge® and EZ-USB® FX2LP controllers that enhance connectivity and performance in multimedia handsets. Cypress serves numerous markets including consumer, computation, data communications, automotive, and industrial. Cypress trades on the NYSE under the ticker symbol CY. Visit Cypress online at www.cypress.com.
Cypress, the Cypress logo and PSoC are registered trademarks and TrueTouch and CapSense are trademarks of Cypress Semiconductor Corp. All other trademarks are property of their owners.
Cypress Public Relations
Samer Bahou, 408-544-1081
Source: The Earth Times
C. Davies, “Microsoft Surface interactive business card reader, plus gaming,” Slash Gear, Mar. 11, 2009.
Australia-based digital agency Amnesia have been playing with their multitouch Microsoft Surface table, and have come up with perhaps the best way to make sure people actually look through the business cards they get showered with at corporate events. Each of the specially-designed Amnesia cards has a code which the Surface recognizes, calling up contact detail, social networking feeds and more.
Amnesia Razorfish - Staff Directory on Surface from Razorfish - Emerging Experiences on Vimeo.
Tuesday, March 10, 2009
Co-Led by CDC Innovation and Auriga
BORDEAUX, France, March 10, 2009 – Stantum Technologies (www.stantum.com), a pioneer developer of multi-touch sensing technologies, announced today it has secured $13 million in Series B funding. CDC Innovation and Auriga Partners co-led the round with XAnge Private Equity as historical investor (2007).
Valery Huot, managing partner of CDC Innovation, and Philippe Granger, partner at Auriga Partners, have joined Stantum’s board of directors, alongside Nicolas Rose, partner at XAnge Private Equity.
According to Stantum CEO Etienne Paillard, the new funding will be used to develop a worldwide sales and marketing organization in the U.S., Europe and Asia; increase R&D capacity for next-generation sensing technologies and new products; and establish and increase mass manufacturing capabilities through partners.
Stantum, originally called JazzMutant, was founded in 2002 with an aim to develop new human-machine interface standards for the creative industry. In 2003, JazzMutant produced the world’s first multi-touch screen that could track an unlimited number of fingers at once. In 2005, JazzMutant launched the first multi-touch product, the first such on the market. Facing an ever-growing demand from various OEMs, the company began working in 2006 to make its patented multi-touch technology available to third-party integrators. In 2007, after a round of financing led by XAnge Private Equity, JazzMutant became Stantum and officially launched its OEM activity.
“Stantum has assembled an unparalleled engineering and top management team with a proven track record of bringing leading products to market,” said Huot. “The company has already demonstrated cutting edge multi-touch solutions that are attracting strong partner and customer interest for mobile phones, navigation devices, and other consumer electronics equipment.”
“We are very excited to be supporting Stantum and are impressed by the progress the company has made to date. Stantum’s focused strategy and unique multi-touch technologies provide a very strong foundation for future success,” added Granger.
Vadis Ventures, a corporate finance boutique focused on the high-tech sector, advised Stantum through this round of financing.
Monday, March 9, 2009
This is an 1-page illustration by G. Retseck at STMicroelectronics showing how capacitive multi-touch screens work.
Source: Smart Phones: Touch Screens Redefine the Market, Scientific American
STMicroelectronics Unveils Microcontroller-Based Sensing Capability, Placing Touch Control within Easy Reach
STMicroelectronics releases an open-source capacitive touch sensing library for its STM8 microcontrollers. Compare it with freescale proximity sensing software.
Sunday, March 8, 2009
As expected, there another kind of sensing electrode pattern. Here, I try posting an exemplary patent on touch sensing patterns. Yes, reading this kind of patent is somewhat boring compared with reading cute UI patents from companies such as Apple. However, “sensing pattern” is also an important factor in designing a capacitive touch screen.
The patent US7202859 explains a sensor (or sensing pattern) as shown in the figure below. The X traces and Y traces are arranged in an intertwined pattern around each crossing.
The typical construction of the sensor matrix is shown below. In this case two X and Y traces are printed on separate layers. In other embodiments, two trace patterns are printed on opposite faces or the same face of the insulating layer.
In order to reduce capacitive coupling between X traces and Y traces, two methods are presented: 1) Traces become very thin around each crossing or 2) Either of X and Y traces is connected through via holes as shown in the Fig. 5 of the patent. This configuration is quite useful when two X and Y traces are disposed on the same face of the insulating layer.
Many variations of sensing patterns are also introduced (see Fig. 6 and Fig. 7 of the patent). Possible advantages of the Fig. 6 of the patent are:
Grouping traces in this manner can allow individual traces of the group to be arbitrarily narrow relative to the size of the spiral, which may be desirable for reasons including, and not limited to: cost, ease of manufacture, availability of fabrication expertise or equipment, availability of material and components, and specific sensor design. For example, one may want to design a touch-sensor which glows, or a touch screen through which a display can be viewed. One desirable property of a grouping of thin traces is to enable the overall trace matrix to pass light around individual traces, while still allowing the group as a whole to have sufficient surface area to achieve the desired sensitivity.
Source: Don A. Speck, et al. (Synaptics), Capacitive Sensing Pattern, US7202859, Apr. 10, 2007.
Tuesday, March 3, 2009
Microsoft Surface, the company's first commercial example of surface computing, has “crossed the pond” and is available in no less than 12 markets over the Atlantic. The Redmond company announced at CeBIT 2009 that Surface would be made available in the Europe, Middle East and Africa (EMEA) regions. Until the start of CeBIT 2009 only companies in the United States and Canada were able to purchase Microsoft Surface. Now the tabletop computer will be made available in Austria, Belgium, France, Germany, Ireland, Italy, Netherlands, Qatar, Spain, Sweden, the United Arab Emirates and the U.K.
Apple multi-touch patent explains Palm threats, Pocket-lint
Palm shares down 10% on the news Apple owns tech Pre uses
One Ultraportable, Many Ways to Interact, PC World
The MacBook Air, the Lenovo X301, the Asus Eee PC--dinosaurs! At least, they are compared with the ideal ultraportable laptop we have in our heads. This month we took a stab at imagining the perfect ultraportable, and here are the features we came up with. Of course, we're not all geniuses (just some of us are), so if you have some better ideas for our mashed-up machine, tell us in the comments section.
N-trig Challenges ISV to Develop New Hands-on Computing™ Applications for SID Display Week 2009, Business Wire
N-trig is seeking application developers who can creatively integrate hands-on input into new application software, utilizing full multi-touch capabilities. The new beta application must run over Windows 7. N-trig is looking for developers to be creative, push the boundaries and further help to break down the barriers between people and their computers for a true Hands-on computing™ experience.
Stantum’s technology can be used by all mobile electronic devices, from the PDA and smartphone through to MP3 players, as long as they use resistive multi-touch sensors. This exciting new technology is able to detect several finger movements simultaneously and contains a series of applications that let you maneuver several objects on the screen at the same time.
In association with this technology, Vision Objects integrates MyScript, its handwriting recognition solution. Backed up by a simple, intuitive user interface, users can write a message effortlessly with either their fingertips or a stylus and see their text instantly transcribed into digital text with excellent accuracy.
MyScript not only recognizes all handwriting styles but also intuitive gestures which allow users to write naturally, to insert spaces and line breaks, and to easily correct text by editing gestures (from simple backspace gestures to natural scratch outs).
C8051F700 Enables Robust, Cost-Effective Capacitive Touch Sensing
Monday, March 2, 2009
A test of Microsoft's flashy computer-as-table prompts adoption when the technology helps the gaming company sell more drinks.
Source: Casino Puts Microsoft Surface to Work (and Play), PC World
Sunday, March 1, 2009
[Pictures from TheNark.com]
Almost at the same time, I come across with news articles about the world first (?) flexible touch screen.
Concept and supporting technology are prepared. A next action may be waiting for some brave people or company to make a commercial product.
[Photo from EETimes]
For more details on the flexible touch screen, consult the following videos and links.
With the Windows 7 Release Candidate about to roll out Chaltanya Sareen on the Engineering Windows 7 Blog is detailing some of the changes that have been made since the public beta. Notable amongthe 36 features highlighted are some enhancements for touch and multi-touch. Check them out after the jump.
Full Article: GottaBeMobile.com
Multi-touch has been a hot issue for touch UI field since the iPhone’s surprising debut at MacWorld 2007. At first, multi-touch is enough to differentiate electronic products. But now, many ideas come from various people and fields to improve the usability of touch UIs, and realistic interaction seems to be one of them.
The following video is a demonstration of a physics engine combined with multi-touch display.
Via Mahmoud Thoughts
Physics engine is not just for entertainment or demo applications. BumpTop is a 3D desktop utility combining pen touch and a physic engine.
The final video is a game play with MS Surface. I’m not sure how I categorize it – a mixed reality natural touch user interface?
Via Point & Do
In addition, I found the following video at the same post. The video is a more detailed view of the Stantum multi-touch screen.