Wednesday, February 25, 2009

Touch Screen and UI Patents, Feb. 25, 2009

J. Peterson (Cypress), “Method for extending the life of touch screens,” US2008/0231604A1, Sep.25, 2008.

In an embodiment, a signature area and virtual keypad, among other display elements, are displayed in more than one location on a touch screen display. As a result, wear and tear may be strategically distributed evenly across the touch screen, instead of isolated to fixed locations, thus increasing the touch screen's useful lifetime. Display degradation is detected in a novel embodiment from physical parameters that are conventionally used for the touch screen's touch sensitivity. By detecting the display degradation according to display location, display elements can be strategically located to enhance the life of the touch screen.

J. Peterson (Cypress), “Method for extending the life of touch screens,” US2008/0231604A1, Sep.25, 2008.

J. J. Troy, et al. (The Boeing Company), “Systems and Method for Haptics-Enabled Teleoperation of Vehicles and Other Devices,” US20008/0103639A1, May 1, 2008.

Systems and methods are disclosed for haptics-enabled teleoperation of vehicles and other devices, including remotely-controlled air, water, and land-based vehicles, manufacturing robots, and other suitable teleoperable devices. In one embodiment, a system for teleoperation of a vehicle comprises a control component configured to provide position and orientation control with haptic force feedback of the vehicle based on a position measurement of the vehicle and configured to function in a closed-loop feedback manner. In a particular embodiment, the position measurement may include six degree-of-freedom position data provided by a motion capture system to the control and/or haptic I/O components of the application. The system may also use differences in position and/or velocity between the vehicle and a haptic I/O device for feedback control.
J. J. Troy, et al. (The Boeing Company), “Systems and Method for Haptics-Enabled Teleoperation of Vehicles and Other Devices,” US20008/0103639A1, May 1, 2008.

A. M. Platzer, et al. (Apple), “Swapping User-Interface Objects by Drag-and-Drop Finger Gestures on a Touch Screen Display,” WO/2008/086305, Jul. 17, 2008.

A portable multifunction device (100) displays a first user interface object (4350) and a second user interface object (4310) on a touch screen display (112). Upon detecting a finger-down event (4346-2) at the first user interface object (4350) and one or more finger-dragging events (4365) on the touch screen display (112), the device (100) moves the first user interface object (4350) on the touch screen display (112) along a path determined by the finger-dragging events (4365) until the. first user interface object (4350) at least in part overlaps the second user interface object (4310). Upon detecting a finger-up event (4346-3) at the second user interface object (4310), the device (100) visually replaces the second user interface object (4310) with the first user interface object (4350).
A. M. Platzer, et al. (Apple), “Swapping User-Interface Objects by Drag-and-Drop Finger Gestures on a Touch Screen Display,” WO/2008/086305, Jul. 17, 2008.

No comments:

LinkWithin

Force, Pressure, and Touch - kitronyx.com
Force pressure touch technology: FSR sensor, electronics, firmware and software
Design Service Low Cost Pressure Mapping
Related Posts with Thumbnails