Abstract:
One aspect of the invention is a system for multi-touch gesture processing. The system includes a multi-touch display and processing circuitry coupled to the multi-touch display. The processing circuitry is configured to detect a gesture on a gesture target area of a panel toolbar associated with a panel displayed on the multi-touch display. The panel includes panel content displayed in a content area. The gesture target area includes an empty area absent one or more command icons. Based on detection of the gesture, additional content is displayed on the multi-touch display associated with the panel content.
Abstract:
One aspect of the invention is a system for providing a multi-touch inspection tool. The system includes a multi-touch display and processing circuitry configured to display an inspection tool for a chart on a user interface on the multi-touch display. The inspection tool includes a multiplier-scale control and a precision control. The processing circuitry is also configured to determine a base level of scaling to apply to the chart based on a current value of the multiplier-scale control and detect a touch-based input on the precision control for a precision adjustment of the chart. The precision adjustment is based on linear steps dynamically defined with respect to the base level of scaling. The chart is adjusted in response to the touch-based input on the precision control as a combination of the base level of scaling determined by the multiplier-scale control and the precision adjustment of the precision control.
Abstract:
Certain embodiments herein relate to systems and methods for moving display objects based on user gestures. In one embodiment, a system can include at least one memory configured to store computer-executable instructions and at least one control device configured to access the at least one memory and execute the computer-executable instructions. The instructions may be configured to detect a first user gesture adjacent to an output device in order to identity a display object displayed on the output device. The instructions may be configured to detect a second user gesture adjacent to the output device in order to identify a location to move the display object. The instructions may be configured to update the output device to display the display object at the identified location on the output device.
Abstract:
A system includes a processor configured to cause a display to display a graphical visualization of an industrial system, detect a user input corresponding to an area of the display, perform a semantic zoom of the area of the display, and to display a first level of information based on the semantic zoom. The first level of information includes a data that was not previously displayed on the area of the display.