US11079915B2 - System and method of using multiple touch inputs for controller interaction in industrial control systems - Google Patents

System and method of using multiple touch inputs for controller interaction in industrial control systems Download PDF

Info

Publication number
US11079915B2
US11079915B2 US15/145,095 US201615145095A US11079915B2 US 11079915 B2 US11079915 B2 US 11079915B2 US 201615145095 A US201615145095 A US 201615145095A US 11079915 B2 US11079915 B2 US 11079915B2
Authority
US
United States
Prior art keywords
touch
touch input
processor
graphical element
screen display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/145,095
Other versions
US20170322721A1 (en
Inventor
Pavan Kumar Singh Thakur
Jagadeesh Jinka
Chaithanya Guttikonda
Vibhoosh Gupta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intelligent Platforms LLC
Original Assignee
Intelligent Platforms LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelligent Platforms LLC filed Critical Intelligent Platforms LLC
Priority to US15/145,095 priority Critical patent/US11079915B2/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GUTTIKONDA, CHAITHANYA, JINKA, JAGADEESH, Thakur, Pavan Kumar Singh, GUPTA, VIBHOOSH
Publication of US20170322721A1 publication Critical patent/US20170322721A1/en
Assigned to INTELLIGENT PLATFORMS, LLC reassignment INTELLIGENT PLATFORMS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GENERAL ELECTRIC COMPANY
Application granted granted Critical
Publication of US11079915B2 publication Critical patent/US11079915B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36168Touchscreen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • Embodiments of the disclosure generally relate to controls of industrial systems, and more particularly methods and systems for interfacing with controllers.
  • HMIs human-machine interfaces
  • Examples of such applications that include such controllers include those that interfaces with machineries and components of power plants, factories, refineries, power distribution sites, wind or solar farms, among others.
  • One class of local controllers is embodied as a touch screen HMI that displays data and control features graphically. Because of the harsh, crowded, and tumultuous physical conditions associated with industrial environments, ruggedized HMIs are often used. These ruggedized HMIs often have impact-resistant designs with limited display areas that result in densely arranged graphics, data displays, and controls that are selectable and controllable by an operator. And, because of the dense arrangements of the display and control and the physical conditions associated with the industrial environments, controls may be mistakenly or inadvertently touched by the operator causing inconvenience and loss and improper conditions of the controller in many circumstances.
  • Exemplified methods and systems provide a ruggedized graphical HMI having an interface that mitigate or prevent touch errors and/or inadvertent touches through the use of multiple touch inputs, at a graphical user interface, of a touch-screen input device, to trigger an associated user interface command.
  • the multiple touch inputs comprise an input at two locations, one in relative association, with a displayed interface command, to trigger the command.
  • the multiple touch inputs may be invoke via two fingers placed at the HMI by the operator, e.g., a finger to be landed on the touch screen and another finger to be tapped on the user control on the touch screen to trigger an operation associated with the user control. This may be referred to as a “Land and Tap” input.
  • the command-invocation multiple touch inputs beneficially provide a mechanism of mistake proofing against unintended triggering of a command or an operation due to unintentional finger tap on user control like buttons.
  • the HMI presents a “Set Point” button for triggering the setting of a parameter value on a field device.
  • This Set-Point button is associated with a critical operation of an industrial machinery or subsystem in an industrial control application. If the HMI display is cluttered, or densely arranged, with several user controls on one HMI screen, which often occurs due to the number of controllable inputs associated with such industrial machineries and subsystems, there is always a risk that the operator may mistakenly or inadvertently touch the ‘set point’ button.
  • the exemplified “Land and Tap” input may be invoke via both the thumb of the operator being placed on the screen near the Set-Point button and without touching Set-Point button, and the index finger being simultaneously placed on the Set Point button. To this end, a single input received at the Set-Point button does not invoke or trigger the attached operation associated with the button.
  • a method is disclosed of receiving multiple touch inputs, at a graphical user interface, of a touch-screen input device, in an industrial automation system, to trigger an associated user interface command (e.g., a graphical user interface command).
  • the method includes presenting, by a processor, via a touch-screen display, a graphical element (e.g., an application icon or a control set-point) associated with execution of an application or a control commands.
  • a graphical element e.g., an application icon or a control set-point
  • the method includes presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the second associated touch input.
  • the second associated touch input comprises a point-based input received at one or more pre-defined virtual region (e.g., lower right or lower left of the icons—for each of right hand and left hand operators) located proximal to the graphical element associated with execution of the application or control command.
  • pre-defined virtual region e.g., lower right or lower left of the icons—for each of right hand and left hand operators
  • the method includes presenting, by the processor, via the touch-screen display, a graphical element associated with selection of a location for the pre-defined virtual region (e.g., to select left hand control or right hand control).
  • the second associated touch input comprises a point-based input received for a minimum time parameter.
  • the second associated touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as a non-activated input.
  • the method includes presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the second associated touch input at one of a lower or upper corner (e.g., lower left hand or lower right hand corner) of the touch-screen display (e.g., to require two hands—one to touch the unlock button and one to activate a command).
  • a lower or upper corner e.g., lower left hand or lower right hand corner
  • the graphical element are displayed in a dense matrix of graphical elements.
  • the method includes presenting, by the processor, via the touch-screen display, an indicia (e.g., screen change color) of the second associated touch input being received.
  • an indicia e.g., screen change color
  • the method includes, in response to a third touch input concurrently received with the first input and the second input, maintaining, by the processor, the graphical element associated with execution of the application or control command in the non-activated state.
  • a system is disclosed (e.g., for in an industrial automation system) to trigger an associated user interface command using multiple concurrently-received touch inputs, at a graphical user interface, of a touch-screen input device.
  • the system includes a touch-screen display; a processor operatively coupled to the touch-screen display; and a memory operatively coupled to the processor, the memory having instructions stored thereon, wherein execution of the instructions, cause the processor to: present, via the touch-screen display, a graphical element (e.g., an application icon or a control setpoint) associated with execution of an application or a control commands; either i) upon receipt, via the touch-screen display, of a first input at a first position associated with the graphical element, determine a second associated touch input at a second position associated with the activation of the graphical element, or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, determine the first input at the first
  • the instructions when executed by the processor, further cause the processor to: present via the touch-screen display, a second graphical element for receipt of the second associated touch input.
  • the second associated touch input comprises a point-based input received at one or more pre-defined virtual region (e.g., lower right or lower left of the icons—for each of right hand and left hand operators) located proximal to the graphical element associated with execution of the application or control command.
  • pre-defined virtual region e.g., lower right or lower left of the icons—for each of right hand and left hand operators
  • the instructions when executed by the processor, further cause the processor to: present, via the touch-screen display, a graphical element associated with selection of a location for the pre-defined virtual region (e.g., to select left hand control or right hand control).
  • the second associated touch input comprises a point-based input received for a minimum time parameter.
  • the second associated touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as non-activated input.
  • the instructions when executed by the processor, further cause the processor to: present, via the touch-screen display, a second graphical element for receipt of the second associated touch input at one of a lower or upper corner of the touch-screen display.
  • the graphical element are displayed in a dense matrix of graphical elements.
  • the instructions when executed by the processor, further cause the processor to: in response to a third touch input concurrently received with the first input and the second input, maintain the graphical element associated with execution of the application or control command in the non-activated state.
  • a non-transitory computer readable medium to trigger an associated user interface command using multiple concurrently-received touch inputs, at a graphical user interface, of a touch-screen input device.
  • the computer readable medium has instructions stored thereon, wherein execution of the instructions, cause the processor to: present, via a touch-screen display associated with a computing device, a graphical element (e.g., an application icon or a control set-point) associated with execution of an application or a control commands; either i) upon receipt, via the touch-screen display, of a first input at a first position corresponding to the graphical element, determine a second associated touch input at a second position associated with the activation of the graphical element; or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, determine the first input at the first position associated with the graphical element; in response to the first and second touch input not being concurrently received with the input, maintain the
  • FIG. 1 schematically depicts an industrial automation system with distributed control via a plurality of human-machine interfaces (HMIs) that are each located proximate to individual subsystems according to an illustrative embodiment.
  • HMIs human-machine interfaces
  • FIG. 2 schematically depicts an example implementation of a distributed control system (DCS), in accordance with the illustrative embodiment.
  • DCS distributed control system
  • FIG. 3 graphically depicts an example human-machine interface (HMI) and a user interacting therewith, in accordance with the illustrative embodiment.
  • HMI human-machine interface
  • FIG. 4 graphically depicts an HMI configured for two figure touch operation, in accordance with the illustrative embodiment.
  • FIG. 5 graphically depicts a land area and a tap area on an HMI, in accordance with the illustrative embodiment.
  • FIG. 6 graphically depicts the timing and operation of a land-and-tap gesture for controlling a human-machine interface, in accordance with the illustrative embodiment.
  • FIG. 7 illustrates an exemplary HMI controller in an industrial automation system.
  • the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other additives, components, integers or steps.
  • “Exemplary” means “an example of” and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.
  • FIG. 1 schematically depicts an industrial automation system with distributed control via a plurality of human-machine interfaces (HMIs) that are each located proximate to individual subsystems according to an embodiment of the present invention.
  • industrial systems e.g., power generation/distribution, oil refinement, water treatment, chemical processing, etc.
  • HMIs human-machine interfaces
  • FIG. 1 may have a plurality of subsystems. Each subsystem in the plurality may be different, each performing a particular aspect of an overall process. Alternatively a system may use a plurality of duplicate subsystems, each performing a duplicate process.
  • These subsystems may be collocated or separated. Separated subsystems may be located at the same facility or may be located in different facilities that are separated by large distances.
  • control/monitoring of the various subsystems is often distributed (e.g., geographically, functionally, etc.), wherein a plurality of controllers, each controlling/monitoring a subsystem are utilized.
  • the distributed controllers may be interconnected and communicate (e.g., connected in a hierarchy) in order function and perform.
  • a local network 101 may be communicatively coupled (wired or wirelessly) 106 to a network 105 and thus may send and receive information to/from other local networks.
  • Each local network 101 may include at least a subsystem 102 , a controller 103 , and a human machine interface 104 , though many other possible configurations can be envisioned (e.g., one subsystem having many controllers and a many HMIs).
  • the subsystem, controller(s), and HMI may be interconnected and may communicate using standard communication protocols.
  • the subsystem 102 typically includes the mechanical (e.g., valves, pumps, pneumatic devices, etc.) and/or electrical components (e.g., sensors, actuators, switches, breakers, etc.) necessary to perform all or part of an industrial process. These components are controlled and/or monitored by at least one controller (i.e., field device) 103 .
  • controllers e.g., programmable logic controller (PLC), programmable automation controller (PAC), and supervisory control and data acquisition (SCADA) systems.
  • PLC programmable logic controller
  • PAC programmable automation controller
  • SCADA supervisory control and data acquisition
  • the controller 103 may generate the signals necessary to control the subsystem 102 and may receive signals from the subsystem for monitoring or control (e.g., feedback signals).
  • a controller 103 may include processors, digital/analog I/O ports, timers, and/or memory and may also be programmed to execute logic sequences and/or respond to remote/local directives from a user and/or from another subsystem/component/controller.
  • FIG. 2 schematically depicts an example implementation of a distributed control system (DCS) 200 , in accordance with the illustrative embodiment.
  • a distributed control system 200 for a wind turbine generator may include a first local network 202 located at the base of the wind turbine connected to a second local network 204 located at the turbine cab.
  • the first local network 202 includes a network device 206 having a communication link (e.g., via Profinet, Profibus, InterCAD) and communicates with a controller 208 (shown as “Mark Vie 2308 ”), a SCADA system 210 to connect to other wind turbine generators, and a controller 212 to monitoring conditions at the base of the tower.
  • a controller 208 shown as “Mark Vie 2308 ”
  • SCADA system 210 to connect to other wind turbine generators
  • controller 212 to monitoring conditions at the base of the tower.
  • the second local network 204 includes a second network device 214 having a communication link (e.g., via Profinet, Profibus, InterCAD) and communicates with controllers 216 for each pitch axis (e.g., that regulates control of the pitch, yaw, and rotation of one of the multiple blades of the turbine), and controller 218 for monitor conditions at the nacelle of the tower.
  • the controllers 216 connect to controllers 220 a , 220 b , 220 c for each of the blade rotatable axis.
  • An HMI may be located proximate to the first local network 202 and the second local network 204 . Because of the closed-environment in the inside of the turbine nacelle or at the base of the turbine, an operator therein may make inadvertent or incorrect activation of control widgets for the controller and presented on the HMI.
  • FIG. 3 graphically depicts an example human-machine interface (HMI) and a user interacting therewith, in accordance with the illustrative embodiment.
  • the HMI 104 includes a touch screen 301 (e.g., resistive, capacitive) that displays information and accepts touch inputs from a user 302 .
  • the touch inputs trigger various operations (e.g., change a virtual control, start an application, trigger a function or communication, interact with data, etc.).
  • a user 302 interacts with the controls presented on the screen by simply touching a “touch area” designated on the screen.
  • the amount of information presented on an HMI touch screen 301 may be dense.
  • the HMI may display a plurality of data from the subsystem (e.g., system status, operating values, etc.). Further, it may be necessary to display the data in a variety of formats (e.g., graphs, charts, animation showing operation, etc.) and to perform some analysis on the data for a worker to control the subsystem properly.
  • the controls may be displayed in a way that orients the operator by graphically illustrating the subsystem.
  • the HMI may present inputs (e.g., switches, slides, buttons, dials, etc.) to control various elements or various aspects of the subsystem.
  • HMI Because of the dense presentation of widgets on a HMI for an industrial controller, mistakes may occur by an operator when the wrong area of the HMI is touched.
  • the HMI may be located in space confined areas that increase the risk of inadvertent touching of the graphical input of the HMI.
  • the exemplified system and method uses multiple touch inputs that may be specified in a given sequence and for a given duration to activate an operation associated with a widget presented on a control screen of the HMI.
  • FIG. 4 illustrates an example HMI touch screen 301 in an industrial automation system.
  • the control screen includes a graphical widget associated with a ‘Set Point’ control 405 .
  • the widget when activated, via the HMI screen, is configured to trigger the setting of a parameter value on the touch screen of the subsystem's (e.g., the field device's) HMI.
  • the set point control may be critical to operation of the subsystem and should only be touched when necessary.
  • HMI touch screen i.e., screen
  • the HMI touch screen 301 is cluttered with many user controls, is located in a crowded environment, or is in an awkward position for a user, there is a risk that the user could finger tap on the ‘set point’ virtual control 405 by mistake or inadvertently. This mistake may cause damage, loss, create a safety/regulatory violation, or cause harm to a user or the environment.
  • FIG. 5 graphically depicts an example multiple touch input (e.g., a “Land and Tap”) according to an embodiment of the present invention.
  • a first input is received at a first position shown as a land area 408 (i.e., ‘A’) and then a second input is received at a second position as a tap area 501 (i.e., ‘B’) on an HMI screen (i.e., HMI).
  • the HMI is configured such that the ‘set point’ virtual control 405 , which coincide with the second input 501 , would activate (i.e., react to an input from an operator) only when a multiple touch input is received.
  • the land area 408 is the land used for operation (i.e., the area that would be touched in a single touch input), while the tap area 408 enables the operation of the land area 408 .
  • the widget associated with a control function is associated with a land area.
  • the HMI would activate the widget when a first input is received at widget (e.g., button) and a second input is received at the HMI at a second location that enables the operation of the widget.
  • the widget is associated with a tap area. To this end, the HMI would activate the widget when a first input is received at a land area associated with enable in the operation of a tap area, which is associated with the widget.
  • the HMI may present a visual indicator to the operator that the control widget is touched and activated.
  • the screen may change color, or the HMI may generate sound, or provide other visual, tactile (e.g., vibration), or acoustic notification.
  • the HMI may present a visual indicator to the operator that the control widget is touched, but not activated, e.g., by an incorrect position of the input corresponding to the tap area.
  • the HMI may present instructions (e.g., textual description) of the tap area relative to the land area.
  • the HMI may graphically display the tap area, for example, to highlight the sequence necessary to activate the control widget.
  • the HMI may display on the screen, “To activate the Control Button, please place your thumb of your right hand on the Control Button, and tab the highlighted region” where the highlighted region corresponds to area 501 .
  • the land area and tap area may have the same spatial size. In other embodiments, the tap area may have an area smaller than the land area. In another embodiment, the tap area may have an area larger than the land area. In some embodiments, the tap area may change based on a failed attempt and/or a presentation of instructions of the tap area to the operator.
  • the land area corresponds in spatial size to a presented widget associated with a control function.
  • the activation sequence for the multiple touch inputs may include a predefined spatial and geometric component.
  • the tap 501 area may have a predefined distance (shown as offset 503 ), or a range of distances, from the land area 408 .
  • the tap 501 may have a predefined angle offset (shown as angle 504 ), or a range of angle offsets (e.g., shown as range 504 b ), from the land area 408 .
  • the HMI may have a single land area 408 is common for each available touch area on the display.
  • the land input corresponding to a control widget maybe specified for any position on the touch screen.
  • This region may be specified, for example, at one of the four corners of the touch screen or any arbitrary area on the screen that may be specified, via a configuration panel of the HMI.
  • both the land area and the tap area are visible to the user.
  • the area to receive the second input i.e., the tap area
  • the area to receive the second input is not presented on the HMI.
  • the HMI provides feedback to a user that the user has landed on a land area (e.g., a sound, a change in screen color, a touch area highlighted, etc.).
  • a land area e.g., a sound, a change in screen color, a touch area highlighted, etc.
  • the land area may be enabled via touch gestures besides a tap.
  • the land area may be touched to activate gesture control of virtual knobs or sliders that may require movements of the finger on the touch screen (i.e., movement other than a tap).
  • FIG. 6 graphically depicts the timing and operation of a land-and-tap gesture for controlling a human-machine interface according to an embodiment of the present invention.
  • a first input 606 corresponding to an input at the land area is received at contact time 601 and is maintained until contact removal time 605 .
  • an input 602 a received prior to the contact time 601 associated with the land input 606 does not result in an activation of a widget associated with the control operation.
  • the land input is associated with the widget.
  • the tap input is associated with the widget.
  • an input received prior (shown as input 602 b ) to the contact time 601 does not result in an activation of the widget. That is, the land input has to be initiated prior to the tap input.
  • an input (shown as input 602 e ) received following, and not concurrent with, the land input 606 does not result in an activation of the widget.
  • a tap input (shown as input 602 d ) that overlaps in part with the land input 606 does not result in an activation of the widget.
  • the land input 606 and the tap input are entirely overlapping in which the land input 606 is received prior to the tap input 602 c.
  • the land input 606 may be rejected if the duration time for the input exceeds a pre-defined maximum time value.
  • the maximum time value may be modified via a configuration panel of the HMI. In some embodiments, the maximum time value may be between 10 and 30 seconds. In some embodiments, the maximum time value may be between 5 and 10 seconds.
  • the HMI may only cause activation of the control widget if the tap input 602 c is received within a predefined time (shown as time 605 ) from the contact time 601 of the land input 606 . In some embodiments, this activation time is between 1 and 5 seconds.
  • the GUI receives input via a touch class, e.g., the system.windows.input class in PresentationCore.dll (for Windows).
  • the GUI receives via libinput library in Linux.
  • the GUI may operate in conjunction with a multi-touch gesture program such as Touchegg, or other multi-touch gesture programs, that runs as a user in the background, and adds multi-touch support to the window managers.
  • FIG. 7 illustrates an exemplary HMI controller in an industrial automation system.
  • HMI and “HMI controller” may include a computer or a plurality of computers.
  • the HMI controller may include one or more hardware components such as, for example, a processor 721 , a random access memory (RAM) module 722 , a read-only memory (ROM) module 723 , a storage 724 , a database 725 , one or more input/output (I/O) devices 726 , and an interface 727 .
  • HMI controller 720 may include one or more software components such as, for example, a computer-readable medium including computer executable instructions for performing a method associated with the exemplary embodiments. It is contemplated that one or more of the hardware components listed above may be implemented using software.
  • storage 724 may include a software partition associated with one or more other hardware components. It is understood that the components listed above are exemplary only and not intended to be limiting.
  • Processor 721 may include one or more processors, each configured to execute instructions and process data to perform one or more functions associated with a computer for indexing images.
  • Processor 721 may be communicatively coupled to RAM 722 , ROM 723 , storage 724 , database 725 , I/O devices 726 , and interface 727 .
  • Processor 721 may be configured to execute sequences of computer program instructions to perform various processes.
  • the computer program instructions may be loaded into RAM 722 for execution by processor 721 .
  • processor refers to a physical hardware device that executes encoded instructions for performing functions on inputs and creating outputs.
  • RAM 722 and ROM 723 may each include one or more devices for storing information associated with operation of processor 721 .
  • ROM 723 may include a memory device configured to access and store information associated with HMI controller 720 , including information for identifying, initializing, and monitoring the operation of one or more components and subsystems.
  • RAM 722 may include a memory device for storing data associated with one or more operations of processor 721 .
  • ROM 723 may load instructions into RAM 722 for execution by processor 721 .
  • Storage 724 may include any type of mass storage device configured to store information that processor 721 may need to perform processes consistent with the disclosed embodiments.
  • storage 724 may include one or more magnetic and/or optical disk devices, such as hard drives, CD-ROMs, DVD-ROMs, or any other type of mass media device.
  • Database 725 may include one or more software and/or hardware components that cooperate to store, organize, sort, filter, and/or arrange data used by HMI controller 720 and/or processor 721 .
  • database 725 may store hardware and/or software configuration data associated with input-output hardware devices and controllers, as described herein. It is contemplated that database 725 may store additional and/or different information than that listed above.
  • I/O devices 726 may include one or more components configured to communicate information with a user associated with HMI controller 720 .
  • I/O devices may include a console with an integrated keyboard and mouse to allow a user to maintain a database of images, update associations, and access digital content.
  • I/O devices 726 may also include a display including a graphical user interface (GUI) for outputting information on a monitor.
  • GUI graphical user interface
  • I/O devices 726 may also include peripheral devices such as, for example, a printer for printing information associated with HMI controller 720 , a user-accessible disk drive (e.g., a USB port, a floppy, CD-ROM, or DVD-ROM drive, etc.) to allow a user to input data stored on a portable media device, a microphone, a speaker system, or any other suitable type of interface device.
  • peripheral devices such as, for example, a printer for printing information associated with HMI controller 720 , a user-accessible disk drive (e.g., a USB port, a floppy, CD-ROM, or DVD-ROM drive, etc.) to allow a user to input data stored on a portable media device, a microphone, a speaker system, or any other suitable type of interface device.
  • Interface 727 may include one or more components configured to transmit and receive data via a communication network, such as the Internet, a local area network, a workstation peer-to-peer network, a direct link network, a wireless network, or any other suitable communication platform.
  • interface 727 may include one or more modulators, demodulators, multiplexers, demultiplexers, network communication devices, wireless devices, antennas, modems, and any other type of device configured to enable data communication via a communication network.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Exemplified methods and systems provide a graphical HMI having an interface that mitigate or prevent touch errors and/or inadvertent touches through the use of multiple touch inputs, at a graphical user interface, of a touch-screen input device, to trigger an associated user interface command. In some embodiments, the multiple touch inputs comprise an input at two locations, one in relative association, with a displayed interface command, to trigger the command.

Description

FIELD OF THE INVENTION
Embodiments of the disclosure generally relate to controls of industrial systems, and more particularly methods and systems for interfacing with controllers.
BACKGROUND
In industrial distributed control systems, local controllers with human-machine interfaces (HMIs) may be placed near individual subsystems to which they provide associated control, management, supervision, and operation functions to the subsystem or groups thereof. Examples of such applications that include such controllers include those that interfaces with machineries and components of power plants, factories, refineries, power distribution sites, wind or solar farms, among others.
One class of local controllers is embodied as a touch screen HMI that displays data and control features graphically. Because of the harsh, crowded, and tumultuous physical conditions associated with industrial environments, ruggedized HMIs are often used. These ruggedized HMIs often have impact-resistant designs with limited display areas that result in densely arranged graphics, data displays, and controls that are selectable and controllable by an operator. And, because of the dense arrangements of the display and control and the physical conditions associated with the industrial environments, controls may be mistakenly or inadvertently touched by the operator causing inconvenience and loss and improper conditions of the controller in many circumstances.
What are needed are devices, systems and methods that overcome challenges in the present art, some of which are described above.
SUMMARY
Exemplified methods and systems provide a ruggedized graphical HMI having an interface that mitigate or prevent touch errors and/or inadvertent touches through the use of multiple touch inputs, at a graphical user interface, of a touch-screen input device, to trigger an associated user interface command. In some embodiments, the multiple touch inputs comprise an input at two locations, one in relative association, with a displayed interface command, to trigger the command. The multiple touch inputs may be invoke via two fingers placed at the HMI by the operator, e.g., a finger to be landed on the touch screen and another finger to be tapped on the user control on the touch screen to trigger an operation associated with the user control. This may be referred to as a “Land and Tap” input. The command-invocation multiple touch inputs beneficially provide a mechanism of mistake proofing against unintended triggering of a command or an operation due to unintentional finger tap on user control like buttons.
In some embodiments, the HMI presents a “Set Point” button for triggering the setting of a parameter value on a field device. This Set-Point button is associated with a critical operation of an industrial machinery or subsystem in an industrial control application. If the HMI display is cluttered, or densely arranged, with several user controls on one HMI screen, which often occurs due to the number of controllable inputs associated with such industrial machineries and subsystems, there is always a risk that the operator may mistakenly or inadvertently touch the ‘set point’ button. The exemplified “Land and Tap” input may be invoke via both the thumb of the operator being placed on the screen near the Set-Point button and without touching Set-Point button, and the index finger being simultaneously placed on the Set Point button. To this end, a single input received at the Set-Point button does not invoke or trigger the attached operation associated with the button.
According to an aspect, a method is disclosed of receiving multiple touch inputs, at a graphical user interface, of a touch-screen input device, in an industrial automation system, to trigger an associated user interface command (e.g., a graphical user interface command). The method includes presenting, by a processor, via a touch-screen display, a graphical element (e.g., an application icon or a control set-point) associated with execution of an application or a control commands. And, either, i) upon receipt, via the touch-screen display, of a first input at a first position corresponding to the graphical element, determining, by the processor, receipt of a second associated touch input at a second position associated with the activation of the graphical element, or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, and determining receipt of the first input at the first position associated with the graphical element; in response to the first input and second associated touch input not being concurrently received, maintaining, by the processor, the graphical element associated with execution of the application or the control commands in a non-activated state; and in response to the first input and second associated touch input being concurrently received, causing, by the processor, activation of the graphical element associated with execution of the application or the control commands.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the second associated touch input.
In some embodiments, the second associated touch input comprises a point-based input received at one or more pre-defined virtual region (e.g., lower right or lower left of the icons—for each of right hand and left hand operators) located proximal to the graphical element associated with execution of the application or control command.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, a graphical element associated with selection of a location for the pre-defined virtual region (e.g., to select left hand control or right hand control).
In some embodiments, the second associated touch input comprises a point-based input received for a minimum time parameter.
In some embodiments, the second associated touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as a non-activated input.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the second associated touch input at one of a lower or upper corner (e.g., lower left hand or lower right hand corner) of the touch-screen display (e.g., to require two hands—one to touch the unlock button and one to activate a command).
In some embodiments, the graphical element are displayed in a dense matrix of graphical elements.
In some embodiments, the method includes presenting, by the processor, via the touch-screen display, an indicia (e.g., screen change color) of the second associated touch input being received.
In some embodiments, the method includes, in response to a third touch input concurrently received with the first input and the second input, maintaining, by the processor, the graphical element associated with execution of the application or control command in the non-activated state.
According to another aspect, a system is disclosed (e.g., for in an industrial automation system) to trigger an associated user interface command using multiple concurrently-received touch inputs, at a graphical user interface, of a touch-screen input device. The system includes a touch-screen display; a processor operatively coupled to the touch-screen display; and a memory operatively coupled to the processor, the memory having instructions stored thereon, wherein execution of the instructions, cause the processor to: present, via the touch-screen display, a graphical element (e.g., an application icon or a control setpoint) associated with execution of an application or a control commands; either i) upon receipt, via the touch-screen display, of a first input at a first position associated with the graphical element, determine a second associated touch input at a second position associated with the activation of the graphical element, or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, determine the first input at the first position associated with the graphical element; in response to the first and second touch input not being concurrently received with the input, maintain the graphical element associated with execution of an application or a control commands in a non-activated state; and in response to the first and second touch input being concurrently received with the input, cause activation of the graphical element associated with execution of an application or a control command.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: present via the touch-screen display, a second graphical element for receipt of the second associated touch input.
In some embodiments, the second associated touch input comprises a point-based input received at one or more pre-defined virtual region (e.g., lower right or lower left of the icons—for each of right hand and left hand operators) located proximal to the graphical element associated with execution of the application or control command.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: present, via the touch-screen display, a graphical element associated with selection of a location for the pre-defined virtual region (e.g., to select left hand control or right hand control).
In some embodiments, the second associated touch input comprises a point-based input received for a minimum time parameter.
In some embodiments, the second associated touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as non-activated input.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: present, via the touch-screen display, a second graphical element for receipt of the second associated touch input at one of a lower or upper corner of the touch-screen display.
In some embodiments, the graphical element are displayed in a dense matrix of graphical elements.
In some embodiments, the instructions, when executed by the processor, further cause the processor to: in response to a third touch input concurrently received with the first input and the second input, maintain the graphical element associated with execution of the application or control command in the non-activated state.
According to another aspect, a non-transitory computer readable medium to trigger an associated user interface command using multiple concurrently-received touch inputs, at a graphical user interface, of a touch-screen input device, is disclosed. The computer readable medium has instructions stored thereon, wherein execution of the instructions, cause the processor to: present, via a touch-screen display associated with a computing device, a graphical element (e.g., an application icon or a control set-point) associated with execution of an application or a control commands; either i) upon receipt, via the touch-screen display, of a first input at a first position corresponding to the graphical element, determine a second associated touch input at a second position associated with the activation of the graphical element; or ii) upon receipt, via the touch-screen display, of the second associated touch input at the second position associated with the activation of the graphical element, determine the first input at the first position associated with the graphical element; in response to the first and second touch input not being concurrently received with the input, maintain the graphical element associated with execution of an application or a control commands in a non-activated state; and in response to the first and second touch input being concurrently received with the input, cause activation of the graphical element associated with execution of an application or a control commands.
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 schematically depicts an industrial automation system with distributed control via a plurality of human-machine interfaces (HMIs) that are each located proximate to individual subsystems according to an illustrative embodiment.
FIG. 2 schematically depicts an example implementation of a distributed control system (DCS), in accordance with the illustrative embodiment.
FIG. 3 graphically depicts an example human-machine interface (HMI) and a user interacting therewith, in accordance with the illustrative embodiment.
FIG. 4 graphically depicts an HMI configured for two figure touch operation, in accordance with the illustrative embodiment.
FIG. 5 graphically depicts a land area and a tap area on an HMI, in accordance with the illustrative embodiment.
FIG. 6 graphically depicts the timing and operation of a land-and-tap gesture for controlling a human-machine interface, in accordance with the illustrative embodiment.
FIG. 7 illustrates an exemplary HMI controller in an industrial automation system.
DETAILED DESCRIPTION
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. Methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present disclosure.
As used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Ranges may be expressed herein as from “about” one particular value, and/or to “about” another particular value. When such a range is expressed, another embodiment includes—from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
“Optional” or “optionally” means that the subsequently described event or circumstance may or may not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not.
Throughout the description and claims of this specification, the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other additives, components, integers or steps. “Exemplary” means “an example of” and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.
Disclosed are components that can be used to perform the disclosed methods and systems. These and other components are disclosed herein, and it is understood that when combinations, subsets, interactions, groups, etc. of these components are disclosed that while specific reference of each various individual and collective combinations and permutation of these may not be explicitly disclosed, each is specifically contemplated and described herein, for all methods and systems. This applies to all aspects of this application including, but not limited to, steps in disclosed methods. Thus, if there are a variety of additional steps that can be performed it is understood that each of these additional steps can be performed with any specific embodiment or combination of embodiments of the disclosed methods.
It is understood that throughout this specification the identifiers “first”, “second”, “third”, and such, are used solely to aid in distinguishing the various components and steps of the disclosed subject matter. The identifiers “first”, “second”, “third”, and such, are not intended to imply any particular order, sequence, amount, preference, or importance to the components or steps modified by these terms.
The present methods and systems may be understood more readily by reference to the following detailed description of preferred embodiments and the Examples included therein and to the Figures and their previous and following description.
FIG. 1 schematically depicts an industrial automation system with distributed control via a plurality of human-machine interfaces (HMIs) that are each located proximate to individual subsystems according to an embodiment of the present invention. As shown in FIG. 1, industrial systems (e.g., power generation/distribution, oil refinement, water treatment, chemical processing, etc.) may have a plurality of subsystems. Each subsystem in the plurality may be different, each performing a particular aspect of an overall process. Alternatively a system may use a plurality of duplicate subsystems, each performing a duplicate process. These subsystems may be collocated or separated. Separated subsystems may be located at the same facility or may be located in different facilities that are separated by large distances. Rather than controlling/monitoring the subsystems in a central location, the control/monitoring of the various subsystems is often distributed (e.g., geographically, functionally, etc.), wherein a plurality of controllers, each controlling/monitoring a subsystem are utilized. The distributed controllers may be interconnected and communicate (e.g., connected in a hierarchy) in order function and perform.
Referring still to FIG. 1, a local network 101 (e.g., remote stations) may be communicatively coupled (wired or wirelessly) 106 to a network 105 and thus may send and receive information to/from other local networks. Each local network 101 may include at least a subsystem 102, a controller 103, and a human machine interface 104, though many other possible configurations can be envisioned (e.g., one subsystem having many controllers and a many HMIs). The subsystem, controller(s), and HMI may be interconnected and may communicate using standard communication protocols. The subsystem 102 typically includes the mechanical (e.g., valves, pumps, pneumatic devices, etc.) and/or electrical components (e.g., sensors, actuators, switches, breakers, etc.) necessary to perform all or part of an industrial process. These components are controlled and/or monitored by at least one controller (i.e., field device) 103. Various types of controllers may be used (e.g., programmable logic controller (PLC), programmable automation controller (PAC), and supervisory control and data acquisition (SCADA) systems). The controller 103 may generate the signals necessary to control the subsystem 102 and may receive signals from the subsystem for monitoring or control (e.g., feedback signals). A controller 103 may include processors, digital/analog I/O ports, timers, and/or memory and may also be programmed to execute logic sequences and/or respond to remote/local directives from a user and/or from another subsystem/component/controller.
FIG. 2 schematically depicts an example implementation of a distributed control system (DCS) 200, in accordance with the illustrative embodiment. As shown in FIG. 2, a distributed control system 200 for a wind turbine generator may include a first local network 202 located at the base of the wind turbine connected to a second local network 204 located at the turbine cab. The first local network 202 includes a network device 206 having a communication link (e.g., via Profinet, Profibus, InterCAD) and communicates with a controller 208 (shown as “Mark Vie 2308”), a SCADA system 210 to connect to other wind turbine generators, and a controller 212 to monitoring conditions at the base of the tower. The second local network 204 includes a second network device 214 having a communication link (e.g., via Profinet, Profibus, InterCAD) and communicates with controllers 216 for each pitch axis (e.g., that regulates control of the pitch, yaw, and rotation of one of the multiple blades of the turbine), and controller 218 for monitor conditions at the nacelle of the tower. The controllers 216 connect to controllers 220 a, 220 b, 220 c for each of the blade rotatable axis. An HMI may be located proximate to the first local network 202 and the second local network 204. Because of the closed-environment in the inside of the turbine nacelle or at the base of the turbine, an operator therein may make inadvertent or incorrect activation of control widgets for the controller and presented on the HMI.
FIG. 3 graphically depicts an example human-machine interface (HMI) and a user interacting therewith, in accordance with the illustrative embodiment. The HMI 104 includes a touch screen 301 (e.g., resistive, capacitive) that displays information and accepts touch inputs from a user 302. The touch inputs trigger various operations (e.g., change a virtual control, start an application, trigger a function or communication, interact with data, etc.). Traditionally, a user 302 interacts with the controls presented on the screen by simply touching a “touch area” designated on the screen.
As illustrated in FIG. 3, the amount of information presented on an HMI touch screen 301 may be dense. The HMI may display a plurality of data from the subsystem (e.g., system status, operating values, etc.). Further, it may be necessary to display the data in a variety of formats (e.g., graphs, charts, animation showing operation, etc.) and to perform some analysis on the data for a worker to control the subsystem properly. In addition, the controls may be displayed in a way that orients the operator by graphically illustrating the subsystem. As shown in FIG. 3, the HMI may present inputs (e.g., switches, slides, buttons, dials, etc.) to control various elements or various aspects of the subsystem.
Because of the dense presentation of widgets on a HMI for an industrial controller, mistakes may occur by an operator when the wrong area of the HMI is touched. In addition, the HMI may be located in space confined areas that increase the risk of inadvertent touching of the graphical input of the HMI.
To mitigate, errors in input or unintentional touches, the exemplified system and method uses multiple touch inputs that may be specified in a given sequence and for a given duration to activate an operation associated with a widget presented on a control screen of the HMI.
FIG. 4 illustrates an example HMI touch screen 301 in an industrial automation system. As shown in FIG. 4, the control screen includes a graphical widget associated with a ‘Set Point’ control 405. The widget, when activated, via the HMI screen, is configured to trigger the setting of a parameter value on the touch screen of the subsystem's (e.g., the field device's) HMI. The set point control may be critical to operation of the subsystem and should only be touched when necessary. If the HMI touch screen (i.e., screen) 301 is cluttered with many user controls, is located in a crowded environment, or is in an awkward position for a user, there is a risk that the user could finger tap on the ‘set point’ virtual control 405 by mistake or inadvertently. This mistake may cause damage, loss, create a safety/regulatory violation, or cause harm to a user or the environment.
FIG. 5 graphically depicts an example multiple touch input (e.g., a “Land and Tap”) according to an embodiment of the present invention. As shown in FIG. 5, a first input is received at a first position shown as a land area 408 (i.e., ‘A’) and then a second input is received at a second position as a tap area 501 (i.e., ‘B’) on an HMI screen (i.e., HMI). The HMI is configured such that the ‘set point’ virtual control 405, which coincide with the second input 501, would activate (i.e., react to an input from an operator) only when a multiple touch input is received. To this end, only if a first finger remains touched on a land area 408 (not on the real estate of the ‘Set Point’ button) and a second finger gesture (e.g., tapping, sliding, etc.) is received would an operation be triggered. As a result, a single finger tap (or slide) on the ‘Set Point’ virtual control 405 does not invoke the corresponding operation associated with the presented widget.
As shown in FIG. 5, the land area 408 is the land used for operation (i.e., the area that would be touched in a single touch input), while the tap area 408 enables the operation of the land area 408.
In some embodiments, the widget associated with a control function is associated with a land area. To this end, the HMI would activate the widget when a first input is received at widget (e.g., button) and a second input is received at the HMI at a second location that enables the operation of the widget. In other embodiments, the widget is associated with a tap area. To this end, the HMI would activate the widget when a first input is received at a land area associated with enable in the operation of a tap area, which is associated with the widget.
In some embodiments, the HMI may present a visual indicator to the operator that the control widget is touched and activated. In some embodiments, the screen may change color, or the HMI may generate sound, or provide other visual, tactile (e.g., vibration), or acoustic notification.
Referring back to FIG. 5, in some embodiments, the HMI may present a visual indicator to the operator that the control widget is touched, but not activated, e.g., by an incorrect position of the input corresponding to the tap area. In some embodiments, the HMI may present instructions (e.g., textual description) of the tap area relative to the land area. In some embodiments, the HMI may graphically display the tap area, for example, to highlight the sequence necessary to activate the control widget. For example, if a first input at the area at 408 is received, and no second input at the area 501 is received, the HMI may display on the screen, “To activate the Control Button, please place your thumb of your right hand on the Control Button, and tab the highlighted region” where the highlighted region corresponds to area 501.
In some embodiments, the land area and tap area may have the same spatial size. In other embodiments, the tap area may have an area smaller than the land area. In another embodiment, the tap area may have an area larger than the land area. In some embodiments, the tap area may change based on a failed attempt and/or a presentation of instructions of the tap area to the operator.
In some embodiments, the land area corresponds in spatial size to a presented widget associated with a control function.
Referring still to FIG. 5, in some embodiments, the activation sequence for the multiple touch inputs may include a predefined spatial and geometric component. For example, the tap 501 area may have a predefined distance (shown as offset 503), or a range of distances, from the land area 408. In addition, the tap 501 may have a predefined angle offset (shown as angle 504), or a range of angle offsets (e.g., shown as range 504 b), from the land area 408.
Referring back to FIG. 4, in some embodiments, the HMI may have a single land area 408 is common for each available touch area on the display. To this end, the land input corresponding to a control widget maybe specified for any position on the touch screen. This region may be specified, for example, at one of the four corners of the touch screen or any arbitrary area on the screen that may be specified, via a configuration panel of the HMI.
As shown in FIG. 4, in some embodiments, both the land area and the tap area are visible to the user.
In other embodiments, the area to receive the second input (i.e., the tap area) is not presented on the HMI.
In other embodiments, the HMI provides feedback to a user that the user has landed on a land area (e.g., a sound, a change in screen color, a touch area highlighted, etc.).
In some embodiments, the land area may be enabled via touch gestures besides a tap. For example, the land area may be touched to activate gesture control of virtual knobs or sliders that may require movements of the finger on the touch screen (i.e., movement other than a tap).
FIG. 6 graphically depicts the timing and operation of a land-and-tap gesture for controlling a human-machine interface according to an embodiment of the present invention. As shown in FIG. 6, a first input 606 corresponding to an input at the land area is received at contact time 601 and is maintained until contact removal time 605. To this end an input 602 a received prior to the contact time 601 associated with the land input 606 does not result in an activation of a widget associated with the control operation. In some embodiments, the land input is associated with the widget. In other embodiments, the tap input is associated with the widget.
Referring still to FIG. 6, an input received prior (shown as input 602 b) to the contact time 601 does not result in an activation of the widget. That is, the land input has to be initiated prior to the tap input. In addition, an input (shown as input 602 e) received following, and not concurrent with, the land input 606 does not result in an activation of the widget.
Referring still to FIG. 6, a tap input (shown as input 602 d) that overlaps in part with the land input 606 does not result in an activation of the widget.
Thus, to activate the control widget, the land input 606 and the tap input (shown as input 602 c) are entirely overlapping in which the land input 606 is received prior to the tap input 602 c.
In some embodiments, the land input 606 may be rejected if the duration time for the input exceeds a pre-defined maximum time value. The maximum time value may be modified via a configuration panel of the HMI. In some embodiments, the maximum time value may be between 10 and 30 seconds. In some embodiments, the maximum time value may be between 5 and 10 seconds.
In some embodiments, the HMI may only cause activation of the control widget if the tap input 602 c is received within a predefined time (shown as time 605) from the contact time 601 of the land input 606. In some embodiments, this activation time is between 1 and 5 seconds.
In some embodiments, the GUI receives input via a touch class, e.g., the system.windows.input class in PresentationCore.dll (for Windows). In some embodiments, the GUI receives via libinput library in Linux. In some embodiments, the GUI may operate in conjunction with a multi-touch gesture program such as Touchegg, or other multi-touch gesture programs, that runs as a user in the background, and adds multi-touch support to the window managers.
Example HMI
FIG. 7 illustrates an exemplary HMI controller in an industrial automation system. As used herein, “HMI” and “HMI controller” may include a computer or a plurality of computers. The HMI controller may include one or more hardware components such as, for example, a processor 721, a random access memory (RAM) module 722, a read-only memory (ROM) module 723, a storage 724, a database 725, one or more input/output (I/O) devices 726, and an interface 727. Alternatively and/or additionally, HMI controller 720 may include one or more software components such as, for example, a computer-readable medium including computer executable instructions for performing a method associated with the exemplary embodiments. It is contemplated that one or more of the hardware components listed above may be implemented using software. For example, storage 724 may include a software partition associated with one or more other hardware components. It is understood that the components listed above are exemplary only and not intended to be limiting.
Processor 721 may include one or more processors, each configured to execute instructions and process data to perform one or more functions associated with a computer for indexing images. Processor 721 may be communicatively coupled to RAM 722, ROM 723, storage 724, database 725, I/O devices 726, and interface 727. Processor 721 may be configured to execute sequences of computer program instructions to perform various processes. The computer program instructions may be loaded into RAM 722 for execution by processor 721. As used herein, processor refers to a physical hardware device that executes encoded instructions for performing functions on inputs and creating outputs.
RAM 722 and ROM 723 may each include one or more devices for storing information associated with operation of processor 721. For example, ROM 723 may include a memory device configured to access and store information associated with HMI controller 720, including information for identifying, initializing, and monitoring the operation of one or more components and subsystems. RAM 722 may include a memory device for storing data associated with one or more operations of processor 721. For example, ROM 723 may load instructions into RAM 722 for execution by processor 721.
Storage 724 may include any type of mass storage device configured to store information that processor 721 may need to perform processes consistent with the disclosed embodiments. For example, storage 724 may include one or more magnetic and/or optical disk devices, such as hard drives, CD-ROMs, DVD-ROMs, or any other type of mass media device.
Database 725 may include one or more software and/or hardware components that cooperate to store, organize, sort, filter, and/or arrange data used by HMI controller 720 and/or processor 721. For example, database 725 may store hardware and/or software configuration data associated with input-output hardware devices and controllers, as described herein. It is contemplated that database 725 may store additional and/or different information than that listed above.
I/O devices 726 may include one or more components configured to communicate information with a user associated with HMI controller 720. For example, I/O devices may include a console with an integrated keyboard and mouse to allow a user to maintain a database of images, update associations, and access digital content. I/O devices 726 may also include a display including a graphical user interface (GUI) for outputting information on a monitor. I/O devices 726 may also include peripheral devices such as, for example, a printer for printing information associated with HMI controller 720, a user-accessible disk drive (e.g., a USB port, a floppy, CD-ROM, or DVD-ROM drive, etc.) to allow a user to input data stored on a portable media device, a microphone, a speaker system, or any other suitable type of interface device.
Interface 727 may include one or more components configured to transmit and receive data via a communication network, such as the Internet, a local area network, a workstation peer-to-peer network, a direct link network, a wireless network, or any other suitable communication platform. For example, interface 727 may include one or more modulators, demodulators, multiplexers, demultiplexers, network communication devices, wireless devices, antennas, modems, and any other type of device configured to enable data communication via a communication network.

Claims (17)

What is claimed is:
1. A method of receiving multiple touch inputs, via a touch-screen display, at a graphical user interface of a control application executing in an industrial automation system, to trigger an associated user interface command within the control application, the method comprising:
presenting, by a processor, via the touch-screen display, a graphical depiction of the industrial automation system, or a portion of the industrial automation system including a plurality of physical components performing an industrial process;
presenting, by the processor, via the touch-screen display, a plurality of graphical elements, wherein each graphical element of the plurality of graphical elements is linked to execution of an associated user interface command in the control application, the user interface command operable to cause a controller in the industrial automation system to effect a change to a set-point of one or more of the physical components performing the industrial process, and wherein each of the plurality of graphical elements linked to execution of the associated user interface command is depicted adjacent to the associated one or more of the physical components;
receiving, via the touch-screen display, a first touch input at a first position, the first touch point enabling activation of at least one of the plurality of graphical elements;
receiving, via the touch-screen display, a second touch input at a second position, the second position associated with a selected graphical element of the at least one of the plurality of graphical elements, the second touch input received after the first touch input;
causing the processor to activate the selected graphical element associated with the second position if the first touch input is maintained during a period in which the second touch input is received, the activation of the selected graphical element causing execution of the user interface command associated with the selected graphical element and effecting a change in the set-point of the one or more physical components performing the industrial process, and
causing the processor to maintain in an inactivated state the selected graphical element associated with the second position if the first touch input is not maintained during the period in which the second touch input is received,
wherein the processor is configured to reject the first touch input at the first position if the first touch input is maintained for more than a predetermined maximum time value and to ignore the second touch input at the second position if the second touch input is received after the first touch input is rejected.
2. The method of claim 1, comprising:
presenting, by the processor, via the touch-screen display, a second graphical element associated with the first position.
3. The method of claim 1, wherein the first touch input comprises a point-based input received at one or more pre-defined virtual regions located proximal to the determined graphical element linked to execution of the associated user interface command.
4. The method of claim 3, comprising:
presenting, by the processor, via the touch-screen display, an additional graphical element associated with selection of a location for the one or more pre-defined virtual regions relative to the graphical element.
5. The method of claim 1, wherein the first touch input comprises a point-based input received for a minimum time parameter.
6. The method of claim 1, wherein the first touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as a non-activated input.
7. The method of claim 1, comprising:
presenting, by the processor, via the touch-screen display, a second graphical element for receipt of the first touch input at one of a lower or upper corner of the touch-screen display.
8. The method of claim 1, comprising:
in response to a third touch input concurrently received with the first touch input and the second associated touch input, maintaining, by the processor, the selected graphical element linked to execution of the user interface command in the non-activated state.
9. A system in an industrial automation system, the system receiving, at a graphical user interface of a control application executing in the industrial automation system, via a touch-screen display, multiple concurrently-received touch inputs to trigger an associated user interface command, the system comprising:
the touch-screen display;
a processor operatively coupled to the touch-screen display and to a controller in the industrial automation system, the controller controlling a plurality of physical components performing an industrial process; and
a memory operatively coupled to the processor, the memory having instructions stored thereon, wherein execution of the instructions, cause the processor to:
present, via the touch-screen display, a graphical depiction of the industrial automation system, or a portion of the industrial automation system including a plurality of physical components performing an industrial process;
presenting, by the processor, via the touch-screen display, a plurality of graphical elements, wherein each graphical element of the plurality of graphical elements linked to execution of an associated user interface command operable to cause the controller in the industrial automation system to effect a change to a set-point of one or more of the physical components performing the industrial process, and wherein each of the plurality of graphical elements linked to execution of the associated user interface command is depicted adjacent to the associated one or more of the physical components;
receive, via the touch-screen display, a first touch input at a first position, the first touch point enabling activation of at least one of the plurality of graphical elements;
receive, via the touch-screen display, a second touch input at a second position, the second position associated with a selected graphical element of the at least one of the plurality of graphical elements, the second touch input received after the first touch input;
cause the processor to activate the selected graphical element associated with the second position if the first touch input is maintained during a period in which the second touch input is received, the activation of the selected graphical element causing execution of the user interface command associated with the selected graphical element and effecting a change in the set-point of the one or more physical components performing the industrial process, and
cause the processor to maintain in an inactivated state the selected graphical element associated with the second position if the first touch input is not maintained during the period in which the second touch input is received,
reject the first touch input at the first position if the first touch input is maintained for more than a predetermined maximum time value and to ignore the second touch input at the second position if the second touch input is received after the first touch input is rejected.
10. The system of claim 9, wherein the instructions, when executed by the processor, further cause the processor to:
present via the touch-screen display, a second graphical element associated with the first position.
11. The system of claim 9, wherein the first touch input comprises a point-based input received at one or more pre-defined virtual regions located proximal to the determined graphical element associated with execution of the associated user interface command.
12. The system of claim 11, wherein the instructions, when executed by the processor, further cause the processor to:
present, via the touch-screen display, an additional graphical element associated with selection of a location for the one or more pre-defined virtual regions relative to the graphical element.
13. The system of claim 9, wherein the first touch input comprises a point-based input received for a minimum time parameter.
14. The system of claim 9, wherein the first touch input comprises a point-based input received between a minimum time parameter and a maximum time parameter, wherein receipt of inputs outside the minimum and maximum time parameters are ignored as non-activated input.
15. The system of claim 9, wherein the instructions, when executed by the processor, further cause the processor to:
present, via the touch-screen display, a second graphical element for receipt of the first touch input at one of a lower or upper corner of the touch-screen display.
16. The system of claim 9, wherein the instructions, when executed by the processor, further cause the processor to:
in response to a third touch input concurrently received with the first touch input and the second associated touch input, maintain the determined graphical element linked to execution of the associated user interface command in the non-activated state.
17. A non-transitory computer readable medium to trigger, within a control application executing in an industrial automation system, an associated user interface command using multiple concurrently-received touch inputs received at a graphical user interface of a touch-screen display the computer readable medium having instructions stored thereon, wherein when executed by a processor, cause the processor to:
present, via the touch-screen display associated with a computing device, a graphical depiction of the industrial automation system, or a portion of the industrial automation system including a plurality of physical components performing an industrial process;
presenting, by the computing device, via the touch-screen display, a plurality of graphical elements, wherein each graphical element is linked to execution of an associated user interface command operable to cause a controller in the industrial automation system to effect a change to a set-point of one or more of the physical components performing the industrial process, and wherein each of the plurality of graphical elements linked to execution of the associated user interface command is depicted adjacent to the associated one or more of the physical components;
receive, via the touch-screen display, a first touch input at a first position, the first touch point enabling activation of at least one of the plurality of graphical elements;
receive, via the touch-screen display, a second touch input at a second position, the second position associated with a selected graphical element of the at least one of the plurality of graphical elements, the second touch input received after the first touch input;
cause the processor to activate the selected graphical element associated with the second position if the first touch input is maintained during a period in which the second touch input is received, the activation of the selected graphical element causing execution of the user interface command associated with the selected graphical element and effecting a change in the set-point of the one or more physical components performing the industrial process, and
cause the processor to maintain in an inactivated state the selected graphical element associated with the second position if the first touch input is not maintained during the period in which the second touch input is received,
cause the processor to reject the first touch input at the first position if the first touch input is maintained for more than a predetermined maximum time value and to ignore the second touch input at the second position if the second touch input is received after the first touch input is rejected.
US15/145,095 2016-05-03 2016-05-03 System and method of using multiple touch inputs for controller interaction in industrial control systems Active 2036-11-02 US11079915B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/145,095 US11079915B2 (en) 2016-05-03 2016-05-03 System and method of using multiple touch inputs for controller interaction in industrial control systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/145,095 US11079915B2 (en) 2016-05-03 2016-05-03 System and method of using multiple touch inputs for controller interaction in industrial control systems

Publications (2)

Publication Number Publication Date
US20170322721A1 US20170322721A1 (en) 2017-11-09
US11079915B2 true US11079915B2 (en) 2021-08-03

Family

ID=60242567

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/145,095 Active 2036-11-02 US11079915B2 (en) 2016-05-03 2016-05-03 System and method of using multiple touch inputs for controller interaction in industrial control systems

Country Status (1)

Country Link
US (1) US11079915B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102118293B1 (en) * 2019-12-10 2020-06-02 주식회사 아진엑스텍 Robot controlling method using portable device including touchscreen
US11068221B1 (en) * 2020-06-08 2021-07-20 Schweitzer Engineering Laboratories, Inc. Remote monitoring systems and related methods
EP4343524A1 (en) * 2022-09-22 2024-03-27 Schneider Electric Industries Sas Industrial touchscreen

Citations (140)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4099247A (en) 1974-02-04 1978-07-04 Canon Kabushiki Kaisha Electronic instrument with non-volatile display
WO1997021204A1 (en) 1995-12-05 1997-06-12 Schneider Automation Display device for a programmable controller
US5900877A (en) 1995-05-15 1999-05-04 Sybase, Inc. Method and apparatus for multilevel software controls
US20020054120A1 (en) 2000-07-17 2002-05-09 International Business Machines Corporation Computer system, on-screen keyboard generation method, power-on-password checking method and memory
US20020109677A1 (en) 2000-12-21 2002-08-15 David Taylor Touchpad code entry system
US20020140688A1 (en) 1999-11-19 2002-10-03 Steinberg Robert B. Low information content display
US20020167500A1 (en) 1998-09-11 2002-11-14 Visible Techknowledgy, Llc Smart electronic label employing electronic ink
US20040003036A1 (en) * 2002-06-04 2004-01-01 Eagle Scott G. Identifying the source of messages presented in a computer system
US20040156170A1 (en) 2001-04-10 2004-08-12 Gerhard Mager Household appliance with a display device
US7062716B2 (en) 1999-08-19 2006-06-13 National Instruments Corporation System and method for enhancing the readability of a graphical program
US20060284852A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Peel back user interface to show hidden functions
US20070177803A1 (en) 2006-01-30 2007-08-02 Apple Computer, Inc Multi-touch gesture dictionary
US20070285385A1 (en) 1998-11-02 2007-12-13 E Ink Corporation Broadcast system for electronic ink signs
US20080136587A1 (en) 2006-12-08 2008-06-12 Research In Motion Limited System and method for locking and unlocking access to an electronic device
CN101251884A (en) 2008-03-14 2008-08-27 福建伊时代信息科技有限公司 Path password input method based on contacts
US20080303637A1 (en) 2003-09-03 2008-12-11 Metrologic Instruments, Inc. Updateable electronic-ink based display label device
US20090051648A1 (en) 2007-08-20 2009-02-26 Gesturetek, Inc. Gesture-based mobile interaction
EP2042955A1 (en) 2007-09-28 2009-04-01 Siemens Aktiengesellschaft Support for service actions on a programmable logic controler (PLC)
US20090089701A1 (en) 2007-09-27 2009-04-02 Rockwell Automation Technologies, Inc. Distance-wise presentation of industrial automation data as a function of relevance to user
US20090122018A1 (en) * 2007-11-12 2009-05-14 Leonid Vymenets User Interface for Touchscreen Device
US20090135147A1 (en) * 2007-11-27 2009-05-28 Wistron Corporation Input method and content displaying method for an electronic device, and applications thereof
US20090195496A1 (en) 2003-12-16 2009-08-06 Seiko Epson Corporation Information display having separate and detachable units
US20090225023A1 (en) 2008-03-07 2009-09-10 Szolyga Thomas H Panel With Non-volatile Display Media
US7593000B1 (en) 2008-05-17 2009-09-22 David H. Chin Touch-based authentication of a mobile device through user generated pattern creation
US20090262379A1 (en) 2008-03-03 2009-10-22 Sharp Kabushiki Kaisha Image forming apparatus providing user support in sleep mode
US20090278807A1 (en) 2008-05-12 2009-11-12 Sony Corporation Password input using touch duration code
US20090322700A1 (en) 2008-06-30 2009-12-31 Tyco Electronics Corporation Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen
US20090327975A1 (en) * 2008-06-27 2009-12-31 Stedman Roy W Multi-Touch Sorting Gesture
US20100020025A1 (en) 2008-07-25 2010-01-28 Intuilab Continuous recognition of multi-touch gestures
US20100031344A1 (en) 2008-08-01 2010-02-04 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Touch-screen based password input system and electronic device having same
US20100031200A1 (en) 2008-07-30 2010-02-04 Arima Communications Corp. Method of inputting a hand-drawn pattern password
US20100073303A1 (en) * 2008-09-24 2010-03-25 Compal Electronics, Inc. Method of operating a user interface
US20100115473A1 (en) 2008-10-31 2010-05-06 Sprint Communications Company L.P. Associating gestures on a touch screen with characters
US20100138764A1 (en) 2004-09-08 2010-06-03 Universal Electronics, Inc. System and method for flexible configuration of a controlling device
US20100162182A1 (en) 2008-12-23 2010-06-24 Samsung Electronics Co., Ltd. Method and apparatus for unlocking electronic appliance
US20100177660A1 (en) 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Wireless network devices for use in a wireless communication network
US20100194702A1 (en) * 2009-02-04 2010-08-05 Mstar Semiconductor Inc. Signal processing apparatus, signal processing method and selection method of uer interface icon for multi-touch panel
US20100245102A1 (en) 2009-03-31 2010-09-30 Brother Kogyo Kabushiki Kaisha Information display device
US20100245341A1 (en) 2009-03-30 2010-09-30 Brother Kogyo Kabushiki Kaisha Display device having non-volatile display unit driven with power supplied from battery
US20100322485A1 (en) 2009-06-18 2010-12-23 Research In Motion Limited Graphical authentication
US20110041102A1 (en) 2009-08-11 2011-02-17 Jong Hwan Kim Mobile terminal and method for controlling the same
US20110069018A1 (en) * 2007-05-11 2011-03-24 Rpo Pty Limited Double Touch Inputs
US20110078568A1 (en) 2009-09-30 2011-03-31 Jin Woo Park Mobile terminal and method for controlling the same
US20110156867A1 (en) 2009-12-30 2011-06-30 Carlos Carrizo Gesture-based signature authentication
US20110157375A1 (en) 2009-12-28 2011-06-30 Brother Kogyo Kabushiki Kaisha Display device and program for display device
US20110175839A1 (en) 2008-09-24 2011-07-21 Koninklijke Philips Electronics N.V. User interface for a multi-point touch sensitive device
US20110242022A1 (en) * 2010-04-01 2011-10-06 Mstar Semiconductor, Inc. Touch Determining Method and Determining Method of Touch Gesture on a Touch Panel
US20110260829A1 (en) 2010-04-21 2011-10-27 Research In Motion Limited Method of providing security on a portable electronic device having a touch-sensitive display
US20110273388A1 (en) 2010-05-10 2011-11-10 Samsung Electronics Co., Ltd. Apparatus and method for receiving gesture-based input in a mobile device
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards
US20110285645A1 (en) 2010-05-19 2011-11-24 Sunghyun Cho Mobile terminal and control method thereof
US20110320978A1 (en) 2010-06-29 2011-12-29 Horodezky Samuel J Method and apparatus for touchscreen gesture recognition overlay
US20120023574A1 (en) 2006-05-24 2012-01-26 Vidoop, Llc Graphical Image Authentication And Security System
EP2416308A1 (en) 2010-08-03 2012-02-08 Koninklijke Philips Electronics N.V. Display device
US20120066650A1 (en) 2010-09-10 2012-03-15 Motorola, Inc. Electronic Device and Method for Evaluating the Strength of a Gestural Password
CN102592524A (en) 2011-12-30 2012-07-18 鸿富锦精密工业(深圳)有限公司 Electronic tag and content updating method and content updating system thereof
US20120184368A1 (en) * 2011-01-19 2012-07-19 Konami Digital Entertainment Co., Ltd. Gaming device and recording medium
US20120206474A1 (en) 2011-02-14 2012-08-16 Holland Peter F Blend Equation
US8255867B1 (en) 2010-07-29 2012-08-28 The Boeing Company Methods and systems for use in splitting wiring diagrams
US8286102B1 (en) 2010-05-27 2012-10-09 Adobe Systems Incorporated System and method for image processing using multi-touch gestures
US20120256863A1 (en) 2009-12-28 2012-10-11 Motorola, Inc. Methods for Associating Objects on a Touch Screen Using Input Gestures
US20120291120A1 (en) 2011-05-09 2012-11-15 Research In Motion Limited Touchscreen password entry
US20120306793A1 (en) 2009-09-21 2012-12-06 Xiangtao Liu Electronic Device and Method, Cell Phone, Program to Achieve Preset Operation Command Thereof
US20130033436A1 (en) * 2011-02-17 2013-02-07 Htc Corporation Electronic device, controlling method thereof and computer program product
US20130057070A1 (en) 2011-09-01 2013-03-07 Seiko Epson Corporation Circuit device, electronic apparatus, and ic card
US8405616B2 (en) 2005-10-31 2013-03-26 Samsung Electronics Co., Ltd. Method of using a touch screen and user interface apparatus employing the same
US20130104065A1 (en) 2011-10-21 2013-04-25 International Business Machines Corporation Controlling interactions via overlaid windows
US8445793B2 (en) 2008-12-08 2013-05-21 Apple Inc. Selective input signal rejection and modification
US20130135178A1 (en) * 2010-09-27 2013-05-30 Nec Corporation Information processing terminal and control method thereof
US8458485B2 (en) 2009-06-17 2013-06-04 Microsoft Corporation Image-based unlock functionality on a computing device
US20130227496A1 (en) 2012-02-29 2013-08-29 Fuji Xerox Co., Ltd. Image processing device, non-transitory computer readable medium, and image processing method
US8525799B1 (en) 2007-04-24 2013-09-03 Cypress Semiconductor Conductor Detecting multiple simultaneous touches on a touch-sensor device
US8536978B2 (en) 2010-11-19 2013-09-17 Blackberry Limited Detection of duress condition at a communication device
CN203204640U (en) 2013-03-22 2013-09-18 上海斐讯数据通信技术有限公司 Electronic label system, reader and electronic-label information processing system
US20130241844A1 (en) 2012-03-13 2013-09-19 Yao-Tsung Chang Method of Touch Command Integration and Touch System Using the Same
US20130268900A1 (en) 2010-12-22 2013-10-10 Bran Ferren Touch sensor gesture recognition for operation of mobile devices
US20130298071A1 (en) 2012-05-02 2013-11-07 Jonathan WINE Finger text-entry overlay
US8619052B2 (en) 2006-04-19 2013-12-31 Microsoft Corporation Precise selection techniques for multi-touch screens
US20140026055A1 (en) 2012-07-20 2014-01-23 Barnesandnoble.Com Llc Accessible Reading Mode Techniques For Electronic Devices
US8638939B1 (en) 2009-08-20 2014-01-28 Apple Inc. User authentication on an electronic device
US20140035853A1 (en) 2012-08-06 2014-02-06 Samsung Electronics Co., Ltd. Method and apparatus for providing user interaction based on multi touch finger gesture
US8686958B2 (en) 2011-01-04 2014-04-01 Lenovo (Singapore) Pte. Ltd. Apparatus and method for gesture input in a dynamically zoned environment
US20140092031A1 (en) 2012-09-28 2014-04-03 Synaptics Incorporated System and method for low power input object detection and interaction
US20140109018A1 (en) 2012-10-12 2014-04-17 Apple Inc. Gesture entry techniques
US20140123080A1 (en) 2011-06-07 2014-05-01 Beijing Lenovo Software Ltd. Electrical Device, Touch Input Method And Control Method
US20140143859A1 (en) 2012-11-16 2014-05-22 Mario Linge Unlock touch screen using touch password
US20140149922A1 (en) * 2012-11-29 2014-05-29 Jasper Reid Hauser Infinite Bi-Directional Scrolling
US20140173529A1 (en) * 2012-12-14 2014-06-19 Barnesandnoble.Com Llc Circular gesture for touch sensitive ui control feature
US20140189855A1 (en) 2012-12-31 2014-07-03 Conduit, Ltd. Gestures for Unlocking a Mobile Device
US20140223381A1 (en) 2011-05-23 2014-08-07 Microsoft Corporation Invisible control
US20140223549A1 (en) 2013-02-07 2014-08-07 Dell Products L. P. Passwords for Touch-Based Platforms Using Time-Based Finger Taps
US20140245203A1 (en) * 2013-02-26 2014-08-28 Samsung Electronics Co., Ltd. Portable device and method for operating multi-application thereof
US8823642B2 (en) 2011-07-04 2014-09-02 3Divi Company Methods and systems for controlling devices using gestures and related 3D sensor
US8824040B1 (en) 2012-07-03 2014-09-02 Brian K. Buchheit Enhancing low light usability of electrophoretic displays
US8830072B2 (en) 2006-06-12 2014-09-09 Intelleflex Corporation RF systems and methods for providing visual, tactile, and electronic indicators of an alarm condition
US20140267015A1 (en) 2013-03-15 2014-09-18 Cellco Partnership D/B/A Verizon Wireless Non-volatile display accessory controlled and powered by a mobile device
US20140277753A1 (en) * 2013-03-12 2014-09-18 Trane International Inc. Events Management
US20140298237A1 (en) 2013-03-27 2014-10-02 Texas Instruments Incorporated Radial Based User Interface on Touch Sensitive Screen
US20140372896A1 (en) 2013-06-14 2014-12-18 Microsoft Corporation User-defined shortcuts for actions above the lock screen
US20150007308A1 (en) 2013-07-01 2015-01-01 Blackberry Limited Password by touch-less gesture
US20150029095A1 (en) 2012-01-09 2015-01-29 Movea Command of a device by gesture emulation of touch gestures
WO2015012789A1 (en) 2013-07-22 2015-01-29 Hewlett-Packard Development Company, L.P. Multi-region touchpad
US20150038072A1 (en) * 2013-08-01 2015-02-05 Mattel, Inc. Bidirectional Communication between an Infant Receiving System and a Remote Device
US20150046885A1 (en) 2012-02-23 2015-02-12 Zte Corporation Method and device for unlocking touch screen
US20150067578A1 (en) 2013-09-04 2015-03-05 Samsung Electronics Co., Ltd Apparatus and method for executing function in electronic device
US20150072784A1 (en) 2012-06-08 2015-03-12 Intellectual Discovery Co., Ltd. Method and apparatus for controlling character by inputting pattern
US9001061B2 (en) 2012-09-14 2015-04-07 Lenovo (Singapore) Pte. Ltd. Object movement on small display screens
US20150121314A1 (en) 2013-10-24 2015-04-30 Jens Bombolowsky Two-finger gestures
US9030418B2 (en) 2008-06-24 2015-05-12 Lg Electronics Inc. Mobile terminal capable of sensing proximity touch
US20150135129A1 (en) 2013-11-13 2015-05-14 Samsung Electronics Co., Ltd. Electronic device having touchscreen and input processing method thereof
US20150138142A1 (en) * 2013-11-15 2015-05-21 Mediatek Inc. Method for performing touch communications control of an electronic device by using location detection with aid of touch panel, and an associated apparatus
US20150153932A1 (en) 2013-12-04 2015-06-04 Samsung Electronics Co., Ltd. Mobile device and method of displaying icon thereof
US20150169216A1 (en) 2013-12-13 2015-06-18 Samsung Electronics Co., Ltd. Method of controlling screen of portable electronic device
US20150169141A1 (en) 2013-12-16 2015-06-18 Samsung Electronics Co., Ltd. Method for controlling screen and electronic device thereof
US20150169502A1 (en) 2013-12-16 2015-06-18 Microsoft Corporation Touch-based reorganization of page element
US20150188970A1 (en) 2013-12-31 2015-07-02 Personify, Inc. Methods and Systems for Presenting Personas According to a Common Cross-Client Configuration
US20150220182A1 (en) * 2013-11-07 2015-08-06 Daniel Avrahami Controlling primary and secondary displays from a single touchscreen
US20150227943A1 (en) * 2014-02-07 2015-08-13 Qvolve, LLC System and Method for Documenting Regulatory Compliance
US20150294096A1 (en) 2014-04-10 2015-10-15 Bank Of America Corporation Rhythm-based user authentication
US9165159B1 (en) 2013-01-10 2015-10-20 Marvell International Ltd. Encryption based on touch gesture
US9189614B2 (en) 2013-09-23 2015-11-17 GlobalFoundries, Inc. Password entry for double sided multi-touch display
US20150331399A1 (en) * 2012-11-15 2015-11-19 Keba Ag Method for the secure and intentional activation of functions and/or movements of controllable industrial equipment
US20150355805A1 (en) * 2014-06-04 2015-12-10 Quantum Interface, Llc Dynamic environment for object and attribute display and interaction
US20150365492A1 (en) 2014-06-13 2015-12-17 Rockwell Automation Technologies, Inc. Systems and methods for adapting a user interface based on a profile
US9262603B2 (en) 2011-10-21 2016-02-16 International Business Machines Corporation Advanced authentication technology for computing devices
US20160054851A1 (en) * 2014-08-22 2016-02-25 Samsung Electronics Co., Ltd. Electronic device and method for providing input interface
US9357391B1 (en) 2015-06-25 2016-05-31 International Business Machines Corporation Unlocking electronic devices with touchscreen input gestures
US9460575B2 (en) 2013-12-05 2016-10-04 Lg Electronics Inc. Vehicle control apparatus and method thereof
US20170039691A1 (en) 2015-08-05 2017-02-09 Ricoh Company, Ltd. Image processing apparatus, image processing method and storage medium
US9600103B1 (en) * 2012-12-31 2017-03-21 Allscripts Software, Llc Method for ensuring use intentions of a touch screen device
US20170090463A1 (en) * 2014-05-08 2017-03-30 Beet, Llc Automation interface
US9703392B2 (en) 2009-04-03 2017-07-11 Sony Corporation Methods and apparatus for receiving, converting into text, and verifying user gesture input from an information input device
US20170230378A1 (en) * 2016-02-08 2017-08-10 Rockwell Automation Technologies, Inc. Beacon-based industrial automation access authorization
US20180004386A1 (en) * 2016-06-30 2018-01-04 Microsoft Technology Licensing, Llc Pre-touch sensing for mobile interaction
US9983664B2 (en) * 2011-11-16 2018-05-29 Samsung Electronics Co., Ltd. Mobile device for executing multiple applications and method for same
US20180267690A1 (en) * 2017-03-20 2018-09-20 Georgia Tech Research Corporation Control system for a mobile manipulation device
US20190005958A1 (en) * 2016-08-17 2019-01-03 Panasonic Intellectual Property Management Co., Ltd. Voice input device, translation device, voice input method, and recording medium
US20190095075A1 (en) 2015-06-11 2019-03-28 Beijing Kingsoft Internet Security Software Co. Ltd. Method and apparatus for setting background picture of unlocking interface of application, and electronic device
US20190174069A1 (en) * 2016-03-18 2019-06-06 Kenneth L. Poindexter, JR. System and Method for Autonomously Recording a Visual Media
US10320789B1 (en) * 2014-03-26 2019-06-11 Actioneer, Inc. Fast and secure way to fetch or post data and display it temporarily to a user

Patent Citations (143)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4099247A (en) 1974-02-04 1978-07-04 Canon Kabushiki Kaisha Electronic instrument with non-volatile display
US5900877A (en) 1995-05-15 1999-05-04 Sybase, Inc. Method and apparatus for multilevel software controls
WO1997021204A1 (en) 1995-12-05 1997-06-12 Schneider Automation Display device for a programmable controller
US20020167500A1 (en) 1998-09-11 2002-11-14 Visible Techknowledgy, Llc Smart electronic label employing electronic ink
US20070024551A1 (en) 1998-09-11 2007-02-01 Alexander Gelbman Smart electronic label employing electronic ink
US20070285385A1 (en) 1998-11-02 2007-12-13 E Ink Corporation Broadcast system for electronic ink signs
US7062716B2 (en) 1999-08-19 2006-06-13 National Instruments Corporation System and method for enhancing the readability of a graphical program
US20020140688A1 (en) 1999-11-19 2002-10-03 Steinberg Robert B. Low information content display
US20020054120A1 (en) 2000-07-17 2002-05-09 International Business Machines Corporation Computer system, on-screen keyboard generation method, power-on-password checking method and memory
US20020109677A1 (en) 2000-12-21 2002-08-15 David Taylor Touchpad code entry system
US20040156170A1 (en) 2001-04-10 2004-08-12 Gerhard Mager Household appliance with a display device
US20040003036A1 (en) * 2002-06-04 2004-01-01 Eagle Scott G. Identifying the source of messages presented in a computer system
US20080303637A1 (en) 2003-09-03 2008-12-11 Metrologic Instruments, Inc. Updateable electronic-ink based display label device
US20090195496A1 (en) 2003-12-16 2009-08-06 Seiko Epson Corporation Information display having separate and detachable units
US20100138764A1 (en) 2004-09-08 2010-06-03 Universal Electronics, Inc. System and method for flexible configuration of a controlling device
US20060284852A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Peel back user interface to show hidden functions
US8405616B2 (en) 2005-10-31 2013-03-26 Samsung Electronics Co., Ltd. Method of using a touch screen and user interface apparatus employing the same
US20070177803A1 (en) 2006-01-30 2007-08-02 Apple Computer, Inc Multi-touch gesture dictionary
US8619052B2 (en) 2006-04-19 2013-12-31 Microsoft Corporation Precise selection techniques for multi-touch screens
US20120023574A1 (en) 2006-05-24 2012-01-26 Vidoop, Llc Graphical Image Authentication And Security System
US8830072B2 (en) 2006-06-12 2014-09-09 Intelleflex Corporation RF systems and methods for providing visual, tactile, and electronic indicators of an alarm condition
US20080136587A1 (en) 2006-12-08 2008-06-12 Research In Motion Limited System and method for locking and unlocking access to an electronic device
US8125312B2 (en) 2006-12-08 2012-02-28 Research In Motion Limited System and method for locking and unlocking access to an electronic device
US8525799B1 (en) 2007-04-24 2013-09-03 Cypress Semiconductor Conductor Detecting multiple simultaneous touches on a touch-sensor device
US20110069018A1 (en) * 2007-05-11 2011-03-24 Rpo Pty Limited Double Touch Inputs
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards
US20090051648A1 (en) 2007-08-20 2009-02-26 Gesturetek, Inc. Gesture-based mobile interaction
US20090089701A1 (en) 2007-09-27 2009-04-02 Rockwell Automation Technologies, Inc. Distance-wise presentation of industrial automation data as a function of relevance to user
EP2042955A1 (en) 2007-09-28 2009-04-01 Siemens Aktiengesellschaft Support for service actions on a programmable logic controler (PLC)
US20090122018A1 (en) * 2007-11-12 2009-05-14 Leonid Vymenets User Interface for Touchscreen Device
US20090135147A1 (en) * 2007-11-27 2009-05-28 Wistron Corporation Input method and content displaying method for an electronic device, and applications thereof
US20090262379A1 (en) 2008-03-03 2009-10-22 Sharp Kabushiki Kaisha Image forming apparatus providing user support in sleep mode
US20090225023A1 (en) 2008-03-07 2009-09-10 Szolyga Thomas H Panel With Non-volatile Display Media
CN101251884A (en) 2008-03-14 2008-08-27 福建伊时代信息科技有限公司 Path password input method based on contacts
US20090278807A1 (en) 2008-05-12 2009-11-12 Sony Corporation Password input using touch duration code
US7593000B1 (en) 2008-05-17 2009-09-22 David H. Chin Touch-based authentication of a mobile device through user generated pattern creation
US9030418B2 (en) 2008-06-24 2015-05-12 Lg Electronics Inc. Mobile terminal capable of sensing proximity touch
US20090327975A1 (en) * 2008-06-27 2009-12-31 Stedman Roy W Multi-Touch Sorting Gesture
US20090322700A1 (en) 2008-06-30 2009-12-31 Tyco Electronics Corporation Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen
US20100020025A1 (en) 2008-07-25 2010-01-28 Intuilab Continuous recognition of multi-touch gestures
US20100031200A1 (en) 2008-07-30 2010-02-04 Arima Communications Corp. Method of inputting a hand-drawn pattern password
US20100031344A1 (en) 2008-08-01 2010-02-04 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Touch-screen based password input system and electronic device having same
US20100073303A1 (en) * 2008-09-24 2010-03-25 Compal Electronics, Inc. Method of operating a user interface
US20110175839A1 (en) 2008-09-24 2011-07-21 Koninklijke Philips Electronics N.V. User interface for a multi-point touch sensitive device
US20100115473A1 (en) 2008-10-31 2010-05-06 Sprint Communications Company L.P. Associating gestures on a touch screen with characters
US8445793B2 (en) 2008-12-08 2013-05-21 Apple Inc. Selective input signal rejection and modification
US20100162182A1 (en) 2008-12-23 2010-06-24 Samsung Electronics Co., Ltd. Method and apparatus for unlocking electronic appliance
US20100177660A1 (en) 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Wireless network devices for use in a wireless communication network
US20100194702A1 (en) * 2009-02-04 2010-08-05 Mstar Semiconductor Inc. Signal processing apparatus, signal processing method and selection method of uer interface icon for multi-touch panel
US20100245341A1 (en) 2009-03-30 2010-09-30 Brother Kogyo Kabushiki Kaisha Display device having non-volatile display unit driven with power supplied from battery
US20100245102A1 (en) 2009-03-31 2010-09-30 Brother Kogyo Kabushiki Kaisha Information display device
US9703392B2 (en) 2009-04-03 2017-07-11 Sony Corporation Methods and apparatus for receiving, converting into text, and verifying user gesture input from an information input device
US8458485B2 (en) 2009-06-17 2013-06-04 Microsoft Corporation Image-based unlock functionality on a computing device
US20100322485A1 (en) 2009-06-18 2010-12-23 Research In Motion Limited Graphical authentication
US20110041102A1 (en) 2009-08-11 2011-02-17 Jong Hwan Kim Mobile terminal and method for controlling the same
US8638939B1 (en) 2009-08-20 2014-01-28 Apple Inc. User authentication on an electronic device
US20120306793A1 (en) 2009-09-21 2012-12-06 Xiangtao Liu Electronic Device and Method, Cell Phone, Program to Achieve Preset Operation Command Thereof
US20110078568A1 (en) 2009-09-30 2011-03-31 Jin Woo Park Mobile terminal and method for controlling the same
US20120256863A1 (en) 2009-12-28 2012-10-11 Motorola, Inc. Methods for Associating Objects on a Touch Screen Using Input Gestures
US20110157375A1 (en) 2009-12-28 2011-06-30 Brother Kogyo Kabushiki Kaisha Display device and program for display device
US20110156867A1 (en) 2009-12-30 2011-06-30 Carlos Carrizo Gesture-based signature authentication
US20110242022A1 (en) * 2010-04-01 2011-10-06 Mstar Semiconductor, Inc. Touch Determining Method and Determining Method of Touch Gesture on a Touch Panel
US20110260829A1 (en) 2010-04-21 2011-10-27 Research In Motion Limited Method of providing security on a portable electronic device having a touch-sensitive display
US20110273388A1 (en) 2010-05-10 2011-11-10 Samsung Electronics Co., Ltd. Apparatus and method for receiving gesture-based input in a mobile device
US20110285645A1 (en) 2010-05-19 2011-11-24 Sunghyun Cho Mobile terminal and control method thereof
US8286102B1 (en) 2010-05-27 2012-10-09 Adobe Systems Incorporated System and method for image processing using multi-touch gestures
US20110320978A1 (en) 2010-06-29 2011-12-29 Horodezky Samuel J Method and apparatus for touchscreen gesture recognition overlay
US8255867B1 (en) 2010-07-29 2012-08-28 The Boeing Company Methods and systems for use in splitting wiring diagrams
EP2416308A1 (en) 2010-08-03 2012-02-08 Koninklijke Philips Electronics N.V. Display device
US20120066650A1 (en) 2010-09-10 2012-03-15 Motorola, Inc. Electronic Device and Method for Evaluating the Strength of a Gestural Password
US20130135178A1 (en) * 2010-09-27 2013-05-30 Nec Corporation Information processing terminal and control method thereof
US8536978B2 (en) 2010-11-19 2013-09-17 Blackberry Limited Detection of duress condition at a communication device
US20130268900A1 (en) 2010-12-22 2013-10-10 Bran Ferren Touch sensor gesture recognition for operation of mobile devices
US8686958B2 (en) 2011-01-04 2014-04-01 Lenovo (Singapore) Pte. Ltd. Apparatus and method for gesture input in a dynamically zoned environment
US20120184368A1 (en) * 2011-01-19 2012-07-19 Konami Digital Entertainment Co., Ltd. Gaming device and recording medium
US20120206474A1 (en) 2011-02-14 2012-08-16 Holland Peter F Blend Equation
US20130033436A1 (en) * 2011-02-17 2013-02-07 Htc Corporation Electronic device, controlling method thereof and computer program product
US20120291120A1 (en) 2011-05-09 2012-11-15 Research In Motion Limited Touchscreen password entry
US20140223381A1 (en) 2011-05-23 2014-08-07 Microsoft Corporation Invisible control
US20140123080A1 (en) 2011-06-07 2014-05-01 Beijing Lenovo Software Ltd. Electrical Device, Touch Input Method And Control Method
US8823642B2 (en) 2011-07-04 2014-09-02 3Divi Company Methods and systems for controlling devices using gestures and related 3D sensor
US20130057070A1 (en) 2011-09-01 2013-03-07 Seiko Epson Corporation Circuit device, electronic apparatus, and ic card
US9262603B2 (en) 2011-10-21 2016-02-16 International Business Machines Corporation Advanced authentication technology for computing devices
US20130104065A1 (en) 2011-10-21 2013-04-25 International Business Machines Corporation Controlling interactions via overlaid windows
US9983664B2 (en) * 2011-11-16 2018-05-29 Samsung Electronics Co., Ltd. Mobile device for executing multiple applications and method for same
CN102592524A (en) 2011-12-30 2012-07-18 鸿富锦精密工业(深圳)有限公司 Electronic tag and content updating method and content updating system thereof
US20150029095A1 (en) 2012-01-09 2015-01-29 Movea Command of a device by gesture emulation of touch gestures
US20150046885A1 (en) 2012-02-23 2015-02-12 Zte Corporation Method and device for unlocking touch screen
US20130227496A1 (en) 2012-02-29 2013-08-29 Fuji Xerox Co., Ltd. Image processing device, non-transitory computer readable medium, and image processing method
US20130241844A1 (en) 2012-03-13 2013-09-19 Yao-Tsung Chang Method of Touch Command Integration and Touch System Using the Same
US20130298071A1 (en) 2012-05-02 2013-11-07 Jonathan WINE Finger text-entry overlay
US20150072784A1 (en) 2012-06-08 2015-03-12 Intellectual Discovery Co., Ltd. Method and apparatus for controlling character by inputting pattern
US8824040B1 (en) 2012-07-03 2014-09-02 Brian K. Buchheit Enhancing low light usability of electrophoretic displays
US20140026055A1 (en) 2012-07-20 2014-01-23 Barnesandnoble.Com Llc Accessible Reading Mode Techniques For Electronic Devices
US20140035853A1 (en) 2012-08-06 2014-02-06 Samsung Electronics Co., Ltd. Method and apparatus for providing user interaction based on multi touch finger gesture
US9001061B2 (en) 2012-09-14 2015-04-07 Lenovo (Singapore) Pte. Ltd. Object movement on small display screens
US20140092031A1 (en) 2012-09-28 2014-04-03 Synaptics Incorporated System and method for low power input object detection and interaction
US20140109018A1 (en) 2012-10-12 2014-04-17 Apple Inc. Gesture entry techniques
US20150331399A1 (en) * 2012-11-15 2015-11-19 Keba Ag Method for the secure and intentional activation of functions and/or movements of controllable industrial equipment
US20140143859A1 (en) 2012-11-16 2014-05-22 Mario Linge Unlock touch screen using touch password
US20140149922A1 (en) * 2012-11-29 2014-05-29 Jasper Reid Hauser Infinite Bi-Directional Scrolling
US20140173529A1 (en) * 2012-12-14 2014-06-19 Barnesandnoble.Com Llc Circular gesture for touch sensitive ui control feature
US20140189855A1 (en) 2012-12-31 2014-07-03 Conduit, Ltd. Gestures for Unlocking a Mobile Device
US9600103B1 (en) * 2012-12-31 2017-03-21 Allscripts Software, Llc Method for ensuring use intentions of a touch screen device
US10452259B1 (en) * 2012-12-31 2019-10-22 Allscripts Software, Llc Method for ensuring use intentions of a touch screen device
US9165159B1 (en) 2013-01-10 2015-10-20 Marvell International Ltd. Encryption based on touch gesture
US20140223549A1 (en) 2013-02-07 2014-08-07 Dell Products L. P. Passwords for Touch-Based Platforms Using Time-Based Finger Taps
US20140245203A1 (en) * 2013-02-26 2014-08-28 Samsung Electronics Co., Ltd. Portable device and method for operating multi-application thereof
US20140277753A1 (en) * 2013-03-12 2014-09-18 Trane International Inc. Events Management
US20140267015A1 (en) 2013-03-15 2014-09-18 Cellco Partnership D/B/A Verizon Wireless Non-volatile display accessory controlled and powered by a mobile device
CN203204640U (en) 2013-03-22 2013-09-18 上海斐讯数据通信技术有限公司 Electronic label system, reader and electronic-label information processing system
US20140298237A1 (en) 2013-03-27 2014-10-02 Texas Instruments Incorporated Radial Based User Interface on Touch Sensitive Screen
US20140372896A1 (en) 2013-06-14 2014-12-18 Microsoft Corporation User-defined shortcuts for actions above the lock screen
US20150007308A1 (en) 2013-07-01 2015-01-01 Blackberry Limited Password by touch-less gesture
WO2015012789A1 (en) 2013-07-22 2015-01-29 Hewlett-Packard Development Company, L.P. Multi-region touchpad
US20150038072A1 (en) * 2013-08-01 2015-02-05 Mattel, Inc. Bidirectional Communication between an Infant Receiving System and a Remote Device
US20150067578A1 (en) 2013-09-04 2015-03-05 Samsung Electronics Co., Ltd Apparatus and method for executing function in electronic device
US9189614B2 (en) 2013-09-23 2015-11-17 GlobalFoundries, Inc. Password entry for double sided multi-touch display
US20150121314A1 (en) 2013-10-24 2015-04-30 Jens Bombolowsky Two-finger gestures
US20150220182A1 (en) * 2013-11-07 2015-08-06 Daniel Avrahami Controlling primary and secondary displays from a single touchscreen
US20150135129A1 (en) 2013-11-13 2015-05-14 Samsung Electronics Co., Ltd. Electronic device having touchscreen and input processing method thereof
US20150138142A1 (en) * 2013-11-15 2015-05-21 Mediatek Inc. Method for performing touch communications control of an electronic device by using location detection with aid of touch panel, and an associated apparatus
US20150153932A1 (en) 2013-12-04 2015-06-04 Samsung Electronics Co., Ltd. Mobile device and method of displaying icon thereof
US9460575B2 (en) 2013-12-05 2016-10-04 Lg Electronics Inc. Vehicle control apparatus and method thereof
US20150169216A1 (en) 2013-12-13 2015-06-18 Samsung Electronics Co., Ltd. Method of controlling screen of portable electronic device
US20150169502A1 (en) 2013-12-16 2015-06-18 Microsoft Corporation Touch-based reorganization of page element
US20150169141A1 (en) 2013-12-16 2015-06-18 Samsung Electronics Co., Ltd. Method for controlling screen and electronic device thereof
US20150188970A1 (en) 2013-12-31 2015-07-02 Personify, Inc. Methods and Systems for Presenting Personas According to a Common Cross-Client Configuration
US20150227943A1 (en) * 2014-02-07 2015-08-13 Qvolve, LLC System and Method for Documenting Regulatory Compliance
US10320789B1 (en) * 2014-03-26 2019-06-11 Actioneer, Inc. Fast and secure way to fetch or post data and display it temporarily to a user
US20150294096A1 (en) 2014-04-10 2015-10-15 Bank Of America Corporation Rhythm-based user authentication
US20170090463A1 (en) * 2014-05-08 2017-03-30 Beet, Llc Automation interface
US20150355805A1 (en) * 2014-06-04 2015-12-10 Quantum Interface, Llc Dynamic environment for object and attribute display and interaction
US20150365492A1 (en) 2014-06-13 2015-12-17 Rockwell Automation Technologies, Inc. Systems and methods for adapting a user interface based on a profile
US20160054851A1 (en) * 2014-08-22 2016-02-25 Samsung Electronics Co., Ltd. Electronic device and method for providing input interface
US20190095075A1 (en) 2015-06-11 2019-03-28 Beijing Kingsoft Internet Security Software Co. Ltd. Method and apparatus for setting background picture of unlocking interface of application, and electronic device
US9357391B1 (en) 2015-06-25 2016-05-31 International Business Machines Corporation Unlocking electronic devices with touchscreen input gestures
US20170039691A1 (en) 2015-08-05 2017-02-09 Ricoh Company, Ltd. Image processing apparatus, image processing method and storage medium
US20170230378A1 (en) * 2016-02-08 2017-08-10 Rockwell Automation Technologies, Inc. Beacon-based industrial automation access authorization
US20190174069A1 (en) * 2016-03-18 2019-06-06 Kenneth L. Poindexter, JR. System and Method for Autonomously Recording a Visual Media
US20180004386A1 (en) * 2016-06-30 2018-01-04 Microsoft Technology Licensing, Llc Pre-touch sensing for mobile interaction
US20190005958A1 (en) * 2016-08-17 2019-01-03 Panasonic Intellectual Property Management Co., Ltd. Voice input device, translation device, voice input method, and recording medium
US20180267690A1 (en) * 2017-03-20 2018-09-20 Georgia Tech Research Corporation Control system for a mobile manipulation device

Non-Patent Citations (18)

* Cited by examiner, † Cited by third party
Title
Copending U.S. Appl. No. 15/145,073, filed May 3, 2016, and the prosecution history thereof.
Copending U.S. Appl. No. 15/145,087, filed May 3, 2016, and the prosecution history thereof.
Decision on Rejection for Chinese Application No. 201480080514.0, dated Sep. 30, 2019.
European Search Report and Opinion issued in connection with related EP Application No. 16168865.0 dated Jul. 12, 2016.
European Search Report and Opinion issued in connection with related EP Application No. 16168865.0 dated Oct. 17, 2016.
Final Office Action issued in connection with related U.S. Appl. No. 14/713,467 dated Apr. 19, 2017.
First Office Action for Chinese Application No. 201480080514.0, dated Jun. 29, 2018.
International Preliminary Report on Patentability for Application No. PCT/US2014/069247, dated Jan. 10, 2017.
International Search Report and Written Opinion for Application No. PCT/US2014/069247, dated Jun. 23, 2015.
Jiao, et al., "An Investigation of Two-Handled Manipulation and Related Techniques in Multi-touch Interaction", Machine Vision and Human-Machine Interface (MVHI), 2010, 565-568.
Lee, et al., "Access to an Automated Security System Using Gesture-Based Passwords", Network-Based Information Systems (NBiS), 2012 15 International Conference, 2012, 760-765.
Niu, Yuan et al., "Gesture Authentication with Touch Input for Mobile Devices," Third International ICST Conference, MobiSec 2011, Aalborg, Denmark, May 17-19, 2011, pp. 13-24.
Non-Final Office Action issued in connection with related U.S. Appl. No. 14/713,467 dated Oct. 4, 2016.
Office Action, European patent application No. 14824987.3, dated Jul. 9, 2019.
Sae-Bae, et al., "Multitouch Gesture-Based Authentication", Information Forensics and Security, IEEE Transactions, 2014, 568-582.
Third Office Action for Chinese Application No. 201480080514.0, dated Apr. 12, 2019.
Tsagaris, et al., "Methodology for finger gesture control of mechatronic systems", MECHATRONIKA, 2012, 1-6.
Wang, et al., "VirtualTouch: A finger glove to simulate touch screen commands", Sensors, 2012 IEEE, 2012, 1-4.

Also Published As

Publication number Publication date
US20170322721A1 (en) 2017-11-09

Similar Documents

Publication Publication Date Title
CN100549879C (en) Hybrid user interface with basic presentation information of the outstanding side information of belt variable
EP2713257B1 (en) Touch-enabled complex data entry
EP3000013B1 (en) Interactive multi-touch remote control
CN102239451B (en) User interface for a portable communicator for use in a process control environment
US20170323092A1 (en) Method and system of using spatially-defined and pattern-defined gesturing passwords
EP2699972B1 (en) Method and system for controlling an industrial system
EP3026813B1 (en) Frequency converter
EP3036615B1 (en) Dynamic contextual menu for touch-sensitive devices
US11567571B2 (en) Remote control of a device via a virtual interface
US11079915B2 (en) System and method of using multiple touch inputs for controller interaction in industrial control systems
EP1770461A2 (en) System and method for identifying particularized equipment information of interest to varied users in an industrial automation environment
US20160328133A1 (en) Systems and Methods for Controlling Power Generation Plant Operations via a Human-Machine Interface
WO2020146145A1 (en) Techniques for multi-finger typing in mixed-reality
EP2076823A1 (en) Data structure&associated method for automation control system management
JP2017525000A (en) Operating device and control system
US20140132555A1 (en) Method for making secure a control on a visualization device with a tactile surface, and associated system
US20100083110A1 (en) Human-machine interface having multiple touch display navigation capabilities
US20150338837A1 (en) Method and device for managing and configuring field devices in an automation installation
US10088822B2 (en) Method for actuating a safe switching element of an installation
US10845987B2 (en) System and method of using touch interaction based on location of touch on a touch screen
EP2710435A1 (en) System, method, work station and computer program product for controlling an industrial process
US20160085367A1 (en) Systeme d'entree de donnee multimode
US20160085227A1 (en) Device for managing and configuring field devices in an automation installation
US7437337B2 (en) Intuitive and reliable control of operator inputs in software components
JP2015197884A (en) Information terminal and equipment operation method

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:THAKUR, PAVAN KUMAR SINGH;JINKA, JAGADEESH;GUTTIKONDA, CHAITHANYA;AND OTHERS;SIGNING DATES FROM 20160502 TO 20160505;REEL/FRAME:038550/0707

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: INTELLIGENT PLATFORMS, LLC, VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC COMPANY;REEL/FRAME:049309/0210

Effective date: 20190131

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4