US20130162603A1 - Electronic device and touch input control method thereof - Google Patents
Electronic device and touch input control method thereof Download PDFInfo
- Publication number
- US20130162603A1 US20130162603A1 US13/714,313 US201213714313A US2013162603A1 US 20130162603 A1 US20130162603 A1 US 20130162603A1 US 201213714313 A US201213714313 A US 201213714313A US 2013162603 A1 US2013162603 A1 US 2013162603A1
- Authority
- US
- United States
- Prior art keywords
- touch
- coordinates
- finger
- user
- touched area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 210000003811 finger Anatomy 0.000 description 36
- 238000010586 diagram Methods 0.000 description 4
- 210000005224 forefinger Anatomy 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
Definitions
- the present disclosure relates to an electronic device and a touch input control method thereof.
- Electronic devices with touch screens mobile phones, digital photo frames, electronic readers (e-reader), for example, are popular.
- the user inputs information by touching objects such as icons and virtual keyboard displayed on the touch screen.
- objects displayed on the touch screen associates with a predefined touch coordinates.
- the electronic device detects the coordinates of the touched portion, compares and analyzes the coordinates touched with the predefined coordinates of the object, so determining the object that the user has touched.
- FIG. 1 is a block diagram of an electronic device in accordance with an exemplary embodiment.
- FIG. 2 is a schematic diagram showing the portions touched by a user, displayed on the electronic device of FIG. 1 .
- FIGS. 3( a ) and 3 ( b ) are schematic diagrams showing the portions touched by a user, displayed on the electronic device of FIG. 1 .
- FIG. 4 is a flowchart of a touch input control method for electronic devices, such as the one of FIG. 1 , in accordance with the exemplary embodiments.
- FIG. 1 is an exemplary embodiment of a block diagram of an electronic device 100 .
- the electronic device 100 can calibrate the touch position for the user, for the user to be deemed to have accurately touched the objects displayed on the touch screen.
- the electronic device 100 is a mobile terminal with a touch screen, such as a mobile phone.
- the electronic device 100 can be other electronic devices with touch screens, such as an electronic reader, a tablet, or a digital phone frame, for example.
- the electronic device 100 includes a storage unit 10 , a touch screen 20 , and a processor 30 .
- the touch screen 20 generates signals in response to user touches. The user can activate the touch calibration function, do a touch calibration test, and touch the objects displayed on the touch screen 20 .
- the storage unit 10 stores a calibration database 12 recording touch calibration data for the fingers of a number of users.
- the touch calibration data of each finger includes the size of the touched area, shape of the touched area, the touch offset direction, and the touch offset distance, when a finger is attempting to touch a single object.
- the touch offset direction is the offset direction of the touched point relative to the point having the predefined coordinates of the object which the user intends to touch.
- the touch offset distance is the offset distance between the touched point and the point having the predefined coordinates of the object which the user intends to touch.
- a navigation interface is displayed on the touch screen 20 for prompting the user to do a touch calibration test for the finger or fingers of that user.
- the electronic device 100 also stores the touch calibration data of the user generated during the touch calibration process, in the calibration database 12 .
- a user may be in the habit of using a particular finger or several particular fingers for inputting information via the touch screen 20 .
- the size of the touched area, shape of the touched area, the touch offset direction, and the touch offset distance are expected to be substantially the same.
- the electronic device 100 prompts the user to do the test using his/her commonly used finger(s) for touch input. If deviation exists when the user using a same finger touches different objects during the test, the electronic device 100 calculates the average value of each type of the calibration data respectively, and stores the calculated value of each type of the calibration data in the calibration database 12 .
- the processor 30 includes a figure identification module 31 , a calculation module 32 , a determination module 33 , a touch control module 34 , and a display control module 35 .
- the display control module 35 controls the touch screen 20 to display an interface including a number of objects. Each object is associated with predetermined touch coordinates. If the coordinates actually touched are the same as the predetermined coordinates associated with an object, then the function or process corresponding to the object is activated.
- the object can be a virtual key, or a touch icon or the like.
- the calculation module 32 calculates the coordinates of the actual touch according to the signals transmitted from the touch screen 20 .
- the determination module 33 determines whether the coordinates of the actual touch is the same as the predetermined touch coordinates of one of the objects displayed on the touch screen 20 . If the coordinates of the actual touch is the same as the predetermined touch coordinates of one of the objects, the determination module 33 allows a normal signal. If the coordinates of the actual touch does not match any of the predetermined touch coordinates of the objects, the determination module 33 creates an adjustment signal and transmits the adjustment signal to the touch control module 34 , the figure identification module 31 , and to the calculation module 32 .
- the figure identification module 31 identifies the shape of the portion of the finger which actually makes contact with the touch screen 20 (shape of the touched area) according to the signals transmitted from the touch screen 20 , and the calculation module 32 further calculates the size of the touched area.
- the calculation module 32 calculates the size of the touched area according to the resolution of the touch screen 20 , and the size of the touch screen 20 .
- the size of the touch screen 20 is pre-stored in the storage unit 10 .
- the determination module 33 further determines which finger is used by the user in touching the touch screen 20 according to the size of the touched area, shape of the touched area, and the touch calibration data of the user's commonly used finger, as recorded in the calibration database 12 .
- the determination module 33 also retrieves the touch offset direction and the touch offset distance of the finger of the user so determined from the calibration database 12 , and transmits the retrieved data to the touch control module 34 .
- the touch control module 34 processes the coordinates of the touch of the user according to the normal touch signal and the adjustment signal transmitted from the determination module 33 , so as to determine the object which the user intended to touch. In this embodiment, if a normal signal is received, the touch control module 34 determines that the object associated with the coordinates of the touch is the touched object. If the adjustment signal is received, the touch control module 34 executes the touch coordinates compensation (as hereinafter explained) for the actual touch, and determines the touched object accordingly.
- the touch control module 34 executes the touch coordinates compensation for the touch according to the touch offset direction and the touch offset distance of the determined finger, retrieved by the determination module 33 .
- the touch control module 34 applies compensation equal to the total displacement, that is, the touch coordinates compensation is (x0-1, y0-1).
- the touch control module 34 applies compensation equal to coordinates is (x0-1, y1).
- FIG. 2 shows an image 21 of the shape of the touched area of the thumb of a user, and an image 22 of the shape of the touched area of the forefinger of a user.
- the image of the shape of the touched area of the finger is the image on the touch screen 20 of FIG. 2 which reflects the actual touched portion of the touch screen 20 , by the finger of the user.
- FIG. 3( a ) shows an image of the shape of the touched areas of the right forefinger of a user when the key A, the key S, and the key Z are touched by the right forefinger of the user.
- the coordinates of the portions actually touched match the predetermined touch coordinates of the key A, the key S, and the key Z individually. In other words, the coordinates of the touch are exactly the predetermined touch coordinates; there is no offset in the touch.
- FIG. 3( b ) shows a different situation, when a user attempts to touch the key A, the key S, and the key Z.
- the touched coordinates do not match the predetermined touch coordinates of the key A, the key S, and the key Z respectively, and compensation has been applied, the actual touches have touch offset direction and touch offset distance relative to the predetermined touch coordinates.
- the touch control module 34 will compensate the touched coordinates of the touch with the touched coordinates as shown in FIG. 3( a ) according to the touch offset direction and touch offset distance of the finger of the user, as recorded in the calibration database 12 .
- the storage unit 10 further stores a calibration interface including a number of objects, such as virtual keys, touch icons and the like.
- the display control module 35 controls the touch screen 20 to display a dialog box to invite the user to do the touch calibration test. If the user selects to do the test, the display control module 35 further controls the touch screen 20 to display the calibration interface stored in the storage unit 10 , and controls a pop up dialog box to prompt the user to do the test in respect of the highlighted object. If the user confirms the test, eg. by selecting the icon “OK” displayed on the dialog box, the display control module 35 controls the objects displayed on the calibration interface to be highlighted in sequence according to a predetermined order.
- the touch screen 20 When the user touches the highlighted objects in sequence, the touch screen 20 generates signals in response to the touches on the object accordingly.
- the figure determination module 31 identifies and stores the image of the shape of the touched area of the finger of a user in the calibration database 12 .
- the calculation module 32 calculates the size of the touched area of the finger, and calculates the coordinates of the touch according to predefined arithmetic and the signals caused by the touch.
- the calculation module 32 further compares the coordinates of the actual touch and the predefined touch coordinates of the object, so as to determine the touch offset direction and the touch offset distance of the finger.
- the calculation module 32 further stores the calculated size of the shape of the touched area, the touch offset direction, the touch offset distance of the finger, in the calibration database 12 .
- the display control module 35 highlights the next object for guiding the user to progress through the test, until the user completes the test for all the objects displayed on the calibration interface. All of the calibration data of the user is stored in the calibration database 12 .
- FIG. 4 shows a flowchart of a touch input control method of the electronic device 100 of FIG. 1 .
- the electronic device 100 includes a touch screen and a storage unit.
- the touch screen generates signals in response to the user touches.
- the storage unit stores a calibration database 12 recording touch calibration data for fingers of a number of users.
- the touch calibration data of each finger includes the shape of the screen area in contact with a finger, the size of the touched area, the touch offset direction, and the touch offset distance, when the finger touches a single object.
- the touch offset direction is the offset direction of the touched point relative to the point having the predefined coordinates of the object which the user intends to touch.
- the touch offset distance is the offset distance between the touched point and the point having the predefined coordinates of the object which the user intends to touch.
- the method includes the following steps, each of which is related to the various components contained in the electronic device 100 :
- step S 41 the display control module 35 controls the touch screen 20 to display an interface including a number of objects according to the command of a user.
- Each object is associated with predetermined touch coordinates.
- the object is a virtual key, or a touch icon or the like.
- step S 42 the touch screen 20 generates signals in response to a touch on the touch screen 20 which is an attempt to touch an object.
- step S 43 the calculation module 32 calculates coordinates of the actual touched portion on the touch screen 20 according to the generated signals.
- step S 44 the determination module 33 determines whether the coordinates of the touch match the predetermined touch coordinates of one of the objects displayed on the touch screen 20 . If yes, the process ends, otherwise, the process goes to step S 45 .
- step S 45 the determination module 33 creates an adjustment signal (touch compensation signal) and transmits the touch compensation signal to the touch control module 34 , the figure identification module 31 , and the calculation module 32 .
- step S 46 the figure identification module 31 identifies the shape of the touched area on the touch screen 20 by the finger, and the calculation module 32 calculates the size of the touched area according to the signals from the touch screen 20 .
- the calculation module 32 calculates the size of the touched area according to the resolution of the touch screen 20 , and the size of the touch screen 20 .
- step S 47 the determination module 33 determines which finger it is which does the actual touching according to the shape of the touched area, the size of the touched area, and the touch calibration data of the user's commonly used finger, recorded in the calibration database 12 , and retrieves the touch offset direction and the touch offset distance of the determined finger of the user from the calibration database 12 .
- step S 48 the touch control module 34 processes the coordinates of the touch of the user according to the retrieved touch offset direction and the retrieved touch offset distance of the determined finger, so as to determine the object which the user intended to touch.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- 1. Technical Field
- The present disclosure relates to an electronic device and a touch input control method thereof.
- 2. Description of Related Art
- Electronic devices with touch screens, mobile phones, digital photo frames, electronic readers (e-reader), for example, are popular. The user inputs information by touching objects such as icons and virtual keyboard displayed on the touch screen. Usually, each of the objects displayed on the touch screen associates with a predefined touch coordinates. When the user touches the touch screen, the electronic device detects the coordinates of the touched portion, compares and analyzes the coordinates touched with the predefined coordinates of the object, so determining the object that the user has touched. However, due to the different touch habits, such as individual parallax, different fingers used for touching, different manners and different orientations for gripping the devices, when the user touches an object displayed on the touch screen, more or less deviation occurs between the coordinates touched and the predetermined coordinates of the object, which results in wrong determinations and input errors.
- Therefore, what is needed is an electronic device and a touch input control method thereof to alleviate the limitations described above.
- The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding sections throughout the several views.
-
FIG. 1 is a block diagram of an electronic device in accordance with an exemplary embodiment. -
FIG. 2 is a schematic diagram showing the portions touched by a user, displayed on the electronic device ofFIG. 1 . -
FIGS. 3( a) and 3(b) are schematic diagrams showing the portions touched by a user, displayed on the electronic device ofFIG. 1 . -
FIG. 4 is a flowchart of a touch input control method for electronic devices, such as the one ofFIG. 1 , in accordance with the exemplary embodiments. -
FIG. 1 is an exemplary embodiment of a block diagram of anelectronic device 100. Theelectronic device 100 can calibrate the touch position for the user, for the user to be deemed to have accurately touched the objects displayed on the touch screen. Theelectronic device 100 is a mobile terminal with a touch screen, such as a mobile phone. In alternative embodiments, theelectronic device 100 can be other electronic devices with touch screens, such as an electronic reader, a tablet, or a digital phone frame, for example. - The
electronic device 100 includes astorage unit 10, atouch screen 20, and aprocessor 30. Thetouch screen 20 generates signals in response to user touches. The user can activate the touch calibration function, do a touch calibration test, and touch the objects displayed on thetouch screen 20. - The
storage unit 10 stores acalibration database 12 recording touch calibration data for the fingers of a number of users. The touch calibration data of each finger includes the size of the touched area, shape of the touched area, the touch offset direction, and the touch offset distance, when a finger is attempting to touch a single object. The touch offset direction is the offset direction of the touched point relative to the point having the predefined coordinates of the object which the user intends to touch. The touch offset distance is the offset distance between the touched point and the point having the predefined coordinates of the object which the user intends to touch. - When the user activates the touch calibration function of the
electronic device 100, a navigation interface is displayed on thetouch screen 20 for prompting the user to do a touch calibration test for the finger or fingers of that user. Theelectronic device 100 also stores the touch calibration data of the user generated during the touch calibration process, in thecalibration database 12. A user may be in the habit of using a particular finger or several particular fingers for inputting information via thetouch screen 20. And, moreover, when a particular user using a same finger touches each object displayed on thetouch screen 20, the size of the touched area, shape of the touched area, the touch offset direction, and the touch offset distance are expected to be substantially the same. Thus, before the touch calibration test, theelectronic device 100 prompts the user to do the test using his/her commonly used finger(s) for touch input. If deviation exists when the user using a same finger touches different objects during the test, theelectronic device 100 calculates the average value of each type of the calibration data respectively, and stores the calculated value of each type of the calibration data in thecalibration database 12. - The
processor 30 includes afigure identification module 31, acalculation module 32, adetermination module 33, atouch control module 34, and adisplay control module 35. - The
display control module 35 controls thetouch screen 20 to display an interface including a number of objects. Each object is associated with predetermined touch coordinates. If the coordinates actually touched are the same as the predetermined coordinates associated with an object, then the function or process corresponding to the object is activated. In the embodiment, the object can be a virtual key, or a touch icon or the like. - The
calculation module 32 calculates the coordinates of the actual touch according to the signals transmitted from thetouch screen 20. - The
determination module 33 determines whether the coordinates of the actual touch is the same as the predetermined touch coordinates of one of the objects displayed on thetouch screen 20. If the coordinates of the actual touch is the same as the predetermined touch coordinates of one of the objects, thedetermination module 33 allows a normal signal. If the coordinates of the actual touch does not match any of the predetermined touch coordinates of the objects, thedetermination module 33 creates an adjustment signal and transmits the adjustment signal to thetouch control module 34, thefigure identification module 31, and to thecalculation module 32. - When the adjustment signal transmitted from the
determination module 33 is received, thefigure identification module 31 identifies the shape of the portion of the finger which actually makes contact with the touch screen 20 (shape of the touched area) according to the signals transmitted from thetouch screen 20, and thecalculation module 32 further calculates the size of the touched area. In the embodiment, thecalculation module 32 calculates the size of the touched area according to the resolution of thetouch screen 20, and the size of thetouch screen 20. The size of thetouch screen 20 is pre-stored in thestorage unit 10. Thedetermination module 33 further determines which finger is used by the user in touching thetouch screen 20 according to the size of the touched area, shape of the touched area, and the touch calibration data of the user's commonly used finger, as recorded in thecalibration database 12. Thedetermination module 33 also retrieves the touch offset direction and the touch offset distance of the finger of the user so determined from thecalibration database 12, and transmits the retrieved data to thetouch control module 34. - The
touch control module 34 processes the coordinates of the touch of the user according to the normal touch signal and the adjustment signal transmitted from thedetermination module 33, so as to determine the object which the user intended to touch. In this embodiment, if a normal signal is received, thetouch control module 34 determines that the object associated with the coordinates of the touch is the touched object. If the adjustment signal is received, thetouch control module 34 executes the touch coordinates compensation (as hereinafter explained) for the actual touch, and determines the touched object accordingly. - In this embodiment, the
touch control module 34 executes the touch coordinates compensation for the touch according to the touch offset direction and the touch offset distance of the determined finger, retrieved by thedetermination module 33. For example, assuming the coordinates of the touch is (x0, y0), and the touch offset direction retrieved by thedetermination module 33 is precisely southwest of the predetermined touch coordinates (where the touch screen is upright and due north is vertically upwards) and the touch offset distance retrieved by thedetermination module 33 is one unit length leftward in the horizontal direction and one unit length downward in the vertical direction relative to the predetermined touch coordinates, thetouch control module 34 applies compensation equal to the total displacement, that is, the touch coordinates compensation is (x0-1, y0-1). For another example, assuming the coordinates of the actual touch are (x+1, y+1), the touch offset direction retrieved by thedetermination module 33 is precisely northeast relative to the predetermined touch coordinates, and the touch offset distance retrieved by thedetermination module 33 is one unit length rightward in the horizontal direction and one unit length upward in the vertical direction relative to the predetermined touch coordinates, thetouch control module 34 applies compensation equal to coordinates is (x0-1, y1). -
FIG. 2 shows animage 21 of the shape of the touched area of the thumb of a user, and animage 22 of the shape of the touched area of the forefinger of a user. In the embodiment, the image of the shape of the touched area of the finger is the image on thetouch screen 20 ofFIG. 2 which reflects the actual touched portion of thetouch screen 20, by the finger of the user. -
FIG. 3( a) shows an image of the shape of the touched areas of the right forefinger of a user when the key A, the key S, and the key Z are touched by the right forefinger of the user. The coordinates of the portions actually touched match the predetermined touch coordinates of the key A, the key S, and the key Z individually. In other words, the coordinates of the touch are exactly the predetermined touch coordinates; there is no offset in the touch.FIG. 3( b) shows a different situation, when a user attempts to touch the key A, the key S, and the key Z. The touched coordinates do not match the predetermined touch coordinates of the key A, the key S, and the key Z respectively, and compensation has been applied, the actual touches have touch offset direction and touch offset distance relative to the predetermined touch coordinates. Under this condition, thetouch control module 34 will compensate the touched coordinates of the touch with the touched coordinates as shown inFIG. 3( a) according to the touch offset direction and touch offset distance of the finger of the user, as recorded in thecalibration database 12. - In the embodiment, the
storage unit 10 further stores a calibration interface including a number of objects, such as virtual keys, touch icons and the like. When the touch calibration function of theelectronic device 100 is activated by the user, thedisplay control module 35 controls thetouch screen 20 to display a dialog box to invite the user to do the touch calibration test. If the user selects to do the test, thedisplay control module 35 further controls thetouch screen 20 to display the calibration interface stored in thestorage unit 10, and controls a pop up dialog box to prompt the user to do the test in respect of the highlighted object. If the user confirms the test, eg. by selecting the icon “OK” displayed on the dialog box, thedisplay control module 35 controls the objects displayed on the calibration interface to be highlighted in sequence according to a predetermined order. When the user touches the highlighted objects in sequence, thetouch screen 20 generates signals in response to the touches on the object accordingly. Thefigure determination module 31 identifies and stores the image of the shape of the touched area of the finger of a user in thecalibration database 12. Thecalculation module 32 calculates the size of the touched area of the finger, and calculates the coordinates of the touch according to predefined arithmetic and the signals caused by the touch. Thecalculation module 32 further compares the coordinates of the actual touch and the predefined touch coordinates of the object, so as to determine the touch offset direction and the touch offset distance of the finger. Thecalculation module 32 further stores the calculated size of the shape of the touched area, the touch offset direction, the touch offset distance of the finger, in thecalibration database 12. Thedisplay control module 35 highlights the next object for guiding the user to progress through the test, until the user completes the test for all the objects displayed on the calibration interface. All of the calibration data of the user is stored in thecalibration database 12. -
FIG. 4 shows a flowchart of a touch input control method of theelectronic device 100 ofFIG. 1 . Theelectronic device 100 includes a touch screen and a storage unit. The touch screen generates signals in response to the user touches. The storage unit stores acalibration database 12 recording touch calibration data for fingers of a number of users. The touch calibration data of each finger includes the shape of the screen area in contact with a finger, the size of the touched area, the touch offset direction, and the touch offset distance, when the finger touches a single object. The touch offset direction is the offset direction of the touched point relative to the point having the predefined coordinates of the object which the user intends to touch. The touch offset distance is the offset distance between the touched point and the point having the predefined coordinates of the object which the user intends to touch. The method includes the following steps, each of which is related to the various components contained in the electronic device 100: - In step S41, the
display control module 35 controls thetouch screen 20 to display an interface including a number of objects according to the command of a user. Each object is associated with predetermined touch coordinates. The object is a virtual key, or a touch icon or the like. - In step S42, the
touch screen 20 generates signals in response to a touch on thetouch screen 20 which is an attempt to touch an object. - In step S43, the
calculation module 32 calculates coordinates of the actual touched portion on thetouch screen 20 according to the generated signals. - In step S44, the
determination module 33 determines whether the coordinates of the touch match the predetermined touch coordinates of one of the objects displayed on thetouch screen 20. If yes, the process ends, otherwise, the process goes to step S45. - In step S45, the
determination module 33 creates an adjustment signal (touch compensation signal) and transmits the touch compensation signal to thetouch control module 34, thefigure identification module 31, and thecalculation module 32. - In step S46, the
figure identification module 31 identifies the shape of the touched area on thetouch screen 20 by the finger, and thecalculation module 32 calculates the size of the touched area according to the signals from thetouch screen 20. In the embodiment, thecalculation module 32 calculates the size of the touched area according to the resolution of thetouch screen 20, and the size of thetouch screen 20. - In step S47, the
determination module 33 determines which finger it is which does the actual touching according to the shape of the touched area, the size of the touched area, and the touch calibration data of the user's commonly used finger, recorded in thecalibration database 12, and retrieves the touch offset direction and the touch offset distance of the determined finger of the user from thecalibration database 12. - In step S48, the
touch control module 34 processes the coordinates of the touch of the user according to the retrieved touch offset direction and the retrieved touch offset distance of the determined finger, so as to determine the object which the user intended to touch. - With such a configuration, after the interface is displayed on the
touch screen 20 in response to the operation of the user, if the coordinates of the touch by the user do not match with a predetermined touch coordinates of one of the objects displayed on thetouch screen 20, an adjustment signal is created, and the finger of the user touching thetouch screen 20 is determined according to the shape of the touched area of the finger, the size of the touched area, and the touch calibration data of the user's commonly used finger recorded in thecalibration database 12. The touch offset direction and the touch offset distance of the determined finger of the user are also retrieved form thecalibration database 12, so as to apply compensation to the coordinates of the touch and to determine the touched object accordingly. Thus, the reliability and accuracy of the touch input is greatly improved. - Although the present disclosure has been specifically described on the basis of the embodiments thereof, the disclosure is not to be construed as being limited thereto. Various changes or modifications may be made to the embodiments without departing from the scope and spirit of the disclosure.
Claims (13)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110444042.0 | 2011-12-27 | ||
CN201110444042.0A CN103186329B (en) | 2011-12-27 | 2011-12-27 | Electronic equipment and its touch input control method |
CN201110444042 | 2011-12-27 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130162603A1 true US20130162603A1 (en) | 2013-06-27 |
US9182846B2 US9182846B2 (en) | 2015-11-10 |
Family
ID=48654041
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/714,313 Active 2033-10-05 US9182846B2 (en) | 2011-12-27 | 2012-12-13 | Electronic device and touch input control method for touch coordinate compensation |
Country Status (3)
Country | Link |
---|---|
US (1) | US9182846B2 (en) |
CN (1) | CN103186329B (en) |
TW (1) | TWI547837B (en) |
Cited By (75)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140176509A1 (en) * | 2012-12-21 | 2014-06-26 | Lg Display Co., Ltd. | Method of compensating reference data and touch screen apparatus using the method |
WO2015006776A1 (en) * | 2013-07-12 | 2015-01-15 | Tactual Labs Co. | Reducing control response latency with defined cross-control behavior |
US20150309597A1 (en) * | 2013-05-09 | 2015-10-29 | Kabushiki Kaisha Toshiba | Electronic apparatus, correction method, and storage medium |
US20150339012A1 (en) * | 2013-02-05 | 2015-11-26 | Tencent Technology (Shenzhen) Company Limited | Method used by mobile terminal to return to home screen, mobile terminal and storage medium |
WO2016053239A1 (en) | 2014-09-29 | 2016-04-07 | Hewlett-Packard Development Company, L.P. | Virtual keyboard |
EP3077897A1 (en) * | 2013-12-03 | 2016-10-12 | Microsoft Technology Licensing, LLC | User interface adaptation from an input source identifier change |
US9507500B2 (en) | 2012-10-05 | 2016-11-29 | Tactual Labs Co. | Hybrid systems and methods for low-latency user input processing and feedback |
US20170139535A1 (en) * | 2015-11-12 | 2017-05-18 | Dell Products L.P. | Information Handling System Desktop Surface Display Touch Input Compensation |
US9818171B2 (en) * | 2015-03-26 | 2017-11-14 | Lenovo (Singapore) Pte. Ltd. | Device input and display stabilization |
US20180095596A1 (en) * | 2016-09-30 | 2018-04-05 | Biocatch Ltd. | System, device, and method of estimating force applied to a touch surface |
US10037138B2 (en) | 2012-12-29 | 2018-07-31 | Apple Inc. | Device, method, and graphical user interface for switching between user interfaces |
US10042542B2 (en) | 2012-05-09 | 2018-08-07 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10067645B2 (en) | 2015-03-08 | 2018-09-04 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10073615B2 (en) | 2012-05-09 | 2018-09-11 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10078442B2 (en) | 2012-12-29 | 2018-09-18 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10095391B2 (en) | 2012-05-09 | 2018-10-09 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US10101887B2 (en) | 2012-12-29 | 2018-10-16 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
US10126930B2 (en) | 2012-05-09 | 2018-11-13 | Apple Inc. | Device, method, and graphical user interface for scrolling nested regions |
US10133412B2 (en) * | 2014-10-07 | 2018-11-20 | General Electric Company | Intuitive touch screen calibration device and method |
US10162452B2 (en) | 2015-08-10 | 2018-12-25 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10168826B2 (en) | 2012-05-09 | 2019-01-01 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10175864B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity |
US10175757B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface |
US10191627B2 (en) | 2012-05-09 | 2019-01-29 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10203868B2 (en) | 2015-08-10 | 2019-02-12 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10222980B2 (en) | 2015-03-19 | 2019-03-05 | Apple Inc. | Touch input cursor manipulation |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US10262324B2 (en) | 2010-11-29 | 2019-04-16 | Biocatch Ltd. | System, device, and method of differentiating among users based on user-specific page navigation sequence |
US10275087B1 (en) | 2011-08-05 | 2019-04-30 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10298614B2 (en) * | 2010-11-29 | 2019-05-21 | Biocatch Ltd. | System, device, and method of generating and managing behavioral biometric cookies |
US10303354B2 (en) | 2015-06-07 | 2019-05-28 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10387029B2 (en) | 2015-03-08 | 2019-08-20 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10397262B2 (en) | 2017-07-20 | 2019-08-27 | Biocatch Ltd. | Device, system, and method of detecting overlay malware |
US10404729B2 (en) | 2010-11-29 | 2019-09-03 | Biocatch Ltd. | Device, method, and system of generating fraud-alerts for cyber-attacks |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US10437333B2 (en) | 2012-12-29 | 2019-10-08 | Apple Inc. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
US10474815B2 (en) | 2010-11-29 | 2019-11-12 | Biocatch Ltd. | System, device, and method of detecting malicious automatic script and code injection |
US10496260B2 (en) | 2012-05-09 | 2019-12-03 | Apple Inc. | Device, method, and graphical user interface for pressure-based alteration of controls in a user interface |
US10523680B2 (en) * | 2015-07-09 | 2019-12-31 | Biocatch Ltd. | System, device, and method for detecting a proxy server |
US10579784B2 (en) | 2016-11-02 | 2020-03-03 | Biocatch Ltd. | System, device, and method of secure utilization of fingerprints for user authentication |
US10586036B2 (en) | 2010-11-29 | 2020-03-10 | Biocatch Ltd. | System, device, and method of recovery and resetting of user authentication factor |
US10621585B2 (en) | 2010-11-29 | 2020-04-14 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values |
US10620781B2 (en) | 2012-12-29 | 2020-04-14 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
US10685355B2 (en) * | 2016-12-04 | 2020-06-16 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US10694078B1 (en) | 2019-02-19 | 2020-06-23 | Volvo Car Corporation | Motion sickness reduction for in-vehicle displays |
US10719765B2 (en) | 2015-06-25 | 2020-07-21 | Biocatch Ltd. | Conditional behavioral biometrics |
US10728761B2 (en) | 2010-11-29 | 2020-07-28 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data |
US10747305B2 (en) | 2010-11-29 | 2020-08-18 | Biocatch Ltd. | Method, system, and device of authenticating identity of a user of an electronic device |
US10776476B2 (en) | 2010-11-29 | 2020-09-15 | Biocatch Ltd. | System, device, and method of visual login |
US10782871B2 (en) | 2012-05-09 | 2020-09-22 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US10834590B2 (en) | 2010-11-29 | 2020-11-10 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user |
US10897482B2 (en) | 2010-11-29 | 2021-01-19 | Biocatch Ltd. | Method, device, and system of back-coloring, forward-coloring, and fraud detection |
US10908808B2 (en) | 2012-05-09 | 2021-02-02 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US10917431B2 (en) | 2010-11-29 | 2021-02-09 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video |
US10949514B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | Device, system, and method of differentiating among users based on detection of hardware components |
US10949757B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | System, device, and method of detecting user identity based on motor-control loop model |
US10963097B2 (en) * | 2017-09-18 | 2021-03-30 | Lenovo (Beijing) Co., Ltd. | Method, electronic device, and apparatus for touch-region calibration |
US10970394B2 (en) | 2017-11-21 | 2021-04-06 | Biocatch Ltd. | System, device, and method of detecting vishing attacks |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US11055395B2 (en) | 2016-07-08 | 2021-07-06 | Biocatch Ltd. | Step-up authentication |
US20210329030A1 (en) * | 2010-11-29 | 2021-10-21 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
US11210674B2 (en) | 2010-11-29 | 2021-12-28 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US11223619B2 (en) | 2010-11-29 | 2022-01-11 | Biocatch Ltd. | Device, system, and method of user authentication based on user-specific characteristics of task performance |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11269977B2 (en) | 2010-11-29 | 2022-03-08 | Biocatch Ltd. | System, apparatus, and method of collecting and processing data in electronic devices |
US11606353B2 (en) | 2021-07-22 | 2023-03-14 | Biocatch Ltd. | System, device, and method of generating and utilizing one-time passwords |
DE102021212800A1 (en) | 2021-11-15 | 2023-05-17 | Continental Automotive Technologies GmbH | Calibrating a touch-sensitive display |
US20240080339A1 (en) * | 2010-11-29 | 2024-03-07 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI496052B (en) * | 2013-12-13 | 2015-08-11 | Ind Tech Res Inst | Auto calibration system and auto calibration method |
KR101967033B1 (en) * | 2014-04-10 | 2019-04-08 | 아이비코리아 유한회사 | Biometric sensor for touch-enabled device |
CN105094405A (en) * | 2014-05-23 | 2015-11-25 | 中兴通讯股份有限公司 | Method and apparatus for automatically adjusting effective contact |
CN105094404A (en) * | 2014-05-23 | 2015-11-25 | 中兴通讯股份有限公司 | Adaptive effective clicking method and device |
CN105718069B (en) * | 2014-12-02 | 2020-01-31 | 联想(北京)有限公司 | Information processing method and electronic equipment |
JP6055459B2 (en) * | 2014-12-17 | 2016-12-27 | 京セラドキュメントソリューションズ株式会社 | Touch panel device and image processing device |
CN106155502A (en) * | 2015-03-25 | 2016-11-23 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN106814958A (en) * | 2015-12-01 | 2017-06-09 | 小米科技有限责任公司 | The touch control method and device of function key |
TWI557620B (en) * | 2015-12-30 | 2016-11-11 | 奕力科技股份有限公司 | Splicing touch screen apparatus and touch detection method for touch screens thereof |
KR102535056B1 (en) * | 2016-08-03 | 2023-05-22 | 삼성전자 주식회사 | An electronic apparautus and mehtod of recognizing a touch in the apparatus |
WO2020047869A1 (en) * | 2018-09-07 | 2020-03-12 | 深圳柔显系统技术有限公司 | Method for determining touch sensing distance, electronic apparatus, and terminal device |
CN109683775B (en) * | 2018-12-12 | 2021-07-06 | 歌尔科技有限公司 | Projection-based interaction method, projection equipment and storage medium |
CN110456978B (en) * | 2019-08-13 | 2021-06-01 | 青度互娱(重庆)科技有限公司 | Touch control method, system, terminal and medium for touch terminal |
CN114637454B (en) * | 2020-12-16 | 2025-03-25 | 北京搜狗科技发展有限公司 | Input method, device and device for input |
CN113341190B (en) * | 2021-06-09 | 2022-10-21 | 深圳市鼎阳科技股份有限公司 | Channel selection method of digital oscilloscope and storage medium |
CN113867562B (en) * | 2021-08-18 | 2022-11-15 | 荣耀终端有限公司 | Touch screen point reporting correction method and device and electronic equipment |
CN115079867A (en) * | 2022-06-17 | 2022-09-20 | 东莞市泰宇达光电科技有限公司 | Input accuracy detection and correction method for capacitive touch screen |
CN118092773A (en) * | 2024-03-01 | 2024-05-28 | 深圳市摩乐吉科技有限公司 | Handwriting recognition method and device, computer equipment and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6353434B1 (en) * | 1998-09-08 | 2002-03-05 | Gunze Limited | Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display |
US20020070926A1 (en) * | 2000-12-11 | 2002-06-13 | Xerox Corporation | Touchscreen display calibration using results history |
US6456952B1 (en) * | 2000-03-29 | 2002-09-24 | Ncr Coporation | System and method for touch screen environmental calibration |
US20060066590A1 (en) * | 2004-09-29 | 2006-03-30 | Masanori Ozawa | Input device |
US7256772B2 (en) * | 2003-04-08 | 2007-08-14 | Smart Technologies, Inc. | Auto-aligning touch system and method |
US20100220066A1 (en) * | 2009-02-27 | 2010-09-02 | Murphy Kenneth M T | Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
US20110032197A1 (en) * | 2009-08-06 | 2011-02-10 | Canon Kabushiki Kaisha | Information processing apparatus and control method of information processing apparatus |
US20110057896A1 (en) * | 2009-09-04 | 2011-03-10 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling mobile terminal |
US20110102334A1 (en) * | 2009-11-04 | 2011-05-05 | Nokia Corporation | Method and apparatus for determining adjusted position for touch input |
US20120287087A1 (en) * | 2010-02-02 | 2012-11-15 | Zte Corporation | Touch screen calibration parameter obtaining method and device |
US20130201155A1 (en) * | 2010-08-12 | 2013-08-08 | Genqing Wu | Finger identification on a touchscreen |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4377343B2 (en) * | 2005-01-31 | 2009-12-02 | 株式会社東海理化電機製作所 | Touch operation input device |
CN102129332A (en) * | 2011-03-07 | 2011-07-20 | 广东威创视讯科技股份有限公司 | Detection method and device of touch points for image recognition |
CN102736759B (en) * | 2011-04-08 | 2017-02-15 | 富泰华工业(深圳)有限公司 | Touch screen and control method thereof |
-
2011
- 2011-12-27 CN CN201110444042.0A patent/CN103186329B/en active Active
- 2011-12-29 TW TW100149656A patent/TWI547837B/en active
-
2012
- 2012-12-13 US US13/714,313 patent/US9182846B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6353434B1 (en) * | 1998-09-08 | 2002-03-05 | Gunze Limited | Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display |
US6456952B1 (en) * | 2000-03-29 | 2002-09-24 | Ncr Coporation | System and method for touch screen environmental calibration |
US20020070926A1 (en) * | 2000-12-11 | 2002-06-13 | Xerox Corporation | Touchscreen display calibration using results history |
US7256772B2 (en) * | 2003-04-08 | 2007-08-14 | Smart Technologies, Inc. | Auto-aligning touch system and method |
US20060066590A1 (en) * | 2004-09-29 | 2006-03-30 | Masanori Ozawa | Input device |
US20100220066A1 (en) * | 2009-02-27 | 2010-09-02 | Murphy Kenneth M T | Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
US20110032197A1 (en) * | 2009-08-06 | 2011-02-10 | Canon Kabushiki Kaisha | Information processing apparatus and control method of information processing apparatus |
US20110057896A1 (en) * | 2009-09-04 | 2011-03-10 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling mobile terminal |
US20110102334A1 (en) * | 2009-11-04 | 2011-05-05 | Nokia Corporation | Method and apparatus for determining adjusted position for touch input |
US20120287087A1 (en) * | 2010-02-02 | 2012-11-15 | Zte Corporation | Touch screen calibration parameter obtaining method and device |
US20130201155A1 (en) * | 2010-08-12 | 2013-08-08 | Genqing Wu | Finger identification on a touchscreen |
Cited By (150)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210329030A1 (en) * | 2010-11-29 | 2021-10-21 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
US12101354B2 (en) * | 2010-11-29 | 2024-09-24 | Biocatch Ltd. | Device, system, and method of detecting vishing attacks |
US10586036B2 (en) | 2010-11-29 | 2020-03-10 | Biocatch Ltd. | System, device, and method of recovery and resetting of user authentication factor |
US10474815B2 (en) | 2010-11-29 | 2019-11-12 | Biocatch Ltd. | System, device, and method of detecting malicious automatic script and code injection |
US10621585B2 (en) | 2010-11-29 | 2020-04-14 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values |
US10404729B2 (en) | 2010-11-29 | 2019-09-03 | Biocatch Ltd. | Device, method, and system of generating fraud-alerts for cyber-attacks |
US20240080339A1 (en) * | 2010-11-29 | 2024-03-07 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
US10728761B2 (en) | 2010-11-29 | 2020-07-28 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data |
US11838118B2 (en) * | 2010-11-29 | 2023-12-05 | Biocatch Ltd. | Device, system, and method of detecting vishing attacks |
US11580553B2 (en) | 2010-11-29 | 2023-02-14 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US11425563B2 (en) | 2010-11-29 | 2022-08-23 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user |
US10747305B2 (en) | 2010-11-29 | 2020-08-18 | Biocatch Ltd. | Method, system, and device of authenticating identity of a user of an electronic device |
US10776476B2 (en) | 2010-11-29 | 2020-09-15 | Biocatch Ltd. | System, device, and method of visual login |
US11330012B2 (en) * | 2010-11-29 | 2022-05-10 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video |
US11314849B2 (en) | 2010-11-29 | 2022-04-26 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data |
US10298614B2 (en) * | 2010-11-29 | 2019-05-21 | Biocatch Ltd. | System, device, and method of generating and managing behavioral biometric cookies |
US10834590B2 (en) | 2010-11-29 | 2020-11-10 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user |
US11269977B2 (en) | 2010-11-29 | 2022-03-08 | Biocatch Ltd. | System, apparatus, and method of collecting and processing data in electronic devices |
US11250435B2 (en) | 2010-11-29 | 2022-02-15 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values |
US10262324B2 (en) | 2010-11-29 | 2019-04-16 | Biocatch Ltd. | System, device, and method of differentiating among users based on user-specific page navigation sequence |
US11223619B2 (en) | 2010-11-29 | 2022-01-11 | Biocatch Ltd. | Device, system, and method of user authentication based on user-specific characteristics of task performance |
US10897482B2 (en) | 2010-11-29 | 2021-01-19 | Biocatch Ltd. | Method, device, and system of back-coloring, forward-coloring, and fraud detection |
US10917431B2 (en) | 2010-11-29 | 2021-02-09 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video |
US11210674B2 (en) | 2010-11-29 | 2021-12-28 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US10949514B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | Device, system, and method of differentiating among users based on detection of hardware components |
US10949757B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | System, device, and method of detecting user identity based on motor-control loop model |
US10345961B1 (en) | 2011-08-05 | 2019-07-09 | P4tents1, LLC | Devices and methods for navigating between user interfaces |
US10275087B1 (en) | 2011-08-05 | 2019-04-30 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10649571B1 (en) | 2011-08-05 | 2020-05-12 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10656752B1 (en) | 2011-08-05 | 2020-05-19 | P4tents1, LLC | Gesture-equipped touch screen system, method, and computer program product |
US10664097B1 (en) | 2011-08-05 | 2020-05-26 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10386960B1 (en) | 2011-08-05 | 2019-08-20 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10365758B1 (en) | 2011-08-05 | 2019-07-30 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10540039B1 (en) | 2011-08-05 | 2020-01-21 | P4tents1, LLC | Devices and methods for navigating between user interface |
US10338736B1 (en) | 2011-08-05 | 2019-07-02 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10114546B2 (en) | 2012-05-09 | 2018-10-30 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US11068153B2 (en) | 2012-05-09 | 2021-07-20 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10942570B2 (en) | 2012-05-09 | 2021-03-09 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10095391B2 (en) | 2012-05-09 | 2018-10-09 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US10908808B2 (en) | 2012-05-09 | 2021-02-02 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US11221675B2 (en) | 2012-05-09 | 2022-01-11 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10884591B2 (en) | 2012-05-09 | 2021-01-05 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects |
US10073615B2 (en) | 2012-05-09 | 2018-09-11 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US11314407B2 (en) | 2012-05-09 | 2022-04-26 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US10782871B2 (en) | 2012-05-09 | 2020-09-22 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US10191627B2 (en) | 2012-05-09 | 2019-01-29 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10042542B2 (en) | 2012-05-09 | 2018-08-07 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US10775999B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10969945B2 (en) | 2012-05-09 | 2021-04-06 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US10775994B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US10996788B2 (en) | 2012-05-09 | 2021-05-04 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US12067229B2 (en) | 2012-05-09 | 2024-08-20 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US11354033B2 (en) | 2012-05-09 | 2022-06-07 | Apple Inc. | Device, method, and graphical user interface for managing icons in a user interface region |
US10175757B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface |
US11010027B2 (en) | 2012-05-09 | 2021-05-18 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10592041B2 (en) | 2012-05-09 | 2020-03-17 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10175864B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity |
US11947724B2 (en) | 2012-05-09 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10168826B2 (en) | 2012-05-09 | 2019-01-01 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10126930B2 (en) | 2012-05-09 | 2018-11-13 | Apple Inc. | Device, method, and graphical user interface for scrolling nested regions |
US10496260B2 (en) | 2012-05-09 | 2019-12-03 | Apple Inc. | Device, method, and graphical user interface for pressure-based alteration of controls in a user interface |
US10481690B2 (en) | 2012-05-09 | 2019-11-19 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface |
US12045451B2 (en) | 2012-05-09 | 2024-07-23 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US9507500B2 (en) | 2012-10-05 | 2016-11-29 | Tactual Labs Co. | Hybrid systems and methods for low-latency user input processing and feedback |
US9927959B2 (en) | 2012-10-05 | 2018-03-27 | Tactual Labs Co. | Hybrid systems and methods for low-latency user input processing and feedback |
US9223438B2 (en) * | 2012-12-21 | 2015-12-29 | Lg Display Co., Ltd. | Method of compensating reference data and touch screen apparatus using the method |
US20140176509A1 (en) * | 2012-12-21 | 2014-06-26 | Lg Display Co., Ltd. | Method of compensating reference data and touch screen apparatus using the method |
US10620781B2 (en) | 2012-12-29 | 2020-04-14 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
US10101887B2 (en) | 2012-12-29 | 2018-10-16 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
US12135871B2 (en) | 2012-12-29 | 2024-11-05 | Apple Inc. | Device, method, and graphical user interface for switching between user interfaces |
US10915243B2 (en) | 2012-12-29 | 2021-02-09 | Apple Inc. | Device, method, and graphical user interface for adjusting content selection |
US10185491B2 (en) | 2012-12-29 | 2019-01-22 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or enlarge content |
US12050761B2 (en) | 2012-12-29 | 2024-07-30 | Apple Inc. | Device, method, and graphical user interface for transitioning from low power mode |
US10175879B2 (en) | 2012-12-29 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for zooming a user interface while performing a drag operation |
US10437333B2 (en) | 2012-12-29 | 2019-10-08 | Apple Inc. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
US10078442B2 (en) | 2012-12-29 | 2018-09-18 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold |
US10037138B2 (en) | 2012-12-29 | 2018-07-31 | Apple Inc. | Device, method, and graphical user interface for switching between user interfaces |
US10831342B2 (en) * | 2013-02-05 | 2020-11-10 | Tencent Technology (Shenzhen) Company Limited | Method used by mobile terminal to return to home screen, mobile terminal and storage medium |
US20150339012A1 (en) * | 2013-02-05 | 2015-11-26 | Tencent Technology (Shenzhen) Company Limited | Method used by mobile terminal to return to home screen, mobile terminal and storage medium |
US10345994B2 (en) * | 2013-02-05 | 2019-07-09 | Tencent Technology (Shenzhen) Company Limited | Method used by mobile terminal to return to home screen, mobile terminal and storage medium |
US20150309597A1 (en) * | 2013-05-09 | 2015-10-29 | Kabushiki Kaisha Toshiba | Electronic apparatus, correction method, and storage medium |
US9632615B2 (en) | 2013-07-12 | 2017-04-25 | Tactual Labs Co. | Reducing control response latency with defined cross-control behavior |
WO2015006776A1 (en) * | 2013-07-12 | 2015-01-15 | Tactual Labs Co. | Reducing control response latency with defined cross-control behavior |
EP3077897A1 (en) * | 2013-12-03 | 2016-10-12 | Microsoft Technology Licensing, LLC | User interface adaptation from an input source identifier change |
WO2016053239A1 (en) | 2014-09-29 | 2016-04-07 | Hewlett-Packard Development Company, L.P. | Virtual keyboard |
EP3201737B1 (en) * | 2014-09-29 | 2021-10-27 | Hewlett-Packard Development Company, L.P. | Virtual keyboard |
US10133412B2 (en) * | 2014-10-07 | 2018-11-20 | General Electric Company | Intuitive touch screen calibration device and method |
US10268342B2 (en) | 2015-03-08 | 2019-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10613634B2 (en) | 2015-03-08 | 2020-04-07 | Apple Inc. | Devices and methods for controlling media presentation |
US10387029B2 (en) | 2015-03-08 | 2019-08-20 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10268341B2 (en) | 2015-03-08 | 2019-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10338772B2 (en) | 2015-03-08 | 2019-07-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10180772B2 (en) | 2015-03-08 | 2019-01-15 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10402073B2 (en) | 2015-03-08 | 2019-09-03 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10860177B2 (en) | 2015-03-08 | 2020-12-08 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US11112957B2 (en) | 2015-03-08 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10067645B2 (en) | 2015-03-08 | 2018-09-04 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US11977726B2 (en) | 2015-03-08 | 2024-05-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10222980B2 (en) | 2015-03-19 | 2019-03-05 | Apple Inc. | Touch input cursor manipulation |
US10599331B2 (en) | 2015-03-19 | 2020-03-24 | Apple Inc. | Touch input cursor manipulation |
US11550471B2 (en) | 2015-03-19 | 2023-01-10 | Apple Inc. | Touch input cursor manipulation |
US11054990B2 (en) | 2015-03-19 | 2021-07-06 | Apple Inc. | Touch input cursor manipulation |
US9818171B2 (en) * | 2015-03-26 | 2017-11-14 | Lenovo (Singapore) Pte. Ltd. | Device input and display stabilization |
US10152208B2 (en) | 2015-04-01 | 2018-12-11 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10455146B2 (en) | 2015-06-07 | 2019-10-22 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US11835985B2 (en) | 2015-06-07 | 2023-12-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10303354B2 (en) | 2015-06-07 | 2019-05-28 | Apple Inc. | Devices and methods for navigating between user interfaces |
EP3196750B1 (en) * | 2015-06-07 | 2019-12-11 | Apple Inc. | Devices and methods for navigating between user interfaces |
US11681429B2 (en) | 2015-06-07 | 2023-06-20 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10841484B2 (en) | 2015-06-07 | 2020-11-17 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10705718B2 (en) | 2015-06-07 | 2020-07-07 | Apple Inc. | Devices and methods for navigating between user interfaces |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10719765B2 (en) | 2015-06-25 | 2020-07-21 | Biocatch Ltd. | Conditional behavioral biometrics |
US11238349B2 (en) | 2015-06-25 | 2022-02-01 | Biocatch Ltd. | Conditional behavioural biometrics |
US10834090B2 (en) | 2015-07-09 | 2020-11-10 | Biocatch Ltd. | System, device, and method for detection of proxy server |
US11323451B2 (en) | 2015-07-09 | 2022-05-03 | Biocatch Ltd. | System, device, and method for detection of proxy server |
US10523680B2 (en) * | 2015-07-09 | 2019-12-31 | Biocatch Ltd. | System, device, and method for detecting a proxy server |
US11327648B2 (en) | 2015-08-10 | 2022-05-10 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10884608B2 (en) | 2015-08-10 | 2021-01-05 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10698598B2 (en) | 2015-08-10 | 2020-06-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US10754542B2 (en) | 2015-08-10 | 2020-08-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US10203868B2 (en) | 2015-08-10 | 2019-02-12 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10209884B2 (en) | 2015-08-10 | 2019-02-19 | Apple Inc. | Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback |
US10162452B2 (en) | 2015-08-10 | 2018-12-25 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US11182017B2 (en) | 2015-08-10 | 2021-11-23 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10963158B2 (en) | 2015-08-10 | 2021-03-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11740785B2 (en) | 2015-08-10 | 2023-08-29 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US9778792B2 (en) * | 2015-11-12 | 2017-10-03 | Dell Products L.P. | Information handling system desktop surface display touch input compensation |
US20170139535A1 (en) * | 2015-11-12 | 2017-05-18 | Dell Products L.P. | Information Handling System Desktop Surface Display Touch Input Compensation |
US11055395B2 (en) | 2016-07-08 | 2021-07-06 | Biocatch Ltd. | Step-up authentication |
US20180095596A1 (en) * | 2016-09-30 | 2018-04-05 | Biocatch Ltd. | System, device, and method of estimating force applied to a touch surface |
US10198122B2 (en) * | 2016-09-30 | 2019-02-05 | Biocatch Ltd. | System, device, and method of estimating force applied to a touch surface |
US10579784B2 (en) | 2016-11-02 | 2020-03-03 | Biocatch Ltd. | System, device, and method of secure utilization of fingerprints for user authentication |
US10685355B2 (en) * | 2016-12-04 | 2020-06-16 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US10397262B2 (en) | 2017-07-20 | 2019-08-27 | Biocatch Ltd. | Device, system, and method of detecting overlay malware |
US10963097B2 (en) * | 2017-09-18 | 2021-03-30 | Lenovo (Beijing) Co., Ltd. | Method, electronic device, and apparatus for touch-region calibration |
US10970394B2 (en) | 2017-11-21 | 2021-04-06 | Biocatch Ltd. | System, device, and method of detecting vishing attacks |
US10694078B1 (en) | 2019-02-19 | 2020-06-23 | Volvo Car Corporation | Motion sickness reduction for in-vehicle displays |
US11606353B2 (en) | 2021-07-22 | 2023-03-14 | Biocatch Ltd. | System, device, and method of generating and utilizing one-time passwords |
DE102021212800A1 (en) | 2021-11-15 | 2023-05-17 | Continental Automotive Technologies GmbH | Calibrating a touch-sensitive display |
Also Published As
Publication number | Publication date |
---|---|
CN103186329A (en) | 2013-07-03 |
TWI547837B (en) | 2016-09-01 |
CN103186329B (en) | 2017-08-18 |
TW201327303A (en) | 2013-07-01 |
US9182846B2 (en) | 2015-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9182846B2 (en) | Electronic device and touch input control method for touch coordinate compensation | |
US7810247B2 (en) | Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor | |
US10642933B2 (en) | Method and apparatus for word prediction selection | |
US8358277B2 (en) | Virtual keyboard based activation and dismissal | |
US8421756B2 (en) | Two-thumb qwerty keyboard | |
US20090066659A1 (en) | Computer system with touch screen and separate display screen | |
US20130159927A1 (en) | Electronic device with touch screen and screen unlocking method thereof | |
JP5728629B2 (en) | Information processing apparatus, information processing apparatus control method, program, and information storage medium | |
CN106662974A (en) | Probabilistic touch sensing | |
US20160266659A1 (en) | Method and apparatus for word prediction using the position of a non-typing digit | |
US20100033428A1 (en) | Cursor moving method and apparatus for portable terminal | |
US20140104179A1 (en) | Keyboard Modification to Increase Typing Speed by Gesturing Next Character | |
CN110297592B (en) | Processing method and electronic equipment | |
US20140168106A1 (en) | Apparatus and method for processing handwriting input | |
EP2778859B1 (en) | Method and apparatus for word prediction using the position of a non-typing digit | |
KR101919841B1 (en) | Method and system for calibrating touch error | |
US20210048937A1 (en) | Mobile Device and Method for Improving the Reliability of Touches on Touchscreen | |
WO2012169188A1 (en) | Information device and display control method | |
CA2846561C (en) | Method and apparatus for word prediction selection | |
US8531412B1 (en) | Method and system for processing touch input | |
JP5212919B2 (en) | Information processing device | |
EP2778860A1 (en) | Method and apparatus for word prediction selection | |
CN105718069A (en) | Information processing method and electronic equipment | |
KR20090126575A (en) | A user interactive device having a touch pad, a method of obtaining control data for controlling the user interactive device, and a method of performing user authentication in the user interactive device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HON HAI PRECISION INDUSTRY CO., LTD., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PENG, LI-XIA;ZHENG, ZHANG-YONG;WANG, FEI;AND OTHERS;REEL/FRAME:029467/0146 Effective date: 20121212 Owner name: FU TAI HUA INDUSTRY (SHENZHEN) CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PENG, LI-XIA;ZHENG, ZHANG-YONG;WANG, FEI;AND OTHERS;REEL/FRAME:029467/0146 Effective date: 20121212 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |