专利摘要:
information processing apparatus, its control method, and control method storage means for an information processing apparatus configured to acquire position information of one or more touched points, one by one, detected according to a predetermined period detection on a touch panel, includes retaining position information from each of the one or more tapped points in a holding unit, acquiring a number of tapped points whose position information is held in the holding unit, determining at each predetermined period detect that the last position information of all tapped points, whose position information is retained in the holding unit, is specified, in a case where the number of tapped points acquired is two or more, and decide the multitouch operation executed in the information processing apparatus after the determination.
公开号:BR102013004710B1
申请号:R102013004710-4
申请日:2013-02-27
公开日:2021-06-29
发明作者:Hikaru Ito
申请人:Canon Kabushiki Kaisha;
IPC主号:
专利说明:

FUNDAMENTALS OF THE INVENTION Field of Invention
[0001] The present invention relates to a technique to improve false operations in an apparatus in which a multitouch operation is possible. DESCRIPTION OF RELATED ART
[0002] Currently, touch input devices have become widely used, which receive, in response to a user touching a screen by a finger or a pointed instrument, X, Y coordinate values of the tapped position as input values to perform various processing based on input values.
[0003] In addition, a multitouch technique has recently been developed in which an operation is performed by touching a plurality of points on a screen. Typically, like a multi-touch operation on a touch panel, an operation to move two points where a user is touching towards or away from each other is referred to as a “pinch” operation. Between operations, the operation of moving two points closer together is called a “pinch-in” operation, and with this operation the displayed image is reduced. On the other hand, the operation of moving two points apart is referred to as a “pinch-out” operation, and with this operation the displayed image is enlarged.
[0004] In a technique discussed in Japanese Patent Application Open to the Public No. 2011-059952, when at least one of the two fingers touching the screen moves, a pressing operation is recognized according to the change of distance between the two fingers, and the magnification ratio of an image is changed, and simultaneously the displayed image is scrolled according to the direction of finger movement. SUMMARY OF THE INVENTION
[0005] The present invention relates to an information processing apparatus, in which a multitouch operation is possible, capable of reducing false operations performed in relation to the user's intention in the information processing apparatus.
[0006] Furthermore, features and aspects of the present invention will become apparent from the following detailed description of the exemplary embodiments with reference to the attached drawings. BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features and aspects of the invention and, together with the description, serve to explain the principles of the invention.
[0008] Fig. 1A is a block diagram illustrating an exemplary hardware configuration of an information processing apparatus.
[0009] Fig. 1B is a block diagram illustrating an exemplary function configuration of the information processing apparatus.
[0010] Fig. 2 is a flowchart illustrating a processing flow performed by the information processing apparatus for recognizing a multi-touch operation.
[0011] Fig. 3A is a flowchart illustrating an exemplary touch detection processing performed by the information processing apparatus. Fig. 3B is a flowchart illustrating an example of output processing performed by the information processing apparatus.
[0012] Fig. 4A is a diagram illustrating a multi-drag operation performed by a user. Fig. 4B is a diagram illustrating an exemplary distribution of tapped points detected when a user performs a multi-drag operation.
[0013] Figs. 5A to 5F are tables illustrating examples of information held at detected touched points.
[0014] Fig. 6 is a flowchart illustrating a flow for recognizing a multi-touch operation of the information processing apparatus.
[0015] Fig. 7A is a diagram illustrating an exemplary distribution of tapped points detected when a user performs a tightening operation. Figs. 7B to 7E are tables illustrating examples of information contained in detected tapped points.
[0016] Fig. 8 is a block diagram illustrating an exemplary function configuration of the information processing apparatus.
[0017] Fig. 9 is a flowchart illustrating a flow for recognizing a multi-touch operation of the information processing apparatus.
[0018] Fig. 10 is a flowchart illustrating an example flow of motion vector generation processing performed by the information processing apparatus.
[0019] Figs. 11A to 11F are diagrams illustrating exemplary distributions of tapped points detected when a user performs tap operations. DESCRIPTION OF ACHIEVEMENTS
[0020] Various embodiments, features and exemplary aspects of the invention will be described in detail below with reference to the drawings.
[0021] The exemplary embodiments described below are mere examples and not limited thereto.
[0022] In a first exemplary embodiment, a multitouch state is supported. In the multi-tap state, a user taps a colon at a point of time. An example is described, in which an operation content performed by a user in an information processing apparatus 100 is determined in response to a completion of an update of position information of all two points, in order to recognize an operation of multitouch satisfying a user's intent.
[0023] A case is described as an exemplary multitouch operation in which a user intends to move two tapped points in the same direction, while keeping the distance between them (herein referred to as a "multi-drag" operation)
[0024] Until now, in order to recognize a multitouch operation, the information of a touched point detected by a touch panel is notified one by one, and processing is performed based on the position information obtained for each point. Therefore, between two touched points moved by a user to perform a multi-drag operation, at a point in time when a position after a first point is moved is taken, a position of a second point touched before the move is maintained. As a result, it is determined that the “distance between two points” is increased by the distance corresponding to the amount of movement of the user's finger.
[0025] Then, at a point in time, when the position of the second point after movement is obtained, the distance between the two points is returned to the distance before the movement. As a result, it is determined that the “distance between two points” is reduced.
[0026] Therefore, when a user intends to perform a multi-drag operation, a pinch-out and a pinch-in are alternatively notified based on the results of the determination described above. As a result, an abnormal display, in which the displayed image is enlarged and alternatively reduced, is performed on the device in relation to the user's intent.
[0027] In contrast, in the present exemplary embodiment, the position information of a plurality of tapped points configuring a multitap operation is obtained and, in response to a termination of an update of the position information of all tapped points obtained , a content of the operation performed by the user is determined.
[0028] Fig. 1A is a diagram illustrating an exemplary hardware configuration of an information processing apparatus 100 in accordance with the present exemplary embodiment. A central processing unit (CPU) 101 performs calculations and logical determinations for various types of processing and controls each component connected to a system bus 110.
[0029] The information processing apparatus 100 is provided with memories including a program memory and a data memory. A read-only memory (ROM) 102 is a program memory for storing programs to control various types of processing procedures of the CPU 101, described below. A random access memory (RAM) 103 includes a work area for the above-described programs for the CPU 101, a data save area for error processing, and a load area for the above-described driver program. Program memory can be realized by loading programs from a storage device 109 into RAM 103.
[0030] A hard disk (HD) 104 stores data and programs, in accordance with the present exemplary embodiment. In the present exemplary embodiment, the HD 104 stores a plurality of image data. The external storage device 109 connected to it via an input/output (I/F) 107 interface can be used in place of the HD 104.
[0031] The external storage device 109 can be configured with, for example, a medium storage drive (recording medium) and an external one, to access the medium. Such a medium includes, for example, a floppy disk (FD), a compact disk ROM, a digital versatile disk (DVD), a universal serial bus memory, a magneto-optical disk, and a flash memory.
[0032] Furthermore, the external storage device 109 can be a server device connected via a network. In the present exemplary embodiment, the necessary information is stored in RAM 103, HD 104, and external storage device 109.
[0033] An input interface (I/F) controls an input device, such as an indicating device, to obtain a signal output from the input device. An I/F output 106 issues a control signal to control an output of a result of various types of processing, described below, to an output device including a display unit such as a liquid crystal display and a monitor. TV.
[0034] In the present exemplary embodiment, a touch panel display 108, integrated in information processing apparatus 100, includes a touch panel (input device) and a display device (output device). However, an external device connected to the information processing apparatus 100 can be used, and independent devices can also be used.
[0035] From the touch panel as the input device, according to the present exemplary embodiment, a signal indicating detection of the point touched or highlighted by a user is notified to information processing apparatus 100 as a touch event . At that time, the I/F input 105 acquires a touch event, one by one, detected sequentially by scanning a touch detection area.
[0036] In the present exemplary embodiment, a capacitance detection type touch panel is used, and coordinates of a point are specified as a point touched in a contact area between the user and the panel surface to notify the touch event. However, the touch panel type is not limited to the capacitance detection type.
[0037] Fig. 1B is a function block diagram illustrating a function configuration of information processing apparatus 100 in accordance with the present exemplary embodiment. Information processing apparatus 100 includes a detection unit 111, a first acquisition unit 112, a determination unit 114, a second acquisition unit 115, a decision unit 116, and a display control unit 118.
[0038] These function units are implemented by the CPU 101 loading and executing the program stored in ROM 102 in RAM 103. Furthermore, in the present exemplary embodiment, a holding unit 113 is configured by RAM 103, and a holding unit. storage 119 is a function unit of the HD 104. Here below, each component will be described.
[0039] The detection unit 111 sequentially detects information of a touch event notified by the touch panel monitor 108, one by one, and analyzes it.
[0040] The first acquisition unit 112 acquires information about the touched points detected by the detection unit 111. Then, information indicating the position of a touched point and the moment when the touch is detected (detection time) are associated with a Touched point ID, and then stored in holding unit 113 for each touched point. Using the ID, the same tapped point can be identified.
[0041] Therefore, the first acquisition unit 112 updates the information held by the holding unit 113 every time the first acquisition unit 112 acquires the tapped point information with the same ID. By associating the detection order of tapped points with IDs, control becomes easier when a plurality of tapped points is detected.
[0042] The detecting unit 114 refers to the information held in the holding unit 113 to determine whether the position information of all detected points has been updated.
[0043] The second acquisition unit 115, based on the information acquired by the first acquisition unit 112, calculates the distance between two touched points and the changed distance between them before and after the positions of the touched points are updated, and their absolute value .
[0044] Specifically, the second acquisition unit 115 calculates the shortest distance based on the coordinates of the two touched points acquired by the first acquisition unit 112 and retains them. Then, the second acquisition unit 115 performs a comparison operation between the distance of the two newly calculated points after updating the touched points and the distance maintained to obtain the changed distance and its absolute value.
[0045] The decision unit 116 acquires the absolute value of the changed distance between the two points obtained by the second acquisition unit 115, to compare it with a predetermined threshold value. In the present exemplary embodiment, if the shifted value obtained is equal to or greater than the predetermined threshold value, the decision unit 116 decides that the operation performed by the user is a tightening operation (first operation). On the other hand, if the acquired shifted value is less than the predetermined threshold value, the decision unit 116 decides that the operation performed by the user is a multi-drag operation (second operation).
[0046] If the decision unit 116 decides that a pinch operation is introduced, the pinch control unit 117 determines whether a pinch-out or pinch-in operation is initiated, based on the changed value acquired by the second acquisition unit 115.
[0047] The display control unit 118 acquires the information processed by the decision unit 116 and generates a display image for display on an output device, which is a touch panel monitor in the present embodiment. On the monitor, before the multi-touch operation is performed by the user, at least one of a plurality of image data stored in the storage unit 119 is displayed.
[0048] If the decision unit 116 decides that the input operation is a tightening operation, the decision unit 116 acquires a parameter from the tightening control unit 117 and generates a display image, which is produced by zooming or reducing the image displayed before the user performs the operation. On the other hand, if the decision unit 116 decides that the input operation is a multi-drag operation, in the present exemplary embodiment, based on the direction in which the touched points are moved, the decision unit 116 generates an image of display for displaying a specific image among the plurality of image data stored in storage unit 119.
[0049] Fig. 2 is a flowchart illustrating a flow for recognizing a multi-touch operation by information processing apparatus 100, in accordance with the present exemplary embodiment.
[0050] The information processing apparatus 100, in accordance with the present exemplary embodiment, starts processing for recognizing the multi-touch operation in response to the display of at least one of the image data stored in the monitor storage unit 119 of touch panel 108.
[0051] First, in step S201, a touched point is acquired. In the present exemplary embodiment, the detection unit 111 acquires a touch event, one by one, detected by scanning the touch panel input area 401, and analyzes it.
[0052] Thereafter, the first acquisition unit 112 maintains, in the holding unit 113, the information about the tapped point detected in the input area 401 for each tapped point. In the present exemplary embodiment, the touched point ID, the information indicating the position information expressed by coordinates, and the acquired time are held in the holding unit 113. Details of the processing performed in step 201 will be described below.
[0053] In step S202, the determination unit 114 determines whether a plurality of touched points is detected in the input area 401. In the present exemplary embodiment, since each touched point is identified by ID, the number of touched points is obtained by referring to the number of IDs between pieces of information stored in the holding unit 113.
[0054] In the present exemplary embodiment, information processing apparatus 100 determines whether a multi-tap operation, acquired using two tapped points, is a multi-drag of a pinch operation. Therefore, the description will be made below assuming that, in the input area 401, up to two touched points are detected.
[0055] If it is determined that two or more touched points have been detected (YES in step S202), processing proceeds to step S203. On the other hand, if it is determined that two or more touched points were not detected (NOT in step S202), processing ends.
[0056] In step S203, the determining unit 114 determines whether the position information of all touched points whose information is kept in the holding unit 113 is up to date. In the present exemplary embodiment, in step S201, the first acquisition unit 112 determines an update flag when the touched point information is updated and, in step S203, the determination unit 114 performs the determination based on the number of flags of update.
[0057] The result of the first processing of step S203 is always set to "YES". If the position information of all touched points is determined to be up-to-date (YES in step S203), the update flag is reset, and processing proceeds to step S204. If the position information of all touched points is determined not to be up-to-date (NOT in step S203), processing ends.
[0058] Furthermore, the method of determining whether tapped points are up to date is not limited to it, and other methods can be used. For example, the determination unit 114 can refer to the detection time of each touched point contained in the holding unit 113, and performs the determination based on whether the information of all touched points, which are detected after the moment in that the processing of step S203 is determined to be “YES” the last time is updated.
[0059] As described above, in the present exemplary embodiment, the last position information of all tapped points is specified by the fact that the position information of all held tapped points is updated.
[0060] In step S204, the decision unit 116 decides whether at least one touched point is moved. In the present exemplary embodiment, of step S201, when new position information of a tapped point having the same ID as an already detected ID is obtained, a motion flag is determined and maintained in the holding unit 113.
[0061] Therefore, in step S204, the decision unit 116 can decide whether the touched point has been moved by checking the movement flag of the holding unit 113. If it is determined that at least one touched point has been moved (YES in step S204 ), the motion flag is reset and processing proceeds to step S206.
[0062] If it is determined that no touched point has been moved (NOT in step S204), processing proceeds to step S205.
[0063] Furthermore, the method of deciding whether a touched point has been moved is not limited to this. For example, the position information before the update is kept when the information of the holding unit 113 is updated, and the decision unit 116 can decide whether the touched point is moved based on whether the distance between the position information before and the one after the update is greater than a predetermined distance.
[0064] In step S205, the second acquisition unit 115 acquires the distance between the two touched points. The second acquisition unit 115 calculates the shortest distance between the two touched points based on the information acquired by the first acquisition unit 112, and retains it.
[0065] On the other hand, in step S206, the second acquisition unit 115 acquires the changed distance between two tapped points detected in the input area 401. The second acquisition unit 115 calculates the shortest distance between two tapped points based on the information acquired by the first acquisition unit 112 and maintains it.
[0066] Then, the recently calculated distance between two points after the update and the distance between the two points calculated and last maintained in step S205 or step S206, are subjected to a comparison calculation to obtain the changed value and the absolute value of distance.
[0067] Then, in step S207, the decision unit 116 decides whether the absolute value, of the distance changed between the two points acquired by the second acquisition unit 115, is less than a threshold value "a".
[0068] Here, "a" is a threshold value for the distance recorded in advance in the information processing apparatus 100. The threshold value "a" is established based on the changed distance between the two required points being recognized by the information processing apparatus. information 100, whose distance from the two points touched by a user is extended for the tightening operation or reduced.
[0069] If it is determined that the changed value (absolute value) is less than the threshold value "a" (SIM in step S207), the decision unit 116 decides that the operation input by the user is a multi-drag operation, and processing proceeds to step S208. If it is determined that the changed value (absolute value) is equal to or greater than the threshold value "a" (NOT in step S207), the decision unit 116 decides that the user input is a tightening operation, and processing proceeds to step S210.
[0070] In step S208, the decision unit 116 notifies the display control unit 118 that the multi-drag operation is introduced. In step S209, the processing corresponding to the multi-drag operation is performed. In the present exemplary embodiment, processing for displaying the touch panel monitor specific image data among a plurality of data stored in storage unit 119 is performed. Its details are described below.
[0071] On the other hand, in step S210, the pinch control unit 117 determines whether the pinch-out operation or the input pinch-in operation is initiated. In the present exemplary embodiment, the pinch control unit 117 determines whether the distance between the two touched points is enlarged or reduced based on the value changed between the two points acquired by the second acquisition unit 115.
[0072] Then the total distance between the two points being enlarged or reduced is counted. In step S210, if the continuously counted number of times of the distance between the two touched points is enlarged twice or more, it is determined that the pinchout operation has started.
[0073] If the number of times continuously counted the distance between the two tapped points is reduced by two times or more, it is determined that the pinch-in operation has started. On the other hand, if the continuously counted number of times the distance between the two tapped points is increased or decreased once, either a pinch-out or a pinch-in operation is not determined. In this case, it is determined that the tightening operation is not started.
[0074] This processing is performed to avoid false operations even when an error occurs in which it is erroneously determined, in step S207, that the changed distance of the two points (absolute value) is the threshold value "a" or greater, originated by conditions of the touch panel and the user's fingers. Thus, the user's intention to perform a tightening operation is certainly reflected.
[0075] If it is determined that a pinch-out or pinch-in operation is initiated (SIM in step S210), processing proceeds to step S211. On the other hand, if it is determined that neither the pinch-out nor the pinch-in operation has been performed (NOT in step S210), processing ends.
[0076] At step S211, pinch control unit 117 notifies display control unit 118 that a pinch-out or pinch-in operation has been introduced. In step S212, the display control unit 118 outputs the result originated by the input of the pinch-out or pinch-in operation.
[0077] In the present exemplary embodiment, when the pinch-out operation is introduced, processing to display the magnified display image on the monitor, according to how far the user moves the touched points, is performed. Also, when pinch-in operation is introduced, processing to display the reduced display image on the monitor according to how far the user moves the tapped points is performed.
[0078] As described above, processing to recognize the multi-touch operation by the information processing apparatus 100 is performed. After the series of processing is completed, processing returns to step S201, and each time a new touch event is notified, the processing described above is performed.
[0079] Fig. 3A is a flowchart illustrating a touched point detection processing flow performed in step S201.
[0080] In step S301, the detection unit 111 determines whether the touch event notified by the touch panel is “TOUCH”.
[0081] Of the touch panel used in the present exemplary embodiment, "TOUCH" is emitted as a touch event when the input area 401 is scanned to detect that it has been touched. When the detected ring is released, the “RELEASE” is notified as a ring event.
[0082] If the notified ring event is determined to be “RING” (YES in step S301), processing proceeds to step S303. If the ring event notified is not “RING” (ie, “RELEASE”) (NOT in step S301), processing proceeds to step S302.
[0083] In step S302, the first acquisition unit 112 deletes the information associated with the corresponding ID with the tapped point where "RELEASE" is detected among the information held in the holding unit 113 along with the ID.
[0094] On the other hand, in step S303, the detection unit 111 determines whether the tapped point, with the same ID as the tapped point at which "TOUCH" is detected, has previously been detected. The detection unit 111 refers to the information contained in the holding unit 113 and determines whether the corresponding ID is included.
[0085] If it is determined that a touched point with the same ID was previously detected (YES in step S303), processing proceeds to step S305. On the other hand, if it is determined that a touched point with the same ID has not been detected (NOT in step S303), processing proceeds to step S304.
[0086] In step S304, the first acquisition unit 112 again adds information such as the ID of the tapped point where "TOUCH" is detected, the coordinates, and the time for holding the information in the holding unit 113. At that time , the holding unit 113 holds information indicating that the touched point is up to date. For example, an updated flag is established.
[0087] In step S 305, the coordinate information with the same ID as the touched point where "TOUCH" is detected and the detected time among the information contained in the holding unit 13, is updated. At that time, the holding unit 113 retains information indicating that the touched point is up to date. For example, an updated flag is established.
[0088] Furthermore, when the required position information is moved from the position of the tapped point with the same ID before updating a predetermined distance or more, the information indicating that the tapped point is moved is kept. For example, a movement flag is established.
[0089] Then, the processing returns to the main processing (Fig. 2) where the multitouch operation is recognized. In the present exemplary embodiment, the predetermined distance is established in advance based on the shortest distance by which the touched point needs to be removed to perform a touch operation by the user. [0081] To discriminate a plurality of operations, a plurality of scaled threshold values can be established. In the present exemplary embodiment, when motion is detected within the predetermined distance, the touched point is treated as not moved, that is, stopped.
[0090] In the present exemplary embodiment, a touch panel notifies "TOUCH" as a touch event when the input area 401 is scanned and touch is detected, and notifies "RELEASE" as a tap event when the detected tapped point is released. However, it is not limited to these.
[0091] For example, “TOUCH_DOWN”, “MOVE”, and “TOUCH_UP” are notified as touch events, respectively, when the input area 401 is touched again, when the movement of the touched point with the already detected ID is detected , and when the tapped point is released.
[0092] In this case, when “TOUCH_DOWN” is notified, the information contained in the holding unit 113 is added again and the updated signaling is set. When “MOVE” is notified, information with the same ID is updated and motion signaling is established. Also, when “TOUCH_UP” is notified, information with the same ID can be deleted.
[0093] In this way, the information processing apparatus 100 identifies two or more touched points detected by the touch panel which is an input device using IDs and controls the information. As a result, the movement of each of the plurality of touched points can be detected. Therefore, the multi-touch operation configured on those tapped points can be recognized.
[0094] Fig. 3B is a flowchart illustrating a flow of an output processing of the result of the multi-drag operation performed in step S209. An exemplary operation required by the multi-drag operation, in the present exemplary embodiment, processing to display specific image data on the touch panel monitor from among a plurality of image data stored in storage unit 119, is performed. The plurality of image data stored in storage unit 119 is stored in a predetermined order.
[0095] In the present exemplary embodiment, if the multi-drag direction is to the left, an image has stored ten sheets after the image data displayed before user operation. If the multi-drag direction is to the right, an image stored ten sheets before the image data displayed before the user operation.
[0096] In step S311, the display control unit 118 acquires the multi-dragged direction performed by the user. In the present exemplary embodiment, the display control unit 118 acquires at least one of the touched points acquired by the first acquisition unit 112, if the direction in which the user moves the user's finger from the X-coordinate of the information. position before and after update is negative direction or positive direction of X-geometric axis.
[0097] In step S312, the display control unit 118 specifies image data to be displayed on the monitor based on information indicating the direction acquired in step S311. In the present exemplary embodiment, if the acquired direction is the negative direction of the X-geometric axis, out of a plurality of image data from storage unit 119, the image data corresponding to the ten image data sheets after the data of image displayed before user operation are specified.
[0098] Similarly, if the acquired direction is the positive direction of the X-geometric axis, out of a plurality of image data stored in the storage unit 119, the image data corresponding to the ten image data sheets before the image data displayed before user operation, are specified.
[0099] At step S313, the display control unit 118 generates a display image to display the specified image data. Then, in step S314, the display control unit 118 outputs the generated display image to the touch panel monitor 108 (output device). Then, processing is finished to return to main processing (ie, flowchart in Fig. 2) for recognition of the multitouch operation.
[0100] In addition, when the display image output from step S314 is performed, processing can be performed in response to the release of the point touched by the user. In this case, after step S313, the detection processing of the touched point of Fig. 3A is performed, and processing proceeds to step S314, based on the completion of the processing of deleting the information of the points touched in step S302 of the retention for all touched points.
[0101] In this way, the user can designate a time to display a specific image to the information processing apparatus 100 by the release operation of the fingers used to introduce the touch panel multi-drag operation.
[0102] The operation requested by the multi-drag operation is not limited to the example described above. For example, the method for specifying the image data to be displayed is not limited to ten sheets after or ten sheets before, and can be arbitrarily determined. Furthermore, with respect to the operation of multi-drags in the Y-axis direction, a certain operation of the information processing apparatus 100 can be designed.
[0103] However, in general, image advance is performed in many cases corresponding to the operation to move the displayed image in the X direction by dragging, to display other images stored before and after the image. Therefore, the operation to display a specific image by advancing more images once, moving the displayed images in the direction of the X-geometric axis, employing multi-drags with a plurality of fingers, is intuitively understandable and convenient for users.
[0104] An example of operation 1 of information processing apparatus 100, according to the first exemplary embodiment of a user, is described in detail.
[0105] Fig. 4 is a diagram illustrating an exemplary operation of information processing apparatus 100 in accordance with the present exemplary embodiment. Fig. 4A illustrates a state in which a user enters an operation into an input area 401 of information processing apparatus 100. Input area 401 is configured of a touch panel monitor including a touch panel over the monitor. Input area 401 can detect a user tapping on the touch panel.
[0106] In the present exemplary embodiment, the entire input area 401 is used as a coordinate plane with an origin point at its lower left position, and the touch panel detects coordinates of the position touched by the user. In the present exemplary embodiment, as a specific operation example, the multi-drag operation is introduced by contacting fingers 402 and 403 to input area 401, and moving fingers 402 and 403 in the horizontal direction.
[0107] Figure 4B is a diagram illustrating an exemplary displacement of the positions of the touched points obtained by the information processing apparatus 100 when the user performs the multi-drag operation, as illustrated in Fig. 4A. The tapped points A to F illustrate the coordinates of the positions tapped by the user's fingers in the input area 401. The horizontal direction is set to be the X-geometry axis and the vertical direction is set to be the Y-geometry axis.
[0108] First, the user's fingers 402 and 403 touch the positions corresponding to points A and B in the input area 401. The left side finger 402 touches the position corresponding to the touched point A, and the right side finger 403 touches the position corresponding to the tapped point B.
[0109] Then the user moves both fingers horizontally. The positions touched after moving the two fingers correspond to point C touched and point D touched. The left lateral finger 402 touches the position corresponding to the touched point C, and the right lateral finger 403 touches the position corresponding to the touched point D.
[0110] Also, the user moves both fingers horizontally. The positions touched after moving the two fingers are touched points E and F. Left lateral finger 402 touches the position corresponding to touched point E, and right lateral finger 403 touches the position corresponding to touched point F. Here below, accordingly with the series of operations, the operation of information processing apparatus 100, in accordance with the present exemplary embodiment, will be described.
[0111] In operation example 1, the touch panel input area 401 is scanned every 20 ms, and sequentially detected touch events are notified to the detection unit 111. In addition, the threshold value “a” is set as being 10 points up front.
[0112] First, the scan of input area 401 is started in a state where the user touches input area 401, and the touch event of touched point A is notified by the touch panel. In step S201, the detection unit 111 of the information processing apparatus 100 detects that the touched point A has been touched again. Then, the first acquisition unit 112 acquires information that the ID is 1, the position coordinates are (100, 50), and the detection time is 0 ms, to hold them in the holding unit 113.
[0113] Fig. 5A is a table illustrating an example of information held in holding unit 113. In step S202, determining unit 114 refers to information held in holding unit 113 to determine whether two or more points have touched are detected in input area 401 based on the number of IDs. At this time, as illustrated in Fig. 5A, only one touched point with ID 1 is detected, not two or more touched points (NOT in step S202), and therefore, processing ends.
[0114] Then, on the touch panel, the touch event of tapped point B is notified. In the information processing apparatus 100 of step S201, similarly, the first acquisition unit 112 acquires information that the ID is 2, the position coordinates are (110, 40), and the detection time is 5 ms. At that time, the table contained in the holding unit 113 is updated as illustrated in Fig. 5B.
[0115] In step S202, the determination unit 114 determines that the number of points touched is two or more (YES in step S202). In step S203, the determination unit 114 determines that the information of all detected touched points is updated (SIM in step S203), in view of the processing being carried out for the first time. At step S204, decision unit 116 decides that at least one touched point is not moved (NOT at step S204). Then, in step S205, the distance B in Fig. 4B is calculated based on the position coordinates of the respective two touched points. Distance B is shortened among other distances between the respective two points.
[0116] In the present exemplary embodiment, based on the coordinates of tapped point A (100, 50) and tapped point B (110, 40), the distance between the two points is 14 points. The second acquisition unit 115 contains 14 points in RAM 103 as the distance between the two points in the initial detection time. Then processing is finished.
[0117] Then input area 401 is scanned every 20ms, tapping event of tapped point C is notified by touch panel. In step S201, the first acquisition unit 112 of the information processing apparatus 100 acquires the information that the ID is 1, the position coordinates are (200, 50), and the detection time is 20 ms, and the information contained in holding unit 113 is updated, as illustrated in Fig. 5C. At that time, the ID of tapped point C is 1, that is, identical to that of tapped point A, and therefore the information associated with that of ID 1 is updated.
[0118] In addition, the position coordinates are different from those detected last time, a motion flag is set. In step S202, the number of points played is determined to be two or more. In step S203, since the information of the point touched with ID 1 is updated, but the point touched with ID 2 is not updated, the determination unit 114 determines that the information not of all points touched is updated (NOT in step S203), processing is terminated.
[0119] Then a touch event of tapped point D is notified by the touch panel. In step S201, the information processing apparatus 100 similarly acquires the information that the ID is 2, the position coordinates are (210, 40), the detection time is 25 ms and the information contained in the holding unit 113 is updated as illustrated in Fig. 5D. At that time, since the ID of tapped point D is 2 and identical to that of tapped point B, the information associated with ID 2 is updated.
[0120] Also, since the position coordinates are different from the last detection time, the motion flag is set. In step S202, the number of points touched is determined to be two, i.e. two or more (YES in step S202). Therefore, in step S203, since the touched points with ID 1 and 2 have been updated, it is determined that the position information of all touched points is updated (YES in step S203). In step S204, based on the move flag, it is determined that at least one touched point is moved (YES in step S204).
[0121] In step S206, the second acquisition unit 115 acquires the change in distance between the two touched points and its absolute value. First, based on the coordinates of the touched point C (200, 50) and the coordinates of the touched point D (210, 40), the distance between the two points (distance B in Fig. 4A) is calculated as 14 points.
[0122] Then, the changed distance value (absolute value), between the two points acquired by the second acquisition unit 115 the last time and retained in RAM 113, is calculated to be 0 points (= 14 - 14).
[0123] Now, since the threshold value "a" is 10 points, in step S207, the decision unit 116 decides that the distance value changed between the two points is less than the threshold value "a" ( YES in step S207). Therefore, in step S208, the multi-drag operation is notified, and in step S209, an exit is performed.
[0124] According to operation example 1, the multi-drag operation is performed in the positive direction of the X-geometric axis, among the plurality of image data stored in the storage unit 119, the image data corresponding to ten sheets of image before the image displayed before user operation, are displayed.
[0125] In addition, since the input area 401 is scanned every 20 ms, a touch event of the tapped point E is notified on the touch panel.
[0126] In step S201, the first acquisition unit 112 of the information processing apparatus 100 acquires the information that the ID is 1, the position coordinates are (300, 50), and the detection time is 40 ms , and the information held in the holding unit 113 is updated, as illustrated in Fig. 5E. At that time, the ID of tapped point E is 1, that is, identical to that of tapped point C, and therefore the information associated with that of ID 1 is updated.
[0127] In addition, the position is moved. In step S202, the number of points touched is determined to be two or more. In step S203, since the information of the point touched with ID 1 is updated, but the point touched with ID 2 is not updated, the determination unit 114 determines that the information not of all points touched is updated ( NO in step S203), and processing ends.
[0128] Then, from the touch panel, a touch event of tapped point F is notified. In step S201, the first acquisition unit 112 of the information processing apparatus 100 similarly acquires the information that the ID is 2, the position coordinates are (310, 40), and the detection time is 45 ms, and the information contained in the holding unit 113 is updated, as illustrated in Fig. 5F. At that time, the ID of tapped point F is 2, that is, identical to that of tapped point D, and therefore the information associated with that of ID 2 is updated.
[0129] Also, the position coordinates are different from those last detected, the motion flag is determined. In step S202, the number of points touched is determined to be two, i.e. two or more (YES in step S202). Therefore, in step S203, once the touched points with ID 1 and 2 are updated, it is determined that the position information of all touched points is updated (YES in step S203). In step S204, it is determined that at least one touched point is moved (YES in step S204).
[0130] In step S206, the second acquisition unit 115 acquires the changed distance between the two touched points and its absolute value. First, based on the coordinates of the touched point E (300, 50) and the coordinates of the touched point F (310, 40), the distance between the two points (distance F in Fig. 4B) is calculated to be 14 points. Then, the changed value (absolute value), of the distance between the two points acquired by the second acquisition unit 115 last time and contained in RAM 113, is calculated to be 0 points (= 14 - 14).
[0131] Now, since the threshold value "a" is 10 points, in step S207, the decision unit 116 decides that the changed distance between the two points is less than the threshold value "a" (SIM in step S207). Therefore, in step S208, the multi-drag operation is notified, and in step S209, an exit is performed.
[0132] According to operation example 1, the multi-drag operation is performed in the positive direction of the X-geometric axis, among the plurality of image data stored in the storage unit 119, the image data corresponding to ten sheets of image, even before the image displayed in step S209, are displayed.
[0133] When the image data specified in response to the release of the tapped point is displayed, even if the multi-drag operation is notified more than once, the display image for displaying the image data specified for the first occasion is kept .
[0134] Therefore, in response to releasing all touches of the points touched by the user, the image data corresponds to ten image sheets before the image displayed before the user operation. In this way, the user can certainly display the image data corresponding to ten images before or after the displayed image, regardless of the distance of the multi-drag operation.
[0135] As described above, in the present exemplary embodiment, the position information of the two touched points touched by the user is acquired and, in response to the completion of updating the position information of all touched points, processing is performed to determine the input of multi-touch operation by the user in the information processing apparatus 100.
[0136] At that time, by comparing the changed distance between the two points and the threshold value, it is determined what the input operation is, either the multi-drag operation or the tightening operation. In this way, when the user moves the two tapped points in the same direction while keeping the distance between the two tapped points, the tightening operation is not erroneously notified. As a result, the false operation where the displayed image size is changed in relation to the user's intent is avoided.
[0137] Similarly, when the user introduces another multi-touch operation using a plurality of fingers, such as scrolling or rotating the image, applying the present exemplary embodiment, it is possible to prevent the pinch operation from being misrecognized.
[0138] In this case, determining the input of multitouch operation by the user after the position information of all touched points constituting the multitouch operation is updated, it is possible to reduce the false operations performed in relation to the user's intention.
[0139] In the present exemplary embodiment, after the decision unit 116 decides that the user input is a pinch operation, in step S210, the pinch control unit 117 determines whether the operation is the pinch operation. out or the pinch-in operation started. The processing performed in step S210 can be omitted.
[0140] In that case, in step S211, the decision unit 116 notifies the display control unit 118 that the tightening operation has started. Then, in step S212, the display control unit 118 performs processing to change the size of the displayed image according to the distance by which the user moves the touched point based on the value changed between the two points acquired by the second acquisition unit. 115.
[0141] However, in step S210, by determining whether the operation was initiated based on the changed directions and the number of directions changed in the distance between the two points, the display image can be changed after determining the operation intended by the user more correctly.
[0142] For example, if an error occurs where the changed distance value (absolute value) between the two points is determined to be the threshold value "a" or greater caused by the state of the touch panel or the user's fingers, notification that the tightening operation has been entered is not performed. Therefore, an effect to avoid false operation in which the display magnification of the displayed image is changed in relation to the user's intention can be obtained.
[0143] In the present exemplary embodiment, the processing of the two tapped points is described, however, even if more than two tapped points are detected, similar processing can be performed.
[0144] In this case, for example, only two tapped points, selected according to the detected order starting from the first tapped point, and the same processing described above, is performed. Alternatively, if N touched points (more than two points) of the touched point are detected, the information of all N touched points is acquired and, in step S203, the subsequent processing steps are performed in response to the end of the information update of all points touched N.
[0145] In the first exemplary embodiment, the touch panel as an input device notifies the information processing apparatus 100 of a touch event each time the touch panel detects a touched point or a touch point released by a user . In comparison, in modification example 1, the touch panel notifies a touch event each time the touch panel detects a point touched in a new position by the user, and the touch point released by the user to the touch processing apparatus. information 100. One such case will be described below.
[0146] The touch panel used in modification example 1 of the first exemplary embodiment does not notify a touch event when the touched point is stationary. Therefore, the touched point information held by the information processing apparatus 100 is not superimposed.
[0147] Therefore, when processing to recognize a multitouch operation, according to the first exemplary embodiment, is performed and, for example, in a case where the user fixes a tapped point, of the tapped points, and moves another point tapped to perform a tightening operation, a part of the information of the tapped points is kept as it is.
[0148] Therefore, if the determination is made based on the completion of updating all the information held by the information processing apparatus 100, as in the first exemplary embodiment, the multitouch operation may not be performed. Therefore, in modification example 1 of the first exemplary embodiment, it is determined that the information of all touched points is specified on the basis of a predetermined time elapsed from the detection time previously held in the holding unit 113.
[0149] The hardware configuration and function configuration of the information processing apparatus 100, according to modification example 1, are the same as those of the first exemplary embodiment. Meanwhile, the determining unit 114 of modification example 1 of the first exemplary embodiment refers to a clock included in the information processing apparatus 100 and in the information held in the holding unit 113, and determines that the information of position of all tapped points was updated according to the predetermined time elapsed since the update was performed previously.
[0150] Fig. 6 is a flowchart illustrating a flow of processing, which is performed by information processing apparatus 100 in modification example 1 of the first exemplary embodiment, for recognizing the multitouch operation. In steps with the same numerals as those in Fig. 2, the same processing is performed. Therefore, its details are not described and the different parts of the first exemplary embodiment will be described.
[0151] In modification example 1 of the first exemplary embodiment, in step S202, if it is determined that there are two or more points touched (SIM in step S202), then, in step S601, the determination unit 114 determines whether the position information of all tapped points held in the holding unit 113 is updated. In first time processing, it is always determined to be “YES”.
[0152] Then, if it is determined that all position information held by the holding unit 113 is updated (SIM in step S601), processing proceeds to step S24. On the other hand, if it is determined that all position information held by the holding unit 113 is not updated (NOT at step S601), processing proceeds to step S602.
[0153] In step S602, the determination unit 114 determines whether a predetermined time has elapsed since the end of the update of the previously touched point among the touched points detected at the time. Here, the predetermined time is set earlier to be a time long enough to complete scanning the entire input area 401 by the touch panel.
[0154] For example, a time as long as the touch panel scan period is set. If the detected tapped points are two or more, and enough time to complete the entire touch panel scan update has elapsed since the previous position information, among the currently detected position information, it is known that the tapped point movement , which the touch event is not notified of, is not detected. Therefore, it is considered that the position information of all touched points held in the holding unit 113 at the time point is maintained.
[0155] If the determination unit 114 determines that the predetermined time elapsed since the previous update of the position information of the touched points has been executed (YES in step S602), processing proceeds to step S204. On the other hand, if the determination unit 114 determines that the predetermined time, not elapsed since the previous update of the position information of the touched points, has been executed (NOT in step S602), processing is terminated.
[0156] Processing other than that described above is performed similar to the processing to recognize the multi-touch operation by the information processing apparatus 100 illustrated in Fig. 2.
[0157] Operation example 2 of information processing apparatus 100, according to modification example 1 of the user-operated first exemplary embodiment, is described in detail by referring to Figs. 7A to 7E.
[0158] Fig. 7A is a diagram illustrating an example of position displacements of the touched points acquired by the information processing apparatus 100 when the user performs a tightening operation. Fig. 7A corresponds to Fig. 4B of the first exemplary embodiment. However, the user retains the tapped point A with the left lateral finger 402 in Fig. 4A, and moves only the right lateral finger 403 to perform a pinch operation in the right direction (positive direction of the X-geometric axis). As a result, tapped point B, tapped point D, and tapped point F are sequentially detected.
[0159] Figs. 7B to 7E are an example of a table illustration information contained in the holding unit 113 held in response to detection of each touched point. Figs. 7B to 7E correspond to Figs. 5A to 5F of the first exemplary embodiment. [0152] In operation example 2 also, the touch panel input area 401 is scanned every 20 ms, and detected touch events are notified to the detection unit 111. The predetermined time used to determine whether the position information of all touched points is updated is set to 20 ms, and the threshold value “a” is set to 10 points. They were previously determined.
[0160] First, the 401 input area scan is started in a state where the user touches the 401 input area, and the touch event of touched point A is notified by the touch panel. In step S201, the detection unit 111 of the information processing apparatus 100 detects that the touched point A has been touched again. Then, as illustrated in Fig. 7B, the first acquisition unit 112 acquires information that the ID is 1, the position coordinates are (10, 50), and the detection time is 0 ms, and retains the information in the holding unit 113.
[0161] In step S202, the determination unit 114 refers to the information held to determine whether the number of touched points detected in the input area 401 is two or more of the number of IDs. At this point, as illustrated in Fig. 7B, since only one point touched with ID 1 is detected (ie not two or more points detected) (NOT in step S202), processing ends.
[0162] Then the touch event of tapped point B is notified on the touch panel. In step S201, the first acquisition unit 112 of the information processing apparatus 100 similarly acquires information that the ID is 2, the position coordinates are (110, 40), and the detection time is 5 ms. At that time, the table retained in the holding unit 113 is updated as illustrated in Fig. 7C.
[0163] In step S202, the determination unit 114 determines that the number of points touched is two or more. In step S601, the determination unit 114 determines, when it is the first time processing, that the information of all detected touched points is updated (SIM in step S601).
[0164] At step S204, decision unit 116 decides that at least one touched point is not moved (NOT at step S204). Then, in step S205, by the position coordinates of the touched points, the shortest distance between the two points is calculated (distance illustrated in Fig. 7A).
[0165] Here, based on the coordinates (100, 50) of the touched point A and the coordinates (110, 40) of the touched point B, the distance between the two points is calculated as 14 points. The second acquisition unit 115 stores 14 points as the distance between the two points in the detection start time in RAM 103, and processing ends.
[0166] Then input area 401 is scanned every 20 ms, tap event of tapped point D is notified by touch panel. This is because, in the touch panel of modification example 1, a new touch event of stationary touched point A is not notified.
[0167] Therefore, in step S201, the first acquisition unit 112 of the information processing apparatus 100 acquires information that the ID is 2, the position coordinates are (210, 40), and the detection time is 25 ms. Then, the information held in the holding unit 113 is updated, as illustrated in Fig. 7D.
[0168] At this time, the ID of tapped point D is 2. Since it is the same as that of tapped point B, only the information associated with ID 2 is updated. Furthermore, since the position coordinates are different from those of the previous detection time, the motion flag is set. In step S202, since the number of touched points is 2, it is determined that two or more touched points are detected (YES in step S202).
[0169] Then, in step S601, since the point touched with ID 2 is updated, but the point touched with ID 1 is not updated, it is determined that the position information of all touched points is not updated (NOT in step S601).
[0170] Then, in step S602, it is determined if 20 ms have elapsed, since the previous update was performed between the information of the retained touched points. Now the elapsed time since the previous update is executed (that is, from the time when touched point A is detected), is 25 ms, and the elapsed time exceeds the predetermined time (YES in step S602). Therefore, it can be determined that the position information of all tapped points is updated.
[0171] In step S204, based on the motion signaling, it is determined that at least one touched point is moved (YES in step S204). In step S206, the second acquisition unit 115 acquires the changed distance between the two touched points and its absolute value.
[0172] Here, based on the coordinates of the touched point A (100, 50) and the coordinates of the touched point D (210, 40), the distance between the two points (distance D in Fig. 7A) is calculated as 110 points . The shifted distance (absolute value), from the distance between the two points previously acquired by the second acquisition unit 115 and retained in RAM 103, is calculated as 96 points (= 110 - 14).
[0173] In the present exemplary embodiment, since the threshold value "a" is 10 points, in step S207, the decision unit 116 decides that the changed distance between the two points is the threshold value "a" or larger (NOT in step S207). Therefore, in step S210, it is determined whether the pinch-out or pinch-in operation has started.
[0174] Now, the changed distance between the two points acquired by the second acquisition unit 115 is 96 points. Therefore, the magnified number of times the distance between the two points is counted as one. However, the number counted is less than 2, the pinch-out operation is determined to be started (NOT in step S210), and processing is finished.
[0175] Then input area 401 is scanned every 20 ms, tap event of tapped point F is notified by touch panel. This is because, in the touch panel of modification example 1, a new touch event of stationary touched point A is not notified. Therefore, in step S201, the first acquisition unit 112 of the information processing apparatus 100 acquires the information that the ID is 2, the position coordinates are (310, 40), and the detection time is 45 ms. Then, the information held in the holding unit 113 is updated, as illustrated in Fig. 7E. At that time, the ID of tapped point C is 2, that is, identical to that of tapped point B, and therefore the information associated with that of ID 2 is updated. Furthermore, since the position coordinates are different from those of the previous detection time, motion signaling is established.
[0176] In step S202, the number of points played is determined to be two, that is, two or more (YES in step S202). Then, in step S601, since the point touched with ID 2 is updated, but the point touched with ID 1 is not updated, it is determined that the position information of all touched points is not updated (NOT in step S601).
[0177] Then, in step S602, it is determined if 20 ms elapsed, since the previous update was performed between the information of the retained touched points. Now the elapsed time since the previous update is executed (that is, from the time when tapped point A is detected), is 45 ms, and the elapsed time exceeds the predetermined time (YES in step S602). Therefore, it can be determined that the position information of all tapped points is updated.
[0178] In step S204, based on the motion signaling, it is determined that at least one touched point is moved (YES in step S204). In step S206, the second acquisition unit 115 acquires the changed distance between the two touched points and its absolute value.
[0179] Here, based on the coordinates of tapped point A (100, 50) and the coordinates of tapped point D (310, 40), the distance between the two points (distance F in Fig. 7A) is calculated as 210 points . Then, the changed value (absolute value), from the distance between the two points acquired by the second acquisition unit 115 last time and retained in RAM 103, is calculated as 100 points (= 210 - 110).
[0180] In the present exemplary embodiment, since the threshold value "a" is 10 points, in step S207, the decision unit 116 decides that the changed distance between the two points is the threshold value "a" or larger (NOT in step S207). Therefore, in step S210, the pinch control unit 117 determines whether pinch-out operation or pinch-in operation has been started.
[0181] Now, the changed distance between the two points acquired by the second acquisition unit 115 is 100 points. Therefore, the magnified number of times the distance between the two points is counted as one. At step S210, since the magnified number of times the distance between the two touched points is two or more, the pinch control unit 117 determines that the pinch-out operation is started (YES at step S210).
[0182] At step S211, pinch control unit 117 notifies display control unit 118 that a pinch-out operation has been introduced. Then, in step S212, the result of the tightening operation is output. The display control unit 118 performs the processing to display the enlarged display image based on the result that the distance between the two points is enlarged from 14 points away to 210 points away from F.
[0183] As described above, according to modification example 1 of the first exemplary embodiment, when the user touches two points, the multitouch operation, which is entered by the user in the information processing apparatus 100, is determined in response over a predetermined time as the previous position information is updated.
[0184] Therefore, even if when a touch panel, which does not notify a new touch event when a detected touched point is stationary, is used, it is possible to determine that the position information of all detected points is updated.
[0185] Therefore, even when a pinch operation is performed by moving a tapped point of the two tapped points while the user keeps another point stationary, the user can change the size of the display image to satisfy the user's intent. [0179] In a second exemplary embodiment, an exemplary multi-drag operation entered by a user is described as an example of the multi-touch operation entered by a user. In the first exemplary embodiment, it is determined whether a multi-drag operation or squeezing operation is introduced based on whether the changed distance between two points changed by the movement of at least one touched point is less than a predetermined threshold value, such as a result of updating the position information of the detected two points.
[0186] On the other hand, in the second exemplary embodiment, it is determined that whether a multi-drag operation or a tightening operation is introduced, based on whether the angle difference in the directions of movement is changed by the movement of at least one point touched, is less than a predetermined threshold value as a result of updating the position information of the two detected points.
[0187] At that time, when the angle difference is smaller and the touched points of the two fingers are supposed to have moved almost in the same direction, it is determined that the multi-drag operation is introduced. On the other hand, when the angle difference is greater, it is determined that the tightening operation is introduced.
[0188] Fig. 8 is a function block diagram illustrating a function configuration of the information processing apparatus 100 according to modification example 1. Fig. 8 is different from Fig. 1B in that one generation unit 801 and a third acquisition unit 802 to be added.
[0189] The generation unit 801 retains the information necessary to generate vectors between the information of the touched points acquired by the first acquisition unit 112. The generation unit 801 generates a motion vector indicating a motion direction and a motion amount for each tapped point, based on the offset of the position information of the tapped point.
[0190] The third acquisition unit 802 compares the direction components of each of the motion vectors with each touched point generated by the generating unit 801 to acquire the angle difference. Then, the decision unit 116, according to the present exemplary embodiment, decides that the operation performed by the user is a tightening operation (first operation), if the angle difference of the motion vectors acquired by the third acquisition unit 802 is a predetermined threshold value or greater. [0185] On the other hand, if the acquired angle difference is less than the predetermined threshold value, it is determined that the operation performed by the user is a multi-drag operation (second operation).
[0191] Fig. 9 is a flowchart illustrating a flow that is performed by information processing apparatus 100 of the second exemplary embodiment to recognize a multitouch operation. In steps with the same numerals as those in Fig. 2, the same processing is performed. Therefore, its details are not described and the different parts of the first exemplary embodiment will be described.
[0192] In the second exemplary embodiment, in step S204, if the decision unit 116 decides that at least one touched point is moved (SIM in step S204), processing proceeds to step S901. In step S901, the third acquisition unit 802 acquires the angle difference of the motion vectors. Details of the processing performed in step 901 will be described below.
[0193] Then, in step S902, the decision unit 116 decides whether the angle difference, of the two motion vectors, generated by the generation unit 801 is less than a threshold value "b". In the present exemplary embodiment, the threshold value "b" is a threshold value for a degree of angle previously recorded in information processing apparatus 100. It is set as a minimum value required to decide that the two touched points are moved in different directions.
[0194] If the angle difference is the threshold value "b" or greater (NOT in step S902), the decision unit 116 decides that the user input is a tightening operation, and processing proceeds to step S210.
[0195] If the angle difference is less than the threshold value “b” (SIM in step S902), the decision unit 116 decides that the user input is a multi-drag operation, and processing proceeds to the step S208.
[0196] In step S902 of the second exemplary embodiment, if it is determined that the user enters a tightening operation, then in step S210, The second acquisition unit 115 acquires the distance between the two touched points and compares it with the distance retained. Then, based on whether the distance between the two points is enlarged or reduced, processing similar to that of the first embodiment is performed.
[0197] Fig. 10 is a flowchart illustrating an exemplary processing flow to acquire the angle difference of the motion vectors of the touched points performed in step S901. First, in step S1001, the generating unit 801 selects the ID of a tapped point to specify a tapped point from the detected plurality of tapped points.
[0198] In step S1002, the generating unit 801 determines the coordinates as being a starting point of a position vector. In the present exemplary embodiment, the generating unit 801 determines the position of the tapped point with the selected ID detected the first time or the position detected last time, based on the position information of the tapped point held, such as the coordinates of the point. starting position vector.
[0199] In step S1003, the generation unit 801 determines the coordinates of the end point of the motion vector. In the present exemplary embodiment, the generation unit 801 determines the last position information of the point touched with the selected ID as the coordinates of the end point of the motion vector, based on the information acquired by the first acquisition unit 112.
[0200] Then, in step S1004, the generating unit 801 acquires information indicating the direction of the motion vector going from the given starting point to the given ending point. On this occasion, the generating unit 801 generates the motion vector, connecting the given starting point and the ending point. However, the motion vector, when the starting point and ending point have the same coordinates, is defined as a 0 vector.
[0201] The generating unit 801 then calculates the angle in a clockwise direction indicating the direction of the motion vector generated using the Y-axis of the input area 401 as a reference. Angle calculation can be performed using, for example, a trigonometric function such as a tangent. The 801 generation unit retains the acquired angle information by associating it with the tapped point ID.
[0202] In step S1005, the third acquisition unit 802 determines whether motion vectors of all touched points are generated. The third acquisition unit 802 performs the determination based on the number of IDs retained in the generating unit 801.
[0203] If the third acquisition unit 802 determines that the vectors of all touched points are generated (SIM in step S1005), processing proceeds to step S1006. On the other hand, if the third acquisition unit 802 determines that the vectors of all touched points are not generated (NOT in step S1005), processing returns to step S1001. In step S1001, a tapped point ID to be processed next is selected.
[0204] In step S1006, the third acquisition unit 802 acquires the angle difference of the motion vectors for each touched point. The third acquisition unit 802 calculates the difference of the vector angles. The angles retained in the 801 generation unit indicate the directions of the vectors. The angle difference obtained is an absolute value. Then, the angle difference calculation processing of the motion vectors ends, and processing proceeds to step S902. Subsequent processing is performed similar to that of the first exemplary embodiment.
[0205] Figs. 11A to 11F are diagrams illustrating an exemplary method of generating motion vectors of a touched point performed in step S901. Referring to Fig. 11A, a specific example of motion vectors generated in the present exemplary embodiment is described.
[0206] In Figs. 11A to 11F, p1 to p8 are points touched from the input area 401 and the user's finger sampled at constant time intervals of the coordinates of the positions where the user's finger touches the input area 401. p1 corresponds to the first position detected.
[0207] In the present exemplary embodiment, the touch panel scans input area 401 every 20 ms. Therefore, the tapped point can be obtained at about 20 ms intervals. The generation unit 801 generates a motion vector V1a when p2 is detected. Similarly, a motion vector V2a is generated when p3 is detected, and a motion vector V3a is generated when p4 is detected.
[0208] The motion vector generation method is not limited to the method used in the present exemplary embodiment. Referring to Figs. 11B to 11D, other example generation methods are described below.
[0209] Referring to Fig. 11B, a method is described in which, when the length of the generated motion vector exceeds the reference distance L, the detected position of the touched point, which is the end point of the motion vector , is established as the starting point of a motion vector to be generated by the next processing.
[0210] Here, p1 to p4 are tapped points sampled at constant time intervals, and p1 corresponds to the position of the first tapped point detected. The generation unit 801 determines the coordinates of p1 detected first as the starting point of the motion vector when p1 and p2 are detected. When p2 is detected, the generating unit 801 generates the motion vector, going from p1 as the starting point to p2 as the ending point, and determines whether the length of the generated motion vector (motion length of p1 a p2) exceeds the reference distance L.
[0211] When the length of the generated motion vector exceeds the reference distance L, the last detected position of the touched point is determined to be the coordinates of the starting point when the motion vector is next generated. Because the length of the motion vector generated (motion range p1 to p2) does not exceed the reference distance L, when p3 is detected, motion vector V1b is generated. Then, when p3 is detected, the generation unit 801 determines whether the motion vector length V1b exceeds the reference distance L.
[0212] Because the length of the motion vector V1b exceeds the reference distance L, the generating unit 801 determines p3 as the starting point of the motion vector when the motion vector is next generated.
[0213] Then, when p4 is detected, the generating unit 801 generates the motion vector V2b, going from p3 as the starting point to p4 as the ending point. In this way, the generation unit 801 generates a motion vector for each touched point, and the third acquisition unit 802 acquires the angle difference, using the last motion vector generated for each touched point. The generating unit 801 can discard the touched points information that is not required to generate the vectors, and the motion vector information that is not the last.
[0214] Fig. 11C illustrates an example where the generation unit 801 determines the point at which the motion speed of the touched point becomes local minimum, as the coordinates of the starting point to generate the motion vector.
[0215] Here, p1 to p8 are tapped points sampled at constant time intervals. Since the sampling time intervals are constant, the distance between the touched points in Fig. 11C is proportional to the movement speed of the touched point. Therefore, it can be seen that at p6 the movement speed becomes local minimum from the displacements of the touched points from p1 to p8.
[0216] Therefore, the generation unit 801 determines p1 as the starting point of the motion vector in processing when p1 to p6 are detected and, for example, when p6 is detected, the generation unit 801 generates the motion vector V1c . Then, in the processing performed when points subsequent to p7 are detected, the generating unit 801 determines p6 as the starting point and, for example, when p8 is detected, the generating unit 801 generates the motion vector V2c.
[0217] However, the position coordinates of tapped points acquired from the touch panel sensor may include variations. For example, the area of contact between the user's finger and the touch panel may change while operating, and the relative position of the coordinates of a specified point with the touched point may change. When the motion vector is generated without taking such variations into account, the direction of the motion vector may depend on the variation of the detections, and the precision of the processing to compare the angle difference and the threshold value “b” may deteriorate.
[0218] To solve this problem, referring to Fig. 11D, an example is described in which the motion vector is generated, considering coordinates of all displacements of the touched points between certain coordinates as the starting point and the last position detected as the ending point.
[0219] In the present exemplary embodiment, the coordinate of p1, which is first detected, is determined as the starting point. As the coordinate as the ending point, the X-coordinate is determined to be the X-coordinate of the last detected position of the touched point, and the Y-coordinate is determined to be the average value of the Y-geometric coordinates of the positions. touched points with the same ID detected so far.
[0220] In this way, the precision of the comparison processing of the angle difference of the motion vectors and the threshold value “b” can be increased, generating the motion vectors using the average value of the coordinates of the detected touched point. Thus, it is possible to correctly recognize the multitouch operation.
[0221] The above described method of generating motion vectors is just an example. For example, it is also possible to determine the coordinates of the starting point of the motion vector to be generated next, based on which the angle difference of the directions of the generated motion vectors has become a predetermined threshold value or greater, or the touched point has not moved for a predetermined period of time or longer. Thus, among various methods, it is possible to select an appropriate method, in accordance with the calculation capability of information processing apparatus 100 and the like, and be used by the present exemplary embodiment.
[0222] An example operation 3 is described, in which a user operates the information processing apparatus 100 according to the second exemplary embodiment.
[0223] In operation example 3, similar to operation example 1, a user touches the input area 401 with two fingers, 402 and 403, and moves them together in the right direction (multi-drag operation), as illustrated in Fig. 4A. Then the touched point of each finger moves as illustrated in Fig. 4B.
[0224] At that time, the information of each touched point acquired by the first acquisition unit 112 is held in the holding unit 113, as tables illustrated in Figs. 5A to 5F. In operation example 3, the threshold value “b” of the angle is determined at 45 degrees.
[0225] The difference of operation example 3 from operation example 1 lies in the processing performed after it is determined, in step S204, that at least one touched point is moved (YES in step S204) in response to the notification of the event of touch point D received by the touch panel. In step S901, the generating unit 801 generates, with respect to the touched point whose ID is 1, the motion vector A, with the position of the touched point A as the coordinates of the starting point, and with the position of the touched point C as the endpoint coordinates.
[0226] The direction of motion vector A is 90 degrees when the angle is calculated in a clockwise direction from the Y-geometric axis as a reference. Similarly, the generation unit 801 generates, with respect to the touched point whose ID is 2, the motion vector B, with the position of the touched point B as the coordinates of the starting point, and with the position of the touched point D as the coordinates of the endpoint.
[0227] The direction of motion vector B is also 90 degrees clockwise from the Y-geometric axis as a reference. Therefore, the third acquisition unit 802 acquires the information indicating 0 degree as the angle difference between motion vector A and motion vector B.
[0228] In the present exemplary embodiment, the threshold value "b" is 45 degrees. Therefore, in step S902, the decision unit 116 decides that the 0 degree angle difference acquired by the third acquisition unit 802 is less than the threshold value “b” (SIM in step S902). In other words, the user is determined to introduce a multi-drag operation because the two touched points are moved in the same direction. Therefore, in step S208 and step S209, processing corresponding to the multi-drag operation is performed.
[0229] Similarly, in the processing performed when a tap event of the tapped point F is notified by the touch panel, the generating unit 801 generates the motion vector C, illustrated in Fig. 4B with respect to the tapped point whose ID is 1 , and the motion vector D, illustrated in Fig. 4B with respect to the touched point whose ID is 2. The direction of each vector is 90 degrees and the angle difference is 0 degrees. Therefore, the decision unit 116 decides that the multi-drag operation is introduced.
[0230] In processing, to acquire the angle difference performed in step S1006 of the present exemplary embodiment, the calculation result may desirably be the absolute value. Using the absolute value, only the degree of the angle difference is taken into account and the direction of the angle difference is not taken into account.
[0231] Also, in order to calculate the angle difference at an acute angle, if the calculation result is greater than 180 degrees, it is desirable to subtract the 360 degree calculation result to obtain the angle difference. By unifying the angle differences in the acute angle, it is possible to maintain the uniformity of the comparison processing with the threshold value and visually easy operation, so that the user's understanding can be obtained.
[0232] Furthermore, in comparing the motion vector angle difference and the angle threshold value, a plurality of threshold values can be established. For example, according to the distance between the starting point and the ending point, if the distance is short, a wide angle threshold value is used. In this way, even if the variation of the detected position of the touched point, as described above, occurs, it is possible to avoid the deterioration of the accurate determination caused by the acquired angle error, such as the motion vector direction being large.
[0233] Fig. 11E illustrates the difference in direction between motion vectors V1e and V1e’. Motion vector V1e is generated when p2 is detected after p1 is detected. The vector Ve1’ is generated when p2’ is detected due to the variation of the detected position. On the other hand, Fig. 11F illustrates the difference in direction between the motion vectors V2e and V2e’. Motion vector V2e is generated when p3, which is separated from p2, is detected after p1 is detected. The vector V2e’ is generated when p3’ is detected.
[0234] The detected position variation between p3 and p3’ is the same as that between p2 and p2’. The difference in directions between V1e and V1e’ is greater than that between V2e and V2e’.
[0235] Thus, when the distance from the starting point to the ending point is shorter, the detected position variation affects the motion vector direction largely. Therefore, by establishing a plurality of threshold values, according to the distance from the starting point to the ending point, it is possible to determine the operation of multitouch input correctly.
[0236] As described above, in the present exemplary embodiment, when the user touches two points, the multi-touch operation input to the information processing apparatus 100 by the user is determined in response to the completion of updating the position information of all points touched. At that time, it is determined that either a multi-touch operation or a tightening operation is introduced based on whether the angle difference in the directions of movement changed by the movement of at least one touched point is less than a predetermined threshold value, such as result of updating the position information of the detected two points.
[0237] In this way, when the user moves the two touched points in almost the same direction, the tightening operation is not erroneously notified. As a result, the false operation, where the size of the display image is changed in relation to the user's intent, is avoided. Furthermore, the first exemplary embodiment and the modification examples, and the second exemplary embodiment can be implemented in combination.
[0238] Aspects of the present invention may also be performed by a computer of a system or apparatus (or devices, such as a CPU or MPU) that displays and executes a program recorded in a memory device to perform the functions of the forms of above-described embodiment, and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, displaying and executing a program recorded on a memory device to perform the functions of the above-described embodiments. To this end, the program is provided to the computer, for example, via a network or from a recording medium of various types serving as the memory device (eg, computer-readable medium). In such a case, the system or apparatus, and the recording medium where the program is stored, are included as being within the scope of the present invention.
[0239] Although the present invention has been described with reference to the exemplary embodiments, it should be understood that the invention is not limited to the described exemplary embodiments. The scope of the following claims is to conform to the broadest interpretation to cover all modifications, equivalent structures, and functions.
权利要求:
Claims (3)
[0001]
1. Information processing apparatus comprising: a display control means (118) configured to cause a display means (108) to display one of a plurality of images stored in a predetermined order; a detection means (111) configured to sequentially detect touch information indicating each of the two touched points touched on the touch panel one by one, wherein each part of the touch information is a signal indicating a detection of a touch point touched by a user; a first acquisition means (112) configured to acquire information from each of the two touched points indicated by the touch information detected by the detection means, sequentially one by one, the information includes position information and detection time of each of the two tapped points and associated with an ID of each of the two tapped points; a holding unit (133) configured to retain, in a memory, the information of each of the two touched points, wherein, each time the first acquisition means acquires the information of one of the two touched points, the first means acquisition updates the tap information held in memory (103, 113) with the last detected position of each of the two tapped points; the apparatus being characterized in that a determination unit (114) configured to determine, each time the information of one of the two touched points is acquired by the first acquisition means (112), whether two or more touched points are detected by the touch panel and the position information of all two detected tapped points held in memory (103, 113) have been updated with the last acquired position information; a second acquisition means (115) configured to calculate, in a case where the determination means (114) determines that two or more touched points are detected by the touch panel and the position information of all two touched points is kept in the memory (113) are updated with the most recent acquired position information, a distance between the positions of the two touched points based on the position information kept in memory and keeping the calculated distance in memory; and wherein the second acquisition means (115) is configured to calculate, in a case where it is determined that at least one of the two tapped points is moved, based on a difference of the position information of each of the two tapped points before and after updating by the first acquisition means, a changed amount of the distance based on comparing the calculated distance before the update and the newly calculated distance after the update and an absolute value of the calculated changed amount; and a decision unit (116) configured to decide an input type to the information processing apparatus (100) using the absolute value of the calculated altered distance amount calculated by the second acquisition means (115), the type of input one of: a first operation where the absolute value is equal to or greater than a threshold value, and at least one of the two touched points has been moved; and a second operation in which the absolute value is less than the threshold value, the two touched points are moved in the same direction and the absolute value is maintained; wherein: in response to the input type being the first operation, the display control means (118) causes the display means (108) to enlarge or reduce the displayed image; and in response to the input type being the second operation, the display control means (118) selects, from the plurality of images stored in the predetermined order in a storage unit (119), an image in a position in the predetermined order which is not adjacent to the displayed image, and causes the display medium (108) to display the selected image.
[0002]
2. Control method for an information processing apparatus configured to acquire position information of one or more touched points, one by one, detected in accordance with a predetermined detection period on a touch panel, the method comprising: displaying, by of a display control means (118), one of a plurality of images stored in a predetermined order; sequentially detect, by means of a detection means (111), touch information indicating each of the two touch points touched on the touch panel one by one, wherein each part of the touch information is a signal indicating a detection of a point played by a user; acquire, by means of a first acquisition means (112), information from each of the two touched points indicated by the touch information detected by the detection means, sequentially one by one, the information includes position information and detection time of each one of the two points tapped is associated with an ID of each of the two points tapped; retain, in a memory, by means of a holding unit (133), the information of each of the two points touched, in which, each time the first acquisition means acquires the information of one of the two points touched, the first acquisition means updates the tap information held in memory (103, 113) with the last detected position of each of the two tapped points; the method being characterized by determining, by means of a determination unit (114), each time the information of one of the two touched points is acquired by the first acquisition means (112), whether two or more touched points are detected by the panel of touch and the position information of all two detected touched points held in memory (103, 113) has been updated with the last acquired position information; calculate, by means of a second acquisition means (115), in a case where the determination means (114) determines that two or more touched points are detected by the touch panel and the position information of all two touched points held in memory (113) is updated with the most recent acquired position information, a distance between the positions of the two touched points based on the position information held in memory and keeping the calculated distance in memory; and wherein the second acquisition means (115) is configured to calculate, in a case where it is determined that at least one of the two tapped points is moved, based on a difference of the position information of each of the two tapped points before and after updating by the first acquisition means, a changed amount of the distance based on comparing the calculated distance before the update and the newly calculated distance after the update and an absolute value of the calculated changed amount; and deciding, by means of a decision unit (116), an input type to the information processing apparatus (100) using the absolute value of the amount of the calculated altered distance, calculated by the second acquisition means (115), being input type one of: a first operation where the absolute value is equal to or greater than a threshold value, and at least one of the two touched points has been moved; and a second operation in which the absolute value is less than the threshold value, the two touched points are moved in the same direction and the absolute value is maintained; wherein: in response to the input type being the first operation, the display control means (118) causes the display means (108) to enlarge or reduce the displayed image; and in response to the input type being the second operation, the display control means (118) selects, from the plurality of images stored in the predetermined order in a storage unit (119), an image in a position in the predetermined order that is not adjacent to the displayed image, and causes the display medium to display the selected image.
[0003]
3. Computer-readable storage medium, characterized in that it stores instructions which, when read and executed by a computer, cause the computer to perform the method according to claim 2.
类似技术:
公开号 | 公开日 | 专利标题
BR102013004710B1|2021-06-29|INFORMATION PROCESSING APPARATUS, ITS CONTROL METHOD, AND STORAGE MEANS
BRPI0905206B1|2020-02-11|APPARATUS AND METHOD OF PROCESSING INFORMATION
BR102013016792B1|2021-02-23|CONTROL METHOD BASED ON TOUCH AND GESTURE INPUT AND TERMINAL FOR THAT
US10042547B2|2018-08-07|User interface control based on pinch gestures
KR20130127146A|2013-11-22|Method for processing function correspond to multi touch and an electronic device thereof
JP2012247936A|2012-12-13|Information processor, display control method and program
BR112013030414A2|2020-08-11|computer implemented method.
US10241570B2|2019-03-26|Pointing support device, pointing support method, and non-transitory computer-readable recording medium
JP6371475B2|2018-08-08|Eye-gaze input device, eye-gaze input method, and eye-gaze input program
JP2015102943A|2015-06-04|Portable device, screen display program, and screen display method
JP6253284B2|2017-12-27|Information processing apparatus, control method therefor, program, and recording medium
JP2014130450A|2014-07-10|Information processor and control method therefor
JP2015018432A|2015-01-29|Gesture input device
US10514802B2|2019-12-24|Method for controlling display of touchscreen, and mobile device
WO2019019372A1|2019-01-31|Picture operation and control method and device for mobile terminal, mobile terminal, and medium
JP2019096182A|2019-06-20|Electronic device, display method, and program
US8384692B2|2013-02-26|Menu selection method and apparatus using pointing device
US10126856B2|2018-11-13|Information processing apparatus, control method for information processing apparatus, and storage medium
US10599326B2|2020-03-24|Eye motion and touchscreen gestures
KR101426642B1|2014-08-06|Device and method for controlling rotation of three-dimensional object image
JP2015060455A|2015-03-30|Electronic device, control method, and program
CN106406578B|2020-01-21|Information processing apparatus, input control method, and control method for information processing apparatus
WO2016206438A1|2016-12-29|Touch screen control method and device and mobile terminal
JPWO2017183194A1|2018-09-06|Display control device
WO2017074607A1|2017-05-04|Terminating computing applications using a gesture
同族专利:
公开号 | 公开日
CN103294227A|2013-09-11|
US9880673B2|2018-01-30|
US20130222340A1|2013-08-29|
RU2013108886A|2014-09-10|
BR102013004710A2|2015-10-06|
IN2013CH00808A|2015-08-14|
EP2634681A3|2017-04-05|
JP2013178636A|2013-09-09|
EP2634681A2|2013-09-04|
JP5990011B2|2016-09-07|
CN103294227B|2016-11-30|
KR20130098907A|2013-09-05|
RU2568775C2|2015-11-20|
KR101572307B1|2015-11-26|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5952998A|1997-01-15|1999-09-14|Compaq Computer Corporation|Transparent touchpad with flat panel display for personal computers|
JP4803883B2|2000-01-31|2011-10-26|キヤノン株式会社|Position information processing apparatus and method and program thereof.|
US7138983B2|2000-01-31|2006-11-21|Canon Kabushiki Kaisha|Method and apparatus for detecting and interpreting path of designated position|
US8643605B2|2005-11-21|2014-02-04|Core Wireless Licensing S.A.R.L|Gesture based document editor|
US7956847B2|2007-01-05|2011-06-07|Apple Inc.|Gestures for controlling, manipulating, and editing of media files using touch sensitive devices|
US7884805B2|2007-04-17|2011-02-08|Sony Ericsson Mobile Communications Ab|Using touches to transfer information between devices|
KR20090038540A|2007-10-16|2009-04-21|주식회사 현대오토넷|Apparatus and method for changing image position on the screen, and nevigation system using the same|
WO2009133559A2|2008-04-30|2009-11-05|N-Trig Ltd.|Multi-touch detection|
CN101667089B|2008-09-04|2011-08-17|比亚迪股份有限公司|Method and device for identifying touch gestures|
TW201011618A|2008-09-05|2010-03-16|Kye Systems Corp|Optical multi-point touch-to-control method of windows-based interface|
JP2010086230A|2008-09-30|2010-04-15|Sony Corp|Information processing apparatus, information processing method and program|
US8217909B2|2008-12-19|2012-07-10|Cypress Semiconductor Corporation|Multi-finger sub-gesture reporting for a user interface device|
KR101844366B1|2009-03-27|2018-04-02|삼성전자 주식회사|Apparatus and method for recognizing touch gesture|
US20110032192A1|2009-08-04|2011-02-10|General Electric Company|Touch screen control system and method|
JP2011059952A|2009-09-09|2011-03-24|Aisin Aw Co Ltd|Input/output display device|
JP5340868B2|2009-09-24|2013-11-13|パイオニア株式会社|Contact operating device|
US9417787B2|2010-02-12|2016-08-16|Microsoft Technology Licensing, Llc|Distortion effects to indicate location in a movable data collection|
US20110205169A1|2010-02-24|2011-08-25|Primax Electronics Ltd.|Multi-touch input apparatus and its interface method using hybrid resolution based touch data|
US20110225524A1|2010-03-10|2011-09-15|Cifra Christopher G|Multi-Touch Editing in a Graphical Programming Language|
JP5473708B2|2010-03-26|2014-04-16|京セラ株式会社|Portable terminal and display control program|
EP2378403A1|2010-04-19|2011-10-19|Tyco Electronics Services GmbH|Method and device for determining a user's touch gesture|
US8749499B2|2010-06-08|2014-06-10|Sap Ag|Touch screen for bridging multi and/or single touch points to applications|
US8743064B2|2010-09-17|2014-06-03|A.I. Solutions, Inc.|Gesture orbit design|
US9542092B2|2011-02-12|2017-01-10|Microsoft Technology Licensing, Llc|Prediction-based touch contact tracking|
JP2012185647A|2011-03-04|2012-09-27|Sony Corp|Display controller, display control method and program|
CN202142028U|2011-03-31|2012-02-08|比亚迪股份有限公司|Multipoint recognition device of reducing-enlarging motion|
US8368723B1|2011-07-14|2013-02-05|Google Inc.|User input combination of touch and user position|
US20130021261A1|2011-07-18|2013-01-24|Research In Motion Limited|Electronic device and method of controlling same|
US10296205B2|2011-12-12|2019-05-21|Sony Corporation|User interface for controlling a display scale of an image|
US9436300B2|2012-07-10|2016-09-06|Nokia Technologies Oy|Method and apparatus for providing a multimodal user interface track|US9792017B1|2011-07-12|2017-10-17|Domo, Inc.|Automatic creation of drill paths|
US10001898B1|2011-07-12|2018-06-19|Domo, Inc.|Automated provisioning of relational information for a summary data visualization|
US9202297B1|2011-07-12|2015-12-01|Domo, Inc.|Dynamic expansion of data visualizations|
JP5852514B2|2012-06-13|2016-02-03|株式会社東海理化電機製作所|Touch sensor|
KR102157332B1|2013-03-15|2020-09-17|삼성전자주식회사|Method and apparatus for controlling zoom function in an electronic device|
KR102143760B1|2013-12-02|2020-08-13|엘지디스플레이 주식회사|Touch tracking method|
JP6071866B2|2013-12-18|2017-02-01|キヤノン株式会社|Display control device, display device, imaging system, display control method, and program|
KR102242891B1|2013-12-30|2021-04-21|엘지디스플레이 주식회사|Method for calcurating touch coordinates using gaussian mixture model|
TWI610211B|2014-02-07|2018-01-01|財團法人工業技術研究院|Touching device, processor and touching signal accessing method thereof|
CN103914224B|2014-03-24|2018-03-23|联想有限公司|Information processing method and device|
US9645731B2|2014-03-31|2017-05-09|Xiaomi Inc.|Unlocking method and device using same|
CN103971046B|2014-03-31|2018-10-19|小米科技有限责任公司|A kind of method and apparatus of decryption|
CN104156167B|2014-07-29|2018-01-02|Tcl通讯(宁波)有限公司|A kind of entry and exit method of mobile terminal and its child mode|
CN104679429A|2015-02-12|2015-06-03|深圳市中兴移动通信有限公司|Accidental-touch-prevention method and device|
JP2016206749A|2015-04-16|2016-12-08|シャープ株式会社|Touch input device, image forming apparatus including the same, and touch input method|
JP2016212738A|2015-05-12|2016-12-15|コニカミノルタ株式会社|Information processing apparatus, information processing program, and information processing method|
CN105426042A|2015-11-05|2016-03-23|小米科技有限责任公司|Icon position exchange method and apparatus|
US10761709B2|2018-11-29|2020-09-01|General Electric Company|Computer system and method for changing display of components shown on a display device|
CN112181265A|2019-07-04|2021-01-05|北京小米移动软件有限公司|Touch signal processing method, device and medium|
法律状态:
2015-10-06| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]|
2018-12-04| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-11-12| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-08-25| B07A| Technical examination (opinion): publication of technical examination (opinion) [chapter 7.1 patent gazette]|
2020-12-22| B06A| Notification to applicant to reply to the report for non-patentability or inadequacy of the application [chapter 6.1 patent gazette]|
2021-04-20| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-06-29| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 27/02/2013, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
JP2012041603A|JP5990011B2|2012-02-28|2012-02-28|Information processing apparatus and control method thereof|
JP2012-041603|2012-02-28|
[返回顶部]