专利摘要:
METHOD AND DEVICE FOR APPLYING THE INERIAL MOVEMENT OF THE WINDOW OBJECT The present invention relates to a method and device for applying the inertial movement of a window object, to obtain the effect of the inertial movement of a window object based on speed. linear and angular velocity. When an input device realizes that a window object being dragged, the initial linear velocity and initial angular velocity are obtained, and then the linear velocity deceleration and the angular velocity deceleration are used to obtain the amount of displacement and the amount of rotation corresponding to each processing period; the effect of the inertial movement of the window object is simulated after the release of a mouse button or the removal of a finger from a touchscreen. Through the method and the device, the effect of the inertial movement of the window object is realized, thus facilitating the touch operation, increasing the ease of use of the terminal software, and improving the user experience.
公开号:BR112012019980B1
申请号:R112012019980-3
申请日:2011-07-06
公开日:2021-01-26
发明作者:Min Ren;Qiang Zhang;Shundong Gao
申请人:Tencent Technology (Shenzhen) Company Limited;
IPC主号:
专利说明:

[0001] The present disclosure refers to the design technology for the software touch screen device's operating interface, in particular a method and device for applying the inertial motion of the window object. BACKGROUND OF THE INVENTION
[0002] Nowadays, the internet instant messenger has been accepted by most internet users, and has become an essential software tool for the user. The internet instant messenger is widely used in the work of users, as well as in leisure and habitual entertainment. Thus, there is a high demand for ease of use, stability, security and other aspects in the Instant Messaging (IM) software.
[0003] With the development of technology, touch-sensitive products like the iPhone gradually became popular. The project of using inertia to carry out materialization is gradually becoming more popular.
[0004] In the existing project to use inertia in order to carry out the materialization, the user can perform, through a linear touch operation, the inertia scroll of the content in a list and on a web page at a linear speed, but cannot process the inertia effect based on "angular velocity", under the condition of rotation. SUMMARY OF THE INVENTION
[0005] For this reason, the present disclosure aims, mainly, to provide a method and an application device for the inertial movement of a window object, to realize the effect of the inertia movement of a window object based on the linear speed and at angular velocity. To achieve this goal, the technical solution of this disclosure is implemented as follows.
[0006] A: monitorar a operação de toque inicial de um dispositivo de entrada, gravar a coordenada do toque inicial e o tempo do toque, e iniciar um primeiro temporizador; B: no processo de arrastar o objeto de janela, acionar periodicamente, pelo primeiro temporizador, um evento para obter a quantidade de deslocamento e a quantidade de rotação que são geradas por um usuário em uma tela através do dispositivo de entrada em relação a um ponto de tempo anterior, e enviar a quantidade de deslocamento e a quantidade de rotação obtidas para o objeto de janela; C: quando o usuário efetua objeto de janela, obtém a velocidade linear inicial e a velocidade angular inicial do objeto de janela e inicia um segundo temporizador; D: com base na velocidade linear inicial e na velocidade angular inicial, acionar periodicamente, pelo segundo temporizador, um evento para obter a quantidade de deslocamento e a quantidade de rotação atuais do objeto de janela de acordo com a desaceleração da velocidade linear e a desaceleração da velocidade angular, e enviar a quantidade de deslocamento e a quantidade de rotação atuais obtidas para o objeto de janela; E: quando a velocidade linear inicial e a velocidade angular inicial correspondentes ao período atual do segundo temporizador são menores ou iguais a 0, terminando o processo de movimento de inércia do objeto de janela. A method for inertial movement of a window object includes: A: monitor the initial ring operation of an input device, record the initial ring coordinate and ring time, and start a first timer; B: in the process of dragging the window object, periodically trigger, by the first timer, an event to obtain the amount of displacement and the amount of rotation that are generated by a user on a screen through the input device in relation to a point previous time, and send the amount of displacement and the amount of rotation obtained to the window object; C: when the user performs a window object, he obtains the initial linear speed and the initial angular speed of the window object and starts a second timer; D: based on the initial linear speed and the initial angular speed, periodically trigger an event to obtain the current displacement and rotation amount of the window object, according to the deceleration of the linear speed and deceleration, by the second timer. the angular velocity, and send the current amount of displacement and amount of rotation obtained to the window object; E: when the initial linear velocity and the initial angular velocity corresponding to the current period of the second timer are less than or equal to 0, ending the inertial movement process of the window object.
[0007] In addition, in the aforementioned method, the user can drag, with one or several touch points, the window object through the input device; in the case of multiple touch points, the amount of displacement and the amount of rotation can be obtained based on two touch points with the least number of devices.
[0008] Therefore, the present disclosure further provides a device for applying the inertial movement of a window object, including: a touch point capture unit, configured to capture the operation that an input device drags the window object on a screen, and generate the corresponding control information; a drag processing unit, configured to carry out the drag effect of the window object according to the control information sent from the touch point capture unit, to obtain the initial linear speed and the initial angular speed, when the input releases the window object, and sends the initial linear velocity and the initial angular velocity obtained to an inertia processing unit; and the configured inertia processing unit, based on the initial linear speed and initial angular speed, sends from the drag processing unit, performs the effect of the inertia movement according to the deceleration of the linear speed and the deceleration of the angular speed.
[0009] In addition, the control information generated by the touch point capture unit may include: initial control information, when the input device initially touches the window object, which comprises the position, time, device number, and identity that initially indicates the touch of the window object; current control information, when the input device moves the window object, which comprises position, time, device number, and which indicates the drag of the window object; release control information when the input device releases the window object, comprising position, time, device number, and identity indicating the release of the window object.
[0010] In addition, the trawl processing unit may include: a first motion processing unit, configured to start a first timer when it receives the initial control information sent from the touch point capture unit, obtain, according to the current control information sent from the touch point capture unit , the amount of displacement and the amount of rotation of the window object corresponding to the current period of the first timer under activation of the first timer, and send the obtained amount of displacement and the amount of rotation to the window object, and an initial speed processing unit, configured to obtain the initial linear speed and initial angular speed after receiving the release control information sent from the touch point capture unit, and sending the obtained initial linear speed and initial angular speed for the inertia processing unit.
[0011] With the application of the technical solution provided by the present disclosure, the effect of the inertial movement of the window object based on the linear speed and the angular speed is carried out, thus facilitating the operation, through the interface and materialization operation, the ease of use of the terminal software is enhanced, and the user experience is also improved. BRIEF DETAILED DESCRIPTION OF THE DRAWINGS
[0012] Figure 1 shows a structural diagram of a device for realizing the effect of the inertial movement of a window object in accordance with the present disclosure; Figure 2 shows a vector diagram to obtain the amount of displacement and the amount of rotation under the condition of the single touch point; Figure 3 shows a vector diagram to obtain the amount of displacement and the amount of rotation under the condition of several touch points; and Figure 4 shows a flow chart of a method for moving a window object according to the present invention. DETAILED DESCRIPTION
[0013] The basic idea of the present disclosure is: to carry out the materialization treatment in the process that the user moves the window object by means of a mouse or a finger, after the user finishes the dragging operation, to apply the inertial movement of an object in motion through the inertia processing unit according to linear speed and angular speed when a mouse drag button is released or the finger is lifted.
[0014] The moving object in the present disclosure means an instance of the window object capable of responding to the mouse or dragging operation via the finger on the touchscreen, the window object has its own interface attribute, method attribute and data attribute , and can respond to the window interface trigger event. For example, taking the Windows operating system as an example, the window object can be a window for displaying contact information, and the window object has the function of responding to events by clicking and moving a user, and has interface attributes, such as width, height, angle of rotation with the center of the window, such as the center of a circle, horizontal coordinate (CenterX) of the center of the window, and vertical coordinate (CenterY) of the center of the window.
[0015] To conveniently describe, in the following description, the operating scenario is that the user moves with his finger or a mouse, a window object instance (called below just a window object for short) that represents an IM contact; when moving the window object at a certain speed, the user raises his finger or releases the mouse button quickly, then the window object keeps moving and rotates along the original motion track by the force of inertia, and slowly returns to the static state by the frictional force (deceleration).
[0016] In order to achieve and make the objective of the disclosure clearer, the technical solution and advantages of this disclosure are described below with reference to the accompanying drawings and their modalities are detailed.
[0017] Figure 1 shows a structural diagram of a device for effecting the inertial motion of a window object according to the present disclosure, the device includes a touch point capture unit 101, a drag processing unit 102 and an inertia processing unit 103.
[0018] The touch point capture unit 101 is configured to capture the operation that an input device drags a window object on a screen, and generates the corresponding control information.
[0019] The input device includes a touch screen, mouse, WordPad and so on, which can perform the movement of the window.
[0020] The control information includes: (1) initial control information, when the input device initially touches the window object, including the position, time, device number, identity that initially indicates the touch on the window object, and so on ; (2) current control information when the input device moves the window object, including the position, time, device number, identity that indicates dragging the window object, and so on; (3) launching the control information, when the input device releases the window object and ends up dragging the window object, including the position, device time, identification number indicating the release of the window object and so on .
[0021] The touch point capture unit 101 is an interface unit between the device of the present disclosure and the operating system, which monitors the operation of the operating system on the window object in real time, obtains the control information generated by the input device , and sends the control information to the drag processing unit, which performs the effect of dragging the window object.
[0022] For example, when the mouse is pressed and moved, or the finger presses and moves, the touch point capture unit 101 captures the event that the mouse is pressed or the touch of the finger, and sends the initial control information to the drag processing unit 102, where the initial control information includes: the touch point position coordinate, the current time, device number, the identity that indicates that the left mouse button is pressed or the touch of the fingers for preparing the dragging of the window object, and so on. The number of devices is used to distinguish between different input devices; under the condition of several touch points, for example, the user uses two fingers to touch the window object, the numbers of the devices corresponding to the fingers that come into contact with the touch screen at different times are different.
[0023] The drag processing unit 102 is configured to carry out the drag effect of the window object according to the control information sent from the touch point capture unit 101.
[0024] Trailing processing unit 102 includes: a first motion processing unit and an initial speed processing unit.
[0025] The first motion processing unit is configured to start a first timer Timer1 when it receives the initial control information sent from the touch point capture unit 101, to obtain, according to the current control information sent from the point capture unit touch 101, the amount of displacement and the amount of rotation of the window object corresponding to the current period of Timer1, under the activation of Timer1, and send the obtained amount of displacement and the amount of rotation to the window object.
[0026] The initial speed processing unit is configured to obtain the initial linear speed and initial angular speed after receiving the release control information sent from the touch point capture unit 101, and to send the initial linear speed and angular speed. obtained for the inertia processing unit.
[0027] The method for carrying out the dragging effect of the window object is specifically: (1) after receiving the initial control information sent from the touch point capture unit 101, the drag processing unit 102 records the position coordinate information (X0, Y0) of the initial touch point, the contact time initial T0 and the device number; (2) when the mouse is moved after being pressed or the finger moves after pressing, the drag processing unit 102 records the position (Xi, Yi) of the mouse or finger, the current time Ti and the device number at each interval Δt, obtains the amount of displacement and the amount of rotation in relation to the time point of the previous record, and sends the relative amount of displacement and relative amount of rotation to the window object; and, then the window object defines its own attribute value to carry out the drag effect; for example, the attributes of the central coordinates of the window object are (CenterX, CenterY), and the rotation angle attribute is angle; when the current displacement quantity and the rotation quantity are obtained per window object during the period of Ti are (ΔΧ, ΔΥ) and a, respectively, the window object defines its own attribute value of the coordinate center and attribute value rotation angle as: CenterX = CenterX + ΔX, CenterY = CenterY + ΔY, Angle = Angle + α; the window object triggers its function of redefining the interface by defining the attribute value of the coordinate center and attribute value of the rotation angle, in order to display the window object in a new coordinate position and new rotation angle; (3) when the input device releases the window object, that is, when the mouse button is released or the fingers are lifted, The touch point capture unit 101 sends the release control information to the control unit. drag processing 102, the drag processing unit 102 obtains the initial linear speed and the initial angular speed according to the amount of displacement and the amount of rotation that are obtained during the last period of Timer1 before releasing;
[0028] There are two cases, that is, in the case of a single touch point and the case of several touch points, in obtaining the amount of displacement, amount of rotation, initial linear speed and initial angular speed. (1) As shown in figure 2, in the case of a single touch point, the method for obtaining the amount of displacement, the amount of rotation, the initial linear speed and the initial angular speed includes: the amount of displacement between Ti-1 and Ti is: (ΔΧ, ΔΥ) = (Χi-Χi-1, Υi-Υi-1), where (Xi, Yi) is the coordinate vector of the touch point corresponding to the Ti period of Timer1, and (Χi-1, Υi-1) is the coordinate vector of the touch point corresponding to the Ti-1 period of Timer1; the amount of rotation between Ti-1 and Ti is: a = arcs [Ui. Vi / (| ui | vi |)]; where the center of rotation coordinates are the central coordinates of the window object: (Cx, Cy); ui is the vector between the coordinates of the touch point corresponding to the Ti period of Timer1 and the coordinates of the center of rotation, that is, ui = (Xi-Cx, Yi-Cy); Vi is the vector between the coordinates of the touch point corresponding to the Ti-1 period of Timer1 and the coordinates of the center of rotation, that is, Vi = (Xi-1-Cx, Yi-1-Cy); the initial linear velocity V when the input device releases the window object is: V = (Xn-Xn-1, Yn-Yn-1) / (Tn-Tn-1) = (Xn-Xn-1, Yn-Yn-1) / Δt, where (Χn, Yn) is the coordinate vector of the touch point corresponding to the last Tn period of Timer 1 when the input device releases the window object; (Xn-1, Yn-1) is the coordinate vector of the touch point corresponding to the Tn-1 period; If Δt is defined as a standard time unit, then the above equation can be simplified as: V = (Xn-Xn-1, Yn-Yn-1); the method for obtaining the initial angular velocity Va when the input device releases the window object is: Va = arcs [un. vn / | un | vn |)] / (Tn-Tn-1) = arcs [un. Vn / (| un | Vn |)] / Δt, where un is the vector between the coordinates of the touch point (Xn, Yn) corresponding to the Tn period of Timer1 and the coordinates of the center of rotation (Cx, Cy), that is, Un = (Xn-Cx, Yn-Cy) , and vn is the vector between the coordinates of the touch point corresponds to the Tn-1 period of Timer1 and the coordinates of the center of rotation, that is, vn = (Xn-1-Cx, Yn-1-Cy);
[0029] If Δt is defined as a standard time unit, then the above equation can be simplified as: Va = arcs [un. Vn / (| un | vn |)]. 2) As shown in figure 3, in the case of several touch points, only two touch points with the smallest number of devices are taken into account, the method to obtain the amount of displacement, the amount of rotation, the initial linear speed and the initial angular velocity includes: assuming that the coordinates of two touch points with the least number of devices correspond to the Ti-1 period of Timer1 are (Xi-1, d0, Yi-1, d0) and (Xi-1, d1, Yi-1, d1 ), and the coordinates corresponding to the Ti period are (Xi, d0, Yi, d0) and (Xi, d1, Yi.d1), where dO and d1 are device numbers, so the coordinates of the center of the vectors of the two points of touch during Ti-1 period and Ti period are respectively: (CXi, CYi) = (((Xi, d0 + Xi, d1) / 2, (Yi, d0 + Yi, d1) / 2), (CXi-1, CYi-1) = (((Xi-1, d0 + Xi-1, d1) / 2, (Yi-1, d0 + Yi-1, d1) / 2); the amount of displacement between Ti-1 and Ti is: (ΔX, ΔY) = (CXi-CXi-1, CYi-CYi-1); when the input device releases the window object, the initial linear speed corresponding to Timer1's Tn period is: V = (CXn-CXn.1, CYn-CYn-1) / (Tn-Tn-1) = (CXn-CXn-1, CYn-CYn-1) / Δt, where (Xn, CYn) and (CXn-1, CYn-1) are the coordinates of the center of the vectors that are formed by two touch points with the least number of devices, respectively, during the last period Tn and the period Tn- 1 of Timer1 when the input device releases the window object; Δt is the time interval for Timer1;
[0030] If Δt is defined as a standard time unit, then the above equation can be simplified as: V = (CXn-CXn-1, CYn-CYn-1); in the case of multiple touch points, when obtaining the amount of rotation, only two touch points with the least number of devices are taken into account, the point of intersection of a line connecting the two points during the Ti-1 period with a line connecting the two touch points during the Ti period serves as the center for obtaining the amount of rotation; the coordinates of two touch points corresponding to the Ti period are (Xi, d0.Yi, d0) and (Xj, d1, Yi, d1) ,, and the coordinates of two touch points corresponding to the Ti-1 period are (Xi -1, d0, Yi-1, d0) and (Xi-1, d1, Yi-1, d1); the central coordinate of rotation for obtaining the amount of rotation is: (Cx, Cy) = (Xi, d0 + Xi, d1) / 2, (Yi, d0 + Yi, d1) / 2); the amount of rotation between Ti-1 and Ti is: αi = arcs [ui. vi / (| ui | vi |)], ui where is the vector (Xi, d0-Xi, d1, Y, d0-Yi, d1) formed by the two touch points with the smallest number of devices corresponding to the Ti period of Timer1, vi is the vector (Xi-1, d0-Xi-1, d1, Yi-1, d0-Yi-1, d1) formed by the two touch points with the smallest number of devices corresponding to the Ti-1 period of Timer1.
[0031] When the input device releases the window object, the initial angular velocity Va of the amount of rotation corresponding to the Tn period of Timer1 is: Va = arcs [un. Vn / (| Un | Vn |)] / (Tn-Tn-1) = arcs [un. Vn / (| Un | Vn |)] / Δt, where un is the vector (Xn, d0-Xn, d1, Yn, d0-Yn, d1) formed by the two touch points with the least number of devices corresponding to the last Tn period of Timer1 when the input device releases the object window, and vn is the vector ((Xn-1, d0-Xn-1, d1, Yn-1, d0-Yn-1, d1) formed by the two touch points with the smallest number of devices corresponding to the Ti- 1.
[0032] If Δt is defined as the standard time unit, then the above equation can be simplified as: Va = arcs [un. Vn / (| un | vn |)].
[0033] The inertia processing unit 103 is configured to carry out the effect of the inertia movement after the input device releases the window object, that is, after the user has finished the drag operation. After receiving the release control information sent from the drag processing unit 102, the inertia processing unit 103 starts a second timer Timer2 to activate a timer processing function at each time interval Tg, obtaining, according to the initial linear velocity and initial angular velocity corresponding to the current period of Timer2, the amount of displacement and the amount of rotation during the current period in the timer processing function, and then sends the obtained amount of displacement and the amount of rotation during the current period for the window object; then the window object defines its own attribute to obtain the effect of the inertial movement.
[0034] The method for obtaining the initial linear velocity corresponding to the Ti period of Timer2 includes: Vi = Vi-1-Vdec * Tg, where Vdec is the deceleration of the linear speed, and can be predefined by a user or a system according to the current situation, which is equivalent to adjusting the friction coefficient of the linear speed; in a first period of Timer2 after the input device releases the window object, V1 is the initial linear speed V sent by the drag processing unit 102; Tg is the Timer2 period, and can be predefined by the user or by the system.
[0035] The method for obtaining the initial angular velocity corresponding to the Ti period of Timer2 includes: Vai = Vai-1-Vadec * Tg, where Vadec is the deceleration of the angular velocity, and can be predefined by a user or a system according to the real situation, which is equivalent to adjusting the friction coefficient of the angular velocity, in a first period of Timer2 after the input device releasing the window object, Va1 is the initial angular velocity Va sent by the drag processing unit 102; the amount of displacement corresponding to the Ti period of Timer2 is: (ΔX, ΔY) = (Vi-Vdec * Tg) * Tg = (Vi.X-Vdec * Tg, Vi.Y-Vdec * Tg) * Tg, where Vi is the initial linear velocity vector corresponding to the Ti period of Timer2, and Tg is the time interval of Timer2; the amount of rotation corresponding to the Ti period of Timer2 is: αi = (Vai-Vadec * Tg) * Tg, where Vai is the initial angular velocity corresponding to the Ti period of Timer2;
[0036] If Tg is equal to Δt, and Δt is defined as the standard time unit, then the above equation can be simplified as: (ΔX, ΔY) = (Vi.X-Vdec, Vi.Y-Vdec), αi = (Vai-Vadec), that is, if Tg = Δt serves as the time unit of the sampling speed, the processing of inertia and the updated position of the window object, then the amount of calculation can be greatly reduced.
[0037] When the amount of displacement and the amount of rotation of the window object's inertia movement during the current period, which are obtained through the above equation, are less than or equal to 0, the inertia process ends.
[0038] After receiving the amount of displacement and the amount of rotation sent by the inertia processing unit 103, the window object defines its own coordinate center attribute value and angle of rotation attribute value as: CenterX = CenterX + ΔX, CenterY = CenterY + ΔY, Angle = Angle + α.
[0039] The function of resetting the interface is triggered by setting the attribute value of the coordinate center and the attribute value of the rotation angle, in order to display the window object in the new coordinate position and new rotation angle, thus realizing the effect of inertial motion.
[0040] Figure 4 shows a flowchart of a method for moving a window object according to the present disclosure, including the following steps: Step 401: Monitor the operation of the initial touch operation of the input device, and obtain the initial control information.
[0041] In the step, the initial touch operation of the input device is controlled by the touch point capture unit, and the touch point capture unit sends the initial control information to the drag processing unit, after detecting the operation initial touch.
[0042] The input device comprises mouse, touch screen, wordpad and so on, which have the function of moving the window object. If the input devices are different, then the generated system messages are different. For example, the system message indicating that the mouse activates the window object to move it, it may be that the left mouse button is pressed and moves, the system message, indicating that the touch screen activates the object. window to move is that the fingers press and move. The initial touch operation refers to a click operation that a user initially presses the left mouse button, or the finger initially presses the touch screen to move the window object. The initial control information includes: the position of the initial touch point, time, device number, button code and so on.
[0043] Step 402: record the initial control information, starting a first timer Timer1, establishing a time interval as Δt, where Δt is predefined by a user or a system according to the real situation.
[0044] After receiving the initial control information sent from the touch point capture unit, the drag processing unit records the initial control information and starts Timer1. Timer1 is used to periodically trigger an event function to calculate the amount of displacement (ΔΧ, ΔY) and the amount of rotation α of the window object, in order to obtain the amount of displacement and the amount of rotation generated on the screen by the input device during the current period.
[0045] Step 403: when the Timer1 time interval is longer, obtain the current control information, and obtain the amount of displacement and the amount of rotation generated on the screen by a user through the input device during the current period.
[0046] The prerequisite for the execution of the step is that the user maintains the operation of pressing the input device, and immobilizes or drags the window object; the touch point capture unit transforms the monitored system message generated by the input device into current control information (including position, time, device number, and so on) and continuously sends current control information to the trawl processing unit, the trawl processing unit updates the current local cache control information in real time, and records the current control information when each Timer1 time interval forms a history of the control information, the history can reflect on the trace, which can also be called a gesture, of the movement of the input device on the screen, and then the amount of displacement and the amount of rotation of the window object corresponding to the current Tirnerl period are obtained.
[0047] Step 404: the window object adjusts its position and the rotation angle on the screen according to the amount of displacement and the amount of rotation obtained.
[0048] After obtaining the amount of displacement and amount of rotation at the current point in time, the drag processing unit sends these to the window object; the window object triggers a method of redefining the window, defining its own coordinate center and the value of the rotation angle attribute in relation to the coordinate center, in order to carry out the drag effect.
[0049] Step 405: determine whether the input device releases the window object, that is, whether the user releases the mouse button or raises his finger to end the dragging operation, if so, perform Step 406, otherwise, perform Step 403.
[0050] In Step, it is the touch point capture unit that monitors whether the input device releases the window object. After the release information is detected, the touch point capture unit immediately sends the release control information to the drag processing unit, where the release control information includes: the position at the time of release, time , device number, and so on.
[0051] Step 406: when the drag operation is finished, obtain the initial linear velocity V and the initial angular velocity Va corresponding to the last period Tn of Timer1, and stop Timer1.
[0052] The drag processing unit obtains, according to the history of the control information, the initial linear velocity V and initial angular velocity Va corresponding to the last period Tn of Timer1 when the input device releases the window object; the obtaining method is described above, and will not be repeated here; the drag processing unit sends the obtained initial linear speed V and initial angular speed Va to the inertia processing unit.
[0053] Step 407: start a second Timer2 timer, and set the time interval to Tg.
[0054] In Step, after receiving V and Va, the inertia processing unit starts the second timer Timer2, where Timer2 is used to effect the inertia movement after the input device releases the window object.
[0055] Step 408: when each Timer2 time interval is greater, obtain the current displacement amount and the rotation amount of the window object according to the Vdec linear speed deceleration and the Vadec angular speed deceleration, record the position at the moment present, and calculate the initial linear velocity V and the initial angular velocity Va next time period.
[0056] In the step, the inertia processing unit obtains, according to the predefined Vdec and Vadec, the amount of displacement and the amount of rotation used to keep moving forward during the current time interval of Timer2 after the input device releases the window object; the method of obtaining is described above, and will not be repeated here.
[0057] The method of obtaining the initial linear velocity V and the initial angular velocity Va for the next time period of Timer2 is: Vi = Vj-1-Vdec * Tg; Vai = Vai-1-Vadec * Tg.
[0058] Step 409: determine if the inertia movement ends, if so, end the process, if not, perform Step 410.
[0059] The step of determining whether the inertia movement ends includes: determining whether Vi and Vai obtained in Step 408 are less than or equal to 0; if so, stop Timer2, and end the inertia movement process; if a value is less than or equal to 0, and the other value is even greater than 0, then it is still necessary to perform the process of inertia on the other value, for example, when Vi is less than or equal to 0 and Vai is greater than 0, in the current period and in a subsequent period of Timer2, the amount of displacement is 0, and the amount of rotation is not 0; the inertia movement process is not terminated until both Vi and Vai are less than or equal to 0.
[0060] Step 410: the window object adjusts its position and the rotation angle on the screen according to the amount of displacement and the amount of rotation obtained, in order to achieve the inertia effect; then Step 408 is performed again.
[0061] The above description is only a preferred embodiment of the present disclosure and should not be used to limit the scope of protection of the present disclosure.
权利要求:
Claims (5)
[0001]
Method for applying inertial motion to a window object, FEATURED by the fact that the method comprises: A: monitor the initial ring operation of an input device, record the initial ring coordinate and ring time, and start a first timer; B: in the process of dragging the window object, periodically trigger, by the first timer, an event to obtain the amount of displacement and the amount of rotation that are generated by a user on a screen through the input device in relation to a point previous time, and send the amount of displacement and the amount of rotation obtained to the window object; C: when the user performs a window object, he obtains the initial linear speed and the initial angular speed of the window object and starts a second timer; D: based on the initial linear speed and the initial angular speed, periodically trigger an event to obtain the current displacement and rotation amount of the window object, according to the deceleration of the linear speed and deceleration, by the second timer. the angular velocity, and send the current amount of displacement and amount of rotation obtained to the window object; E: when the initial linear velocity and the initial angular velocity corresponding to the current period of the second timer are less than or equal to 0, terminate the inertial movement process of the window object, where the user drags, with one or several touch points, the window object through the input device; in the case of multiple touch points, the amount of displacement and the amount of rotation are obtained based on two touch points with fewer devices, where device numbers are used to distinguish different touch points that are in contact with the touch screen at a given time, where, in the case of a single touch point, obtaining the amount of displacement and the amount of rotation in Step B is as follows: the amount of displacement between Ti-1 and Ti is: (ΔX, ΔY) = (Xi-Xi-1, Yi-Yi-1), where (Xi, Yi) is a coordinate vector of a contact point corresponding to the Ti period of the first timer, and Xm, Ym) is a coordinate vector of a contact point corresponding to the Ti-1 period of the first timer; the amount of rotation between Ti-1 and Ti is: αi = arcs [ui. vi / (| ui | vi |)]; where the coordinate of the center of rotation is a central coordinate of the window object: (Cx, Cy); ui is a vector between the coordinates of the touch point corresponding to the Ti period of the first timer and the coordinates of the center of rotation, that is, ui = (Xi-Cx, Yi-Cy); vi is a vector between the coordinates of the touch point corresponding to a Ti-1 period of the first timer and the coordinates of the center of rotation, namely, vi = (Xi-1-Cx, Yi-1-Cy); obtaining the initial linear velocity V and the initial angular velocity Va in Step C is: V = (Xn-Xn-1, Yn-Yn-1) / (Tn-Tn-1) = (Xn-Xn-1, Yn-Yn-1) / Δt, where (Xn, Yn) is a coordinate vector of a touch point corresponding to a last period Tn of the first timer when the input device releases the window object; (Xn-1, Yn-1) is a coordinate vector of a touch point corresponding to the Tn-1 period; Δt is a time interval of the first timer; Va = arcs [un.vn/ (| un | vn |)] / (Tn-Tn-1) = arcs [un.vn/ (| un | vn |)] / Δt where un is a vector between the coordinates of the touch point (Xn, Yn) corresponding to the Tn of the first timer and the coordinates of the center of rotation (Cx, Cy), namely, un = (Xn-Cx, Yn-Cy) , when the input device releases the window object; and Vn is a vector between the coordinates of the touch point corresponding to the Tn-1 period of the first timer and the coordinates of the center of rotation, namely, Vn = (Xn-1-Cx, Yn-1-Cy), where, in the case of multiple touch points, obtaining the amount of displacement and the amount of rotation in Step B is as follows: amount of displacement between Ti-1 and Ti is: (ΔX, ΔY) = (CXi-CXi-1, CYi-CYi-1), where (CXi, CYi) and (CXm, CYm) are central coordinates of two touch points with the least number of corresponding devices, respectively for the Ti period and the Ti-1 period of the first timer; the amount of rotation between Ti-1 and Ti is: αi = arcs [ui. vi / (| ui | vi |)], where ui is a vector formed by the two touch points with the smallest number of devices corresponding to the Ti period of the first timer, vi is a vector formed by the two touch points with the smallest number of devices corresponding to the Ti-1 period of the first timer ; obtaining V and Va in Step C is: V = (CXn-CXn-1, CYn-CYn-1) / (Tn-Tn-1) = (CXn-CXn-1, CYn-CYn-1) / Δt, where (CXn, CYn) and (CXn-1, CYn-1) are central coordinates of the vectors that are formed by the two touch points with the least number of devices, respectively, during a last Tn period and the Tn-1 period of the first timer, when the input device releases the window object; Δt is a time interval of the first timer; Va = arcs [un.vn/ (| un | vn |)] / (Tn-Tn-1) = arcs [un.vn/ (| un | vn |)] / Δt, where un is a vector formed by the two touch points with the smallest number of devices corresponding to the last Tn period of the first timer when the input device releases the window object, and vn is a vector formed by the two touch points with the lowest number of devices corresponding to the Tn-1 period, where upon receiving the amount of displacement and the amount of rotation, the window object defines its own coordinate center attribute value and the rotation angle attribute value as: CenterX = CenterX + ΔX, CenterY = CenterY + ΔY, Angle = Angle + a; where the window object triggers its function of redefining the interface by defining the attribute value of the coordinate center and attribute value of the rotation angle, in order to display the window object in a new coordinate position and a new rotation angle to achieve the effect of inertial motion.
[0002]
Method according to claim 1, CHARACTERIZED by the fact that obtaining the amount of current displacement and the amount of rotation of the window object according to the deceleration of the linear velocity and the deceleration of the angular velocity in Step D is: the amount of displacement corresponding to the Ti period of the second timer is: (ΔX, ΔY) = (Vi-Vdec * Tg) * Tg; the amount of rotation corresponding to the Ti period of the second timer is: ai = (Vai-Vadec * Tg) * Tg; where Vdec is the deceleration of the linear velocity, Vadec is the deceleration of the angular velocity; Vi is the initial linear speed corresponding to the Ti period of the second timer; Vai is the initial angular velocity corresponding to the Ti period of the second timer; Tg is a time interval for the second timer; an initial linear velocity and an initial angular velocity corresponding to a first period of the second timer, after the input device releases the window object are respectively obtained in Step C.
[0003]
Device for applying the inertial movement of a window object, CHARACTERIZED by the fact that the device comprises: a touch point capture unit, configured to capture the operation that an input device drags the window object on a screen, and generates the corresponding control information; a drag processing unit, configured to carry out the drag effect of the window object according to the control information sent from the touch point capture unit, to obtain the initial linear speed and the initial angular speed, when the input releases the window object, and sends the initial linear velocity and the initial angular velocity obtained to an inertia processing unit; and the configured inertia processing unit, based on the initial linear speed and the initial angular speed, sends from the drag processing unit, performs the effect of the inertia movement according to the deceleration of the linear speed and the deceleration of the angular speed, wherein the trawl processing unit comprises: a first motion processing unit, configured to start a first timer when it receives the initial control information sent from the touch point capture unit, obtain, according to the current control information sent from the touch point capture unit , the amount of displacement and the amount of rotation of the window object corresponding to the current period of the first timer under activation of the first timer, and send the obtained amount of displacement and the amount of rotation to the window object, and an initial speed processing unit, configured to obtain the initial linear speed and initial angular speed after receiving the release control information sent from the touch point capture unit, and sending the obtained initial linear speed and initial angular speed for the inertia processing unit, in which the user drags, with one or several touch points, the window object through the input device; in the case of multiple touch points, the amount of displacement and the amount of rotation are obtained based on two touch points with fewer devices, where the device numbers are used to distinguish different touch points that are in contact with the touch screen at any given time, where in the case of a single touch point, the first motion processing unit obtains the amount of displacement and the amount of rotation as follows: the amount of displacement between Ti-1 and Ti is: (ΔX, ΔY) = (Xi-Xi-1, Yi-Yi-1), where, (Xi, Yi) is a coordinate vector of a touch point corresponding to the Ti period of the first timer, and (Xm, Ym) is a vector of coordinates of a touch point corresponding to the Ti-1 period of the first timer. ; the amount of rotation between Ti-1 and Ti is: αi = arcs [ui. vi / (| ui | vi |)]; where the coordinate of the center of rotation is a central coordinate of the window object: (Cx, Cy); ui is a vector between the coordinates of the touch point corresponding to the Ti period of the first timer and the coordinates of the center of rotation, that is, ui = (Xi-Cx, Yi-Cy); vi is a vector between the coordinates of the touch point that corresponds to the Ti-1 period of the first timer and the coordinates of the center of rotation, namely, vi = (Xi-1-Cx, Yi-1-Cy); the initial velocity processing unit obtains the initial linear velocity V and the initial angular velocity Va as follows: V = (Xn-Xn-1, Yn-Yn-1) / (Tn-Tn-1) = (Xn-Xn-1, Yn-Yn-1) / Δt, where (Xn, Yn) is a coordinate vector of a touch point corresponding to a last period Tn of the first timer when the input device releases the window object; (Xn-1, Yn-1) is a coordinate vector of a touch point corresponding to the Tn-1 period; Δt is a time interval of the first timer; Va = arcs [un.vn/ (| un | vn |)] / (Tn-Tn-1) = arcs [un. vn / (| un | vn |)] / Δt, where un is a vector between the coordinates of the touch point (Xn, Yn) corresponding to the period Tn of the first timer and the coordinates of the center of rotation (Cx, Cy), that is, un = (Xn-Cx, Yn-Cy ) when the input device releases the window object; and vn is a vector between the coordinates of the touch point corresponding to the Tn-1 period of the first timer and the coordinates of the center of rotation, namely, vn = (Xn-1-Cx, Yn-1-Cy); in the case of multiple touch points, the first motion processing unit obtains the amount of displacement and the amount of rotation using the following way: the amount of displacement between Ti-1 and Ti is: (ΔX, ΔY) = (CXi-CXi-1, CYi-CYi-1), where (CXi, CYi) and (CXi-1, CYi-1) are central coordinates of two touch points with the least number of corresponding devices, respectively, for the Ti period and the Ti-1 period of the first timer; the amount of rotation between Ti-1 and Ti is: αi = arcs [ui. vi / (| ui | vi |)], where ui is a vector formed by the two touch points with the smallest number of devices corresponding to the Ti period of the first timer, vi is a vector formed by the two touch points with the smallest number of devices corresponding to the Ti-1 period of the first timer ; the initial velocity processing unit obtains the initial linear velocity V and the initial angular velocity Va as follows: V = (CXn-CXn-1, CYn-CYn-1) / (Tn-Tn-1) = (CXn-CXn-1, CYn-CYn-1) / Δt, where (CXn, CYn) and (CXn-1, CYn-1) are central coordinates of the vectors that are formed by the two touch points with the least number of devices, respectively, during a last Tn period and the Tn-1 period of the first timer, when the input device releases the window object; Δt is a time interval of the first timer; Va = arcs [un.vn/ (| un | vn |)] / (Tn-Tn-1) = arcs [un. vn / (| un | vn |)] / Δt, where un is a vector formed by the two touch points with the smallest number of devices corresponding to the last period Tn of the first timer, when the input device releases the window object, and vn is a vector formed by the two touch points with the lowest number of devices corresponding to the Tn-1 period, where after receiving the release control information sent from the drag processing unit, the inertia processing unit starts a second timer to trigger a timer processing function for each period of time that Tg obtains, according to the speed initial linear velocity and the initial angular velocity corresponding to the current period of the second timer, the amount of displacement and the amount of rotation during the current period in the timer processing function, and then sends the amount of displacement obtained and the amount of rotation during the current period for the window object, where upon receiving the amount of displacement and the amount of rotation, the window object defines its own coordinate center attribute value and the rotation angle attribute value as: CenterX = CenterX + ΔX, CenterY = CenterY + ΔY, Angle = Angle + α; where the window object triggers its function of redefining the interface by defining the attribute value of the coordinate center and attribute value of the rotation angle, in order to display the window object in a new coordinate position and a new rotation angle to achieve the effect of inertial motion.
[0004]
Device according to claim 3, CHARACTERIZED by the fact that the control information generated by the touch point capture unit comprises: initial control information, when the input device initially touches the window object, which comprises the position, time, device number, and identity that initially indicates the touch of the window object; current control information when the input device moves the window object, which comprises the position, time, device number, and which indicates the dragging of the window object; launching control information when the input device releases the window object, comprising position, time, device number, and identity indicating the release of the window object.
[0005]
Device according to claim 3, CHARACTERIZED by the fact that the inertia processing unit obtains the current displacement and rotation amount of the window object according to the deceleration of the linear velocity and the deceleration of the angular velocity through the as follows: the amount of displacement corresponding to the Ti period of a second timer is: (ΔX, ΔY) = (Vi-Vdec * Tg) * Tg; the amount of rotation corresponding to the Ti period of the second timer is: αi = (Vai-Vadec * Tg) * Tg; where Vdec is the deceleration of the linear velocity, Vadec is the deceleration of the angular velocity; Vi is an initial linear speed corresponding to the Ti period of the second timer; Vai is the initial angular velocity corresponding to the Ti period of the second timer; Tg is a period of time for the second timer; an initial linear speed and an initial angular speed corresponding to a first period of the second timer after the input device releases the window object are those sent respectively by the drag processing unit.
类似技术:
公开号 | 公开日 | 专利标题
BR112012019980B1|2021-01-26|method and device for applying inertial movement of the window object
US9182838B2|2015-11-10|Depth camera-based relative gesture detection
KR101572307B1|2015-11-26|Information processing apparatus, control method thereof, and storage medium
CN103713766B|2017-01-11|Method and system for detecting and handling unintentional touching of a touch screen
TWI494795B|2015-08-01|Multi-touch object inertia simulation
WO2016090888A1|2016-06-16|Method, apparatus and device for moving icon, and non-volatile computer storage medium
US8643616B1|2014-02-04|Cursor positioning on a touch-sensitive display screen
JP2014501996A5|2015-02-26|
EP3195102B1|2019-02-27|Simulating real-time responsiveness for touch displays
JP2016534481A|2016-11-04|System and method for providing a response to user input using information regarding state changes and predictions of future user input
JP2012507077A5|2012-12-06|
CN104364734A|2015-02-18|Remote session control using multi-touch inputs
CN104951213B|2018-06-22|The method for preventing false triggering boundary slip gesture
CN103823630A|2014-05-28|Virtual mouse
US9684445B2|2017-06-20|Mobile gesture reporting and replay with unresponsive gestures identification and analysis
Zhang et al.2014|Ren
US20150253856A1|2015-09-10|Determining pointer and scroll gestures on a touch-sensitive input device
CN104346095B|2018-07-06|A kind of information processing method and electronic equipment
US10234953B1|2019-03-19|Cross-device interaction through user-demonstrated gestures
WO2017096797A1|2017-06-15|Operating assembly control method and system based on motion sensing
CN105808129B|2020-05-22|Method and device for quickly starting software function by using gesture
JP2016515741A|2016-05-30|Pan and scaling detection during multi-finger touch interaction
CN106547339B|2020-03-20|Control method and device of computer equipment
Sakai et al.2012|AccuMotion: intuitive recognition algorithm for new interactions and experiences for the post-PC era
WO2018098960A1|2018-06-07|Method for operating touchscreen device, and touchscreen device
同族专利:
公开号 | 公开日
WO2012003799A1|2012-01-12|
JP2013533550A|2013-08-22|
US8462132B2|2013-06-11|
MY160170A|2017-02-28|
CN102314297A|2012-01-11|
AU2011276797B2|2014-08-14|
JP5655142B2|2015-01-14|
EP2523086A4|2014-12-24|
RU2491610C1|2013-08-27|
KR101458289B1|2014-11-04|
BR112012019980A2|2020-09-24|
EP2523086A1|2012-11-14|
US20120306796A1|2012-12-06|
CN102314297B|2016-04-13|
AU2011276797A1|2012-08-23|
EP2523086B1|2017-06-21|
KR20130008578A|2013-01-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JP2827612B2|1991-10-07|1998-11-25|富士通株式会社|A touch panel device and a method for displaying an object on the touch panel device.|
JP3593827B2|1996-11-26|2004-11-24|ソニー株式会社|Screen scroll control device and scroll control method|
US6539101B1|1998-04-07|2003-03-25|Gerald R. Black|Method for identity verification|
US7783978B1|1999-10-20|2010-08-24|Koninklijke Philips Electronics N.V.|Information processing device|
JP2002013532A|2000-06-28|2002-01-18|Koyo Seiko Co Ltd|Magnetic bearing control system|
US7084854B1|2000-09-28|2006-08-01|Immersion Corporation|Actuator for providing tactile sensations and device for directional tactile sensations|
JP3918700B2|2002-09-26|2007-05-23|株式会社デンソー|Navigation device|
GB0301360D0|2003-01-21|2003-02-19|Scratch Ltd|Golf stroke training device|
US8462109B2|2007-01-05|2013-06-11|Invensense, Inc.|Controlling and accessing content using motion processing on mobile devices|
KR100984596B1|2004-07-30|2010-09-30|애플 인크.|Gestures for touch sensitive input devices|
US7728821B2|2004-08-06|2010-06-01|Touchtable, Inc.|Touch detecting interactive display|
US8264465B2|2004-10-08|2012-09-11|Immersion Corporation|Haptic feedback for button and scrolling action simulation in touch input devices|
US7683883B2|2004-11-02|2010-03-23|Pierre Touma|3D mouse and game controller based on spherical coordinates system and system for use|
JP3888382B2|2005-07-05|2007-02-28|松下電器産業株式会社|Data processing device|
US7834847B2|2005-12-01|2010-11-16|Navisense|Method and system for activating a touchless control|
EP1804154A3|2005-12-27|2012-08-08|Poston Timothy|Computer input device enabling three degrees of freedom and related input and feedback methods|
KR100877829B1|2006-03-21|2009-01-12|엘지전자 주식회사|Terminal with scrolling function and scrolling method thereof|
US9395905B2|2006-04-05|2016-07-19|Synaptics Incorporated|Graphical scroll wheel|
US7469381B2|2007-01-07|2008-12-23|Apple Inc.|List scrolling and document translation, scaling, and rotation on a touch-screen display|
JP4973245B2|2007-03-08|2012-07-11|富士ゼロックス株式会社|Display device and program|
US7832126B2|2007-05-17|2010-11-16|Siemens Industry, Inc.|Systems, devices, and/or methods regarding excavating|
US8432365B2|2007-08-30|2013-04-30|Lg Electronics Inc.|Apparatus and method for providing feedback for three-dimensional touchscreen|
EP2060970A1|2007-11-12|2009-05-20|Research In Motion Limited|User interface for touchscreen device|
US20090262137A1|2008-01-10|2009-10-22|Walker Jay S|Systems and methods for presenting prediction in a broadcast|
US20110102570A1|2008-04-14|2011-05-05|Saar Wilf|Vision based pointing device emulation|
JP2009277117A|2008-05-16|2009-11-26|Kenwood Corp|Navigation device, program, and map scroll method|
KR100914438B1|2008-05-20|2009-08-28|엘지전자 주식회사|Electronic device with touch device and method of executing functions thereof|
US8477103B2|2008-10-26|2013-07-02|Microsoft Corporation|Multi-touch object inertia simulation|
JP5267990B2|2009-02-26|2013-08-21|Necインフロンティア株式会社|Electronic journal data browsing system, electronic journal data browsing method, program thereof, and recording medium|
JP5568970B2|2009-11-30|2014-08-13|ソニー株式会社|Information processing apparatus, information processing method, and program thereof|
JP5381691B2|2009-12-25|2014-01-08|アイシン・エィ・ダブリュ株式会社|Map display device, map display method and program|
CN101751195A|2009-12-30|2010-06-23|中兴通讯股份有限公司|Method and device for controlling move event on resistive touch screen|
JP5230684B2|2010-05-13|2013-07-10|パナソニック株式会社|Electronic device, display method, and program|
JP5800473B2|2010-06-11|2015-10-28|任天堂株式会社|Information processing program, information processing apparatus, information processing system, and information processing method|
JP5241038B2|2010-07-01|2013-07-17|パナソニック株式会社|Electronic device, display control method, and program|US9857868B2|2011-03-19|2018-01-02|The Board Of Trustees Of The Leland Stanford Junior University|Method and system for ergonomic touch-free interface|
US8840466B2|2011-04-25|2014-09-23|Aquifi, Inc.|Method and system to create three-dimensional mapping in a two-dimensional game|
US8854433B1|2012-02-03|2014-10-07|Aquifi, Inc.|Method and system enabling natural user interface gestures with an electronic system|
US8836768B1|2012-09-04|2014-09-16|Aquifi, Inc.|Method and system enabling natural user interface gestures with user wearable glasses|
CN103365583B|2012-04-06|2016-06-22|宏碁股份有限公司|Electronic installation and the method unlocked thereof|
CN103425706B|2012-05-25|2016-03-23|腾讯科技(深圳)有限公司|Method and the device of element dragging is realized in the webpage of embedded floating frame element|
US9111135B2|2012-06-25|2015-08-18|Aquifi, Inc.|Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera|
US9098739B2|2012-06-25|2015-08-04|Aquifi, Inc.|Systems and methods for tracking human hands using parts based template matching|
US9268424B2|2012-07-18|2016-02-23|Sony Corporation|Mobile client device, operation method, recording medium, and operation system|
CN102833183B|2012-08-16|2017-02-15|上海量明科技发展有限公司|Instant messaging interactive interface moving method, client and system|
CN103677503B|2012-09-14|2015-10-21|腾讯科技(深圳)有限公司|The triggering operation method of interface assembly and device|
CN102929535B|2012-10-09|2018-05-01|中兴通讯股份有限公司|A kind of method and terminal of the control of suspension window position|
US9092665B2|2013-01-30|2015-07-28|Aquifi, Inc|Systems and methods for initializing motion tracking of human hands|
US9129155B2|2013-01-30|2015-09-08|Aquifi, Inc.|Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map|
US9298266B2|2013-04-02|2016-03-29|Aquifi, Inc.|Systems and methods for implementing three-dimensionalgesture based graphical user interfacesthat incorporate gesture reactive interface objects|
CN103246476B|2013-04-27|2016-12-28|华为技术有限公司|The spinning solution of a kind of screen content, device and terminal unit|
CN104182116A|2013-05-23|2014-12-03|中兴通讯股份有限公司|View processing method and view processing terminal|
US9798388B1|2013-07-31|2017-10-24|Aquifi, Inc.|Vibrotactile system to augment 3D input systems|
KR102203810B1|2013-10-01|2021-01-15|삼성전자주식회사|User interfacing apparatus and method using an event corresponding a user input|
CN103595972A|2013-11-28|2014-02-19|深圳英飞拓科技股份有限公司|Remote focusing device real-time browse control method and system|
JP5924555B2|2014-01-06|2016-05-25|コニカミノルタ株式会社|Object stop position control method, operation display device, and program|
US9507417B2|2014-01-07|2016-11-29|Aquifi, Inc.|Systems and methods for implementing head tracking based graphical user interfacesthat incorporate gesture reactive interface objects|
US9619105B1|2014-01-30|2017-04-11|Aquifi, Inc.|Systems and methods for gesture based interaction with viewpoint dependent user interfaces|
KR102205283B1|2014-02-12|2021-01-20|삼성전자주식회사|Electro device executing at least one application and method for controlling thereof|
CN104866166A|2014-02-21|2015-08-26|联想有限公司|Information processing method and electronic equipment|
CN105022567B|2014-04-30|2018-06-26|惠州市德赛西威汽车电子股份有限公司|A kind of method by mouse drag slide handover interface|
CN106462355B|2014-06-11|2020-05-19|三菱电机株式会社|Display control system and display control method|
CN104636027A|2014-08-22|2015-05-20|康佳集团股份有限公司|Method and system for managing tabletop icons based on intelligent television|
CN105045521A|2015-08-31|2015-11-11|网易(杭州)网络有限公司|Touch screen response method and apparatus|
CN105975122A|2016-04-27|2016-09-28|集怡嘉数码科技(深圳)有限公司|Touch track compensation method and apparatus as well as terminal device|
JP6201004B1|2016-06-01|2017-09-20|株式会社ゲオインタラクティブ|User interface program|
CN106325713B|2016-08-17|2019-12-03|厦门印天电子科技有限公司|A kind of inertia moving method of whiteboarding software sliding object|
CN108803926A|2018-05-29|2018-11-13|广东工业大学|A kind of control method of displacement vector and direction vector based on touch point|
CN111083287B|2019-12-16|2020-12-25|深圳集智数字科技有限公司|Visual list sorting method and device|
CN111026313B|2019-12-16|2021-01-22|深圳介子云图空间科技有限公司|Visual list sorting method and device|
法律状态:
2020-09-29| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-12-22| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-01-26| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 06/07/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
CN201010223457.0|2010-07-07|
CN201010223457.0A|CN102314297B|2010-07-07|2010-07-07|A kind of Window object inertia displacement method and implement device|
PCT/CN2011/076909|WO2012003799A1|2010-07-07|2011-07-06|Method and device for window object inertial movement|
[返回顶部]