专利摘要:
EDGE STYLE.This document describes techniques and devices that enable an edge gesture. In some modalities, these techniques and devices make it possible to select a user interface currently not exposed on a screen using an edge gesture that is easy to use and remember.
公开号:BR112013030414A2
申请号:R112013030414-6
申请日:2011-10-09
公开日:2020-08-11
发明作者:Jennifer Nan;Jesse Clay Satterfield;David A. Matthews;Thomas P. Russo;Robert J. Jarrett;Weidong Zhao;Jensen Harris;Chaitanya Dev Sareen
申请人:Microsoft Corporation;
IPC主号:
专利说明:

Invention Patent Descriptive Report for "COMPUTER IMPLEMENTED METHOD".
BACKGROUND Conventional techniques for selecting a user interface that is not currently displayed on a screen are often confusing, take up valuable display space, cannot be applied universally on different devices, or provide a poor experience. - assistance to the user. Some conventional techniques, for example, allow the selection of a user interface through on-screen controls on a taskbar, inside a floating window, or in a window frame. These on-screen controls, however, take on a real valuable display state and can irritate users, requiring users to find and select the correct control. 15 Some other conventional techniques make it possible to select a user interface through hardware, such as access keys and buttons. At best, these techniques require users to remember that key, the key combination, or the hardware button to select. Even in the best of cases, users often accidentally select fabrics or buttons. In addition, in many cases, hardware selection techniques cannot be applied universally, since the hardware in computing devices may vary from device model, generation, supplier or manufacturer. In such cases, either the techniques will not work or work differently using different computing devices. This exacerbates the problem for users who need to remember the correct hardware, since many users have multiple devices, so they may need to remember different hardware selections for different devices. Furthermore, for many computing devices, hardware selection forces users to engage a computing device outside the user's normal flow of interaction, such as when a touchscreen device requires the user to change its mental and physical orientation of interactions
screen-based information for hardware-based interactions.
SUMMARY This document describes techniques and devices that enable an edge gesture. In some modalities, these techniques and devices 5 allow the selection of a user interface currently not exposed on a screen through an edge gesture that is easy to use and remember. The summary is provided to introduce simplified concepts to enable an edge gesture that are further described below in the Detailed Description. This summary is not intended to identify the essential characteristics of the claimed matter, nor is it intended to be used in determining the scope of the claimed matter. Techniques and / or devices that enable an edge gesture are also referred to in this document separately or together as the "techniques" "as permitted by the context.15 BRIEF DESCRIPTION OF THE DRAWINGS The modalities that make an edge gesture possible are described with reference to the following drawings: The same numbers are used throughout the drawings to refer to the same characteristics and components: 20 Fig. 1 illustrates an exemplary system in which the techniques that enable an edge gesture can be implemented. Fig. 2 illustrates an exemplary method for enabling edge gestures based on the edge gesture that is approximately perpendicular to an edge on which the gesture begins.25 Fig. 3 illustrates an exemplary tablet computing device having a touch screen showing an immersion interface Fig. 4 illustrates an exemplary immersion interface of Fig. 3 together with exemplary edges Fig. 5 illustrates the immersion interface of the figures 3 and 4, together with the angular variation lines from a perpendicular line and from a line from a starting point to a posterior point of a gesture.
Fig. 6 illustrates the edges of the immersion interface shown in Fig. 4 together with two regions on the right edge. Fig. 7 illustrates an application selection interface presented by a system interface module in response to a ox gesture and on the immersion interface and the Fig. 3 web page. Fig. 8 illustrates an exemplary method to enable border gestures, including the determination of an interface, to be based on some gesture factor. Fig. 9 illustrates an exemplary method that makes it possible to expand 10 of, or cease, the presentation of, a user interface presented in response to a border gesture or presentation of another user interface. Fig. 10 illustrates a laptop computer with a touch screen having a window-based email interface and two immersion interfaces 15. Fig. 11 illustrates the interfaces of Fig. 10 together with two gestures having a starting point, posterior points and one or more successive points. Fig. 12 illustrates the e-mail interface based on the windows of 20 figures 10 and 11 along an e-mail manipulation interface presented in response to an edge gesture. Fig. 13 illustrates the interfaces in Fig. 12 together with an additional email options interface presented in response to a determined gesture to have a successive point at a predefined distance from the edge. Fig. 14 illustrates an exemplary device in which the techniques that enable edge gestures can be implemented.
DETAILED DESCRIPTION Overview 30 This document describes techniques and devices that enable an edge gesture. These techniques allow a user to quickly and easily select an interface not currently exposed on the device.
user, as well as other operations.
Consider a case where the user is watching a movie on a tablet computing device.
Suppose the movie is playing on an immersion interface that takes up the entire screen and that the user would like to check your social media web page without interrupting the movie.
The techniques and devices described make it possible to select other interfaces through a simple swipe gesture initiated on an edge of your screen.
She can slide her finger from the edge of her screen and drag out a user interface that makes it possible to select her social networking site.
Or instead, it is assumed that she would like to interact with the media application that plays the movie in a way that is not permitted by the immersion interface, such as displaying a menu that allows subtitles or a comment from the director.
She can slide her finger from the other edge of her tablet screen and drag a control menu 15 onto the immersion interface and select items and commands from this menu quickly and easily.
In both cases the valuable real state used to play the movie was not taken with the on-screen controls, nor was the user required to remember and find a hardware button.
Furthermore, no gesture, except one beginning from an edge, is used by the techniques in this example, thus allowing the immersion interface to use almost all the gestures commonly available.
Furthermore, considering the edge gestures or portions thereof, the techniques do not affect the performance of a gesture or touch input system since the edge gestures can be processed before the entire gesture is complete avoiding latency associated with total processing gestures initiated elsewhere.
These are just two examples of the many ways in which the techniques enable and use cutting-edge gestures, others of which are described below. 30 Exemplary System Fig. 1 illustrates an exemplary system 100 in which techniques that enable an edge gesture can be incorporated. System 100 includes a computing device 102, which is illustrated with six examples: a laptop computer 104, a tablet computer 106, a smart phone 108, a set-top box 110, a desktop computer 112, and a games 114, although other computing devices and systems, 5 such as servers and netbooks, can also be used. Computing device 102 includes computer processor (s) 116 and computer-readable storage means 118 (means 118). Means 118 includes an operating system 120, a window-based mode module 122, an immersion mode module 124, a system interface module 126, gesture manipulator 128, and one or more applications 130, each having one or more application interfaces
132.
Computing device 102 also includes or has access to one or more screens 134 and input mechanisms 136. Four example screens are illustrated in Fig. 1. Input mechanisms 136 can include gesture-sensitive sensors and devices such as touch-based sensors and motion tracking sensors (for example, camera-based), as well as mice (independent or integral with a keyboard), tracking buffers, and microphones with 20 recognition software accompanying voice, to name a few. Input mechanisms 136 can be separated or integral with screens 134; examples of integrals include gesture-sensitive screens with touch-sensitive or motion-sensitive sensors.
The window-based mode module 122 presents application user interfaces 132 through windows having frames. These tables can provide controls through which to interact with an application and / or controls that allow a user to move and resize the window. The immersion mode module 124 provides an environment in which a user can view and interact with one or more of the 132 applications. In some modes, this environment contains content from, and allows interaction with, applications with little or no window frames and / or without a need for a user to manage a window frame layout or primacy over other windows (for example, which window is active or in front) or to manually scale or position user interfaces-- 5 application interface 132 .
This environment may be, but it is not required to be hosted on the surface without using a window-based work environment. Thus, in some cases, the immersion mode module 124 presents an immersion environment that is not a window (even one without a substantial frame) and prevents the use of screens similar to the work area (for example , a taskbar). Furthermore, in some modalities this immersion environment is similar to an operating system in which it cannot be closed or is unable to be installed. Although not required, in some cases this immersion environment makes it possible to use 15 all or almost all pixels of a screen by applications. Examples of immersion environments are provided below as part of a description of the techniques, however they are not exhaustive or intended to limit the techniques described in this document.
System interface module 126 provides one or more inter-faces through which interaction with operating system 120 is activated, such as an application launch interface, a boot menu, or system tools or the options menu, to name just a few.
Operating system 120, modules 122, 124 and 126, as well as gesture handler 128, can be separated from each other, or combined or integrated in any suitable way. Exemplary Methods Fig. 2 describes a method 200 for enabling edge gestures based on the edge gesture which is approximately perpendicular to an edge at which the gesture begins. In the portions of the following discussion reference can be made to system 100 in Fig. 1, reference to which it is only for example.
Block 202 receives a gesture.
This gesture can be received in various parts of a screen, such as through a window-based interface, an immersion interface, or no interface at all.
In addition, this gesture can be made and received in several ways, such as a pointer tracking a movement received through a touch pad, mouse, or ball or a physical movement made with the arm (s) ), finger (s), or a pen received through a movement-sensitive or touch-sensitive mechanism.
In some cases, the gesture is received outside or in the vicinity of a physical edge of the screen (for example, as a finger or a pen meets the edge of the screen) from a touch digitizer, a capacitive touch screen, or a capacitive sensor, to name a few.
As an example, consider Fig. 3, which illustrates a tablet computing device 106. Tablet 106 includes a touchscreen 302 shown showing an immersion interface 304 that includes a web page 306. As part of an ongoing example, in block 202 the gesture handler 128 receives the gesture 308 as shown in Fig. 3. Block 204 determines whether a starting point of the gesture is at an edge.
As noted above, the border in question can be a border of a user interface, whether immersion or window-based, or 20 of a screen.
In some cases, of course, an edge of a user interface is also an edge of a screen.
The size of the border can vary based on several factors on the screen or the interface.
A small screen or interface may be smaller in absolute or pixel terms than a large screen or interface.
A highly sensitive input mechanism allows for a smaller edge as well.
In some cases, an edge may extend beyond an edge of the screen or a screen when an input mechanism is capable of receiving a gesture portion beyond the screen or screen.
The exemplary borders are rectangular and vary between one and twenty pixels in one dimension and an interface or screen boundary in the other dimension, although other sizes and shapes, including concave and convex borders, may instead be used.
Continuing the current example, consider Fig. 4, which illustrates
uses the 304 immersion interface and the 308 gesture in Fig. 3, as well as the left edge 402, top edge 404, right edge 406, and bottom edge
408. For visual clarity the web page 306 is not shown. In this example, the dimensions of the interface and the screen are of a moderate size, including that of smart phones and that of many laptop and desktop screens. The edges 402, 404, 406, and 408 have a small dimension of twenty pixels or about 10-15 millimeters in absolute terms, an area of each shown bordered by dashed lines in twenty pixels from the edge of the screen and the edge edge 410, 412, 414, and 416, respectively. 10 The gesture handler 128 determines that gesture 308 has a starting point 418 and that this starting point 418 is on the left edge 402. The gesture handler 128 determines the starting point in this case by receiving data indicating the coordinates [X, Y] in pixels where the gesture 308 begins and comparing the first of these coordinates to the 15 pixels contained in each border 402-408. The gesture handler 128 can often determine the starting point and whether it is on an edge faster than the sample rate, thereby causing little or no performance degradation of the techniques that simply pass the gestures. directly to an exposed interface on which a gesture is made. Returning to method 200, generally, if block 204 determines that the starting point of the gesture is not at an edge, method 200 proceeds along a "No" path to block 206. Block 206 passes the gestures to a exposed user interface, such as an underlying interface on which the gesture was received. Changing the current example, it is assumed that gesture 308 was determined to have no starting point within an edge. In this case, the gesture handler 128 passes the temporarily stored data for gesture 308 to the immersion user interface 304. After passing the gesture, method 200 terminates. 30 If block 204 determines that the starting point of the gesture is at an edge, method 200 proceeds along a "Yes" path to block 208. Optionally, block 204 can determine a length
a portion of the gesture before the method proceeds to block 208. In some cases, determining the length of the gesture portion allows the determination of the starting point to be made before a completion of the gesture.
Block 208 responds to the positive determination of block 204 5 by determining whether a line from the starting point to a posterior point of the gesture is approximately perpendicular from the edge.
Block 208, in some modalities, determines the posterior point used.
Gesture handler 128, for example, can determine the back point of the gesture based on the back point having received a predefined distance from the edge or starting point, such as passing edge limit 410 to the edge. edge 402 or twenty 418 pixels from starting point 418, all of Fig. 4. In some other embodiments, gesture handler 128 determines the posterior point based on whether it has received a programmed time after receiving the peer point - 15, this amount of time slightly longer than that generally used by computing device 102 to determine that a gesture is a touch and hold gesture or a stationary gesture.
For the current mode, the gesture handler 128 uses a point received after the gesture 308 received out of edge 402 as long as the point received later is received within a pre-set time.
If no point is received outside the border within this pre-set time, the gesture handler 128 proceeds to block 206 and passes gesture 308 to the immersion interface 304. Using the starting point, block 208 determines whether a line 25 from the starting point to the posterior point of the gesture is approximately perpendicular to the edge.
Various angles of variation can be used in this determination by block 208, such as five, ten, twenty or thirty degrees.
As an example, consider a variation angle of thirty 30 degrees from the perpendicular.
Fig. 5 illustrates this exemplary variation, showing the immersion interface 304, the gesture 308, the left edge 402, the edge of the left edge 410, and the starting point 418 of the figures. 3 and 4, together
`[IIP1i]
with 30 degree variation lines 502 from perpendicular line 504. Thus, gesture handler 128 determines that line 506 from starting point 418 to posterior point 508 (which is about twenty degrees perpendicular) is approximately perpendicular based on the exemplary thirty degree 502 variation line 5.
Generally, if block 208 determines that the line is not approximately perpendicular to the edge, method 200 proceeds along a "No" path to block 206 (for example, a one-finger path is curved). As noted in part above, block 208 can also determine that a later point or other aspect of a gesture disqualifies the gesture.
Examples include when a posterior point is inside the edge, such as due to a stationary, touch, pinch and hold gesture, or up and down (for example, to scroll content on the user interface), when the gesture is defined to be a single entry gesture 15 and a second entry is received (for example, a first finger starts at an edge, but a second finger then lands anywhere), or if a touch event occurs during or before the gesture (for example, a finger is already making contact elsewhere or the contact is received anywhere during the gesture). 20 If block 208 determines that the line is approximately perpendicular based on a posterior point outside the border, method 200 proceeds along a "Yes" path to block 210. Block 210 responds to the positive determination of block 208 by passing the gesture for an entity other than the exposed user interface.
This entity is not a user interface through which the gesture was received, assuming it was received through a user interface as a whole. The block 210 can determine which entity to pass the gesture to as well, such as based on an edge or region of an edge on which the starting point of the gesture is received.
Consider Fig. 6, for example, that 30 illustrates the immersion interface 304 and the edges 402, 404, 406, and 408 of Fig. 4, but adds the top region 602 and the bottom region 604 to the right edge 406 A starting point in the upper region 602 may result in a different entity (or even the same entity, but a different user interface provided in response) than a starting point received for the lower region 604. From the same Thus, a starting point at the top edge 404 may result in a different entity or interface than the left edge 402 or bottom edge 408. In some cases, this entity is an application associated with the user interface.
In this case, passing the gesture on to the entity can be effective in getting the application to present a second user interface enabling interaction with the application.
In the example movie above, the 10 entity can be the media player that plays the movie, but not the immersion interface displaying the movie.
The media player can then present a second user interface that allows the selection of subtitles or comments from the director instead of selections made possible by the interface that displays the film, such as "pause", "play" and "stop" . This capability is allowed in Fig. 1, where one of the applications 130 can include or be able to display more than one application user interface 132. Thus, block 210 can pass the gesture to the module system interface 126, one of the applications 130 currently featuring the user interface, or others of the applications 130, to name just three possibilities. 20 Upon completion of the current mode, gesture handler 128 passes gesture 308 from block 210 to system interface module 126. System interface module 126 receives the temporarily stored portion of gesture 308 and continues to receive rest of gesture 308 as done by the user.
Fig. 7 illustrates a possible response when receiving the gestation 25 to 308, showing an application selection interface 702 presented by the system interface module 126 and through the immersion interface 304 and web page 306 of Fig. 3. The application selection interface 702 makes it possible to select several other applications and their respective interfaces in selectable application mosaics 704, 706, 708, and 710. 30 The example application selection interface 702 is an interface of immersion user presented using immersion mode module 124, although this is not required.
The presented interfaces, or a list of them, can instead be window-based and presented using the window-based module 122. Both modules are illustrated in Fig. 1. Block 210 can also or instead determine passing the 5 gesture to entities and different interfaces based on other factors around the received gesture.
Exemplary factors are described in more detail in Method 800 below.
Note that the 200 method and other methods described below can be performed in real time, such as while a gesture is feeling done and received.
This allows, among other things, that a user interface, presented in response to a gesture, is presented before the gesture ends.
In addition, the user interface can be presented progressively as the gesture is received.
This allows a user to experience dragging out the user interface from edge 15 as the gesture is performed with the user interface appearing to "stick" to the gesture (for example, the a mouse point or the finger of the person making the gesture). Fig. 8 shows a method 800 for allowing edge gestures including determining an interface to present itself based on some 20 factor of the gesture.
In the portions of the following discussion reference may be made to system 100 of Fig. 1, reference to which is made as an example only.
Method 800 can act totally or partially separated from, or in conjunction with other methods described in this document.
Block 802 determines that a gesture made through a user interface has a starting point at one edge of the user interface and a later point not within the edge.
The 802 block can operate similarly to or use the 200 method features, such as determining a later point on which to base the 802 block determination. The 802 block can act differently as well. 30 In one case, for example, block 802 determines that a gesture is a single-finger slip gesture starting from an edge of an exposed immersion user interface and having a posterior point not at the edge, but not based on an angle of the gesture.
Based on this determination, block 802 proceeds to block 804 instead of passing the gesture to the exposed immersion user interface.
Block 804 determines which interface to display based on 5 or more factors of the gesture. The block 804 can thus do based on a final or intermediate length of the gesture, whether the gesture is single or multiple points (for example, a single finger or multiple fingers), or a speed of the gesture.
In some cases, two or more factors of a gesture determine which interface to present as a drag and hold gesture having a drag length and holding time or a drag and drop gesture having a drag length and position. of falling.
Thus, block 804 can determine to present a start menu in response to a multiple finger gesture, an application selection interface in response to a relatively short single finger gesture, or a system control interface that allows selection to turn off computing device 102 in response to a relatively long single finger gesture, for example.
To do this, gesture handler 128 can determine the length of the gesture, the speed, or a number of entries (for example, fingers). In response, block 806 presents the defined user interface.
The given user interface can be any of those mentioned in this document, as well as a completely new look, such as a new page in an e-book, an additional look (for example, a toolbar or navigation bar), or a modified view of a current user interface (showing the text of a current user interface in a different font, color, or highlight). In some cases, visual or non-visual effects may be presented such as actions related to a game video or sound effects associated with the current or presented user interface.
Supposing, as an example, that the gesture handler 128 30 determines, based on a gesture factor, to present a user interface that allows interaction with the operating system 120. In response, the system interface module 126 presents this user interface.
The presentation of the user interface can be similar to the modes described in other methods, such as with a progressive display of the application selection user interface 702 of Fig. 7. Following method 200 or method 800, in whole or in pair - 5 te, the techniques can proceed to perform the 900 method of Fig. 9. The 900 method allows the expansion of a user interface, the presentation of another interface, or cease the presentation of the user interface presented in response to a gesture edge.
Block 902 receives a successive point of the gesture and then the presentation of at least a portion of the second user interface.
As noted in part above, the 200 and / or 800 methods are capable of presenting or causing a second user interface to be presented, such as a second user interface for the same application associated with a current user interface, a different application, or 15 a system user interface.
As an example, consider Fig. 10, which illustrates a laptop computer 104 having a touchscreen 1002 displaying a window-based email interface 1004 and two immersion interfaces 1006 and 1008. The interface window-based e-mail address 1004 is associated with an application that manages e-mail, which can be remote or local to the laptop computer 104. Fig. 10 also illustrates two gestures, 1010 and 1012. Gesture 1010 proceeds in a straight line while gesture 1012 reverses back (shown with two arrows to show two directions). Fig. 11 illustrates gesture 1010 having a starting point of 25 1102, a posterior point 1104, and a successive point 1106, and gesture 1012 having the same starting point of 1102, a posterior point 1108, and a first successive point 1110, and a second successive point 1112. Fig. 11 also shows a bottom end 1114, a posterior point area 1116, and an interface addition area 1118. 30 The block 904 determines, based on the successive point, whether the gesture includes a reversal, an extension, or none of them.
Block 904 can determine a reversal in the direction of the gesture by determining that a successive point is on the edge or is closer to the edge than an earlier point of the gesture. Block 904 can determine that the gesture extends based on the successive point being a pre-fixed distance from the edge or the posterior point. If none of these are determined to be true, method 5 can repeat blocks 902 and 904 to receive and analyze the additional successive points until the gesture ends. If block 904 determines that there is a reversal, method 900 proceeds along the "Reversal" path in block 906. If block 904 determines that the gesture is extended, method 900 proceeds along a path " Extension "10 for block 908. In the context of the present example, it is assumed that gesture handler 128 receives first successive point 1110 from gesture 1012. Gesture handler 128 then determines that the first successive point is not in edge 1114, is no closer than a previous point 15 of the gesture to edge 1114 (for example, it is no closer to the back point 1108), and it is not a pre-fixed distance from the edge or back point by not be within the interface addition region 1118. In such a case, method 900 returns to block 902. In a second iteration of block 902, it is assumed that gesture handler 128 receives the second successive point 1112. In this case, gesture handler 128 determines that the second point su cessive 1112 is closer to edge 1114 than successive point 1110 and thus gesture 1012 includes a reversal. The gesture handler 128 then proceeds to block 906 to stop displaying the second 25 user interface previously presented in response to the gesture. As an example, consider Fig. 12, which illustrates an e-mail manipulation interface
1202. In this exemplary case of block 906, gesture handler 128 stops the email application from displaying interface 1202 in response to a 1012 gesture rollback (not shown removed). Block 908, however, presents or causes the presentation of a third user interface or the expansion of the second user interface. In some cases, the presentation of the third user interface stops the second user interface from being displayed, either by canceling the presentation or by hiding the second user interface (for example, by presenting the third user interface through second user interface). Continuing the current example, 5 see Fig. 13, which illustrates an additional email interface 1302 in response to gesture 1010 determined to have a successive point 1106 a pre-fixed distance from edge 1104, in this case being within the region of interface addition 1118 of Fig. 11. This region and the pre-set distance can be defined based on the size of the user interface 10 previously presented in response to the gesture.
Thus, a user who wants to add additional controls can simply extend the gesture beyond the user interface presented in response to an earlier portion of the gesture. © method 900 can be repeated to add additional user interfaces or to expand a presented user interface.
Back to the example interface 702 of Fig. 7, for example, gesture handler 128 can continue to add interfaces or controls to interface 702 as gesture 308 extends beyond interface 702, such as as well as presenting an additional set of selectable application mosaics.
If the gesture 308 extends beyond the additional tiles, the gesture manipulator 128 can cause the system interface module 124 to present another interface adjacent to the tiles to enable the user to select the controls, such as suspend, hibernate, switching modes (immersion for window-based and reverse), or turning off computing device 102. 25 Although the exemplary user interfaces presented above in response to an edge gesture are opaque, they can also be partially transparent.
This can be useful as it does not obscure the content.
In the example film described above, a user interface shown can be partially transparent, thereby allowing the film to be only partially obscured when using the user interface.
Similarly, in the example of figures 12 and 13, interfaces 1202 and 1302 can be partially transparent, allowing the user to see the text of the email while also selecting a control on one of the interfaces.
The preceding discussion describes methods in which techniques can enable and use border gestures.
These methods are shown 5 as sets of blocks that specify the operations performed, but are not necessarily limited to the order shown to perform the operations for the respective blocks.
The characteristics of these methods can be implemented in hardware (for example, fixed logic circuit), firmware, a System-on-10 Chip (SoC), software, manual processing, or any combination thereof.
The software implementation represents program code that performs the specified tasks when executed by a computer processor, such as software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and the like .
The program code can be stored on one or more computer-readable memory devices, both local and / or remote to a computer processor.
The methods can also be practiced in a computing environment spread across multiple computing devices. 20 Exemplary Device Fig. 14 illustrates various components of the exemplary device 1400, which can be implemented as any type of client, server and / or computing device as described with reference to figures 1-13 above to implement the techniques that enable edge gestures.
In the embodiments, the device 1400 can be implemented as one or a combination of a wired and / or wireless device, as a form of television client device (for example, a television set-top box, digital video recorder (DVR) ), etc.), consumer device, computer device, server device, portable computer device, user device, communication device, processing device and video rendering device, the device device, game device, the electronic device, and / or as another type of device
tive.
The 1400 device can also be associated with a user (for example, a person) and an entity that operates the device so that a device describes logical devices that include users, software, firmware and a combination of devices. The device 1400 includes communication devices 1402 that enable wired and wireless communication of data device 1404 (for example, data received, data being received, data programmed for broadcast, data data packets, etc.). The 1404 data device or other device content may include device configuration settings, media content stored on the device, and / or information associated with a device user.
The media content stored on the 1400 device can include any type of audio, video and image data.
Device 1400 includes one or more data inputs 1406, through which any type of data, media content and inputs can be received, such as user-selectable inputs, messages, music, television media content, video content recorded, and any other type of audio, video and / or image data received from any content and / or data source. The device 1400 also includes communication interfaces 1408, which can be implemented as one or more of a serial or parallel interface, a wireless interface, any type of network interface, a modem and like any other type of interface. Communication.
Communication interfaces 1408 provide a communication connection and / or links 25 between device 1400 and a communication network through which other electronic, computing and communication devices communicate data with device 1400. Device 1400 includes a or more 1410 processors (for example, any of microprocessors, controllers, and the like), which process various instructions executable by computer to control the operation of the 1400 device and enable techniques that enable and / or use the gestures of edge.
Alternatively, or in addition, the device
vo 1400 can be implemented with any one or combination of hardware, firmware, or fixed logic circuit that is implemented in connection with processing and control circuits that are generally identified in 1412. Although not shown, the 1400 device can include a system bus or a data transfer system that couples the various components within the device.
A system bus can include any or a combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and / or a processor or bus site using any of a variety of bus architectures.
The 1400 device also includes computer-readable storage media 1414, such as, for example, one or more memory devices that enable persistent and / or non-transition data storage (i.e., in contrast to a simple signal transmission), examples of which include a random access memory (RAM), non-volatile memory (for example, any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.) , and a disk storage device.
A disk storage device can be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a rewritable compact disc and a rewritable compact disc (CD), any type of digital versatile disc (DVD), and so on.
Device 1400 may also include a mass storage medium device 1416. Computer-readable storage media 1414 provides data storage mechanisms for storing data from device 1404, as well as various applications from device 1418 and any other types of information and / or data related to the operational characteristics of the device 1400. For example, an operating system 1420 can be maintained as a computer application with computer-readable storage media 1414 and run in processed-
res 1410. 1418 device applications may include a device manager, such as any form of control application, software application, signal processing and control module, code that is native to a particular device, a 5 hardware abstraction layer for a particular device, and so on. The 1418 device applications also include any system components or modules to implement the techniques that use or enable edge gestures. In this example, applications on device 1418 may include a system interface module 122, gesture handler 128, and application (s) 130.
CONCLUSION Although the modalities of techniques and devices that allow an edge gesture to be described in the specific language for the characteristics and methods, it must be understood that the purpose of the appended claims is not necessarily limited to the characteristics or methods. specifics described. Instead, specific features and methods are described as exemplary implementations that make it possible to use an edge gesture.
权利要求:
Claims (20)
[1]
1. Method implemented by computer characterized by the fact that it comprises the steps of: receiving (202) a gesture (308) made through a touch sensitive screen 5 (302, 1002) presenting at least one user interface (132 ); determine (204) that the gesture (308) has a starting point on an edge (402, 404, 406, 408) of the gesture-sensitive screen (302, 1002) and a posterior point not on the edge (402, 404, 406 , 408); and in response to the determination, passing (210, 206) the gesture (308) to an operating system associated with the gesture-sensitive screen (302, 1002); and feature an application selection interface (702) containing selectable application blocks associated with a plurality of applications, the application selection interface (702) replacing at least a portion of at least one application user interface displayed on the gesture sensitive screen (302, 1002).
[2]
2. Method implemented by computer, according to claim 1, characterized by the fact that the application selection interface (702) is immersive.
[3]
3. Method implemented by computer, according to claim 1, characterized by the fact that it also comprises determining a region of multiple border regions (602, 604), and in which to pass (210) the gesture (308) to the operating system is based on the region.
[4]
4. Method implemented by computer, according to claim 1, characterized by the fact that it also comprises determining that the gesture (308) is one of a first length and a second length, and in which when passing the gesture ( 308) passes the gesture (308) to the operating system in response to determining that the gesture (308) is of the first length.
[5]
5. Method implemented by computer, according to claim 1, characterized by the fact that it also comprises determining whether the gesture (308) is one among a single-finger gesture and a multiple-finger gesture and in which at the passing the gesture (308) to the operating system is responsible for determining whether the gesture (308) is one of the single finger gesture and the multiple finger gesture. 5
[6]
6. Method implemented by computer, according to claim 1, characterized by the fact that the application selection interface (702) is at least partially transparent.
[7]
7. Method implemented by computer, according to claim 1, characterized by the fact that passing the gesture (308) to the operational system is responsive to determine if the edge (402, 404, 406, 408) is a lateral edge of the gesture-sensitive screen (302, 1002).
[8]
8. Method implemented by computer, according to claim 1, characterized by the fact that the step of determining (204) includes determining that a line from the starting point to the posterior point of the gesture (308) is within twenty degrees of a predetermined angle from the edge (402, 404, 406, 408).
[9]
9. Method implemented by computer, according to claim 8, characterized by the fact that it also comprises, before determining that a line from the starting point to the posterior point of the gesture (308) is within twenty degrees of a predetermined angle from the edge (402, 404, 406, 408), determine the posterior point of the gesture (308) based on the posterior point being received at a preset distance from one of the edge (402, 404 , 406, 408) and the starting point.
[10]
10. Method implemented by computer, according to claim 8, characterized by the fact that it also comprises, before determining that a line from the starting point to the posterior point of the gesture (308) is within twenty degrees of a predetermined angle from the edge (402, 404, 406, 408), determine the posterior point of the gesture (308) based on the posterior point being received at a preset time after receiving the starting point.
[11]
11. Method implemented by computer, according to claim 1, characterized by the fact that the stage of determining (204)
includes determining that the gesture (308) is a sliding gesture oriented perpendicular to the edge (402, 404, 406, 408).
[12]
12. Method implemented by computer, according to claim 1, characterized by the fact that passing the gesture (308) causes 5 the application selection interface (702) to be displayed and also comprising: receiving (902) a second gesture (308) made through the gesture-sensitive screen (302, 1002); determining (904) that the second gesture (308) has a starting point at the application selection interface (702) and a posterior point at the edge (402, 404, 406, 408); and in response to the determination that the second gesture (308) has a starting point at the application selection interface (702) and a posterior point at the edge (402, 404, 406, 408), make (906) the application selection interface (702) stops showing.
[13]
13. Method implemented by computer, according to claim 1, characterized by the fact that passing the gesture (308) to the operational system is based on at least one factor of the gesture (308).
[14]
14. Method implemented by computer, according to claim 1, characterized by the fact that it also comprises, based on a successive point of the gesture (308), that the gesture (308) includes a reversal, and where passing the gesture (308) to the operating system is based on the determination that the gesture (308) includes the reversal.
[15]
15. Method implemented by computer, according to claim 1, characterized by the fact that passing the gesture (308) to the operating system causes the application selection interface (702) to be displayed and also comprising: determining, based on a successive point of the gesture (308), that the gesture (308) includes a reversal; and in response to the determination that the gesture (308) includes the rollback, stop the application selection interface (702) from being displayed.
[16]
16. Method implemented by computer, according to claim 1, characterized by the fact that it also comprises, based on the successive point of the gesture (308), that the gesture (308) extends to a pre- adjusted from one of the edge (402, 404, 406, 408) and the 5 posterior point, and where passing the gesture (308) to the operating system is based on the determination that the gesture (308) extends over distance pre-adjusted from one of the edges (402, 404, 406, 408) and the posterior point.
[17]
17. Computer-implemented method characterized by the fact that it comprises the steps of: receiving (202) a gesture (308) made through a gesture-sensitive screen (302, 1002) presenting at least one immersive user interface (304 , 1006, 1008); determine (204) if a starting point of the gesture (308) is received at an edge (402, 404, 406, 408) of the gesture sensitive screen (302, 1002); in response to the determination that the starting point is not at the edge (402, 404, 406, 408) of the gesture-sensitive screen (02, 1002), passing (206) the gesture (308) to one of the at least one interface immersive user (304, 1006, 1008); in response to determining that the starting point is at the edge (402, 404, 406, 408) of the gesture sensitive screen (302, 1002), determine (208) whether a line from the starting point to a later point of the gesture (308) is within a predefined angle of perpendicular variance (504) from the edge (402, 404, 406, 408), and in response to the determination that the line is not within the predefined angle of perpendicular variance dialing (504) from the edge (402, 404, 406, 408), passing (206) the gesture (308) to one of the at least one immersive user interface (304, 1006, 1008); in response to the determination that the line is within the predefined angle of variance, pass the gesture (308) to an effective system interface module to display an application selection interface (702) displayed simultaneously with the at minus an immersive user interface (304, 1006, 1008), the application selection interface (702) containing selectable application blocks associated with a plurality of applications, each of the selectable application blocks containing different content in relation to each of the other selectable application blocks, and allow selection of the selectable application blocks while continuing to display content on at least one immersive user interface (304, 1006, 1008).
[18]
18. Method implemented by computer, according to claim 17, characterized by the fact that the application selection interface (702) is partially transparent and is displayed over at least one immersive user interface (304, 1006, 1008 ).
[19]
19. Method implemented by computer, according to claim 1, characterized by the fact that the step of determining (204) also comprises determining that the gesture (308) has at least two different gesture factors (308) , the at least two different gesture factors (308) are devoid of including a starting point on an edge (402, 404, 406, 408) of the gesture sensitive screen (302, 1002) and a posterior point not on border (402, 404, 406, 408).
[20]
20. Method implemented by computer, according to claim 19, characterized by the fact that the at least two different gesture factors (308) are at least two of a final gesture length (308), an intermediate length of the gesture gesture (308), a gesture speed (308), a number of gesture entry points (308), an extension of the gesture (308) and a reversal in the direction of the gesture (308).
类似技术:
公开号 | 公开日 | 专利标题
BR112013030414A2|2020-08-11|computer implemented method.
RU2604993C2|2016-12-20|Edge gesture
KR102027612B1|2019-10-01|Thumbnail-image selection of applications
US9329774B2|2016-05-03|Switching back to a previously-interacted-with application
AU2017200737B2|2019-01-17|Multi-application environment
US20120304131A1|2012-11-29|Edge gesture
US20170068428A1|2017-03-09|Managing An Immersive Interface in a Multi-Application Immersive Environment
US9104440B2|2015-08-11|Multi-application environment
US20120299968A1|2012-11-29|Managing an immersive interface in a multi-application immersive environment
同族专利:
公开号 | 公开日
KR20140025493A|2014-03-04|
RU2013152636A|2015-06-10|
ZA201307920B|2015-01-28|
CN103562838B|2015-10-07|
EP2715491A1|2014-04-09|
US20120304107A1|2012-11-29|
SG195135A1|2013-12-30|
SG10201604341YA|2016-07-28|
MX342078B|2016-09-13|
NZ618254A|2015-01-30|
CA2836263A1|2012-12-06|
RU2591671C2|2016-07-20|
EP2715491B1|2019-06-12|
JP2014515519A|2014-06-30|
EP2715491A4|2015-02-18|
JP6038898B2|2016-12-07|
HK1193660A1|2014-09-26|
CN103562838A|2014-02-05|
MX2013013921A|2013-12-16|
IL229622D0|2014-01-30|
AU2011369359A1|2013-12-19|
CO6920259A2|2014-04-10|
WO2012166176A1|2012-12-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5821930A|1992-08-23|1998-10-13|U S West, Inc.|Method and system for generating a working window in a computer system|
US5568604A|1992-12-31|1996-10-22|U S West Technologies, Inc.|Method and system for generating a working window in a computer system|
US6219032B1|1995-12-01|2001-04-17|Immersion Corporation|Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface|
CN1059303C|1994-07-25|2000-12-06|国际商业机器公司|Apparatus and method for marking text on a display screen in a personal communications device|
KR100327209B1|1998-05-12|2002-04-17|윤종용|Software keyboard system using the drawing of stylus and method for recognizing keycode therefor|
JP2000347645A|1999-03-31|2000-12-15|Sony Corp|Image display processing method, image display processing program storage medium, and image display processing device|
JP2002055753A|2000-08-10|2002-02-20|Canon Inc|Information processor, function list display method and storage medium|
US7728821B2|2004-08-06|2010-06-01|Touchtable, Inc.|Touch detecting interactive display|
US7925996B2|2004-11-18|2011-04-12|Microsoft Corporation|Method and system for providing multiple input connecting user interface|
US7561145B2|2005-03-18|2009-07-14|Microsoft Corporation|Systems, methods, and computer-readable media for invoking an electronic ink or handwriting interface|
US7676767B2|2005-06-15|2010-03-09|Microsoft Corporation|Peel back user interface to show hidden functions|
JP2007148783A|2005-11-28|2007-06-14|Buffalo Inc|Device and method for displaying image for computer and medium with image display program recorded thereon|
KR20080041809A|2006-11-08|2008-05-14|삼성전자주식회사|Apparatus and method for controlling display in potable terminal|
US8351989B2|2007-02-23|2013-01-08|Lg Electronics Inc.|Method of displaying menu in a mobile communication terminal|
US11126321B2|2007-09-04|2021-09-21|Apple Inc.|Application menu user interface|
KR101224588B1|2008-01-24|2013-01-22|삼성전자주식회사|Method for providing UI to detect a multi-point stroke and multimedia apparatus thereof|
US8159469B2|2008-05-06|2012-04-17|Hewlett-Packard Development Company, L.P.|User interface for initiating activities in an electronic device|
KR101461954B1|2008-05-08|2014-11-14|엘지전자 주식회사|Terminal and method for controlling the same|
US8296684B2|2008-05-23|2012-10-23|Hewlett-Packard Development Company, L.P.|Navigating among activities in a computing device|
US20090289902A1|2008-05-23|2009-11-26|Synaptics Incorporated|Proximity sensor device and method with subregion based swipethrough data entry|
TW201005599A|2008-07-18|2010-02-01|Asustek Comp Inc|Touch-type mobile computing device and control method of the same|
US8924892B2|2008-08-22|2014-12-30|Fuji Xerox Co., Ltd.|Multiple selection on devices with many gestures|
US20100107067A1|2008-10-27|2010-04-29|Nokia Corporation|Input on touch based user interfaces|
JP4853507B2|2008-10-30|2012-01-11|ソニー株式会社|Information processing apparatus, information processing method, and program|
US8499251B2|2009-01-07|2013-07-30|Microsoft Corporation|Virtual page turn|
CN102150114B|2009-02-06|2014-01-22|松下电器产业株式会社|Image display device|
JP5370374B2|2009-02-13|2013-12-18|富士通モバイルコミュニケーションズ株式会社|Information processing device|
US8836648B2|2009-05-27|2014-09-16|Microsoft Corporation|Touch pull-in gesture|
US8681106B2|2009-06-07|2014-03-25|Apple Inc.|Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface|
US8832585B2|2009-09-25|2014-09-09|Apple Inc.|Device, method, and graphical user interface for manipulating workspace views|
US8957918B2|2009-11-03|2015-02-17|Qualcomm Incorporated|Methods for implementing multi-touch gestures on a single-touch touch surface|
US8438504B2|2010-01-06|2013-05-07|Apple Inc.|Device, method, and graphical user interface for navigating through multiple viewing areas|
US8239785B2|2010-01-27|2012-08-07|Microsoft Corporation|Edge gestures|
US20110191675A1|2010-02-01|2011-08-04|Nokia Corporation|Sliding input user interface|
US20110209098A1|2010-02-19|2011-08-25|Hinckley Kenneth P|On and Off-Screen Gesture Combinations|
US8799827B2|2010-02-19|2014-08-05|Microsoft Corporation|Page manipulations using on and off-screen gestures|
US20110210850A1|2010-02-26|2011-09-01|Phuong K Tran|Touch-screen keyboard with combination keys and directional swipes|
TW201133298A|2010-03-25|2011-10-01|Novatek Microelectronics Corp|Touch sensing method and system using the same|
US9052925B2|2010-04-07|2015-06-09|Apple Inc.|Device, method, and graphical user interface for managing concurrently open software applications|
JP5529616B2|2010-04-09|2014-06-25|株式会社ソニー・コンピュータエンタテインメント|Information processing system, operation input device, information processing device, information processing method, program, and information storage medium|
US20110273379A1|2010-05-05|2011-11-10|Google Inc.|Directional pad on touchscreen|
US20120180001A1|2011-01-06|2012-07-12|Research In Motion Limited|Electronic device and method of controlling same|US8836648B2|2009-05-27|2014-09-16|Microsoft Corporation|Touch pull-in gesture|
US20120036471A1|2010-08-04|2012-02-09|Misys|Tool bars along lateral edges of a mobile computing device display|
US20120159383A1|2010-12-20|2012-06-21|Microsoft Corporation|Customization of an immersive environment|
US20120159395A1|2010-12-20|2012-06-21|Microsoft Corporation|Application-launching interface for multiple modes|
US8612874B2|2010-12-23|2013-12-17|Microsoft Corporation|Presenting an application change through a tile|
US8689123B2|2010-12-23|2014-04-01|Microsoft Corporation|Application reporting in an application-selectable user interface|
US9015641B2|2011-01-06|2015-04-21|Blackberry Limited|Electronic device and method of providing visual notification of a received communication|
US9471145B2|2011-01-06|2016-10-18|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9423878B2|2011-01-06|2016-08-23|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9465440B2|2011-01-06|2016-10-11|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9477311B2|2011-01-06|2016-10-25|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9766718B2|2011-02-28|2017-09-19|Blackberry Limited|Electronic device and method of displaying information in response to input|
US8893033B2|2011-05-27|2014-11-18|Microsoft Corporation|Application notifications|
US9104307B2|2011-05-27|2015-08-11|Microsoft Technology Licensing, Llc|Multi-application environment|
US9104440B2|2011-05-27|2015-08-11|Microsoft Technology Licensing, Llc|Multi-application environment|
US9658766B2|2011-05-27|2017-05-23|Microsoft Technology Licensing, Llc|Edge gesture|
US9158445B2|2011-05-27|2015-10-13|Microsoft Technology Licensing, Llc|Managing an immersive interface in a multi-application immersive environment|
US20120304132A1|2011-05-27|2012-11-29|Chaitanya Dev Sareen|Switching back to a previously-interacted-with application|
US20130057587A1|2011-09-01|2013-03-07|Microsoft Corporation|Arranging tiles|
US9146670B2|2011-09-10|2015-09-29|Microsoft Technology Licensing, Llc|Progressively indicating new content in an application-selectable user interface|
US8933952B2|2011-09-10|2015-01-13|Microsoft Corporation|Pre-rendering new content for an application-selectable user interface|
US9360940B2|2011-11-14|2016-06-07|Google Inc.|Multi-pane interface|
KR20130054579A|2011-11-17|2013-05-27|삼성전자주식회사|Display apparatus and control method thereof|
JP5850229B2|2011-11-29|2016-02-03|日本精機株式会社|Vehicle control device|
US9223472B2|2011-12-22|2015-12-29|Microsoft Technology Licensing, Llc|Closing applications|
US9128605B2|2012-02-16|2015-09-08|Microsoft Technology Licensing, Llc|Thumbnail-image selection of applications|
US9785291B2|2012-10-11|2017-10-10|Google Inc.|Bezel sensitive touch screen system|
GB201300031D0|2013-01-02|2013-02-13|Canonical Ltd|Ubuntu UX innovations|
KR102091028B1|2013-03-14|2020-04-14|삼성전자 주식회사|Method for providing user's interaction using multi hovering gesture|
US9207849B2|2013-03-29|2015-12-08|Microsoft Technology Licensing, Llc|Start and application navigation|
US9450952B2|2013-05-29|2016-09-20|Microsoft Technology Licensing, Llc|Live tiles without application-code execution|
US9645651B2|2013-09-24|2017-05-09|Microsoft Technology Licensing, Llc|Presentation of a control interface on a touch-enabled device based on a motion or absence thereof|
US9851896B2|2013-12-17|2017-12-26|Google Inc.|Edge swiping gesture for home navigation|
US10747416B2|2014-02-13|2020-08-18|Samsung Electronics Co., Ltd.|User terminal device and method for displaying thereof|
US10866714B2|2014-02-13|2020-12-15|Samsung Electronics Co., Ltd.|User terminal device and method for displaying thereof|
US10712918B2|2014-02-13|2020-07-14|Samsung Electronics Co., Ltd.|User terminal device and displaying method thereof|
CN104951213B|2014-03-27|2018-06-22|原相科技股份有限公司|The method for preventing false triggering boundary slip gesture|
KR20150114160A|2014-04-01|2015-10-12|삼성전자주식회사|Device for Displaying User Interface and Method Thereof|
WO2015149347A1|2014-04-04|2015-10-08|Microsoft Technology Licensing, Llc|Expandable application representation|
KR102107275B1|2014-04-10|2020-05-06|마이크로소프트 테크놀로지 라이센싱, 엘엘씨|Collapsible shell cover for computing device|
EP3129847A4|2014-04-10|2017-04-19|Microsoft Technology Licensing, LLC|Slider cover for computing device|
US10678412B2|2014-07-31|2020-06-09|Microsoft Technology Licensing, Llc|Dynamic joint dividers for application windows|
US10592080B2|2014-07-31|2020-03-17|Microsoft Technology Licensing, Llc|Assisted presentation of application windows|
US10254942B2|2014-07-31|2019-04-09|Microsoft Technology Licensing, Llc|Adaptive sizing and positioning of application windows|
US9576172B2|2014-09-16|2017-02-21|Facebook, Inc.|Systems and methods for simultaneously providing and reading machine-readable codes|
CN106662891B|2014-10-30|2019-10-11|微软技术许可有限责任公司|Multi-configuration input equipment|
KR102342555B1|2014-11-10|2021-12-23|엘지전자 주식회사|Mobile terminal and control method thereof|
KR20170043065A|2015-10-12|2017-04-20|삼성전자주식회사|Portable apparatus and method for displaying a screen|
CN105573545A|2015-11-27|2016-05-11|努比亚技术有限公司|Gesture correction method, apparatus and gesture input processing method|
KR20170076357A|2015-12-24|2017-07-04|삼성전자주식회사|User terminal device, and mode conversion method and sound system for controlling volume of speaker thereof|
US10209821B2|2016-04-05|2019-02-19|Google Llc|Computing devices having swiping interfaces and methods of operating the same|
DE102016208575A1|2016-05-19|2017-11-23|Heidelberger Druckmaschinen Ag|Touchpad with gesture control for wallscreen|
JP2018060241A|2016-09-30|2018-04-12|富士通株式会社|Interest level evaluation program, dialogue program, display program, apparatus, and method|
CN111694485A|2017-05-16|2020-09-22|苹果公司|Device, method and graphical user interface for navigating between user interfaces|
US10203866B2|2017-05-16|2019-02-12|Apple Inc.|Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects|
US11036387B2|2017-05-16|2021-06-15|Apple Inc.|Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects|
CN108319414A|2018-01-31|2018-07-24|北京小米移动软件有限公司|interface display method and device|
KR102301903B1|2019-12-09|2021-09-14|네이버 주식회사|Method and apparatus for providing contents associated with side bar using user terminal including a plurality of displays|
法律状态:
2020-09-08| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-09-08| B25A| Requested transfer of rights approved|Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC (US) |
2020-10-20| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE A 9A ANUIDADE. |
2020-12-22| B11B| Dismissal acc. art. 36, par 1 of ipl - no reply within 90 days to fullfil the necessary requirements|
2021-11-03| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US13/118,221|US20120304107A1|2011-05-27|2011-05-27|Edge gesture|
US13/118,221|2011-05-27|
PCT/US2011/055513|WO2012166176A1|2011-05-27|2011-10-09|Edge gesture|
[返回顶部]