![]() apparatus and method for detecting an object based on proximity to the input surface, associated ite
专利摘要:
APPARATUS AND METHOD FOR ENTRY BASED ON PROXIMITY. According to an exemplary embodiment of the present invention, a method for controlling display operations is provided, which comprises: detecting the presence of an object in the vicinity of an input surface (300), detecting at least one item of information associated with the object (310), detect the distance information about the object's distance from the input surface (320), and control the selection of additional information associated with at least one item of information, for display based on the information of distance (330). 公开号:BR112013006616B1 申请号:R112013006616-4 申请日:2011-09-06 公开日:2021-03-16 发明作者:Juha Arrasvuori;Elina Ollila 申请人:Nokia Technologies Oy; IPC主号:
专利说明:
FIELD [001] The present invention relates to an apparatus and a method for entry based on proximity. BACKGROUND [002] Touch screens are used in many portable electronic devices, such as PDA (Personal Digital Assistant) devices, tabletops, and mobile devices. Touch screens are triggered by a pointing device (or pen) and / or a finger. Typically, the devices also comprise conventional buttons for certain operations. [003] Lately, so-called hovering user interfaces have been proposed for devices with touch screens. Hovering generally refers to entries by an object, such as a finger, in the vicinity of a user interface input surface, but not touching the input surface. Thus, the space above a surface of the touchscreen can also be used as an additional means for user input. For example, the user interface can be organized to react to the hovering finger by popping up a menu. The desired item in the menu can be selected by touching the screen. People who are using mobile devices, in general, want to have more versatile and intuitive ways to interact with their devices. SUMMARY [004] Various aspects of examples of the present invention are defined in the claims. [005] In one aspect, an apparatus is provided, comprising at least one processor and at least one memory including the computer program code, at least one memory and the computer program code configured for, with at least one processor, make the device perform at least: detect the presence of an object in close proximity to an input surface, detect at least one item of information associated with the object, detect the information of distance over the object's distance from the input surface and control the selection of additional information associated with at least one item of information for display based on distance information. [006] According to one aspect, an apparatus is provided, comprising: a proximity detection system with at least one proximity detector to detect the presence of an input object in the vicinity of an input surface, and a controller operationally connected to the proximity detection system, the controller being configured to: detect the presence of an object in the vicinity of an input surface, detect at least one item of information associated with the object, detect the distance information about the object's distance in relation to the input surface and control the selection of additional information associated with at least one item of information for display based on distance information. [007] In accordance with an aspect of the present invention, an apparatus is provided, comprising: means for detecting the presence of an object in the vicinity of an entrance surface, means for detecting at least one item of information associated with the object, means for detecting distance information about the object's distance from the input surface, and means for controlling the selection of additional information associated with at least one item of information for display based on the distance information. [008] In accordance with another aspect of the present invention, a user interface is provided for an electronic device comprising a proximity detection system to detect the presence of an entry object in the vicinity of an entry surface, wherein the user interface is configured to: detect at least one item of graphical user interface information associated with the currently detected object position, and display additional information associated with at least one information item, the amount of additional information displayed being adapted depending on the distance of the input object from the input surface. [009] According to another aspect, a method is provided comprising: detecting the presence of an object in the vicinity of an entrance surface, detecting at least one item of information associated with the object, detecting distance information about distance of the object in relation to the input surface, and control the selection of additional information associated with at least one item of information for display based on distance information. [010] The invention and various embodiments of the invention provide several advantages, which will become apparent from the detailed description below. BRIEF DESCRIPTION OF THE DRAWINGS [011] For a more complete understanding of the example embodiments of the present invention, reference is now made to the following description taken in connection with the accompanying drawings, in which: Figures 1a and 1b illustrate an interaction of user interface and control example Figure 2 is a simplified block diagram of a side view of an input device according to an exemplary embodiment of the present invention; Figure 3 illustrates a method according to an exemplary embodiment of the present invention; Figures 4a and 4b illustrate an example of a user interface interaction sequence; Figure 5 illustrates a method according to an exemplary embodiment of the present invention; Figure 6 illustrates entities for a social network system; Figure 7 illustrates an example of a user interface user, and Figure 8 illustrates an electronic device according to an exemplary embodiment of the present invention. DETAILED DESCRIPTION [012] Figures 1a and 1b illustrate the general principles of a display control based on hovering according to some achievements. Hovering generally refers to the introduction of an input object 1, such as a finger or a pen, in close proximity to, but not in contact with, an input surface, such as an input surface of a touch screen. The reference "z" in Figure 1 illustrates a distance from object 1 from the surface of the screen, which can also be referred to as the "hovering distance". The x and y directions are generally substantially parallel to the entry surface, and the z direction is substantially normal to the entry surface. [013] As still illustrated in Figure 1b, the presence of an object in close proximity is detected 10 and indicated for a display controller 20 to select display information and control a display. When such a hovering object 1 is detected, the x, y 12 position of object 1 can be defined. Based on the position of object 1, at least one associated information item, for example, a view, a window, at least one GUI icon for a file, folder, contact entry, link or an application, can be detected or selected. The distance (z) from object 1 to the input surface 14 can be estimated. [014] Additional information 3 related to the currently associated information item 2 can be displayed. Based on the detected distance information 14, controller 20 can control the selection of additional information, associated with at least one item of information, for display. This is to be understood widely to refer to any type of control operation causing the display of additional information related to the item of information of interest to adapt, based on changes in the hovering distance. This may involve adapting a number of related information items for display, controlling the display of additional information, or the level of detail of the information, for example. The information selected for the exhibition can be text, for example, graphic elements, animation, video, or any combination of these. Controller 20 can therefore be arranged to control the display of an additional GUI item 3, such as a viewing window, providing additional information related to the displayed information item 2. For example, when object 1 is brought closer from the surface of the touchscreen, more details about the information item can be displayed, for example, showing a new application view, a window or additional information, in the current view. When the distance from object 1 is detected to increase, less supplemental information 3 can be displayed. This user interface entry arrangement provides an easy and intuitive interaction option to get more information about a GUI element of interest. [015] Figure 2 illustrates an apparatus 100 with one or more inputs and / or output devices according to an exemplary embodiment. Input devices can, for example, be selected from buttons, switches, sliders, keys or keyboards, navigation pads, touch pads, touch screens, and the like. Output devices can be selected from screens, speakers, indicators, for example. [016] The apparatus comprises a display 110 and a proximity detection system or unit 120 configured to detect when an input object 1, such as a finger or pen, is brought in close proximity to, but not in contact with, an input surface 112. The input surface 112 can be a surface of a touch screen or other device input device capable of detecting user input. [017] Detection area 140 can illustrate the approximate area and / or distance at which an entry object 1 is detected to be in close proximity to surface 1 12. Detection area 140 can also be referred to as a zone hover and introduction of an input object 1 to the hover area and possible additional inputs (without touches) by object 1 in the hover area can be referred to as hover. In some embodiments, the hover area 140 also allows the input and / or access to data on the device 100, even without touching the input surface 112. User input, such as a particular detected gesture or even mere introduction of the input object , in the hovering zone 140 detected at least in part based on the entry of an object without touching the entry surface 112 can be referred to as a hovering entry. Such a hovering entry is associated with at least one function, for selecting an instance of an Ul item or activating a pop-up menu. [018] The device 100 can be a peripheral device, such as a keyboard or mouse, or integrated into an electronic device. Examples of electronic devices include any electronic device, such as computers, media players, wireless communication terminal devices, and so on. [019] In some embodiments, a proximity detection system 120 is provided to an apparatus comprising a touch screen. Thus, the display 110 may be a touchscreen 110 comprising a plurality of touch detectors 114 for detecting inputs to the touchscreen input surface. [020] In some embodiments, the detection system 120 generates a detection field of one or more proximity sensors 122. In an example embodiment, a capacitive proximity detection system is applied, whereby sensors 122 are knots capacitive detection sensors. Disturbances by one or more input objects 1 in the detection field are monitored and in the presence of one or more objects it is detected based on detected disturbances. A capacitive detection circuit detects changes in the capacitance above the input surface 112. [021] However, it should be noted that the characteristics present are not limited to the application of any particular type of proximity detection. The proximity detection system 120 can be based on infrared proximity detection, optical shadow detection, acoustic emission detection, ultrasound detection, radar technology, or any other appropriate proximity detection technique. For example, in the event that the proximity detection system 120 is based on infrared detection, the system comprises one or more emitters sending infrared radiation pulses. One or more detectors would be provided to detect reflections of that light from nearby objects 1. If the system detects reflected light, then an incoming object is considered to be present. [022] Proximity detection system 120 is coupled to a controller 130. Proximity detection system 120 is configured to provide controller 130 with signals when an input object 1 is detected in the hovering zone 140. Based on the signals input, commands, selections and other types of actions can be initiated, usually causing visible, audible and / or tactile feedback to the user. Touch inputs for touch sensitive detectors 1 14 can be signaled by means of a control circuit for controller 130, or another controller. [023] The proximity detection system 120 can also be arranged to provide information on / for determining the x, y position of object 1, for controller 130, to allow the determination of an item of target Ul or area of an object hovering 1. [024] The apparatus 100 can be arranged to calculate the distance 150 from the input object 1 to the input surface 112, that is, the hovering distance, which allows providing z data of the coordinates of the location of the object 1, in relation to the surface. input. The distance can be detected by the proximity detection system 100 or on the basis of a signal provided by the proximity detection system 100. In some embodiments of one or more additional devices are applied to assist in estimating the distance. For example, the device can be configured to detect distance based on information from at least one of a video camera, a light level sensor, or an ultrasound sensor. [025] Depending on the proximity detection technique applied, the size of the apparatus 100 and the input surface 112, and the interaction of the intended user, the hovering area 140 can be arranged to extend from the input surface 112 over a distance selected from a few millimeters at the same time to tens of several centimeters, for example. The proximity detection system 120 can be provided to detect additional parts on the user side as well, and the system can be arranged to recognize false entries and prevent further actions. [026] Controller 130 can also be connected to one or more output devices, such as the touchscreen display 110. Controller 130 can be configured to control different application views on display 110. Controller 130 can detect touch inputs and inputs hovering over the function of signals coming from proximity detection system 120 and touch sensitive detectors 114. Controller 130 can then control a display function associated with a detected input touch or hovering input. A wide range of functions is available for selection being associated with an input detected by a touch sensitive detection system and / or the proximity detection system 120. Controller 130 can be configured to adapt to associations according to the current operating status of device 100, a user input or an application running on device 100, for example. For example, associations can be application-specific, menu-specific, view-specific and / or context-specific (which can be defined based on information obtained from the current usage environment or device 100). [027] Controller 130 can be configured to control the selection of additional information, in relation to the current position x, y of the input object, for the exposure, depending on the current distance hovering 150. [028] In one embodiment, apparatus 100 can be arranged to calculate the distance between the input object 150 and an input surface 112, and the controller 130 can compare the calculated distance to one or more predetermined limit values. Each of the limit values can be associated with a certain level of detail. For example, the exposure of a first set of information elements associated with at least one item of information, can be controlled in response to the receipt of information that indicates that the object has a first distance to the entrance surface. The selection and display of a second set of information elements associated with at least one item of information, can then be controlled in response to the reception of a signal that indicates that the object has approached the entrance surface of a certain limit, ie within a second distance to the entrance surface. [029] Thus, controller 130 can be arranged to control the change in the hover distance. In response to detecting the hovering distance to change a limit, the display of additional information can be triggered or adapted. Parameter / adjustment data guides the operation of controller 130 by selecting additional information that can be stored in memory comprising or linked to controller 130. The user can be provided with an option to set the display of additional information while hovering on or off, and / or adapts one or more settings that affect the selection of additional information, for example, to affect the sensitivity of the user interface (introduction / movement of objects) hovering. [030] It will be appreciated that the example apparatus 100 may include several other elements which will not be discussed in detail here. Although apparatus 100 and controller 130 are represented as a single entity, different characteristics can be applied to one or more physical or logical entities. There may be more specific functional modules, for example for making one or more of the blocks described in connection with Figure 3, or 5. In an example variation, the proximity detection system 120 and the input surface 112 are placed at a greater distance from the display 110, for example, on the side or back (in view of the position of a display) of an electronic handheld device. [031] Figure 3 shows a method for controlling display operations according to an exemplary embodiment. The method can be applied as a control algorithm by the controller 130, for example. In block 300, the presence of an object in the vicinity of an input surface is detected. At least one item of information associated with the currently detected position of object 310 is detected, for example, by determining an associated GUI, depending on the information of the current position x, y of object 1. The distance information about the distance of the object at the input surface is detected 320. For example, an estimate of the distance can be obtained by means of a signal from the proximity detection system 120. The selection of additional information, associated with at least one information item, for the display 330 is controlled according to the distance information. Thus, the amount of information displayed to the user can be adapted depending on the hovering distance. [032] It will be appreciated that various modifications and additions can be made within the scope of the process illustrated in simplified Figure 3. In an example variation of Figure 3, the distance from the object to 110 can be monitored continuously (320), regardless of detection association (310) of the object to any item of information. Some additional example embodiments will be illustrated below. [033] In one embodiment, with reference also to Figure 3, one or more additional triggers or conditions are applied before displaying additional information 330. For example, controller 130 may be arranged to control the moment when object 1 is maintained over the area (x, y) associated with the information item, and trigger the display only in response to the detection of an object associated with the information item for a predetermined period of time. Thus, it may be possible to obtain additional information related to a GUI item by keeping a finger hovering over the item. By applying such delay conditions, it becomes possible to avoid or at least reduce the exposure while hovering over the display of additional information that the user is not really interested in. [034] In some embodiments, the device is in block 330 configured to activate the display and / or adapt the selection of information from a preview, recommendation or a complementary information window associated with the information item from the estimated distance . This window can thus be displayed to provide additional information with varying levels of detail about the information item of interest to the user. The window can be static or dynamic. The size of the window and the amount of content in the window can be varied according to the distance currently detected. The window can take many forms. For example, instead of traditional windows and menus, the window can be a bubble. The appearance and / or position of the window can be adapted depending on the time elapsed and / or in response to detected user actions, such as the movement of a finger or a gesture. The window, like a preview, can be arranged in the same view of the display as the GUI element being hovered. However, in some cases, the window may be displayed at the bottom of the screen, on a secondary monitor or as an overlay. For example, if the device 100 is connected to a projector or has a dual screen, the window can be displayed on the secondary display. However, it should be noted that there are numerous possibilities for organizing the display of additional information triggered by hovering, and only a few examples are illustrated in this document. [035] In some embodiments, as illustrated in the example in figure 4a, a folder icon 400 is detected to be associated with the input object. For example, a screen with a set of folders can be displayed, each folder containing several files. A finger is detected in the zone hovering over a currently displayed folder icon, for a predetermined period of time. A window 410 that provides additional information about the files in the folder can thus be displayed. The number of files (among all files in the folder) displayed in window 410 can be controlled based on the distance from the detected object to the input surface. The user can obtain a more detailed view 420 of the folder contents by moving his finger towards the 400 icon. As illustrated in Figure 4b, an additional view 420 can be displayed in response to the detection of the finger closest to the 400 icon. [036] In some embodiments, one or more other actions on the associated information item are controlled based on the current distance 150 from object 1 to the input surface 112. This can be done, in addition to or instead of controlling, to display, the selection of additional information associated with the information element. For example, an application is activated, a user prompt is generated, the audible and / or tactile output is generated. In an embodiment of one or more information elements among the plurality currently presented, information items are selected based on the distance that hovers 150. For example, one of the items in the additional view 3, 410, 420 can be focused and selected based on in the distance. Another example is that a GUI element, among a plurality of GUI elements, is selected based on the estimated distance from the input object to the input surface, for example, the appearance of icon 2 in Figure 1a is changed in response to the additional object approaching the entrance surface. In one embodiment, if the information item currently associated with the input object is a media file, a media player is activated to output a preview and / or playback of the media file. [037] There may be other parameters that affect the selection of information items for display. In some embodiments, the choice of information for display is controlled based on the relevance of available information items. [038] As illustrated in Figure 5, in block 500, the relevant information associated with at least one item of information or one or more additional information elements associated with at least one item of information is detected or determined. Thus, the apparatus 100 may be arranged to generate the relevant information itself based on metadata associated with the information item available for the apparatus, or to receive the relevant information from other devices. [039] The information for screen 510 can be selected according to the relevance information. Thus, it becomes possible to provide more relevant information to the user by indicating their interest by keeping the finger at the top of the information element. This allows you to provide a more convenient way to search for relevant information elements, for example, from dozens or hundreds of files in one or more folders. [040] For example, referring also to Figures 4a and 4b, the information elements for the additional points 410, 420 can be selected based on the relevance values of each of the available information items. For example, the records with the highest relevance values among the records in the folder represented by the 400 icon can be displayed in the additional views 410, 420. [041] Various methods can be applied to determine the relevance of information 500 of an information item or to a set of information elements. In one embodiment, the information of relevance can be determined based on analysis of access history, indicating activity of using the information element. For example, the device 100 may be arranged to store the access history in relation to the stored user files referred to in the memory of the device 100. In one embodiment, the relevance information can be updated based on hovering entries. For example, if a user holds his finger for a long period of time over an information item, the relevance of the information item can be increased. [042] This access history or relevant information can also be maintained with respect to a set of users defined in the contact information stored on the device. For example, these files (or copies / instances of the files) stored in the folder that the user's friends accessed the most and / or recently accessed are defined as the most relevant files in the folder. [043] In one embodiment, the relevance of a file is determined by examining the user's social network. With reference to the example in Figure 6, social networks between people and their use of files and other information can be stored by an online service of a database server, 600, which the device 100 can be arranged for access through one or more networks 62. Relevance information or information to determine the relevance of using file friends can also be pushed to the client device during downtime, for example, each night. The relevance of a file can be determined based on the recommendations received by the 630 server from users' devices belonging to the associated social network. Thus, socially recommended content can be provided in a 410, 420 control view based on the hovering detection distance 150, and a user can easily locate content items at the time of interest to the social network. [044] As yet another example, in the event that a finger is hovering above a folder containing music files, the device 100 can be organized to detect which music the user's friends have been listening to the most. The song icon is displayed and, in addition, the song playback can be started. If the file would be a video clip, the device can be organized to show a preview of it. In an option from another example, the music or video view can be arranged to start from the section where the user's friend just heard or watched the video. In an example related to video game applications, the recent high scores from the user's friends can be displayed next to / above the game application icon. In one embodiment, the relevant information associated with an information item is controlled (330), to display additional information, in response to an object hovering over an element of the information item's graphical interface. For example, Stephen accessed a song, the graphical interface that is being hovered, 15 times. [045] In an example embodiment, apparatus 100 is configured to control the selection of items of additional information on the basis of other properties associated with the movement of input object 1 in the hover area 140. For example, apparatus 100 is arranged to detect the speed of movement of the object 1. The selection of additional information elements can be controlled based on 330 of the speed of movement of the object 1. [046] In one embodiment, the apparatus 100 is arranged to support the use of several fingers to provide touch and / or based on inputs hovering simultaneously. For example, one finger can be used to select or focus on one item of information, and another finger can be used to select or focus another item of information. Distances of one or more hovering fingers can be estimated, and the amount of additional information for each of the associated information items can be controlled based on the currently detected distance from the associated finger. For example, the distance of a hovering finger can be used to control the selection of a folder from several available folders, and the distance of another hovering finger can be used to select music from within the chosen folder. [047] Figure 7 illustrates an example of using multiple fingers to obtain previews or recommendations of information associated with GUI items. A user may wish to have recommendations for files stored in a 700 folder, which contains several subfolders, in this example, music albums, which in turn contain several music files, each. In the example in Figure 7, the available information is additionally pre-selected based on the relevant information. The distance of a finger from the left hand can determine which, for example, the Top 3, which may be more socially relevant and the 702 album folders are selected. The distance of a finger from the right hand can determine which Top 5, for example, the most socially relevant music files 704 of the currently chosen album with the finger of the left hand being selected for viewing. [048] It will be appreciated that the selection setting between the fingers of the left / right hand, the distance levels, and the number of selectable items can be changed in various ways for the embodiment illustrated in Figure 7, as well as for others achievements. [049] In another example embodiment, controller 130 is arranged to control a specific function in response to the detection of specific combined use of multiple fingers, such as a specific hovering gesture or a pinching gesture based on touch or backward gesture. . Such a specific entry can be additionally applied to control display operations, for example, in step 330. [050] In an example embodiment, the properties of the input object) associated with the detected input are determined and applied to control display operations, for example, in step 330. For example, the identifying information in the user's hand, such as fingers and / or wrist, are determined to allow user identification of the device of origin of other users. Some example properties on the user side include, but are not limited to, left or right hand, skin color, approximate size (or adult or child hand indication), finger configuration, as an indication of the pinching gesture that fingers are on and how many fingers are applied. However, other notable descriptors can also be applied, such as a color scheme, for example, a short sleeve color, the reading of the side heat sensor, the user's voice, thinking etc. For example, controller 130 may be arranged to detect such properties of the input object on the basis of signal analysis from the proximity detection system 120 and / or more sensors, such as one or more video cameras. [051] In one embodiment, apparatus 100 is configured to detect gestures by one or more objects (separately or in combination) in the hovering area 140. For example, a detection gesture functionality is activated in response to the detection of object 300 that hover 1. Changes in the proximity detection field can thus be controlled. A gesture is identified based on the detected changes. An action associated with the identified gestures can then be performed. For example, in response to detecting a gesture similar to the movement of a finger when you double-click the mouse button, a GUI element is selected and another action is performed for the selected GUI element, for example, a new window application with content from a selected URL is displayed. Another example is that the apparatus 100 can be configured to detect the movement of an object so as to form a circle and thus move to a posterior or anterior view. Hover gestures can be applied to further control the currently associated information element or display additional information about the information item. For example, in the example in Figure 7, a clockwise rotation gesture with your finger to the right can be associated with the rapid advancement of the music being viewed. The hover gesture functionality can be implemented in such a way that a hover gesture can be associated with different functions, depending on the object's distance from the input surface 112. [052] The features illustrated above can be applied to different applications and application modes. Some examples of applications for which particular advantages are available include navigation applications, device management applications, file management applications, media content applications and related services, map applications, applications using augmented reality interactions, mobile applications. social networks, and the like. [053] Figure 8 shows a block diagram of the structure of an electronic device 800 according to an exemplary embodiment. The electronic device may comprise the apparatus 100. While an embodiment of the electronic device 800 is illustrated and will be described below, for example, other types of electronic devices, such as, but not limited to, PDAs, pagers, portable computers, computers desktop, portable computers, tablet computers, media players, televisions, gaming devices, cameras, video recorders, positioning devices, electronic books, portable devices, projector devices, and other types of electronic systems, can employ the present embodiments. [054] Furthermore, the apparatus of an example embodiment does not have to be the entire electronic device, but it may be a component or set of components of the electronic device in other example embodiments. For example, the device may be in the form of a chip set or some other type of hardware module for the control, performing at least some of the functions described above, such as the functions of the controller 130 in Figure 2. [055] An 802 processor is configured to execute instructions and perform operations associated with the electronic device 800. The 802 processor may comprise means, such as a digital signal processor device, a microprocessor device, and circuits, for carrying out various functions, including, for example, one or more of the functions described in conjunction with Figures 1a to 7. The 802 processor can control the reception and processing of input and output data between the components of electronic devices 800 using instructions retrieved from memory. The 802 processor can be implemented on a single chip, multiple chips or multiple electrical components. Some examples of techniques that can be used for the 802 processor include the dedicated or embedded processor, and ASIC. [056] The 802 processor can understand the functionality to operate one or more computer programs. The computer program code can be stored in an 804 memory. That of at least one memory and the computer program code can be configured to, with at least one processor, make the device perform at least one embodiment, including, for example, controlling one or more of the functions described in conjunction with Figures 1a through 7. For example, the 802 processor may be prepared to perform at least part of the functions of the controller 130 in Figure 2. Typically, the 802 processor operates in conjunction with an operating system to execute computer code and produce and use the data. [057] As an example, memory 804 may include a non-volatile portion, such as EEPROM, flash memory or the like, and a volatile portion, such as a random access memory (RAM) that includes a cache area for the temporary data storage. The information to control the functionality of the 802 processor can also reside on a removable storage medium that is loaded or installed on electronic devices 800, when necessary. [058] The electronic device 800 may comprise an antenna (or multiple antennas) in operable communication with a 806 transceiver unit that includes a transmitter and a receiver. The electronic device 800 can work with one or more air interface standards and communication protocols. By way of illustration, electronic devices 800 may operate according to any one of a number of first, second, third and / or fourth generation communication protocols or the like. For example, the electronic device 800 can operate according to fixed telephony protocols, such as Ethernet and digital subscriber line (DSL), with second generation (2G) wireless communication protocols, such as the Global System for Mobile Communications (GSM), with third generation (3G) wireless communication protocols, such as 3G protocols by the Third Generation Partnership Project (3GPP), CDMA2000, broadband CDMA (WCDMA) and CDMA synchronous time division (TD- SCDMA), with fourth generation (4G) wireless communication protocols such as Term Evolution (LTE 3GPP long), local area wireless network protocols such as 802.11, short-range wireless protocols such as Bluetooth, and / or similar. [059] The user interface of the electronic device 800 may comprise an output device 808, such as a loudspeaker, one or more input devices 810, such as a microphone, a keyboard or one or more buttons or actuators, and a display device 812. The electronic device 800 can comprise a stereoscopic visualization capable of displaying stereoscopic presentations of UL items, that is, Ul items perceived by the user as three-dimensional (3D) images. [060] The 810 input device may include a touch sensor device configured to receive contact input from a user and send that information to the 802 processor. Such a touch detection device can be configured to also recognize the position and the magnitude of touches on a touch sensitive surface. The touch detection device can be based on sensor technologies, including, but not limited to, capacitive detection, resistance detection, surface acoustic wave detection, pressure sensors, inductive sensors and optical sensors. In addition, the touch detection device can be based on a single detection point or multipoint detection. In one embodiment, the input device is a touchscreen, which is positioned in front of the 812 display. [061] The electronic device 800 also comprises a proximity detection system 814, such as the system 120 illustrated above, operatively coupled to the 802 processor. The proximity detection system 814 is configured to detect when a pen, finger or another pointer is in close proximity to, but not in contact with, a component of the computer system, including housing devices, or I / O devices, such as the touchscreen. [062] The electronic device 800 may also include other units and elements not shown in Figure 8, such as additional interface devices, additional sensors, such as an accelerometer sensor, a battery, a media capture element, such as a video and / or audio camera module, a positioning unit, and a user identification module. [063] In some embodiments, additional outputs, such as an audible and / or tactile output, can also be produced by the apparatus 100, based on the detected distance. Thus, the 802 processor may be prepared to control a speaker and / or a tactile output driver, such as a vibration mechanism, in the electronic device 800 to provide such an additional alert output. [064] The embodiments of the present invention can be implemented in software logic, hardware application, or a combination of hardware and software, and the application logic. In an example embodiment, the logic of the application, software or instruction set is maintained in any of several conventional computer-readable media. In the context of this document, a "computer-readable medium" can be any medium or media that may contain, store, communicate, propagate or transport instructions for use by or in connection with an instruction execution system, the apparatus or the device, such as a computer, with an example of a computer described and depicted in Figure 8. A computer-readable medium can comprise a computer-readable material and non-transitory medium, which can be any medium or means that can contain or store instructions for use by or in connection with an instruction execution system, the apparatus or device, such as a computer. [065] In an example embodiment, a circuit or user interface circuit configured to provide at least some of the control functions illustrated above may be provided. As used in this application, the term "circuit" refers to all of the following: (a) hardware implementations of the circuit only (such as implementations in only analog and / or digital circuit) and (b) circuit combinations and software (and / or firmware), such as (as the case may be): (i) the combination of a processor (s) or (ii) parts of the processor (s) / software (including the digital signal processor ( s)), software, and memory (s) that work together to cause a device, such as a cell phone or server, to perform various functions) and (c) circuits, such as a microprocessor (s) or a portion of a microprocessor (s), which require software or firmware for operation, even if the software or firmware is not physically present. [066] This definition of 'circuit' applies to all uses of this term in the application, including any claims. As an additional example, as used in the present application, the term "circuit" also encompasses an implementation of just one processor (or several processors) or part of a processor and its accompanying software (or its) and / or firmware. [067] If desired, at least some of the different functions discussed here can be performed in a different order and / or concurrently with the other. In addition, if desired, one or more of the functions described above can be optional, or can be combined. [068] Although various aspects of the invention have been defined in the independent claims, other aspects of the invention comprise other combinations of characteristics of the described embodiments and / or the dependent claims, with the characteristics of the independent claims, and not just the combinations explicitly set out in the claims . [069] It should also be noted here that although the above document describes the exemplary embodiments of the invention, these descriptions are not to be considered in a limiting sense. Instead, there are several variations and modifications that can be made without departing from the scope of the present invention as defined in the appended claims.
权利要求:
Claims (19) [0001] 1. Apparatus configured to: detect (300) the presence of an object (1) in the vicinity of an entrance surface (112), detect (310), at least one item of information associated with the object (1), detect (320) the distance information about the distance of the object (1) in relation to the entrance surface (112), characterized in that the device is still configured to: identify a first limit distance to which the distance information is associated, at least two limit distances, in response to the identification of the first limit distance, identify (510) a first level of additional information from the available information associated with at least one item of information, where the first level of additional information is identified from the available information based on relevance information, where at least a second limit distance is associated with a second level of additional information, cause the display of the first level of additional information identi based on relevance. [0002] 2. Device according to claim 1, characterized in that the device is configured to activate the display and / or adapt the amount of information in a preview, recommendation or a complementary information window associated with the information item from the distance information. [0003] Apparatus according to claim 1, characterized in that the apparatus is configured to: determine or detect (500) the relevant information associated with at least one item of information, or one or more additional information items associated with the hair least one item of information. [0004] 4. Apparatus, according to claim 3, characterized in that the information of relevance of at least one of the information item is determined based on the verification of the access history indicating the use of the information item by a user of the device and / or one selected set of additional users defined in the contact information stored on the device. [0005] Apparatus according to claim 3, characterized in that the apparatus is configured to: detect the input object (1) above a graphical user interface element associated with a plurality of information items, determine or detect information from relevance associated with the plurality of information items, select to display at least one item of information associated with the greatest relevance among the plurality of information items based on the relevant information. [0006] Apparatus according to claim 1, characterized in that the apparatus is configured to detect the distance based on information from at least one of a video camera, a light level sensor, or an ultra sensor -sound. [0007] Apparatus according to claim 1, characterized in that the apparatus is a mobile communication device comprising a touch-sensitive screen (110). [0008] Apparatus according to claim 1, characterized in that the distance information comprises an estimated distance from the object (1) to the entrance surface (112). [0009] 9. Apparatus according to claim 1, characterized in that it is further configured to: in response to the determination that the distance information is associated with at least the second limit distance, identify a second level of additional information from the available information associated with the at least at least one item of information, in which the second level of additional information is identified from the information available based on information of different relevance from the relevance associated with the first level of additional information, causing the display of the second level of additional information identified based on in relevance. [0010] 10. Apparatus, according to claim 1, characterized by being further configured to: update relevant information for at least one item of information based on the presence of the object (1). [0011] 11. Apparatus, according to claim 1, characterized in that the relevance is based on the history of use. [0012] 12. Device, according to claim 1, characterized in that the relevance is based on the history of use by users on a social network of a user of the device. [0013] Apparatus according to claim 1, characterized in that the first level of additional information comprises the contents of a folder associated with at least one item of information. [0014] 14. Method comprising: detecting (300) the presence of an object (1) in the vicinity of an entrance surface (112), detecting (310) at least one item of information associated with the object (1), detecting (320 ) distance information based on the distance of the object (1) in relation to the input surface, characterized by understanding: identifying a first limit distance to which the distance information is associated, at least two limit distances, in response to the identification of the first limit distance, identify (510) a first level of additional information from the available information associated with at least one item of information, where the first level of additional information is identified from the information available based on the relevant information, where at least a second limit distance is associated with a second level of additional information, causing the display of the first level of additional information identified based on relevance. [0015] 15. Method according to claim 14, characterized in that the display and / or adaptation of the amount of information is controlled in a preview, recommendation or a complementary information window associated with the information item from the distance information. [0016] 16. Method according to claim 14, characterized in that the information of relevance, associated with at least one item of information, or one or more additional items of information associated with at least one item of information, is detected or determined, and selection of additional information for display to be controlled according to the relevance information. [0017] 17. Method, according to claim 16, characterized in that the information of relevance of at least one item of information is determined based on the verification of the access history, indicating the use of the item of information by a user of the device and / or a selected set of additional users defined in the contact information stored in the method. [0018] 18. Method according to claim 14, characterized in that the input object (1) is detected above a graphical user interface element associated with a plurality of information items, the relevant information associated with the plurality of information items information to be determined or detected, and the at least one item of information associated with the highest relevance among the plurality of items of information being selected for display based on the relevant information. [0019] 19. Method according to claim 14, characterized in that the distance is detected based on information from at least one of a video camera, a light level sensor, or an ultrasound sensor.
类似技术:
公开号 | 公开日 | 专利标题 BR112013006616B1|2021-03-16|apparatus and method for detecting an object based on proximity to the input surface, associated item of information and distance from the object AU2018203008B2|2019-07-04|Foldable electronic apparatus and interfacing method thereof JP6816858B2|2021-01-20|How to control the display of multiple objects by operation-related input to the mobile terminal and the mobile terminal US20210109924A1|2021-04-15|User interface for searching US9261995B2|2016-02-16|Apparatus, method, and computer readable recording medium for selecting object by using multi-touch with related reference point KR102203885B1|2021-01-15|User terminal device and control method thereof AU2014200472B2|2019-02-21|Method and apparatus for multitasking US8508347B2|2013-08-13|Apparatus and method for proximity based input US8593398B2|2013-11-26|Apparatus and method for proximity based input US20160004373A1|2016-01-07|Method for providing auxiliary information and touch control display apparatus using the same BR102013016792B1|2021-02-23|CONTROL METHOD BASED ON TOUCH AND GESTURE INPUT AND TERMINAL FOR THAT US20140331146A1|2014-11-06|User interface apparatus and associated methods KR20130050606A|2013-05-16|Method and apparatus for reading in device having touchscreen KR20150119135A|2015-10-23|Systems and methods for managing displayed content on electronic devices JP5837955B2|2015-12-24|Method for executing function of electronic device and electronic device CN104102417B|2020-05-05|Electronic device and method for displaying playlist thereof EP3582095A1|2019-12-18|Apparatus and method of copying and pasting content in a computing device US20170308271A1|2017-10-26|Display device and method for controlling display device KR20150021722A|2015-03-03|Method, apparatus and recovering medium for screen display by executing scroll US20200333994A1|2020-10-22|Systems and Methods for Initiating and Interacting with a Companion-Display Mode for an Electronic Device with a Touch-Sensitive Display KR102118091B1|2020-06-04|Mobile apparatus having fuction of pre-action on object and control method thereof US20110316887A1|2011-12-29|Electronic device with a touch screen and touch operation control method utilized thereby CN107111441B|2020-07-24|Multi-level user interface US10289293B2|2019-05-14|Method and apparatus for adjusting a graphical object according to operator preference US10185457B2|2019-01-22|Information processing apparatus and a method for controlling the information processing apparatus
同族专利:
公开号 | 公开日 US20120068941A1|2012-03-22| WO2012038589A1|2012-03-29| EP2619647A1|2013-07-31| CN103210366B|2016-11-02| CN103210366A|2013-07-17| KR20130058752A|2013-06-04| US8890818B2|2014-11-18| EP2619647B1|2020-10-21| KR101847754B1|2018-04-10| EP2619647A4|2017-04-05| KR20160062147A|2016-06-01| BR112013006616A2|2016-06-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US7411575B2|2003-09-16|2008-08-12|Smart Technologies Ulc|Gesture recognition method and touch system incorporating the same| CN100437451C|2004-06-29|2008-11-26|皇家飞利浦电子股份有限公司|Method and device for preventing staining of a display device| WO2006039939A1|2004-10-13|2006-04-20|Wacom Corporation Limited|A hand-held electronic appliance and method of entering a selection of a menu item| US7924271B2|2007-01-05|2011-04-12|Apple Inc.|Detecting gestures on multi-event sensitive devices| US8219936B2|2007-08-30|2012-07-10|Lg Electronics Inc.|User interface for a mobile device using a user's gesture in the proximity of an electronic device| US20090307224A1|2007-11-12|2009-12-10|Amin Archit A|Apparatus and method for communicating, accessing, organizing, and/or managing, information in a network environment| JP2009140368A|2007-12-07|2009-06-25|Sony Corp|Input device, display device, input method, display method, and program| US8082566B2|2007-12-19|2011-12-20|Verizon Patent And Licensing Inc.|Input based function preview apparatuses, systems, and methods| US8723811B2|2008-03-21|2014-05-13|Lg Electronics Inc.|Mobile terminal and screen displaying method thereof| KR101467766B1|2008-03-21|2014-12-10|엘지전자 주식회사|Mobile terminal and screen displaying method thereof| KR101486345B1|2008-03-21|2015-01-26|엘지전자 주식회사|Mobile terminal and screen displaying method thereof| KR100934514B1|2008-05-07|2009-12-29|엘지전자 주식회사|User Interface Control Method Using Gesture in Adjacent Space| EP2131272A3|2008-06-02|2014-05-07|LG Electronics Inc.|Mobile communication terminal having proximity sensor and display controlling method therein| US9030418B2|2008-06-24|2015-05-12|Lg Electronics Inc.|Mobile terminal capable of sensing proximity touch| KR20100044341A|2008-10-22|2010-04-30|엘지전자 주식회사|Mobile terminal and method of providing scheduler using same| US20100110002A1|2008-11-06|2010-05-06|Sony Ericsson Mobile Communications Ab|Communication device with combined input and display device| JP5407336B2|2009-01-06|2014-02-05|日本電気株式会社|Information processing device| KR101582686B1|2009-03-02|2016-01-05|엘지전자 주식회사|Method for displaying items and mobile terminal using the same|US7509588B2|2005-12-30|2009-03-24|Apple Inc.|Portable electronic device with interface reconfiguration mode| US10313505B2|2006-09-06|2019-06-04|Apple Inc.|Portable multifunction device, method, and graphical user interface for configuring and displaying widgets| US8519964B2|2007-01-07|2013-08-27|Apple Inc.|Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display| US9170708B2|2010-04-07|2015-10-27|Apple Inc.|Device, method, and graphical user interface for managing folders| JP5768347B2|2010-09-07|2015-08-26|ソニー株式会社|Information processing apparatus, information processing method, and computer program| US20120096349A1|2010-10-19|2012-04-19|Microsoft Corporation|Scrubbing Touch Infotip| JP5304763B2|2010-10-22|2013-10-02|アイシン・エィ・ダブリュ株式会社|Image display device, image display method, and program| JP2012103938A|2010-11-11|2012-05-31|Seiko Epson Corp|Optical detection system and program| US20120120002A1|2010-11-17|2012-05-17|Sony Corporation|System and method for display proximity based control of a touch screen user interface| EP2490113B1|2011-02-15|2016-11-23|Lg Electronics Inc.|Display device and method of controlling operation thereof| TWI447066B|2011-06-08|2014-08-01|Sitronix Technology Corp|Distance sensing circuit and touch electronic device| US10133439B1|2011-08-29|2018-11-20|Twitter, Inc.|User interface based on viewable area of a display| US10684768B2|2011-10-14|2020-06-16|Autodesk, Inc.|Enhanced target selection for a touch-based input enabled user interface| US9442942B2|2011-10-20|2016-09-13|Nokia Technologies Oy|Method, apparatus and computer program product for dynamic and visual object search interface| WO2013111140A2|2012-01-26|2013-08-01|Umoove Services Ltd.|Eye tracking| US9952663B2|2012-05-10|2018-04-24|Umoove Services Ltd.|Method for gesture-based operation control| WO2013113360A1|2012-01-30|2013-08-08|Telefonaktiebolaget L M Ericsson |An apparatus having a touch screen display| US9594499B2|2012-02-21|2017-03-14|Nokia Technologies Oy|Method and apparatus for hover-based spatial searches on mobile maps| KR101527038B1|2012-02-24|2015-06-08|엘지전자 주식회사|Mobile terminal and controlling method thereof, and recording medium thereof| WO2013156987A1|2012-04-16|2013-10-24|Xtendi Software Technologies Ltd.|Proximity-based interface system and method| KR20140026723A|2012-08-23|2014-03-06|삼성전자주식회사|Method for providing guide in portable device and portable device thereof| KR102084041B1|2012-08-24|2020-03-04|삼성전자 주식회사|Operation Method And System for function of Stylus pen| US9552068B2|2012-08-27|2017-01-24|Microchip Technology Germany Gmbh|Input device with hand posture control| JP5942762B2|2012-10-04|2016-06-29|富士ゼロックス株式会社|Information processing apparatus and program| KR20140064089A|2012-11-19|2014-05-28|삼성전자주식회사|Method and apparatus for providing user interface through proximity touch input| CN103092467B|2013-01-29|2016-08-03|华为终端有限公司|A kind of method and device of video preview| KR20140105691A|2013-02-23|2014-09-02|삼성전자주식회사|Apparatus and Method for handling object in a user device having a touch screen| US20140267094A1|2013-03-13|2014-09-18|Microsoft Corporation|Performing an action on a touch-enabled device based on a gesture| US9170676B2|2013-03-15|2015-10-27|Qualcomm Incorporated|Enhancing touch inputs with gestures| US9195332B2|2013-05-23|2015-11-24|Nokia Technologies Oy|Apparatus with deformable flexible user interface area and adjustable hover input region and associated methods| WO2014202819A1|2013-06-17|2014-12-24|Nokia Corporation|An apparatus for a 3-d stylus-actuable graphical user interface and associated methods| CN103472998B|2013-09-27|2017-03-15|小米科技有限责任公司|A kind of method of selected text, device and terminal unit| US9377949B2|2013-09-27|2016-06-28|Xiaomi Inc.|Method and terminal device for selecting character| CN104571473B|2013-10-15|2018-09-28|北京三星通信技术研究有限公司|Wearable, mobile terminal and its communication means| US10250735B2|2013-10-30|2019-04-02|Apple Inc.|Displaying relevant user interface objects| CN103631483B|2013-11-27|2017-02-15|华为技术有限公司|Positioning method and positioning device| US9971413B2|2013-11-27|2018-05-15|Huawei Technologies Co., Ltd.|Positioning method and apparatus| KR101692848B1|2013-11-29|2017-01-05|주식회사 하이딥|Control method of virtual touchpad using hovering and terminal performing the same| KR101628246B1|2014-02-24|2016-06-08|삼성전자주식회사|Method and Apparatus of Displaying Content| WO2015130137A1|2014-02-27|2015-09-03|Samsung Electronics Co., Ltd.|Method and device for displaying three-dimensional graphical user interface screen| KR20150104302A|2014-03-05|2015-09-15|삼성전자주식회사|User input detection method of electronic apparatus and electronic apparatus thereof| US9927927B2|2014-05-05|2018-03-27|Atmel Corporation|Implementing a virtual controller outside an area of a touch sensor| CN103970288A|2014-05-21|2014-08-06|钟林|Method for inputting computer instruction by means of Chinese character spelling gestures| US9483763B2|2014-05-29|2016-11-01|Apple Inc.|User interface for payments| US10719132B2|2014-06-19|2020-07-21|Samsung Electronics Co., Ltd.|Device and method of controlling device| KR102251541B1|2014-06-23|2021-05-14|엘지전자 주식회사|Mobile terminal and method for controlling the same| TW201602893A|2014-07-07|2016-01-16|欣興電子股份有限公司|Method for providing auxiliary information and touch control display apparatus using the same| WO2016036552A1|2014-09-02|2016-03-10|Apple Inc.|User interactions for a mapping application| KR20160053597A|2014-11-05|2016-05-13|삼성전자주식회사|Method and apparatus for displaying object and recording medium thereof| KR20160058322A|2014-11-14|2016-05-25|삼성전자주식회사|Method for controlling device and the device| KR102336445B1|2014-12-01|2021-12-07|삼성전자주식회사|Method and system for controlling device and for the same| KR20160067696A|2014-12-04|2016-06-14|엘지전자 주식회사|Mobile terminal and method for controlling the same| JP6520668B2|2015-02-09|2019-05-29|株式会社デンソー|Display control device for vehicle and display unit for vehicle| US9658704B2|2015-06-10|2017-05-23|Apple Inc.|Devices and methods for manipulating user interfaces with a stylus| US10216405B2|2015-10-24|2019-02-26|Microsoft Technology Licensing, Llc|Presenting control interface based on multi-input command| DK201670595A1|2016-06-11|2018-01-22|Apple Inc|Configuring context-specific user interfaces| US20180004385A1|2016-06-30|2018-01-04|Futurewei Technologies, Inc.|Software defined icon interactions with multiple and expandable layers| US10860199B2|2016-09-23|2020-12-08|Apple Inc.|Dynamically adjusting touch hysteresis based on contextual data| CN108227525B|2017-12-04|2020-10-23|无锡小天鹅电器有限公司|Control method and device of household appliance and washing machine| CN108153464A|2018-01-26|2018-06-12|北京硬壳科技有限公司|A kind of control method and device| US11231815B2|2019-06-28|2022-01-25|Qeexo, Co.|Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing| US20210231615A1|2020-01-29|2021-07-29|Qeexo, Co.|Adaptive ultrasonic sensing techniques and systems to mitigate interference|
法律状态:
2017-08-01| B25A| Requested transfer of rights approved|Owner name: NOKIA TECHNOLOGIES OY (FI) | 2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law| 2019-10-01| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure| 2020-10-13| B06A| Notification to applicant to reply to the report for non-patentability or inadequacy of the application according art. 36 industrial patent law| 2020-12-29| B09A| Decision: intention to grant| 2021-03-16| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 06/09/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US12/887,539|2010-09-22| US12/887,539|US8890818B2|2010-09-22|2010-09-22|Apparatus and method for proximity based input| PCT/FI2011/050762|WO2012038589A1|2010-09-22|2011-09-06|Apparatus and method for proximity based input| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|