专利摘要:
system and methods for enhanced remote control functionality a handheld device has a housing that has an upper surface and a lower surface, a transmitter, a first user interface associated with the upper surface of the housing that has first user interface insertion elements that , when active, are actionable to take the transmitter to transmit communications to an external device, and a second user interface associated with the bottom surface of the housing which has insertion elements of second user interface which, when active, are actionable to take the transmitter transmitting communications to an external device. the handheld device is programmed to selectively make one of the first user interface input elements of the first user interface and the second user interface input elements of the second user interface active while making the other of the first interface input elements user interface of the first user interface and input elements of the second user interface of the second inactive user interface.
公开号:BR112012028953B1
申请号:R112012028953-5
申请日:2011-04-12
公开日:2020-09-24
发明作者:Arsham Hatambeiki;Jeffrey Kohanek;Pamela Eichler Keiles;Ramzi Ammari
申请人:Universal Electronics Inc.;
IPC主号:
专利说明:

[0001] [0001] This patent application claims benefit over Provisional Patent Application No. U.S. 61 / 333,567 filed on May 11, 2010, the description of which is hereby incorporated by reference in its entirety for reference. FUNDAMENTALS
[0002] [0002] Control devices, for example, remote controls, for use in issuing commands to entertainment and other devices, and the features and functionality provided by such control devices are well known in the art. Traditionally, means of user input on such control devices comprise a series of command function keys, each of which can result in the transmission of a specific device command when activated. Increasingly, in the current environment, such control devices are additionally used to interact with displayed menu systems, search web pages, manipulate pointers, and perform other similar activities, which may both require navigational input, for example, scrolling information displayed on a screen, to move a pointer, to zoom in or out, etc., and to enter text data, such as URLs, search terms, messages, etc. Although entering text data and entering navigational operations may be possible using conventional control device input mechanisms, for example, by assigning groups of alphabetic characters to numeric keys in the style of a cell phone keyboard and using with up, down, left and right keys, such methods tend to be tedious and, often, the user experience can be improved by providing input mechanisms that are better suited to this type of activity . SUMMARY
[0003] [0003] According to this and other needs, the following describes, in general, a system and method for providing better navigation, text entry and command functionality in a control device. For this purpose, in addition to a conventional key matrix for receiving command function inserts as is well known in the art, a control device can be provided with specific navigation input means, such as those described in Copendent US Patent 12 / 552,761 and 12 / 645,037 both of which are incorporated herein in their entirety for reference, an alphanumeric keyboard to facilitate text data entry, and, in some embodiments, motion capture accelerometer means and / or guidance. In a preferred embodiment described in this document, the control device can have two sides, with command function keys and navigation input means located on one surface and the alphanumeric keypad keys located on an opposite surface. Alternative models are, however, possible. To improve usability and prevent unauthorized activation of the keys, such a control device can be selectively adapted to enable and disable all or a part of the command function keys, keyboard keys, navigation input devices, or other means of user input. located on either surface based on the orientation of the control device, user input, target device status, etc. In certain modalities that incorporate accelerometer means, the accelerometer can be used not only to detect changes in the orientation of the control device for the purposes described above, but it can also be used as an additional source of user input, as will be described in greater detail. details from now on. In addition, in those modes that comprise touch-sensitive input means, improved decoding methods can be applied to reduce the likelihood of false entry due to the user's finger proximity or irrelevant touch while reaching a key location.
[0004] [0004] A better understanding of the objects, advantages, characteristics, properties and relations of the invention claimed hereinafter will be obtained from the detailed description and accompanying drawings below, which present illustrative modalities and which are indicative of the various ways in which the principles of the invention can be used. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] [0005] For a better understanding of the various aspects of the invention claimed hereinafter, reference can be made to preferred embodiments shown in the accompanying drawings, in which:
[0006] [0006] Figure 1 illustrates an example system in which an example control device according to the present invention can be used;
[0007] [0007] Figure 2 illustrates the exemplary control device in Figure 1 in greater detail;
[0008] [0008] Figure 3 illustrates a block diagram of exemplifying components of the exemplifying control device of Figures 1 and 2;
[0009] [0009] Figure 4 illustrates, in the form of a flowchart, an example method for the selection of an active input surface of a multi-surface control device;
[0010] [0010] Figure 5 illustrates an example system in which a control device equipped with an accelerometer can be used;
[0011] [0011] Figure 6 illustrates, in flowchart form, an example implementation of a deferred message queue and transfer method; and
[0012] [0012] Figure 7 illustrates the use of a touch sensitive surface of a control device for inputting discrete function commands. DETAILED DESCRIPTION
[0013] [0013] Turning now to Figure 1, an example system is illustrated in which a control device 100 is configured to control several controllable devices, such as, for example, a television 102 and a set top box type converter ("STB" ) 104. As is known in the art, control device 100 may be able to transmit commands to devices in connection with user activation of various command function keys 106 using any convenient IR, RF, Point-of-use protocol point-to-point, or networked, to get the devices to perform operational functions. Although it is illustrated in the context of a 102 and STB 104 television, it should be understood that controllable devices may include, but need not be limited to, televisions, VCRs, DVRs, DVD players, cable set-top box converters or satellite ("STBs"), amplifiers, CD players, game consoles, home lighting, curtains, fans, HVAC systems, thermostats, personal computers, etc. As is known in the art, control device 100 may also include keys 110 for use in configuring the operation of control device 100, for example, changing operating modes, selecting active key sets, etc. In a particular illustrative embodiment, in addition to the conventional control functionality, as is known in the art, the control device 100 can also include an input area 108 for the generation of navigation commands to be transmitted from the control devices 100 to one or more devices in response to user interaction with that area, for example, as described in the previously cited US Patent Applications 12 / 552,761 and 12 / 645,037.
[0014] [0014] Referring now to Figure 2, in an exemplary embodiment, the control device 100 may further comprise multiple surfaces to receive user input, for example, a first surface 202 comprising command function keys 106 and navigation input means 108, and a second surface 204 located opposite said first surface and comprising an alphanumeric keyboard 206 arranged, by way of example only, in a conventional "qwerty" manner. In such a two-sided modality, several explicit or implicit operating methods can be provided to selectively activate and deactivate the input means on each of these surfaces, and these methods can be used either individually or in conjunction with each other, as will be clear later. Furthermore, in certain modalities, and in particular, when infrared signaling is used as the method of communication between the control device 100 and the target devices, such as TV 102 and / or STB 104, multiple means of communication can be provided. transmitter or transceiver, for example, as shown in 208 (positioned, as an example only, on a device front) and 210 (positioned as an example only, on a device side), to ensure reliable communication regardless of orientation control device 100.
[0015] [0015] With reference to Figure 3, for use in the control of functional operations and one or more devices, the control device 100 may include, as necessary for a particular application, a processor 300 coupled with a ROM 304 memory; a 302 RAM memory; one or more key arrays 326,328, as required to support keys 206, 106, 110 on each of the multiple input surfaces (for example, hard keys, soft keys, such as a touch sensitive surface superimposed on a liquid crystal (LCD) , and / or an electroluminescent display (EL)); input means scroll and / or navigation function input means 324, such as a capacitive or restrictive touch sensor, as required to support the navigation input area 108; one or more transmitter circuit (s) and / or transceiver circuit (s) 308,310 (for example, IR and / or RF); a non-volatile read / write memory 306; means 322 for providing visual and / or audible feedback to the user (for example, one or more LEDs, LCD display, speaker, piezoelectric bell, or the like); a 312 power supply; an input / output port 320 as a serial interface, USB port, modem, Zigbee transceiver, WiFi, or Bluetooth, etc .; one or more means (not shown) for backlight areas of touchpad 324 and / or key arrays 326,328; accelerometer means 318 for use in determining movement and / or orientation of control device 100; and logic of clock and timer 314 with crystal or associated resonator 316.
[0016] [0016] As will be understood by those skilled in the art, some or all memories 302, 304, 306 may include executable instructions that are intended to be executed by processor 300 to control the operation of remote control 100 (collectively, the operating software) , as well as data used to define the necessary control protocols and command values for use in the transmission of command signals to controllable devices (collectively, the command data) to the operating software. In this way, processor 300 can be programmed to control the various electronic components in remote control 100, for example, to monitor an active key array 326 or 328, to cause signal transmission, etc. Non-volatile read / write memory 306, for example, an EEPROM, RAM supported by memory, FLASH, Smart Card, memory card, or the like, can be additionally provided to store configuration data and parameters as needed. Although memory 304 is illustrated and described as a ROM memory, memory 304 may also be made up of readable media, such as ROM, FLASH, EEPROM, or the like. Preferably, memories 304 and 306 are non-volatile or battery backed so that it is not necessary for data to be recharged after disconnection of power supply 312 as a result of battery change, etc. In addition, memories 302, 304 and 306 can take the form of a chip, a hard disk, a magnetic disk, an optical disk, and / or the like. In addition, it will be found that some or all of the memory devices can be physically combined (for example, a single FLASH memory can be logically divided into different parts to support the functionality of memories 304 and 306, respectively), and / or can be physically embedded in the same IC chip as microprocessor 300 (a so-called "microcontroller") and, as such, are shown separately in Figure 3 for clarity purposes only.
[0017] [0017] To make the control device 100 perform an action, the control device 100 can be adapted to be responsive to events, such as a user interaction captured with the keystrokes 326, 328, touchpad 324, etc. In response to an event, appropriate software operating instructions in the program memory can be executed. For example, when a command function key or keyboard key is triggered on the control device 100, the operating software of the control device 100 can extract a command value from the command data stored in memory 302, 304, 306 and the control protocol corresponding to the triggered function or alphanumeric value and, where necessary, a current device mode, and will use the extracted command data to transmit an intended target device, for example, STB 104, an operating command or alphanumeric value in a format recognizable by that device. It will be seen that the operating software can not only cause the transmission of commands and / or data to the devices, but can also perform local operations. Although not limiting, local operations that can be performed by the control device 100 may include displaying information or data, favorite channel configuration, macro key configuration, function key relocation, etc. Examples of local operations can be found in U.S. Patent Nos. 5,481,256, 5,959,751 and 6,014,092.
[0018] [0018] In some modalities, control device 100 can be of the universal type, that is, equipped with a library that comprises a multiplicity of command codes and protocols suitable for the transmission of commands or alphanumeric values to various devices. In such cases, to select command data sets and protocols to be associated with the specific devices to be controlled (hereinafter called the configuration procedure), the data can be entered in the control device 100 that serves to identify each target device intended for its manufacture and / or model and / or type. Data can typically be entered by activating those keys which are also used to cause the transmission of commands or alphanumeric values to a device, preferably keys that are marked with numerals. Such data allows the control device 100 to identify the appropriate command data set in the command data library that is to be used to transmit recognizable commands in formats appropriate for such identified devices. The command data library can represent a plurality of controllable devices of different types and manufacture, a plurality of controllable devices of the same type, but different manufacture, a plurality of devices of the same manufacture, but different type or model, etc., or any combination of them as appropriate for a given modality. In conventional practice, as is well known in the art, such data used to identify an appropriate protocol set and command data can take the form of a numerical configuration code (obtained, for example, from a printed list of manufacturing names and / or models with corresponding code numbers, from a support website, etc.). Alternative configuration procedures known in the art include scanning bar codes, sequentially transmitting a predetermined command in different formats until a target device response is detected, interacting with a website, culminating in the download of command data and / or download configuration codes to the control device, etc. Since such methods for configuring a control device to command the operation of specific household appliances are well known, they will not be described in more detail in this document. Even so, for additional information regarding configuration procedures, the reader can appeal, for example, to US Patents 4,959,810, 5,614,906, or 6,225,938, all of the same applicant and incorporated herein by reference in their wholes. Control surface selection.
[0019] [0019] When using the exemplary multi-sided control device presented in this document, it may be desirable that only a single input surface (for example, a command and navigation function surface 202 or alphanumeric keyboard surface 204 currently facing for a user) is enabled, mixed is, is active, for receiving user input at any point in time. In this way, the inadvertent issuing of device commands or data values due to the user's touch on the other input surface of the control device 100, for example, the surface currently facing away from the user, can be inhibited. To provide a transparent and easy-to-use interface, the operating software of the exemplary control device 100 can deploy some or all of the methods described below to determine which surface should be considered to be currently active.
[0020] [0020] Explicit user action: A user can explicitly interact with one or more keys provided on one or more of the surfaces specifically for the purpose of activating a particular surface of the surfaces. For example, in the illustrative control device of Figure 2, the alphanumeric keyboard surface 204 can be activated by the key 214, while the command and navigation function surface 202 can be activated by the key 212. As will be seen, the activation of a surface can implicitly disable the other surface.
[0021] [0021] Alternative arrangements are also possible, for example, in other modalities only a single key can be provided to switch between surfaces or keys to activate any surface can be provided on only one of the surfaces. As will be seen, because in such alternative modalities such a key or keys may need to remain active even if the remaining keys on a corresponding surface have become inactive, it will be preferred that such key or keys be located in a position that would minimize inadvertent activation (s) (s) by a user.
[0022] [0022] The implicit user action: The detection of repetitive user activation of a key located on an inactive surface of the control device can trigger the activation of that surface. For example, a first key press per user on an inactive surface can be ignored as inadvertent. However, a second key press per user of the same key on the inactive surface in a short period of time and without interfering activations of other keys can be interpreted as a signal to activate that inactive surface. In many cases, such an algorithm can be essentially transparent to the user, since a natural reaction by pressing a key, for example, a device function command, and obtaining no results, is to press the same key a second time.
[0023] [0023] Command of host or target device: In certain modalities where the control device 100 is in bidirectional communication as a target device, for example, STB 104, the target device can communicate its current operational situation to the control device to cause the automatic activation of a particular surface. For example, a user can use navigation input panel 108 on surface 202 to move and select a text entry field displayed by STB 104 on TV 102. By selecting the text entry field, STB 104 can start a communication with the operating software of the control device to cause it to automatically disable the command and navigation surface 202 and activate the alphanumeric keyboard surface 204.
[0024] [0024] Internal detection of control device orientation: In modalities that include accelerometer means 318, the operating software of the control device 100 can use data derived from the accelerometer to track the orientation of the control device 100 and activate whatever surface is currently facing upwards.
[0025] [0025] To maximize ease of use, while providing optimal functionality, several of the above methods can be used in conjunction with each other and designated relative priorities. For example, an explicit user action can always replace an accelerometer-derived orientation control device. An exemplary method by which the operating software of a control device can properly provide prioritized support for all four surface activation methods described above will now be presented with reference to the flowchart in Figure 4, being appreciated, however, that this represents only an illustrative set of surface activation methods and priorities and that many other methods and prioritizations may be possible in alternative modalities.
[0026] [0026] With reference to Figure 4, in an exemplifying modality, requests for surface selection can be prioritized in order: explicit user input (highest), implicit user input, host command / target device, and derived from accelerometer (lower). Anytime the activation of key 400 is detected by the operating software of control device 100 (where "key" in this context is understood to include broadly any physical button, touch screen icon, touch panel capacitive capture or navigation gesture, etc., without limitation), in step 402 it can be determined, first, by the operating software of the control device 100, if the current situation of the control device is "awaiting repetition". As will become apparent later, this situation may exist when the control device 100 is in the middle of the assessment whether an implicit user surface selection is to be fulfilled. If the current situation is not "awaiting repetition", then, in step 406, it can then be determined whether the activated key represents explicit user selection of an active surface, for example, one of the keys 212 or 214 in the case of modality illustrated in Figure 2. If so, in step 420, the desired active surface can be immediately selected and any other pending lower priority requests cleared, after which processing is completed. It will be noted, in this context, that the selection of an input surface of a control device can comprise not only the activation of the keys and / or other input means located on that surface, but also any other action (s) necessary to fully perform changing the entrance surface. For example, in modalities that use infrared-based communication between a control device 100 and target devices, this may include the activation of specific IR transceiver or transmitter means appropriate for the current orientation of the control device, for example, one among 208, 210, in the case of the modality illustrated in Figure 2. In other modalities, the surface selection can also comprise the transmission of an indication of the change to one or more of the target devices, energizing the backlight to an active, alternating surface the header data in subsequent command transmissions to be indicative of surface origin, etc., as required for a particular deployment.
[0027] [0027] If, however, the activated key is not an explicit surface selection key, in step 408, the operating software of the control device 100 can then determine whether the activated key is located on the currently activated surface of the device of control. If so, in step 416, the key activation is processed, that is, the function or feature associated with that key is performed. Upon completion of step 416, the operating software of the control device 100 can then determine whether a request for a surface change is pending. If not, processing of the current keystroke is complete. If a surface change request is pending, it can be processed in step 420, as previously described.
[0028] [0028] If, however, in step 408, it is determined that the activated key is not located on the currently active surface, the determination must then be made whether this is an inadvertent key press or whether it should be treated as a selection surface by implicit user action. Accordingly, by remaining with the method described above, the operating software of the control device 100 may wait for a period of time to ensure that the same key is activated again without any other interference user activity. In step 410, the "waiting for repeat" situation can then be configured, the current key value saved for future reference, and a timer started, after which, the processing of the current key activation is complete.
[0029] [0029] Returning now to step 402, if it is determined by the operating software of the control device 100 that the current situation of the control device is "awaiting repetition," then, in step 404, the present key value can be compared to the key value that was previously saved. If these are not compatible, no implicit user surface selection has occurred, and accordingly, the "awaiting repeat" situation is cleared at step 405, after which processing continues at step 406, as previously described. the current key value is compatible with the one previously stored in conjunction with the "waiting for repeat" situation, an implicit user surface selection has taken place. Accordingly, in step 412, the running timer is stopped and any other requests for changing the lower priority surfaces are cleared, after which the current input key and implicit surface change request are processed in steps 416 and 418, as previously described.
[0030] [0030] With reference now to step 426, if the timer, which was set up in step 410, expires due to no additional user input during the timed interval (the interval can be predetermined in certain modalities, for example, three seconds, or can be externally adjustable in other modes, for example, by a user or by parameter received from a target device), so no surface selection per implicit user is in progress and, accordingly, any situation "awaiting repetition” is cleared at step 426, after which processing continues as step 418, as previously described, with a check for any pending lower priority surface change requests.
[0031] [0031] In control device modalities 100 that incorporate bidirectional communication with a target host device, the request for activation of a particular control surface can be received from a host device, that is, in step 428 of Figure 4 In that case, the operating software of the control device 100 can first determine whether a situation "awaiting repetition" currently exists. If not, the host request can be processed immediately in step 420, as previously described. this situation is configured, then the operating software can defer any surface activation action requested by the host depending on the result of the high priority test in progress for an implicit user-initiated change. Accordingly, in step 432, any change requests from pending low priority surfaces (ie, accelerometer started, in illustrative mode) are limp as, and, in step 434, the pending of a host request is recorded for future evaluation in step 418, as previously described, and processing is completed.
[0032] [0032] In modalities of the control device 100 that incorporate accelerometer means, an application of the signals generated by the accelerometer can be the determination of a control device orientation, that is, in the illustrative modality whose side is currently turned, in general, upwards, and automatic activation of the appropriate control surface. As will be appreciated, each previous explicit or implicit user surface selection or other user activity can be used to establish a new baseline control device orientation and accumulated accelerometer signals thereafter to dynamically monitor a deviation from that orientation orientation. baseline. Starting at step 436, when it is determined by the operating software of the control device 100 that such deviation has reached a point at which a different control surface may require to be activated, in step 438, it is first determined whether a host request is currently pending. If so, since in the illustrative modality a host request has priority, the request derived from the accelerometer must be ignored and processing is complete. If no host request is currently pending, at steps 440 and 442, a host device can be interrogated to determine whether a request is currently standing out and, if so, processing of that host request continues at step 430, as previously described. Steps 440 and 442 are necessary since, as will be described in greater detail hereinafter, a handheld handheld device, such as the exemplary control device, even if capable of bidirectional communication, may not be in constant real time contact with a target host device. If, however, no host request is determined to be pending, in step 444, it is then determined by the operating software of control device 100 whether a situation "awaiting repeat" currently exists. If not, the accelerometer request can be processed immediately in step 420, as previously described, if, however, this situation is configured, then the operational software can defer any requested surface activation action that awaits the result of the test in progress for an implicit user-initiated change. In step 446, a pending request initiated by an accelerometer is recorded for future evaluation in step 418, as described above, and processing is complete.
[0033] [0033] It will also be noted that the signals derived from the accelerometer can be used to take the device to indicate to a user, for example, through a message displayed on a display currently facing upwards, sound, or similar, that the device it should be noted, for example, that the surface that was recently made active is no longer currently facing upwards, in general. Z-axis navigation.
[0034] [0034] In certain embodiments, accelerometer means 318 incorporated in a control device 100 can be used for navigation on the z-axis in a 3-D environment. For example, when navigating the 3-D example program guide 506 illustrated in Figure 5, the movement of a cursor to light horizontally or vertically on a page of the guide data displayed on TV 102 can be controlled by finger movement on a panel navigation input 108 that uses capacitive capture, such as that described in US Patent Applications cited earlier 12 / 552,761 and 12 / 645,037 or other suitable technology, while navigation on the geometric Z axis, for example, from page to page, can be controlled by movements 504 of the control device 100 along that axis, as shown. For this purpose, the operating software of the control device 100 can analyze the data values reported by the accelerometer 318 to distinguish, for example, between a movement 502 that may be indicative of a need to select a new active surface on the control device and it should be processed as previously described, and a 504 movement that is indicative of navigation on the z-axis and should result in the transmission of navigation commands or data to TV 102 (or STB 104, in case STB is responsible for the image of displayed tab). As will be seen, such z-axis navigation transmissions can take the form of individual step commands, for example, page up, page down (whose commands can be sent depending on the speed and / or length of the axis movement -z); they can comprise raw motion data for interpretation by the target device, for example, z-axis movements made to perform a 3-D object in place of or in addition to the z-axis movements provided to perform pagination; or they can understand a combination of them. In addition, in modes of control device 100 capable of bidirectional communication with a target device, such a navigation input on the z-axis and the form in which it is transmitted to the target device can be activated / deactivated and configured under device control. target. It will also be noted that the accelerometer has detected movements described in this document as an illustration of specific concepts only and with no intention of limiting the general use and application of the accelerometer data in a control device. Transfer deferred.
[0035] [0035] In the design of systems that support bidirectional communication with battery powered and / or portable devices as the exemplary control device of the present invention, it cannot be assumed that the device is continuously present, energized and capable of receiving communications. In particular, battery powered devices of the type contemplated in this document generally enter a low power quiescent state when not in active use. In some cases, this behavior can be resolved by defining a system with a strict requirement that all interactions begin with the transmission of a signal from the battery-powered device, thereby ensuring that it is not in a quiescent state at the moment . However, in applications such as those contemplated in this document, where asynchronous events on a target host or device may result in a requirement to indicate the situation for, or request a change in, state of a battery-powered control device, alternative provisions have to be taken. For example, a device such as a TV 102 or STB 104 may asynchronously enter a menu display by reaching the end of the reproduction of a pre-recorded item, and may need to instruct a multi-surface control device 100 with respect to to an appropriate input surface to be activated for the user response when the control device 100 is then active. Accordingly, a deferred transfer queuing method can be deployed in a device capable of engaging in bidirectional communication with control device 100, as a device hereinafter referred to, for convenience, as a host device.
[0036] [0036] As an example, with reference to the flowcharts of Figure 6, any internal process 600 active in a host device, for example, a menu system, timer, calendar reminder, program guide application, etc., that requires communicating a message or request to the control device 100, it can, in step 602, cause the host device's operating software to enter the message data into a deferred transfer queue. In some embodiments, the data placed in the queue may include a process ID, that is, an identity for the request process, and the ID can be echoed in any subsequent response generated by the control device 100, thus guaranteeing the correct transfer of that response to the host initiation process.
[0037] [0037] When a next communication 604 from the control device 100 is received by the exemplary host device, in step 606, the host device's operating software can first inspect the contents of the received message to determine whether it comprises an appliance command intended for that host device. If so, in step 614 the appropriate processes or actions required to perform the indicated function are started on the host device, and the processing of the received message is completed. Although it is not illustrated in the flowchart of Figure 4, it will be found that in the modalities in which the command protocol between the control device 100 and the host device can include a confirmation by the host device of the receipt and / or execution of commands, such confirmation can incorporate an indication for control device 100 of the deferred transfer queue status, i.e., "message pending" or "no message pending". If the received message does not comprise a device command, in step 608, the host device's operating software can then determine whether the contents of the received message comprise data that must be transferred to a particular process with the host device. If so, in step 616, the process ID returned with the data is examined and the data transferred to the appropriate creation process, after which the processing of the received message is complete. If not, in step 610, the host device's operating software can then determine whether the contents of the received message comprise a retrieval request, that is, a request to transfer an item from the deferred delivery queue. If not, processing of the received message is complete. If, however, a retrieval request has been received, the host device's operating software can then determine whether any items are currently present in the deferred transfer queue. If so, in step 681, the next sequential item in the queue is retrieved and transmitted to control device 100. If not, an "empty queue" status message is transmitted to control device 100, after which processing of received message is complete. It will be appreciated that, while the illustrative methods described above implant a deferred transfer queue of the type "first to enter, first to leave" simple, in other modalities, several alternative dispositions, as prioritization by category or process, timestamp, control device request classification, etc., can be implemented, as appropriate.
[0038] [0038] With reference now to the control device portion of the flowchart in Figure 6, starting at step 650, the control device 100 may, from time to time, interrogate a host device to retrieve messages from its deferred transfer queue . Such interrogation may be requested by one of several possible software control device operating processes, for example, without limitation, by entering or leaving a low power quiescent state; after certain user interactions with control surfaces 202, 204; through events generated by an accelerometer, such as a request to change the control surface (for example, step 440 in Figure 4) or detection of a control device 100 being lifted from a surface on which it is resting; etc. In step 652, a fetch request can be transmitted to a host device and a timeout has started. In step 654, if the timeout expires with no response received from the host device, the host is assumed to be out of range, out of battery, or otherwise unresponsive, and in step 662, a "no data" situation message ”is posted to the initialization process and the interrogation process is completed. if a response is received at step 656 from the target host device, at step 658 it can then be determined if that response comprises a message generated by the host or an "empty queue" response. The appropriate data is then posted to the initiation process in step 660 or 662, and the interrogation process is complete.
[0039] [0039] In this way, asynchronously generated commands, parameters, and event data can be reliably transferred from a host device to a portable control device operating under battery power conservation restrictions. It will be seen that the example above is presented as an illustration of a possible method and the protocol that can be used to implement a deferred event queue and that several modifications can be made to it without leaving the essence of the invention. For example, although a request / response protocol is used to transfer items from the queue in the modality illustrated above, other methods, such as a "press" initiated by the host upon detection of the presence of the control device; periodic repetitive diffusion until confirmation; etc., can be used in alternative modes where appropriate. Suppression of adjacent key.
[0040] [0040] Referring now to Figure 7, when a capacitive surface or other touch-sensitive surface 108 is employed as an entry area in a control device 100, that area can commonly be placed above the natural grip point 702 control device 100, preferably positioned so that most user input operations can be performed by the thumb of a user 700 without requiring modification of its grip on the device. Accordingly, the 706 reach of the user's thumb when operating the control device can, in general, be from the bottom up and, at least in the case of a right-handed user, from right to left, as illustrated.
[0041] [0041] In certain embodiments, such as, for example, that described in US Patent Application referred to above 12 / 645,037, which is hereby incorporated by reference in its entirety, such a touch sensitive area can be used as a means of entry for keyboard, either as its sole function, or interchangeably with a mode in which it is used as a gesture-based means to generate navigation or other commands to a target device. When used as a keyboard, the touch-sensitive area can be logically separated into several zones, for example, zones one through nine, as shown in 720, with each zone assigned a command key value. The detection of a user touch at any location in a zone can result in the activation of the designated key function, for example, the transmission of a command to a target device. A disadvantage of using a touch sensitive area in this way, however, is the fact that, upon reaching a "key", a user's thumb can inadvertently trigger functions assigned to other regions of the touch sensitive area due to a touch of a touch. bump or in the vicinity. For example, a right-handed user who reaches to trigger an exemplary "menu" function 704, corresponding to zone 1, may inadvertently trigger, also trigger the "previous channel" function 708, corresponding to zone 9, and / or the zone 5 corresponding to the "OK / Info" function.
[0042] [0042] To mitigate the false activation of the control functions in the manner described above, in an exemplary mode, an adjacent key suppression method can be implemented. Advantageously, such a method can assume that a user's grip is in a similar position more often than not, based on the particular ergonomics of the control device, and, accordingly, that the origin of the user's thumb reach will be the same general area, for example, the southeast quadrant of the touch sensitive area 108 in the example shown in Figure 7. When almost simultaneous multiple input values are detected, these can therefore be agreed upon decoding priorities based on the assumed origin of the reach of the user. For example, with reference now to the zone frame 720 of Figure 7, in certain modalities, the higher zones can be assigned a higher priority than the lower zones, that is, the priority sequence designated from the highest to the highest the bottom can simply be zones 1, 2, 3, 4, 5, 6, 7, 8, 9, in that order. Alternatively, in another illustrative embodiment, decoding properties can be based on the distance from the range origin with those further away always receiving the highest priority, that is, the priority sequence designated from highest to lowest may be the zones 1, 2, 4, 3, 7, 5, 6, 8, 9. In cases where the distances to more than one area are equal (or, in general, equal), a combination approach can be used, through the which regions of the same distance are also designated sub-priorities from highest to lowest. As will be seen, other prioritizations are also possible, for example, without limitation, in control devices that employ a physical model and / or ergonomics different from those of the illustrative modalities presented in this document. In addition, the specific set of priorities to be used can be predetermined or variable, based on the operating mode of the control device 100, can be selectable as a user preference, can be under the control of a target host device, etc. , or a combination thereof as appropriate for a particular modality. It will also be noted that the device software may prompt a user to establish a home location during a configuration process (for example, the user may be asked to place his hand over the device, as if the device were being used) through which the initial detected location is used to establish such region priorities.
[0043] [0043] When the operating software of the control device 100 detects multiple insertions on the touch surface 108 in a short time interval, for example, 100 milliseconds, or other times, as appropriate, and it is found that this interval time can also be an adjustable parameter, these multiple inserts can be evaluated by the control device's operating software based on the particular set of priorities than, in fact, the discarded low priority input values, and only if it acts on the high priority entry, that is, the function or characteristic associated with that "key" is executed. Alternatively, in modalities such as those described in copending US Patent Application 12 / 645,037 in which the user's performance on a touch input is determined by separate means, such as mechanical displacement of the touch surface, the inserts to be evaluated can be those determined as being present at the moment the means of action are activated. As an example, with reference now to Figure 7, if the operating software of the control device 100 determines that both the 'OK Info ”function 710 in zone 5 and the' Menu” 704 function in zone 1 were activated within 100 milliseconds of each other (or such other time period, as appropriate), then, according to the exemplary priority schemes shown above, the "OK Info" function entry 710 will be discarded and the "Menu" function 704 performed. It will also be noted that, in some modalities, if it is determined that the entry corresponding to the highest priority zone has been triggered, the function can be executed immediately without delay to make sure that addition inserts are present, since those would be discarded in any event .
[0044] [0044] It will be noted that, in some modalities, provision can be made for adjusting zone priorities to adapt control device 100 for the user's handling. For example, examining the priority sequence 1, 2, 4, 3, 7, 5, 6, 8, 9 mentioned above with reference now to the exemplary zone model 720 will reveal that this sequence assumes a user range that originates from the southeast quadrant, that is, a right-handed user. For a left-handed user, an equivalent priority sequence for a southwestern source can be 3, 2, 6, 1, 9, 5, 4, 8, 7. Selecting a right-handed or left-handed operating mode can be performed during the initial configuration of the control device 100, for example, by entering a particular key sequence on the keyboard 106. In modalities of the control device 100 that include accelerometer means 318, as an alternative to entering a key sequence, a The user can be instructed to touch the side of the control device with a thumb or index finger and the resulting accelerometer data evaluated to determine the direction of the touch, for example, 510 or 512, as shown in Figure 5. In addition, the control devices that support multiple user configurations, selected or automatically using biometric user detection means, such as those described in US Patents 6,256,019 or 7,236,156 both of which is incorporated here for reference in its entirety, or by explicit user input, can support multiple sets of priority parameters to provide different usage or handling preferences.
[0045] [0045] Although several concepts have been described in detail, it will be seen by those skilled in the art that various modifications and alternatives to those concepts could be developed in the light of the general teachings of that description. For example, although described in the exemplifying context of a control device as a remote control, it will be seen that the principles and concepts described in this document can be applied to any device that receives communications from inserts and / or emissions, such as PDAs, portable computers , phones, game controls, portable game device, computer accessories, and the like. Furthermore, although it is described in the context of functional modules and illustrated using the block diagram format, it should be understood that, unless stated otherwise, one or more of the functions and / or features described can be integrated on a single physical device and / or a software module, or one or more functions and / or features can be deployed on separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual deployment of each module is not necessary for an enabling understanding of the invention. Rather, the actual deployment of such modules would be in the common skill of an engineer, given this document's description of the attributes, functionality, and interrelationships of the various functional modules in the system. Therefore, an individual versed in the technique that applies common skill will be able to practice the invention presented in the claims without unnecessary experimentation. It will be further noted that the particular concepts are intended to be illustrative and not to limit the scope of the invention, to which the full range of the appended claims and any equivalents thereof must be given.
[0046] [0046] All patents cited in this document are hereby incorporated by reference in their entirety.
权利要求:
Claims (12)
[0001]
Portable device (100) characterized by comprising: a housing having an upper surface (202) and a lower surface (204) opposite the upper surface (202); at least one transmitter (310) carried by the housing; a first user interface (106, 108) associated with the upper surface (202) of the housing which has first user interface input elements which, when active, are actionable to cause at least one transmitter to transmit communications to an external device ; a second user interface (206) associated with the bottom surface (204) of the housing which has input elements of a second user interface which, when active, are operable to cause at least one transmitter to transmit communications to an external device; and a processing device (300) that has associated programming to selectively make one of the input elements of the first user interface of the first user interface (106, 108) and the input elements of the second user interface of the second user interface (206) active while rendering the other of the first user interface input elements of the first user interface (106, 108) and the second user interface input elements of the second user interface (206) inactive, where the processor (300) and associated programming consider received signals, each having a priority assigned to selectively making one of the input elements of the first user interface of the first user interface (106, 108) and the elements of second user interface input of the second user interface (206) active while making the other of the input elements of the first user interface of the first user interface (106, 108) and the input elements of the second user interface of the second user interface (206) inactive, in which the received signals comprise: an explicit user input to activate a user input element provided on one or more of the surfaces specifically to activate a particular one among the first user interface and the second user interface; an implicit user input due to repetitive user activation of a user input element in an inactive user interface; a command from an external device; an orientation signal from an orientation sensor on the device.
[0002]
Portable device according to claim 1, characterized by the fact that at least one of the first user interface (106, 108) and the second user interface (206) comprises a touch sensitive input element.
[0003]
Portable device according to claim 1, characterized in that the input elements of the second user interface comprise an alphanumeric keyboard and the input elements of the first user interface comprise a remote control keyboard.
[0004]
Portable device according to claim 1, characterized in that the at least one transmitter (310) is driven to transmit command data communications in response to a triggering of the input elements of the first user interface and that the hair at least one transmitter (310) is driven to transmit alphanumeric data communications in response to a trigger of the input elements of the second user interface.
[0005]
Portable device according to claim 1, characterized in that it comprises an orientation sensor (318) to supply the orientation signal to the processor (300). 6. Portable device according to claim 5, characterized by the fact that the orientation sensor (318) comprises an accelerometer.
[0006]
Portable device according to claim 1, characterized by the fact that at least one of the first user interface (106, 108) and the second user interface (206) comprises an alternate input element (212, 214) for provide a selection signal to the processor (300).
[0007]
Portable device according to claim 1, characterized by the fact that it comprises a receiver (310) for receiving a selection signal from an external device.
[0008]
Portable device according to claim 1, characterized by the fact that the processor (300) and the associated programming consider a repeated activation of a user interface input element of the first user interface (106, 108) and the second user interface (206) within a predetermined period of time.
[0009]
Portable device according to claim 1, characterized by the fact that it comprises a first transmitter (310) and a second transmitter (308) spaced from the first transmitter in which a drive of the elements of the first user interface takes the first transmitter (310) ) to transmit communications to an external device and a drive of the input elements of the second user interface causes the second transmitter (308) to transmit communications to an external device.
[0010]
Portable device according to claim 10, characterized by the fact that the first transmitter (310) is associated with one front of the housing and the second transmitter (308) is associated with one side of the housing.
[0011]
Portable device, according to claim 1, characterized by the fact that the processor (300) works to generate a notification signal when the asset among the input elements of the first user interface and among the input elements of the second interface user is oriented away from a user.
[0012]
Portable device according to claim 1, characterized by the fact that the designated priority has an order of highest priority to the lowest priority of: order: explicit user input, implicit user input, a target host / device command , an orientation signal from an orientation sensor on the device.
类似技术:
公开号 | 公开日 | 专利标题
BR112012028953B1|2020-09-24|SYSTEM FOR ENHANCED REMOTE CONTROL FUNCTIONALITY.
KR102127308B1|2020-06-26|Operation method and apparatus using fingerprint identification, and mobile terminal
US8456284B2|2013-06-04|Direction and holding-style invariant, symmetric design, and touch- and button-based remote user interaction device
EP2690576B1|2021-04-14|User terminal apparatus and control method thereof
EP2805220B1|2018-10-10|Skinnable touch device grip patterns
US8552832B2|2013-10-08|System and method for interactive appliance control
KR101364849B1|2014-02-19|Directional touch remote
BR112012004844B1|2021-02-17|method for using a portable electronic device
BR112012015405B1|2021-03-02|control device and method for using a control device
KR20170124104A|2017-11-09|Method and apparatus for optimal control based on motion-voice multi-modal command
US9250801B2|2016-02-02|Unlocking method, portable electronic device and touch-sensitive device
US7616194B2|2009-11-10|Key lock means and application method of the same
CN104054331B|2017-08-01|The configuration used for many side control devices
WO2014080546A1|2014-05-30|Portable electronic device, method for controlling same, and program
KR101545702B1|2015-08-19|Portable terminal for operating based sensed data and method for operating portable terminal based sensed data
CN105549735A|2016-05-04|Control method, control device and electronic device
同族专利:
公开号 | 公开日
CN102893242A|2013-01-23|
US20160078753A1|2016-03-17|
EP2569683A1|2013-03-20|
US9285888B2|2016-03-15|
BR112012028954B1|2020-11-03|
US20110279223A1|2011-11-17|
EP2569683B1|2018-02-28|
US9620003B2|2017-04-11|
CN102893242B|2015-11-25|
CN102884494A|2013-01-16|
WO2011142921A1|2011-11-17|
US20180096591A1|2018-04-05|
US9852616B2|2017-12-26|
US20130215015A1|2013-08-22|
US11257359B2|2022-02-22|
EP2569686B1|2018-02-28|
US20130185669A1|2013-07-18|
US20110279375A1|2011-11-17|
BR112012028954B8|2021-04-27|
EP2569686A4|2013-11-06|
EP2569683A4|2013-11-06|
US9582989B2|2017-02-28|
EP2569686A1|2013-03-20|
BR112012028954A2|2016-07-26|
WO2011142923A1|2011-11-17|
US20130207789A1|2013-08-15|
BR112012028953A2|2016-07-26|
US9520056B2|2016-12-13|
US8803655B2|2014-08-12|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5036459A|1989-03-09|1991-07-30|U.S. Philips Corporation|Multi-processor computer system with distributed memory and an interprocessor communication mechanism, and method for operating such mechanism|
DE68928987T2|1989-10-02|1999-11-11|Koninkl Philips Electronics Nv|Data processing system with a touch display and a digitizing tablet, both integrated in an input device|
US6094156A|1998-04-24|2000-07-25|Henty; David L.|Handheld remote control system with keyboard|
US6909424B2|1999-09-29|2005-06-21|Gateway Inc.|Digital information appliance input device|
US20010033243A1|2000-03-15|2001-10-25|Harris Glen Mclean|Online remote control configuration system|
DE10021127A1|2000-04-29|2001-10-31|Mannesmann Vdo Ag|Electronic device|
US6836274B1|2000-05-08|2004-12-28|Eagle New Media Investments, Llc|Three dimensional light electronic programming guide|
US6853841B1|2000-10-25|2005-02-08|Sun Microsystems, Inc.|Protocol for a remote control device to enable control of network attached devices|
US6938101B2|2001-01-29|2005-08-30|Universal Electronics Inc.|Hand held device having a browser application|
US6720863B2|2001-08-16|2004-04-13|Wildseed Ltd.|Mobile electronic communication device with lights to indicate received messages|
KR100444987B1|2001-11-19|2004-08-21|삼성전자주식회사|Remote controller|
US7831930B2|2001-11-20|2010-11-09|Universal Electronics Inc.|System and method for displaying a user interface for a remote control application|
AU2003244418A1|2002-02-05|2003-09-02|Koninklijke Philips Electronics N.V.|Method of activating a remotely controllable device|
US20030160812A1|2002-02-26|2003-08-28|Gary Dommer|Representation of EPG programming information|
US20040208681A1|2003-04-19|2004-10-21|Dechene Joseph Fernand|Computer or input device with back side keyboard|
US7425947B1|2003-05-22|2008-09-16|Palmsource, Inc.|Dual sided input device for a handheld computer|
US20050057489A1|2003-09-03|2005-03-17|Shao-Tsu Kung|Cooperative keyboard and touchpad control method|
US7579961B2|2003-09-26|2009-08-25|Openpeak Inc.|Device control system, method, and apparatus|
WO2005046816A2|2003-11-12|2005-05-26|The Edugaming Corporation|Dvd game remote controller|
KR101000923B1|2004-01-08|2010-12-13|삼성전자주식회사|Apparatus for setting macro of remote control and method thereof|
KR100541956B1|2004-06-22|2006-01-12|삼성전자주식회사|DVD reproducing device for transforming external input signal and method thereof|
US7621814B2|2004-07-22|2009-11-24|Scientific Games International, Inc.|Media enhanced gaming system|
US7633076B2|2005-09-30|2009-12-15|Apple Inc.|Automated response to and sensing of user activity in portable devices|
US7167087B2|2004-10-20|2007-01-23|Balboa Instruments, Inc.|Remote SPA monitor|
US7436346B2|2005-01-20|2008-10-14|At&T Intellectual Property I, L.P.|System, method and interface for controlling multiple electronic devices of a home entertainment system via a single control device|
US7385530B2|2005-05-16|2008-06-10|Research In Motion Limited|Key system for a communication device|
US7319426B2|2005-06-16|2008-01-15|Universal Electronics|Controlling device with illuminated user interface|
WO2007003195A1|2005-07-04|2007-01-11|Bang & Olufsen A/S|A unit, an assembly and a method for controlling in a dynamic egocentric interactive space|
US7735018B2|2005-09-13|2010-06-08|Spacetime3D, Inc.|System and method for providing three-dimensional graphical user interface|
US20070070046A1|2005-09-21|2007-03-29|Leonid Sheynblat|Sensor-based touchscreen assembly, handheld portable electronic device having assembly, and method of determining touch location on a display panel|
KR100753397B1|2006-04-04|2007-08-30|삼성전자주식회사|Apparatus and method for controlling auto display in a mobile station|
GB0609022D0|2006-05-08|2006-06-14|Renishaw Plc|Contact sensing probe|
US7903092B2|2006-05-25|2011-03-08|Atmel Corporation|Capacitive keyboard with position dependent reduced keying ambiguity|
CN101535918A|2006-09-05|2009-09-16|诺基亚公司|Mobile electronic device with competing input devices|
US8125312B2|2006-12-08|2012-02-28|Research In Motion Limited|System and method for locking and unlocking access to an electronic device|
US8194034B2|2006-12-20|2012-06-05|Verizon Patent And Licensing Inc.|Systems and methods for controlling a display|
ES1065322Y|2007-05-03|2007-10-16|Eika S Coop|SYSTEM FOR REMOTE CONTROL OF AT LEAST ONE APPLIANCE APPLIANCE|
US9335912B2|2007-09-07|2016-05-10|Apple Inc.|GUI applications for use with 3D remote controller|
US8031172B2|2007-10-12|2011-10-04|Immersion Corporation|Method and apparatus for wearable remote interface device|
US20090234503A1|2008-03-12|2009-09-17|Shoel-Lai Chen|Automatic switch device by sensing direction for a handheld apparatus having double-sided keypad arrangement|
US8031175B2|2008-04-21|2011-10-04|Panasonic Corporation|Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display|
US20090303097A1|2008-06-09|2009-12-10|Echostar Technologies Llc|Systems, methods and apparatus for changing an operational mode of a remote control|
JP5361044B2|2008-10-17|2013-12-04|任天堂株式会社|GAME DEVICE AND GAME PROGRAM|
US9398242B2|2008-11-17|2016-07-19|Universal Electronics Inc.|System and method for rapid configuration of a universal controlling device|
US8788977B2|2008-11-20|2014-07-22|Amazon Technologies, Inc.|Movement recognition as input mechanism|
US20100164745A1|2008-12-29|2010-07-01|Microsoft Corporation|Remote control device with multiple active surfaces|
US9257034B2|2009-02-19|2016-02-09|Echostar Technologies L.L.C.|Systems, methods and apparatus for providing an audio indicator via a remote control|
US8155692B1|2009-06-01|2012-04-10|Sprint Communications Company L.P.|Mobile communications device with rotating keyboards|
US8438503B2|2009-09-02|2013-05-07|Universal Electronics Inc.|System and method for enhanced command input|
US8432305B2|2009-09-03|2013-04-30|Samsung Electronics Co., Ltd.|Electronic apparatus, control method thereof, remote control apparatus, and control method thereof|
US8613664B2|2009-11-13|2013-12-24|Wms Gaming Inc.|Wagering interface for a gaming system|
US20110148762A1|2009-12-22|2011-06-23|Universal Electronics Inc.|System and method for multi-mode command input|KR101135898B1|2007-12-05|2012-04-13|삼성전자주식회사|Remote controller, control method thereof and image processing apparatus having the same|
JP4710930B2|2008-06-20|2011-06-29|コニカミノルタビジネステクノロジーズ株式会社|Data processing apparatus, data processing method, and data processing program|
US8438503B2|2009-09-02|2013-05-07|Universal Electronics Inc.|System and method for enhanced command input|
CN102236442B|2010-04-28|2015-04-15|赛恩倍吉科技顾问(深圳)有限公司|Touchpad control system and method|
US9542007B2|2010-07-13|2017-01-10|Echostar Technologies L.L.C.|Systems and methods for dual use remote-control devices|
KR101786577B1|2010-10-19|2017-11-15|삼성전자주식회사|Method for Controlling Bidirectional Remote Controller and Bidirectional Remote Controller for implementing thereof|
TWI447678B|2011-10-24|2014-08-01|Hon Hai Prec Ind Co Ltd|Integrated remote controller|
EP2613227B1|2012-01-06|2020-04-29|Samsung Electronics Co., Ltd|Input apparatus and control method thereof|
KR101943419B1|2012-01-06|2019-01-30|삼성전자 주식회사|Input apparatus, display apparatus, control method thereof and display system|
US9734707B2|2012-01-09|2017-08-15|Universal Electronics Inc.|Features for use with a multi-sided controlling device|
RU2014138215A|2012-02-23|2016-04-10|Конинклейке Филипс Н.В.|REMOTE CONTROL DEVICE|
KR20130142824A|2012-06-20|2013-12-30|삼성전자주식회사|Remote controller and control method thereof|
US20140047465A1|2012-08-07|2014-02-13|WebTuner, Corporation|Multi-media ad targeting and content recommendation with viewer identity detection system|
US20140085197A1|2012-09-21|2014-03-27|Ati Technologies, Ulc|Control and visualization for multi touch connected devices|
US9927876B2|2012-09-28|2018-03-27|Movea|Remote control with 3D pointing and gesture recognition capabilities|
US10147564B2|2013-02-07|2018-12-04|Universal Electronics Inc.|System and methods for providing orientation compensation in pointing devices|
US9673925B2|2013-03-15|2017-06-06|Universal Electronics Inc.|System and method for monitoring user interactions with a universal controlling device|
KR102046569B1|2013-04-15|2019-11-19|삼성전자주식회사|Imaging apparatus and method of controlling thereof|
US9625992B2|2013-10-21|2017-04-18|I-Interactive Llc|Remote control with dual activated touch sensor input|
US9478124B2|2013-10-21|2016-10-25|I-Interactive Llc|Remote control with enhanced touch surface input|
USD729774S1|2013-12-05|2015-05-19|Panasonic Intellectual Property Management Co., Ltd.|Remote controller|
CN104780642A|2014-01-13|2015-07-15|广东美的制冷设备有限公司|Backlight control system of remote controller, remote controller and backlight control method of remote controller|
CN103826365A|2014-02-18|2014-05-28|美的集团股份有限公司|Backlight control system of remote control unit, remote control unit and backlight control method of remote control unit|
US9501051B2|2014-03-24|2016-11-22|Flextronics International Usa, Inc.|Control apparatus and related methods|
GB201408258D0|2014-05-09|2014-06-25|British Sky Broadcasting Ltd|Television display and remote control|
CN204009771U|2014-08-06|2014-12-10|胡竞韬|A kind of sense of touch type controller|
TWI507920B|2014-08-20|2015-11-11|Quanta Comp Inc|Electronic device and method for avoiding mutual interference between multiple input devices|
JP1530019S|2014-11-07|2015-08-03|
USD760702S1|2015-04-24|2016-07-05|Samsung Electronics Co., Ltd.|Remote controller|
KR20160142528A|2015-06-03|2016-12-13|엘지전자 주식회사|Terminal device, network system and controlling method thereof|
USD777143S1|2015-07-07|2017-01-24|Telefonaktiebolaget L M Ericsson |Remote control|
CN105202715B|2015-10-30|2017-11-03|上海电力学院|A kind of intelligent air conditioner remote controller|
US9811183B2|2015-12-28|2017-11-07|LenovoPte. Ltd.|Device for cursor movement and touch input|
KR20170111459A|2016-03-28|2017-10-12|삼성전자주식회사|Multi-dimensional remote control device and operation controlling method thereof|
USD836613S1|2016-11-15|2018-12-25|Yan Lu|MP3 controller|
CN106708407B|2016-12-16|2019-05-03|Oppo广东移动通信有限公司|Prevent the method, device and mobile terminal of touch key-press false triggering|
CN107249173B|2017-06-21|2019-09-17|深圳市盛路物联通讯技术有限公司|A kind of method for handover control and system of Internet of Things communication pattern|
US10853525B2|2018-04-16|2020-12-01|Dell Products, L.P.|Hack-proof security keyboard|
US20200050814A1|2018-08-08|2020-02-13|Esa Kimmo Piironen|Capacitive touch proximity sensing switch in a barcode scanning device|
法律状态:
2018-12-26| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-05-26| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-09-24| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 12/04/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US33356710P| true| 2010-05-11|2010-05-11|
US61/333,567|2010-05-11|
US12/849,497|US9520056B2|2010-05-11|2010-08-03|System and methods for enhanced remote control functionality|
US12/849,497|2010-08-03|
PCT/US2011/032068|WO2011142923A1|2010-05-11|2011-04-12|System and methods for enhanced remote control functionality|
[返回顶部]